Zhang, F., Lei, T., Li, J., Cai, X., Shao, X., Chang, J. and Tian, F., 2018. Real-Time Calibration and Registration Method for Indoor Scene with Joint Depth and Color Camera. International Journal of Pattern Recognition and Artificial Intelligence, 32 (7), 1-26.
Full text available as:
|
PDF
Real-time Calibration and Registration Method for Indoor Scene with Joint Depth and Color Camera.pdf - Accepted Version Available under License Creative Commons Attribution Non-commercial No Derivatives. 821kB | |
Copyright to original material in this document is with the original owner(s). Access to this content through BURO is granted on condition that you use it only for research, scholarly or other non-commercial purposes. If you wish to use it for any other purposes, you must contact BU via BURO@bournemouth.ac.uk. Any third party copyright material in this document remains the property of its respective owner(s). BU grants no licence for further use of that third party material. |
DOI: 10.1142/S0218001418540216
Abstract
Traditional vision registration technologies require the design of precise markers or rich texture information captured from the video scenes, and the vision-based methods have high computational complexity while the hardware-based registration technologies lack accuracy. Therefore, in this paper, we propose a novel registration method that takes advantages of RGB-D camera to obtain the depth information in real-time, and a binocular system using the Time of Flight (ToF) camera and a commercial color camera is constructed to realize the three-dimensional registration technique. First, we calibrate the binocular system to get their position relationships. The systematic errors are fitted and corrected by the method of B-spline curve. In order to reduce the anomaly and random noise, an elimination algorithm and an improved bilateral filtering algorithm are proposed to optimize the depth map. For the real-time requirement of the system, it is further accelerated by parallel computing with CUDA. Then, the Camshift-based tracking algorithm is applied to capture the real object registered in the video stream. In addition, the position and orientation of the object are tracked according to the correspondence between the color image and the 3D data. Finally, some experiments are implemented and compared using our binocular system. Experimental results are shown to demonstrate the feasibility and effectiveness of our method.
Item Type: | Article |
---|---|
ISSN: | 0218-0014 |
Uncontrolled Keywords: | Binocular stereoscopic vision; image tracking; depth map; filtering; registration; calibration |
Group: | Faculty of Media & Communication |
ID Code: | 30289 |
Deposited By: | Symplectic RT2 |
Deposited On: | 31 Jan 2018 14:49 |
Last Modified: | 14 Mar 2022 14:09 |
Downloads
Downloads per month over past year
Repository Staff Only - |