Calibration of Omnidirectional, Perspective and Lidar Camera Systems
KUTATÁSI PROJEKT LEÍRÁSA
One of the most challenging issue in robotic perception applications is the fusion of information from several different sources. Today majority of the platforms include range (2D or 3D sonar/lidar) and camera (color/infrared, perspective/omni) sensors that are usually capturing independently the surrounding environment, although the information from different sources can be used in a complementary way. In order to fuse the information for these independent devices it is highly desirable to have a calibration among them, i.e. to transform the measured data into a common coordinate frame. To achieve this goal either extrinsic or intrinsic-extrinsic calibration must be performed depending on whether the prior knowledge of the camera intrinsic parameters are available or not. In case of the extrinsic parameter estimation for a range-camera sensor pair the rigid movement between the two reference systems is determined.
The case of extrinsic parameter estimation for 2D/3D lidar and perspective camera has been done especially for environment mapping applications, however this problem is far from being trivial. Due to the different ways of functionality of the lidar and camera, the calibration is often performed manually, or by considering special calibration targets on images (e.g. checkerboard patterns), or point feature extraction methods. These tend to be laborious and time consuming, especially if this calibration procedure has to be done more than once during data acquisition. In practice often it is desirable to have a flexible one step calibration for systems which are not necessary containing sensors fixed to a common platform.
In this project, a novel region based framework is developed for 2D and 3D camera calibration.
- Robert Frohlich, Levente Tamas, and Zoltan Kato. Homography Estimation between Omnidirectional Cameras without Point Correspondences. In Lucian Busoniu and Levente Tamas, editors, Handling Uncertainty and Networked Structure in Robot Control, chapter 6, Springer, 2015. (to appear)
- Zoltan Kato and Levente Tamas. Relative Pose Estimation and Fusion of 2D Spectral and 3D Lidar Images. In Proceedings of the Computational Color Imaging Workshop (CCIW), Lecture Notes in Computer Science, Vol. 9016, Saint-Etienne, France, pages 33-42, March 2015. Springer. (keynote talk)
- Levente Tamas, Robert Frohlich, and Zoltan Kato. Relative Pose Estimation and Fusion of Omnidirectional and Lidar Cameras. In Proceedings of the ECCV Workshop on Computer Vision for Road Scene Understanding and Autonomous Driving (ECCV-CVRSUAD), Lecture Notes in Computer Science, Vol. 8926, Zurich, Switzerland, pages 640-651, September 2014. Springer.
- Levente Tamas and Zoltan Kato. Targetless Calibration of a Lidar - Perspective Camera Pair. In Proceedings of ICCV Workshop on Big Data in 3D Computer Vision (ICCV-BigData3DCV), Sydney, Australia, pages 668-675, December 2013. IEEE.