![]() To investigate the accuracy of the calibration, we also used the estimated omni-camera model in a structure from motion experiment. To test the proposed technique, we calibrated a panoramic camera having a field of view greater than 200 in the vertical direction, and we obtained very good results. ![]() The only assumption is that the image projection function can be described by a Taylor series expansion whose coefficients are estimated by solving a two-step least-squares linear minimization problem. No a priori knowledge of the motion is required, nor a specific model of the omnidirectional sensor. Either the camera or the planar pattern can be freely moved. The proposed method only requires the camera to observe a planar pattern shown at a few different orientations. In this paper, we present a flexible new technique for single viewpoint omnidirectional camera calibration. Experiment results show the validity of the proposed method. We implement the proposed method on our lunar rover prototype. Finally the rotation between the fisheye camera and the inertial measurement unit can be obtained. Then the rotation between the fisheye camera and the total station is deduced via a homography. ![]() The total station can be adjusted to be parallel to the ground plane to which the inertial sensor can provide the rotation. The proposed method introduces a total station as a reference. In this paper we present an extrinsic calibration method for visual and inertial sensors in a general configuration in which the visual and inertial sensors can not observe each other. But most existing methods need a special configuration in which the camera can observe the inertial sensor. For this fusion, extrinsic calibration is a prerequisite. Due to its robust performance and wide potential application, the fusion of visual and inertial sensors for motion estimation has attracted significant attention recently.
0 Comments
Leave a Reply. |