LIDAR AND CAMERA CALIBRATION USING A MOUNTED SPHERE
Extrinsic calibration between lidar and camera sensors is needed for multi-modal sensor data fusion. However, obtaining precise extrinsic calibration can be tedious, computationally expensive, or involve elaborate apparatus. This thesis proposes a simple, fast, and robust method performing extrinsic calibration between a camera and lidar. The only required calibration target is a hand-held colored sphere mounted on a whiteboard. The convolutional neural networks are developed to automatically localize the sphere relative to the camera and the lidar. Then using the localization covariance models, the relative pose between the camera and lidar is derived. To evaluate the accuracy of our method, we record image and lidar data of a sphere at a set of known grid positions by using two rails mounted on a wall. The accurate calibration results are demonstrated by projecting the grid centers into the camera image plane and finding the error between these points and the hand-labeled sphere centers.
Read
- In Collections
-
Electronic Theses & Dissertations
- Copyright Status
- In Copyright
- Material Type
-
Theses
- Authors
-
Li, Jiajia
- Thesis Advisors
-
Morris, Daniel
- Committee Members
-
Radha, Hayder
Li, Zhaojian
- Date Published
-
2020
- Subjects
-
Computer science
Engineering
- Program of Study
-
Electrical Engineering - Master of Science
- Degree Level
-
Masters
- Language
-
English
- Pages
- 45 pages
- Permalink
- https://doi.org/doi:10.25335/pg5g-7a22