Main Content

Lidar-Camera Calibration

Perform calibration, estimate lidar-camera transform, and fuse data from each sensor

Most modern autonomous or semi-autonomous vehicles are equipped with sensor suites that contain multiple sensors. Rotational and translational transformations are required to calibrate and fuse data from these sensors. Fusing lidar data with corresponding camera data is particularly useful in the perception pipeline. The lidar and camera calibration (LCC) workflow serves this purpose. It uses the checkerboard pattern calibration method.

Lidar Toolbox™ algorithms provide functionalities to extract checkerboard features from images and point clouds and use them to estimate the transformation between camera and lidar sensor. The toolbox also provides downstream LCC functionalities, projecting lidar points on images, fusing color information in lidar point clouds, and transferring bounding boxes from camera data to lidar data.

Functions

expand all

estimateCheckerboardCorners3dEstimate world frame coordinates of checkerboard corner points in image
detectRectangularPlanePointsDetect rectangular plane of specified dimensions in point cloud
estimateLidarCameraTransformEstimate rigid transformation from lidar sensor to camera
projectLidarPointsOnImageProject lidar point cloud data onto image coordinate frame
fuseCameraToLidarFuse image information to lidar point cloud
bboxCameraToLidarEstimate 3-D bounding boxes in point cloud from 2-D bounding boxes in image

Topics

What Is Lidar Camera Calibration?

Integrate lidar and camera data.

Featured Examples