Items with no label
3335 Discussions

Relative calibration extrinsics for multiple cameras

CCarl8
Beginner
4,130 Views

Hi

Did anyone in the forum succeed in implementing camera extrinsics calibration for multiple D435 following this paper?

https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/RealSense_D400%20_Custom_Calib_Paper.pdf https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/RealSense_D40

I need to find the world position and rotation if multiple cameras relative to each other, looking at the same calibration chessboard. I'd like to know that it is achievable before I throw a lot of hours at it. In the end it's supposed to run in Unity3D.

All the best

Carl Emil

0 Kudos
14 Replies
MartyG
Honored Contributor III
1,596 Views

Intel recently introduced a new chessboard calibration product in its Click online store that can calibrate multiple 400 Series cameras, though it costs $1500 as it is aimed at engineering departments and manufacturing facilities rather than consumers.

https://click.intel.com/realsense-d400-camera-oem-calibration.html Intel® RealSense™ D400 Camera OEM Calibration Target - Intel® RealSense™ Depth Cameras

0 Kudos
Wgao6
Novice
1,596 Views

Hi

I can calibrate between Left and right camera (intrinsic, and R/T), I can calibrate between Left and Color camera (intrinsic, and R/T).

Since they use the same reference: Left Camera principal point as origin, and it's also point cloud coordinate system.

My question:

point cloud coordinate system reference for D435 is:

Left camera principal point BEFORE rectification or AFTER rectification?

if I use a check board to calibration two D435 cameras (say Camera A, and Camera B) relative location R/T,

if I use only the left imager of Camera A and Left imager of Camera B to get the extrinsic calibration,

Is the R/T between A and B the same as the point Cloud generated from A and B?

I did the above, but the point cloud A and point cloud B do not registered correctly.

How do I use only one plane check board to do relation calibration (R and T) between two point clouds ?

Thanks

Wen

0 Kudos
MartyG
Honored Contributor III
1,596 Views

Your question is outside of my knowledge, unfortunately. Hopefully a member of the Intel support team can pick up your case and help you. Good luck!

0 Kudos
idata
Employee
1,596 Views

Hello Wen

 

 

For calibration, RealSense provides the Y16 data format which is unrectified. I think all your questions are answered in this custom calibration white paper https://www.intel.com/content/www/us/en/support/articles/000026725/emerging-technologies/intel-realsense-technology.html.

 

 

Regards,

 

Jesus

 

Intel Customer Support
0 Kudos
Wgao6
Novice
1,596 Views

Hi Jesus,

Thanks. I could not find the information in the Intel customer calibration document.

My question is:

Are the following two coordinate systems the same:

1) Coordinate system of Point Cloud generated by D435: Origin and X, Y, Z direction

2) Coordinate system of the Left Camera (OV9282): Origin and XYZ direction

Example of two D435 Cameras:

D435 Camera A, LeftCamA and RightCamA, generate PointCloudA.

D435 Camera B: LeftCamB and RightCamB, generate PointCloudB

If I do extrinsic calibration between LeftCamA and LeftCamB to get rotational and transnational matrix R/T, is the following true?

PointCloudB*R+T will align with PointCloudA

Thanks.

Wen

0 Kudos
idata
Employee
1,596 Views

Hello Wen,

 

 

The answer to your question is Yes. This is implied in the multiple camera whitepaper, https://www.intel.com/content/www/us/en/support/articles/000028140/emerging-technologies/intel-realsense-technology.html. Page 10, section C. Aligning Point Clouds.

 

 

Regards,

 

Jesus G.

 

Intel Customer Support

 

0 Kudos
Wgao6
Novice
1,596 Views

How do I do point cloud transform inside D435 camera?

I have R/T matrix from extrinsic calibration of D435 point cloud relative to my workspace,

how do I load the R/T to the D435 camera and D435 will do transform inside camera so that the point cloud output from D435 is already transformed to my workspace coordinate?

Thanks.

Wen

0 Kudos
idata
Employee
1,596 Views

Hello Wen,

 

 

You can use the WriteCustomCalibrationParameters function from Dynamic Calibration API to write those parameters in the camera.

 

Find more about it here: https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/RealSense_D400_Dyn_Calib_Programmer.pdf

 

 

Thank you,

 

Eliza
0 Kudos
Wgao6
Novice
1,596 Views

Hi Eliza,

Thank you.

The Calibration write command is to write the intrinsic, stereo parameters and RGB extrinsic parameters.

Yes, I use those to do the custom intrinsic and stereo calibration

But I would like to do is to write a extrinsic parameters to transform the point clouds.

Does D435 has a way to do point cloud transform (Rotation and translation) inside the camera?

0 Kudos
idata
Employee
1,596 Views

Hello Wen,

 

 

The Rotation and Translation parameters of the IR cameras can be written using this function.

 

 

They are rotationLeftRight (The rotation from the right camera coordinate system to the left camera coordinate system, specified as a 3x3 row-major rotation matrix) and translationLeftRight (The translation from the right camera coordinate system to the left camera coordinate system, specified as a 3x1 vector in millimeters).

 

 

You might also want to check the https://github.com/IntelRealSense/librealsense/wiki/Projection-in-RealSense-SDK-2.0 Projection in RealSense SDK 2.0 page.

 

 

Best regards,

 

Eliza
0 Kudos
Wgao6
Novice
1,596 Views

The point cloud origin is the left camera principal point, the righttoleft is for stereo matching after rectification.

but I'd like to do is transform the point clouds (basically the IR left camera origin) to my work space origin.

I can do this outside of the D435, but I'd like to do that inside the camera.

The commands in sdk mentioned in the document such as:

are those commands that can only be implemented outside of the camera?

0 Kudos
idata
Employee
1,596 Views

Hello Wen,

 

 

The depth data is generated by the overlap of the individual left and right imager filed of view.

 

 

The depth start point is referenced for D435 at -3.2 mm from front of the camera cover glass. You can find this information at page 57, 4.7 Depth Start Point(Ground Zero Reference) from the https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/Intel-RealSense-D400-Series-Datasheet.pdf datasheet.

 

In order to obtain depth data, you also need to consider the Min-Z depth (the minimum distance from depth camera to scene for which the Vision Processor D4 provides depth data), which for D435 varies from 105 mm to 280 mm, depending on the resolution. You can check the datasheet, page 56, 4.4 Minimum-Z Depth for the exact values.

 

 

Given the fact that you need to take these into consideration, obtaining a pointcloud inside the camera is not possible.

 

 

Could you please let us know why do you want to use the pointcloud inside the camera?

 

 

Thank you and best regards,

 

Eliza
0 Kudos
Wgao6
Novice
1,596 Views

What we do now:

1) The Point cloud output from D435 cameras (from disparity, intrinsic and stereo calibration parameters done by D435 camera)

2) We have extrinsic calibration matrix to transform the point cloud from D435 to our workspace.

We we want to do:

1) get the point clouds from D435 camera that is already transformed (we can provide the rotation and translation Matrix to the D435 camera) to our workspace. (to save the conversion time in our computer).

0 Kudos
idata
Employee
1,596 Views

Hi WenLiang,

 

 

Unfortunately, the camera is not capable of doing these advanced transformations. You will have to do that processing on your computer.

 

 

Regards,

 

Alexandra
0 Kudos
Reply