Items with no label
3335 Discussions

How can I get the relative position between D435's RGB camera and the depth camera?

idata
Employee
2,510 Views

Hi,

I bought a D435 camera.

I want calibrate the relative position of the D435 camera to other devices, so I need measure the relative position between D435's RGB camera and the depth camera for the further work.

So I want to ask if this does get a exact parameter or method ?

Thank you in advance to take the time to answer my question.

0 Kudos
6 Replies
MartyG
Honored Contributor III
646 Views

The D435 is a Strereoscopic type camera. This means that it has an RGB sensor and a pair of infrared sensors (left and right). The left and right IR imagers are used to construct a depth image.

If you are aiming to align more than one RealSense camera but do not need their streams to be precisely synchronized, I wonder if the 'Multicam' sample program might meet your needs.

https://github.com/IntelRealSense/librealsense/tree/master/examples/multicam librealsense/examples/multicam at master · IntelRealSense/librealsense · GitHub

If you are needing to align a D435 with a non-RealSense device, then multi-camera hardware sync would be the way to go.

https://realsense.intel.com/wp-content/uploads/sites/63/Multiple_Camera_WhitePaper_rev1.1.pdf https://realsense.intel.com/wp-content/uploads/sites/63/Multiple_Camera_WhitePaper_rev1.1.pdf

It is worth bearing in mind though that with a D435, to initiate the syncronization you would also need an additional D415 camera to act as the sync trigger pulse generator or use an external signal generator.

idata
Employee
646 Views

thank you, I will take a try.

Actually I want to align a D435 with a projector. Currently I find a method to align a RGB sensor with a projector.

So I want to know whether it is OK to align a D435 with a projector directly just align the RGB sensor.

0 Kudos
MartyG
Honored Contributor III
646 Views

Intel's depth testing guide says about external projectors: "Projectors do not need to be co-located with the depth sensor. It is also acceptable to move or shake the projector if desired, but it is not needed".

https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/BKMs_Tuning_RealSense_D4xx_Cam.pdf https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/BKMs_Tuning_R

Here is the rest of the text from the guide about external projectors.

**********

a. While the internal projector is very good (shown below), it is really quite low power and is not designed to illuminate very large rooms. It is therefore sometimes beneficial to add one or more additional external projectors. The D4xx camera will benefit from any additional texture projected onto a scene that has no inherent texture, like a flat white wall. Ordinarily, no tuning needs to be done in the D4 VPU when you use any external projectors as long as it is on continuously or flickering at >50KHz, so as not to interfere with the rolling shutter or auto exposure properties of the sensor.

b. External projectors can be really good in fixed installations where low power consumption is not critical, like a static box measurement device. It is also good for illuminating objects that are far away, by adding illumination nearer to the object.

c. Regarding the specific projection pattern, we recommend somewhere between 5000 and 300,000 semi-random dots. We normally recommend patterns that are scale invariant. This means that if the separation between the projector and the depth sensor changes, then the depth sensor will still see texture across a large range of distances.

d. It is possible to use light in visible or near-infrared (ranging from ~400-1000nm), although we normally recommend 850nm as being a good compromise in being invisible to the human eye while still being visible to the IR sensor with about 15-40% reduction in sensitivity, depending on the sensor used.

e. When using a laser-based projector, it is extremely important to take all means necessary to reduce the laser-speckle or it will adversely affect the depth. Non-coherent light sources will lead to best results in terms of depth noise, but cannot normally generate as high contrast pattern at long range. Even when all speckle reduction methods are employed it is not uncommon to see a passive target or LED projector give >30% better depth than a laser-based projector.

0 Kudos
jb455
Valued Contributor II
646 Views

Yes, the Extrinsics object contains the translation between the two sensors in metres.

To get it, depending on what language you're using, you'll need to do something like this:

var extrinsics = depthStream.GetExtrinsicsTo(colourStream);

Alternatively, IIRC, the https://downloadcenter.intel.com/download/27955/Intel-RealSense-D400-Series-Calibration-Tools-and-API calibration tools/api includes a command line app for outputting calibration data from a device.

idata
Employee
646 Views

thank you, I will take a try.

So the position between the two sensors in D435 is just like this?

RT=[1 0 0 extrinsics

0 1 0 0

0 0 1 0

0 0 0 1]

0 Kudos
ROhle
Novice
646 Views

I don't think there are any numbers for this other than what is in the datasheet.

I believe (but you should double check): the RGB camera is 15mm to the left of IR1... IR2 is 50mm to the right. I think the Viewer takes the center of the baseline as the zero point... so RGB is at -42 mm.

If you are using external analysis ... and want to combine data or compare data with the Realsense results, you need to translate(and possibly rotate) your data if you aren't using the zero point of the camera as the origin of your data universe.

The depth data is already aligned to IR1 when you get it from the Viewer.

0 Kudos
Reply