Items with no label
3335 Discussions

What does adjusting the depth scale do?

AJenc
Beginner
2,551 Views

Hi, I have a D435, and was curious, when I adjust the depth scale using the realsense-viewer, does it adjust the camera to use the entire 16-bit resolution? Or, is it simply a smaller or larger section of the resolution?

Does it actually change the camera to look at a different section in space, or does it just truncate the data?

0 Kudos
1 Solution
MartyG
Honored Contributor III
2,281 Views

Apologies for the delay in responding further, as I have been very carefully considering the best way to answer.

I think the best way to put it is to think of the depth scale in terms of 'depth units'. By default, the 1mm depth scale of the 400 Series translates to depth units of 1000 um.​ This gives the camera a sight range (not depth sensing range) of 65 meters.

​Page 20 of the Intel camera tuning guide, in the 'Changing the depth units' section, explains how changing the depth unit affects the camera's sight range and vision.

https://realsense.intel.com/wp-content/uploads/sites/63/BKM-For-Tuning-D435-and-D415-Cameras-Webinar_Rev3.pdf

View solution in original post

0 Kudos
3 Replies
MartyG
Honored Contributor III
2,281 Views

The depth scale value translates pixels to meters. Dorodnic the RealSense SDK Manager, when asked by another RealSense user if the depth scale changes over distance, replied that the depth scale is "a fixed number for sensor. With D400 advanced mode APIs you can adjust it a bit, to get better precision up close or further away, but that's a rather advanced use-case".

 

The 'Projection' section of the RealSense documentation for the 400 Series cameras has some information on the workings of the depth scale.

 

https://github.com/IntelRealSense/librealsense/wiki/Projection-in-RealSense-SDK-2.0#depth-image-formats

 

A practical effect of depth scales is that at very close range, the SR300 camera model (with its  1/32th of a millimeter depth scale, smaller than the 400 Series' one millimeter) may have greater accuracy than the default setting of a 400 Series camera at the same distance.

0 Kudos
AJenc
Beginner
2,281 Views

That's nice to know about the SR300, but still not quite what I'm asking for. I understand what the depth scale is used for, and I am indeed trying to go for the advanced use-case here, what I need to know, is what happens internally when I adjust the depth scale slider in the realsense-viewer.

More specifically, lets say for example, that I move the slider from 0.010 to 0.0001, from the max, to min; how does the camera adjust its 16-bit depth resolution to compensate for the change? Does it leave the internal scale at 0.01 and simply adjust the display to only show values between 0 and 6.5 m, meaning that the camera is really only using 9 bits of resolution, or does it change the internal scale to 0.0001 and scale the entire 16-bit resolution down to use the entire 16 bits?

0 Kudos
MartyG
Honored Contributor III
2,282 Views

Apologies for the delay in responding further, as I have been very carefully considering the best way to answer.

I think the best way to put it is to think of the depth scale in terms of 'depth units'. By default, the 1mm depth scale of the 400 Series translates to depth units of 1000 um.​ This gives the camera a sight range (not depth sensing range) of 65 meters.

​Page 20 of the Intel camera tuning guide, in the 'Changing the depth units' section, explains how changing the depth unit affects the camera's sight range and vision.

https://realsense.intel.com/wp-content/uploads/sites/63/BKM-For-Tuning-D435-and-D415-Cameras-Webinar_Rev3.pdf

0 Kudos
Reply