Items with no label
3335 Discussions

Stream Depth And Y16 Simultaneously?

JSmit77
Beginner
1,756 Views

Hello. We are currently working on an application that requires both depth and infrared streams of the same subject simultaneously. We are using a d415 with librealsense2.

For the infrared streams we are trying to identify very subtle changes in light intensity and after rigorous testing we have determined that the Y8 format lacks the level of detail we require; however the Y16 stream shows more promise.

Our issue is that we cannot find a configuration that allows for depth images to be streamed alongside Y16 images simultaneously. Is there any configuration (frame rate and resolution) that would allow for depth images to be received while streaming Y16? We are not concerned by the fact that Y16 images are unrectified and lack debayering, we simply want access to Y16 and depth simultaneously, in any form, and we will work from there (Even if we only receive Y16 from one IR Sensor).

Alternatively, is there anyway we can process Y16 images that we receive to turn them into depth images in post? We have been trying to find the exact stereoscopic algorithm that is utilized to create the depth stream, within the source code; however we have yet to find it.

In short, is it possible to stream Y16 and Depth simultaneously or alternatively is the depth algorithm exposed to users?

0 Kudos
1 Solution
MartyG
Honored Contributor III
541 Views

Intel advises against using Y16 in an application because it is unrectified and intended for use in camera calibration.

Y16 is also very limited in the modes that it can be used in. And each time that depth is enabled, Y16 mode is 'ghosted out', making it unusable for your goal of streaming both IR and depth. Below are the IR-only modes in which Y16 will operate.

 

1920x1080, 25 FPS, with depth stream disabled

1920x1080, 15 FPS, with depth diabled

960x540, 25 FPS, with depth disabled

960x540, 15 FPS, with depth disabled

2019 Update:. Advice is provided in the link below.

 

https://github.com/IntelRealSense/librealsense/issues/5062#issuecomment-542611322

 

 

*******************

 

If you are interested in depth post-processing, the paper linked to below may be useful.

 

https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/Intel-RealSense-Depth-PostProcess.pdf

 

https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/Intel-RealSen

View solution in original post

0 Kudos
1 Reply
MartyG
Honored Contributor III
542 Views

Intel advises against using Y16 in an application because it is unrectified and intended for use in camera calibration.

Y16 is also very limited in the modes that it can be used in. And each time that depth is enabled, Y16 mode is 'ghosted out', making it unusable for your goal of streaming both IR and depth. Below are the IR-only modes in which Y16 will operate.

 

1920x1080, 25 FPS, with depth stream disabled

1920x1080, 15 FPS, with depth diabled

960x540, 25 FPS, with depth disabled

960x540, 15 FPS, with depth disabled

2019 Update:. Advice is provided in the link below.

 

https://github.com/IntelRealSense/librealsense/issues/5062#issuecomment-542611322

 

 

*******************

 

If you are interested in depth post-processing, the paper linked to below may be useful.

 

https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/Intel-RealSense-Depth-PostProcess.pdf

 

https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/Intel-RealSen

0 Kudos
Reply