Items with no label
3335 Discussions

SR300. The mapping for a pen from color to depth is not accurate enough

KTien
Beginner
2,815 Views

I want to map a point on a pen from color image to depth image. I used the demo program to do this. However, the mapped point jumps frequently in the depth image. My device fell down on the floor before. I wonder whether it is the reason or not. Could you please tell me the reason? Thanks in advance! Below is the video link:

https://www.youtube.com/watch?v=fNwyM40Gz5w&feature=youtu.be Intel Real Sense SR300. The mapping for a pen from color image to depth image is not accurate? - YouTube

Best regards,

Kawhi

0 Kudos
13 Replies
MartyG
Honored Contributor III
455 Views

As I said in your other post about stripes in the scan pattern, I think your SR300 is probably not suffering from damage. I looked at the YouTube video that you kindly provided and noticed that you are holding the pen in your hand in front of the camera, and your hand is moving. This causes the camera to detect the motion and update its coordinates. The jumping of the coordinates would likely not occur if the pen was in a stationary position, such as resting in a pen pot.

0 Kudos
KTien
Beginner
455 Views

Dear Marty,

Thanks again for your helpful reply.

My goal is to get the stable path of the moving pen in depth image and the device is mounted on a VR HMD. So both the pen and the device are moving in my scenario. Is it possible?

I put the pen in a cup. However, If the device moves a very little bit, the point still jumps.

How about using function "CreateDepthImageMappedToColor"? what's the difference between "CreateDepthImageMappedToColor" and "MapColorToDepth"?

https://www.youtube.com/watch?v=CV3U5oNmG_E&feature=youtu.be mapping problem - YouTube

0 Kudos
MartyG
Honored Contributor III
455 Views

CreateDepthImageMappedToColor ensures that the depth image and the color image are the same size. MapColorToDepth is different, as it maps color image coordinates to depth coordinates.

If you want to get the actual depth coordinates relating to the distance between the camera lens and the pen, I would recommend ProjectCameraToDepth, which maps the depth to real-world coordinates.

https://software.intel.com/sites/landingpage/realsense/camera-sdk/v1.1/documentation/html/index.html?projectcameratodepth_pxcprojection.html Intel® RealSense™ SDK 2016 R2 Documentation

If the camera that is mounted on your HMD is moving, and the pen is moving (because you are moving it in your hand as you record it with the head-mounted camera) then the depth coordinates will never be static. This is because the distance (depth) between the pen and the camera lens is constantly changing. Even holding the pen stationary would cause change, because the head naturally moves up and down slightly

.

0 Kudos
KTien
Beginner
455 Views

Dear MartyG,

What I want is: Given a position in RGB image, I can get the corresponding position in the corresponding depth image. The point has to be very stable and accurate. It seems that MapColorToDepth can't tolerate even small motions. Any suggestion?

0 Kudos
MartyG
Honored Contributor III
455 Views

It is the nature of using a pen that it is going to be frequently moving towards the camera and away from it, even if the camera is in a fixed position. So the real world depth is always naturally going to be fluctuating.

From your description though, I am wondering if what you really want is the depth coordinates on the digital image that the camera has recorded, not the distance of the real-life pen object. MapDepthToColor may therefore be the appropriate instruction to use, to map depth coordinates onto the color image.

https://software.intel.com/sites/landingpage/realsense/camera-sdk/v1.1/documentation/html/index.html?mapdepthtocolor_pxcprojection.html Intel® RealSense™ SDK 2016 R2 Documentation

0 Kudos
KTien
Beginner
455 Views

No. I do want the distance of the real-life pen to get the moving path of the pen. I can get the position of the pen in RGB image using some CV methods. Then I want to know the corresponding depth value of the pen by the mapping from RGB to depth. The mapping has to be stable and accurate.

0 Kudos
KTien
Beginner
455 Views

I mean the 3D moving path of a pen.

0 Kudos
idata
Employee
455 Views

Hello Kawhi,

 

 

Thank you for contacting Intel® RealSense™ Technology support.

 

 

Your case number is 03063707

 

 

In order to further assist you we would like to know the following basic information:

 

 

• Had you tried to use Librealsense on your project?

 

• If you had, did you experience any difference by using it?

 

• If you have not, we do recommend you to download Librealsense from:

 

 

https://github.com/IntelRealSense/librealsense

 

 

Please follow the installation guide, and after the installation process is completed open librealsense/examples/ and use the cpp-alignimages.cpp.

 

 

• In case that you have questions about Librealsense you can visit our Frequently asked questions for Librealsense article by following this link:

 

 

https://www.intel.com/content/www/us/en/support/articles/000022705/emerging-technologies/intel-realsense-technology.html

 

 

We look forward for your reply.

 

 

Best regards,

 

 

Josh B.

 

Intel Customer Support

 

0 Kudos
idata
Employee
455 Views

Hello Kawhi,

 

 

Thank you for having contacted Intel Technical Support.

 

 

This email is just a friendly reminder that case number 03063707 remains open.

 

 

Should you need our technical assistance please do not hesitate to contact us back.

 

 

Best regards,

 

 

Josh B.

 

Intel Customer Support
0 Kudos
KTien
Beginner
455 Views

Hi Josh, I've solved my problem. I tried librealsense. The depth image is better. I will try it in the future. Now I want to know how to reduce the interference caused by HTC Vive? I use HTC Vive and Real Sense together. I tried the vibrating motor method recommended by MartyG

https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/shake27n27Sense.pdf https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/shake27n27Sense.pd

But it doesn't work. Do you have suggestions? How about changing the wavelength of real sense? Thanks in advance.

0 Kudos
MartyG
Honored Contributor III
455 Views

I'm very glad to hear you fixed your problem using Librealsense!

Sorry to hear that the vibrating motor suggestion did not work for you. As the Vive's Lighthouse spins its depth sensing laser at over a thousand revolutions per minute (RPM), I would imagine that the interference-causing pattern gets refreshed much faster than the motor can compensate for.

Josh: the use of HTC Vive was mentioned in another of KawhiTien's posts.

The new D435 RealSense camera can sync its stream with non-Intel hardware devices, so may provide better interference resistance than the SR300.

If you wish to keep using the SR300 though, your options for reducing interference could include:

1. Using an external shutter device with your SR300 (though these are expensive).

2. Spinning the cameras on a turntable, like how the Vive Lighthouse laser is spun. In theory this would normally put the two interfering lasers out of sync so that there were more occasions when one laser was pointing at the target whilst the other laser was facing away from it, giving one of the cameras a chance to take a clean shot of the target (since the other laser is projecting in the opposite direction at that moment). Given the rapid spin rate of the Vive laser though, this may greatly reduce the number of opportunities the SR300 would have to get a clean scan without the Vive's laser intermixing with it.

It reminds me of an episode of Star Trek: The Next Generation where they wanted to transport to another ship but couldn't because its shields were raised and you can't transport through shields. The transporter operator knew that on that model of ship, there was a 0.5 second window in the shield's frequency cycle when a hole opened in the shields to let sensor scans pass through it, and in that half-second the transporter could beam a person through the shield onto the ship. In a way, spinning the SR300 would be like looking for that 0.5 second window when the disruption created by the other device is not in effect.

In another discussion on this subject, where the problem of spinning the camera without tangling the USB able was discussed, the use of a "USB slip ring" to keep the cable centered in a hole in the middle of the turntable.

0 Kudos
KTien
Beginner
455 Views

Marty,

So please tell me more about the external shutters. Any product I can buy?

0 Kudos
MartyG
Honored Contributor III
455 Views

I am not an expert on external shutters, but I will try to provide some guidance based on some research that I did.

It is important to recognize that there are external shutters (that go on the outside of your camera), and shutter remotes (which trigger the existing shutter inside your camera). There are two different things. I believe it is the external shutter that you need, and not the cheaper remote. A good search term for relevant results is to google 'high speed external camera shutter'.

The top search result for an external shutter in my own search was the Cognisys model. This is also recommended by other people. I was not able to find an alternative to the Cognisys that can be purchased.

https://www.cognisys-inc.com/products/high-speed-shutter/high_speed_shutter.php Cognisys - High Speed Shutter

Some people have built their own external shutter instead of buying one. Here's some examples.

https://petermobbs.wordpress.com/2014/10/25/a-high-speed-external-shutter-for-daylight-strobe-photography/ A High Speed External Shutter for Daylight Strobe Photography | petermobbs

https://www.flickr.com/photos/fotoopa_hs/sets/72157631973998729/ https://www.flickr.com/photos/fotoopa_hs/sets/72157631973998729/

0 Kudos
Reply