Items with no label
3335 Discussions

RealSense Q&A by Brian Pruitt, RealSense Peripheral Segment manager, Intel

MartyG
Honored Contributor III
1,526 Views

Hi everyone,

Brian Pruitt, RealSense Peripheral Segment manager at Intel, did a webinar session on Tuesday and answered questions put to him by attendees. I've posted them below for the RealSense community.

**********

1. Will you share slides later?

I do not plan to, check out our website for all the info!

2. What is the difference between active and passive?

Active uses an emitter to add texture to a scene allowing for differentiation for the two imagers. Passive does not have this meaning backgrounds like a flat, same color wall will have more trouble determining depth.

3. What is the z-error described here? Depth error?

Yes – Depth Error

4. Can multiple D435 sensors have their shutters synchronized?

Yes – Check out our whitepaper on http://realsense.intel.com http://realsense.intel.com

5. Are you planning to support Windows 7??

No – Unless there is a really large unit commit then we would consider porting to windows 7. In short we decided to use Microsoft's current release when we built and designed the camera.

6. How can I find out more about multi-sensor synchronization?

Check out our whitepaper on http://realsense.intel.com http://realsense.intel.com

7.Is there a way to switch between presets for reconstruction through the SDK?

Yes – check out the SDK – if we don't say anything in the doc, please post in our community!

8. The R200 SDK included a rich feature set. Is the previous SDK compatible with the D400 series?

No – We focused more on depth across multiple OS' and wrappers. This was in much higher demand than the other features.

9. You mentioned about possibility for 3rd party Software developers to align with like 3DiVi. How can we as a 3rd party approach Intel with our ideas & proposals?

Please contact us via our website!

10. What's the price on the 30-packs?

Check with your favorite distributor

11. Thanks! Is the D4 processor based on the low power Movidius acquisition?

No, the D5 VPU is a fixed processor allowing for low power. Movidius allows for different vision processing.

12. How soon can we get the units D435?

It's about 9-10 Week delay but getting better!

13. Can you talk about the T260 tracking module?

Check out our website. I bet there will be a webinar in the future…

14. What is the price for the bundle?

Check with your distributor

15. How does the D4 VP stand up to other solutions?

We think quite well for fixed purpose. Check out the material we have online.

16. When will the skeletal tracking be available?

Now – Check out 3diVi and Nuitrack.

Edit by Marty: 3diVi is a company that offers software called Nuitrack.

https://www.youtube.com/watch?v=gMPtV4NXtUo Intel RealSense D415/D435 and Nuitrack skeletal tracking SDK replace Kinect SDK - YouTube

17. Does reconstruction use GPU processing?

We provide depth and rgb information. After it reaches the compute platform it is up to the developer to determine what is done with it.

18. Is there any object recognition databases that are available for developers?

We provide Depth. Over time we hope the community and third parties will provide middleware like object recognition.

19. The camera keeps crashing with laptops (prob. due to the USB port not providing enough power due to power saving features). Will/can this problem be resolved?

Please submit a ticket on our website. We do not see this issue as long as you are using USB 3.

20. What are the options to provide wireless communications with the D400 series cameras?

No. You could plug the camera into a compute board and then have the compute board send data to the cloud or wherever.

21. I am interested in point cloud processing including finding nearest NURB and conical surfaces.

Great – Check out our website and if you have a specific need not answered please post!

22. Are you planning to add microphone array to the camera package similar to MS Kinect v2.

No – Microphone were used in <1% of the previous kits we made.

23. What are the possibilities for developers to obtain a Camera for test purposes? (like with glasses for AR e.g.).

Yes – feel free to purchase from our website or any distributor.

24. Is 435 camera limited to 90 fps?

Yes.

25. What is the frame rate of your fastest module?

90 FPS

26. For high-res head tracking the SR300 seems most suitable. At 1m distance, can the D400 series achieve comparable accuracy?

Yes.

27. How fast is the reconstruction?

Depends on the software doing the reconstruction and the SoC powering. Remember we provide the data portion. We do all the algorithm processing at the camera so that part is done.

28. How is accuracy to measure object length (x, y) assuming Z is distance.

<2% error @ 4m. Closer more accurate. Further less. Varies depending on conditions.

29. Will the LabVIEW adapter allow us to visualize the 3D point cloud similarly to the Intel's platform, or is it a adaptor for intercepting depth data into the environment?

Please check out our site for more on LabView!

30. Is access to these camera limited to RGB in current web browsers?

RGB and depth is provided via all the different OS' and Wrappers. See our website for details.

31. What are the output formats for pointcloud and images?

Please check out our website for the different output formats.

32. Is there any way to use any of the cameras with one (or several) USB 2.0 ports?

Not today but maybe in th...

0 Kudos
7 Replies
PSnip
New Contributor II
390 Views

Hi Marty,

Thanks for sharing.

0 Kudos
ARyba
Beginner
390 Views

Thanks for sharing. I'm second to request support for USB2 protocol in the next DXXX F/W update. With 720p or less resolution, USB2 would cover most of the needs. Also, USB2 cables are plenty, while finding a good working USB3 cable of proper length and flexibility is close to impossible.

-albertr

0 Kudos
MartyG
Honored Contributor III
390 Views

Have you seen this, Albert?

0 Kudos
ARyba
Beginner
390 Views

Marty, actually we are trying to use the D415 camera on a small drone, so we need about 15cm long cable which should be rather flexible and preferably light. Also, having Intel to choose location of USB type C connector on a side of the camera doesn't help either. It would be more easy if the connector was on the back of the camera. So far I was not able to locate any working 15cm USB3.0 cable which can be used with this camera. There're much more choices should camera support USB2 protocol instead.

-albertr

0 Kudos
MartyG
Honored Contributor III
390 Views

The D415 can be used with the USB 3.0 short cable that comes with the camera in a USB 2.0 port. It starts in USB2 mode, which limits it to 480x270 resolution only. I do not have a USB 2.0 version of the cable to test it with to see if that works too though.

0 Kudos
ARyba
Beginner
390 Views

Marty, thanks! But 480x270 doesn't cut it... we need 1280 x 720 @ 30 fps. I don't think one needs 5Gbit USB3 super-speed for that, even with two concurrent streams USB2 should be able to handle it? Also - do we really need raw output from RGB sensor? I wouldn't mind MJPEG compression if it can be added to the next firmware release.

-albertr

0 Kudos
MartyG
Honored Contributor III
390 Views

The settings offered in USB2 mode (424 x 240 or 480x270 for depth and IR) seem to be part of a "Low Res" setting that is chosen for this mode, and I could not find a way to change to a higher resolution on a USB 2 connection.

An RGB option is not provided by the RealSense camera in this mode, although you can add RGB from another camera with the Add Source option at the top of the RealSense Viewer. I was able to add a 640x480 stream from my laptop webcam with this. I don't know if you can get a higher resolution from attaching a better webcam - you might want to try that. It does not solve the depth resolution restriction though, of course.

You can see the release notes for firmware, which contain hints of what fixes are planned for upcoming firmware releases, by clicking on the firmware link on the SDK download page.

0 Kudos
Reply