Items with no label
3335 Discussions

how to process depth xml data

GGabr4
Beginner
1,442 Views

Using R200, we were able to save depth images in jpg and depth data for each frame in xml.

We used the OpenCV's File Storage in storing the depth in xml.

Attached is the xml

The xml data contains the depth in mm for each pixel of the image.

Do you have an idea how this depth data can be useful? We want to use this depth data for skeleton modeling.

0 Kudos
3 Replies
MartyG
Honored Contributor III
310 Views

I'm not sure if there is an easy, definitive answer to this question. Traditionally with older (pre-400 Series) RealSense cameras, the two main ways to store RealSense data in a file that will work outside of the RealSense SDK is to place the data in an image (like you did) or to convert the RealSense stream data into a MATLAB .mat format file.

https://www.mathworks.com/help/pdf_doc/matlab/matfile_format.pdf https://www.mathworks.com/help/pdf_doc/matlab/matfile_format.pdf

Could you clarify what you are trying to achieve in your project please? Are you trying to detect the skeleton or to track its joint points? Thanks!

0 Kudos
GGabr4
Beginner
310 Views

We want to track the joint points of the upper body but using the third dimension which is the depth. We know that we can track the movement in 2D by just using the images. But our main goal is use the depth data to move a skeletal model in Unity; The skeletal model is supposed to follow the movement captured by the camera. Does that make sense?

0 Kudos
MartyG
Honored Contributor III
310 Views

If you are using Unity then a solution is much simpler than recording the depth data into a file. The '2016 R2' RealSense SDK comes with a program called the Unity Toolkit. This can be used to import RealSense support into a Unity project, as well as a number of useful pre-made tracking scripts that can be dropped into Unity objects to control them with camera input.

One of these scripts is called TrackingAction. Once placed in an object, you can configure it in Unity's Inspector panel with menus to move that object in response to the face.without needing any programming knowledge.

The Unity Toolkit can be found in the SDK's RSSDK > framework > Unity folder. To run it, you should first open your Unity project, and then run the Unity Toolkit file. This causes a list of RealSense toolkit files to pop up in your Unity window, and gives you the option to click 'Import' to import them automatically into your project, along with the camera driver files.

Whilst the R200 can only provide face tracking inputs, as it doesn't have hand joint tracking, the movement of the head can be used to determine the movement of other parts of the body. For example, if the face moves towards in the Z-depth direction towards the camera then that can be interpreted as the waist bending forwards, whilst moving the head back from the camera can be interpreted as straightening the waist up.

Below is an old YouTube video from my own Unity project that demonstrates how the joints of a model can be realistically moved with simple camera inputs.

https://www.youtube.com/watch?v=T_V8SNwLTrY 'My Father's Face' Tech Trailer 7 - YouTube

I have have a large range of published step by step Unity RealSense guides for the 2016 R2 SDK here:

https://software.intel.com/en-us/forums/realsense/topic/676139 Index of Marty G's RealSense Unity How-To Guides

0 Kudos
Reply