Items with no label
3335 Discussions

Is there a way to get the PersonTracking Module working in Unity for the SDK R2 ?

DCunn2
Beginner
2,302 Views

I want to use Person tracking in Unity. Is there a way to get it working in Unity or how can i get it working in Unity ?

0 Kudos
16 Replies
MartyG
Honored Contributor III
744 Views

Somebody else asked this question back in December 2015. Intel support staffer David Lu gave the reply that Intel had not yet had the chance to create a Unity version of the Person Tracking C++ sample. That situation remains the case, as a Unity sample of it never got made and is unlikely to be, as support for the R200 in the official RealSense SDK ended after '2016 R2'.

Trying to use the C++ Person Tracking sample in Unity would likely be a hassle. As it's not in C# or JavaScript language, Unity would have to treat it as a plugin file (like the camera DLL files) rather than a script.

https://docs.unity3d.com/Manual/Plugins.html Unity - Manual: Plugins

To be honest, the Person Tracking feature is very hard to get working even when programming it outside of Unity. Trying to get it working in Unity would add a whole new layer of complication on top. Also, it was classified as a Preview feature and so only had limited functionality before development of it ceased when official SDK support for the R200 ended after R2.

There are ways to simulate full body motion in Unity that are much easier, if you think cleverly about how to use hand and face tracking. Although the R200 does not have hand joint tracking, you can use Blob Tracking to achieve the same control by tracking the palm of the hand with it.

Check out the full-body avatar with complete virtual limb control in my own Unity project.

https://www.youtube.com/watch?v=T_V8SNwLTrY 'My Father's Face' Tech Trailer 7 - YouTube

Edit: I realize I made the assumption that you were using an R200 camera because that is what most people who are interested in Person Tracking use. The SR300 has Person Tracking too though. Which camera are you using please?

0 Kudos
DCunn2
Beginner
744 Views

Sorry for the late Answer, i'm using the SR300 Camera. Thank you for your Response !

 

I also had the idea to use the Blob tracking but i think that i thought it may be easier to get the Person Tracking Module working in Unity then to completely reprogram the Person Tracking with Blob Tracking Module. As far as i Know there are also no examples for the Blob Tracking in Unity ?
0 Kudos
MartyG
Honored Contributor III
744 Views

Blob Tracking was removed in the latest '2016 R3' SDK, according to its release notes. So you would have to use the '2016 R2' SDK to use that feature. The notes say the some features were removed in R3 as Intel refocuses the SDK's feature list based on developer feedback.

Blob Tracking is supported in Unity as part of the 'TrackingAction' script. On the TrackingAction menu beside 'Tracking Source', you can drop that down and choose Blob Tracking instead of Hand Tracking.

0 Kudos
DCunn2
Beginner
744 Views

Hello,

i had success on getting the Blob Module working in the 2016 R2 SDK in Unity.

But i don't know how to use it. Can anybody give me a hint or an example?

0 Kudos
MartyG
Honored Contributor III
744 Views

Are you using the TrackingAction implementation of blob tracking, or did you write your own blob tracking script?

To use blob tracking, you should simply have to move a large flat-ish area of the body such as the forehead or the hand palm close to the camera in order for tracking to be activated.

0 Kudos
idata
Employee
744 Views

Hello realID,

 

 

I was wondering if you could check the questions asked by MartyG.

 

 

If you have any other question or update, don't hesitate to contact us.

 

 

Regards,

 

Andres V.

 

0 Kudos
DCunn2
Beginner
744 Views

I did use the TrackingAction implementation. The one of the Screenshot you provided. I don't know what I am doing wrong.

How can i make the tracking visible or should this happen automatically like with the Hand or Face tracking actions?

0 Kudos
MartyG
Honored Contributor III
744 Views

When using TrackingAction, the most obvious evidence that tracking is occurring is that the object that contains the TrackingAction is moving when you move your hand in front of the camera. If it does not seem to be moving, please bear in mind that with Blob Tracking, you need to put your hand much closer to the camera than with the more advanced Hand Tracking method.

Another way to see if your object is reacting to your hand movements is to run your project in the editor test-mode with the object containing your TrackingAction highlighted, so that the details of the TrackingAction script is shown in the Inspector panel of Unity. If you look at the Position and Rotation coordinates at the top of the Inspector, if the Position or Rotation values there are changing during tracking then the object is reacting to your input.

Another thing to remember is that unless you set Position constraints in the 'Constraints' section of the TrackingAction then it may seem as though your object disappears when the hand is detected, because the object moves so far that it goes offscreen. Position constraints ensure that the object can only move a certain distance on the screen before it is forced to stop.

It would be very helpful if you could provide an image of your TrackingAction's configuration, including having the constraints section of it expanded open so we can see the settings you have in there. Thanks!

0 Kudos
DCunn2
Beginner
744 Views

Hello Marty,

i got it working now. I had to alter the SenseToolkitManager instance because the BlobExtractor Class of the SDK was deprecated in the Version of my SDK. I only had to change BlobExtractor to BlobModule and it's working now.

Now what would i need to do in order to recognize the right Body parts ? i guess i need to extract the segmentation image and compare it to individual images of the right body parts. Is this possible ?

0 Kudos
MartyG
Honored Contributor III
744 Views

Your idea about segmentation.sounds feasible but you should not try to match the images too strictly or it will only recognize the body that the comparison images were taken from.

0 Kudos
DCunn2
Beginner
744 Views

I'm a little bit stuck here, actually. To be honest i don't know how i could compare the segmentation images. May you please give me a hint ? Or is it a better approach to compare the contour data of a blob ?

0 Kudos
MartyG
Honored Contributor III
744 Views

This subject is outside of my knowledge, sadly. Hopefully someone else reading this today can offer useful advice on an approach for you to take. Good luck!

0 Kudos
DCunn2
Beginner
744 Views

Yeah i hope so. Anyways, Thank You! You're always helping !

0 Kudos
MartyG
Honored Contributor III
744 Views

In Unity, I simulated recognition of the movement of most body parts (shoulders, lower arms, waist, etc) by using a method I developed called Reverse Thinking. Instead of trying to track the body points that RealSense can't follow, I track a hand or face point and then use scripting to calculate automatically how body sections affected by movement of those hand and face points should be affected.

For example, if my hand palm moves up in front of the camera then my system knows that the lower arm should be lifting a little and the shoulder lifting a lot. If I lower my head then the system recognizes that the waist joint should be bending, causing the upper body to lean forwards. By applying this method of thinking, a few tracked points can be used to work out the positions of most of the body's parts.

Here's an old video from my project that demonstrates these tracking principles with a full-body avatar.

https://www.youtube.com/watch?v=T_V8SNwLTrY 'My Father's Face' Tech Trailer 7 - YouTube

0 Kudos
Michael_M_Intel1
Employee
744 Views

MartyG, I believe the accepted name for what you call "Reverse Thinking" is actually "Inverse Kinematics". Basically, given the positions of some key joints, and some simple constraints (for instance, as joints move, they should stay as close as possible to their old positions, smoothness, energy minimization, etc) you can solve for some additional "free" joints. This is used a lot in animation and I'm sure Unity supports some variant of it. If not, you can implement it using a nonlinear solver. A reference (refers to robotics, but the same algorithms work for animation):

https://en.wikipedia.org/wiki/Inverse_kinematics Inverse kinematics - Wikipedia

Of course you still need to solve the problem of finding the positions of the "key joints".

0 Kudos
MartyG
Honored Contributor III
744 Views

There may be similarities to the principles of Inverse Kinematics, but Reverse Thinking really does involve thinking and studying how anatomy moves before you can program the untracked body parts to react correctly.. It took multiple anatomy studies over a period of two years to achieve that real-time control of the avatar's limbs. This is because there are a lot of optical illusions in how limb parts move.

If you look at the arm whilst it is moving, you may draw conclusions about how the shoulder, armpit and lower arm are moving, but subsequent studies - especially of what the armpit is doing - may disprove those earlier conclusions. For example, an arm lift involves the shoulder lifting upwards diagonally whilst simultaneously moving forwards. This causes the upper arm to move sidewards, widening the armpit gap between arm and torso.

The early versions of the avatar had limb movements that looked okay but were technically incorrect, with movements that were more like a toy action figure (arms that lift and drop by rotating on the spot in the socket!). They could also perform motions that would require broken bones in RL to achieve, or some motions that a human could do couldn't be replicated with the avatar because the joints couldn't travel in a certain direction. So it was late in development before it was capable of exotic poses like the Kamehameha Wave from Dragon Ball (one had at the side, the other reaching across the front of the body to meet it and then both hands pushing forwards).

https://www.youtube.com/watch?v=fK26qKohh_Y Dragon Ball Super - Goku's first Kamehameha - YouTube

Thanks for the great tech discussion, McCool.

0 Kudos
Reply