- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi!
It seems that at least the rotation of the hand palm is not being calculated correctly:
I'm using the code as follows to read the position and rotations of the palm and the joints in Unity:
int handId = GetHandId(handIndex);
// Query data for the received hand IDs
PXCMHandData.IHand iHand;
_handAnalyzer.QueryHandDataById(handId, out iHand);
// Get hand position and rotation
Vector3 sensorHandPosition = iHand.QueryMassCenterWorld();
Data.Hands[handId].PalmCenter = ToUnitySpace(sensorHandPosition);
Quaternion sensorHandRotation = iHand.QueryPalmOrientation();
Data.Hands[handId].PalmRotation = ToUnitySpace(sensorHandRotation);
for (int i = 2; i < PXCMHandData.NUMBER_OF_JOINTS; i++)
{
PXCMHandData.JointData iJoint;
iHand.QueryTrackedJoint((PXCMHandData.JointType)i, out iJoint);
int fingerId = Mathf.FloorToInt((i - 2.0f) / 4.0f);
Finger.FingerType fingerType = (Finger.FingerType)fingerId;
// Calculate joint id
int jointId = i - fingerId * 4 - 1;
Vector3 absolutePosition = ToUnitySpace(iJoint.positionWorld);
Data.Hands[handId].Fingers[fingerType].Joints[jointId].AbsolutePosition = absolutePosition;
Vector3 relativePosition = Data.Hands[handId].CalcRelativeJointPosition(absolutePosition);
Data.Hands[handId].Fingers[fingerType].Joints[jointId].RelativePosition = relativePosition;
Quaternion absoluteRotation = ToUnitySpace(iJoint.globalOrientation);
Data.Hands[handId].Fingers[fingerType].Joints[jointId].AbsoluteRotation = absoluteRotation;
Quaternion relativeRotation = Data.Hands[handId].CalcRelativeJointRotation(absoluteRotation);
Data.Hands[handId].Fingers[fingerType].Joints[jointId].RelativeRotation = relativeRotation;
}
ToUnitySpace is the function which transforms position and rotation into Unity world coordinates, so I'm able to place the camera in Unity exactly as it is in reality. However, for testing purposes this function currently returns the input parameter, sensorHandPosition/sensorHandRotation.
The behaviour I'm experiencing right now, is as follows: The palm center rotates against the direction of the hand. This can be observed by using a Debug.DrawRay call on the palm center point, using any of the direction vectors (such as Vector3.down). You can see that, when tilting the hand in one direction, the palm center rotates in the opposing direction.
I'm using the RSUnityToolkit of the Realsense 2016 R2 SDK in Unity 5.5.1.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi HoloGuy,
Thanks for your interest in the Intel® Realsense™ Platform.
I'm sorry to hear that you are having a palm rotation issue with your code. Actually, I followed the Hand Tracking sample of Unity in the SDK R2 2016 and I got really good results with the hand palm rotation, please see the images below:
I would encourage you to see and analyze the code that you can find in the following path: C:\Program Files (x86)\Intel\RSSDK\framework\Unity\FF_HandsAnimation\Assets\Scripts.
Hope this helps, have a nice day!
Best Regards,
-Jose P.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Jose!
The palm itself rotates properly, this is correct. But the palm center, which is the red ball in your hand visualization, does not. I'm not able to upload any files here so I cant provide my little coordinate system asset which consists of three orthogonal cylinders but you can build one for your own - or just use a cube to be able to see the rotation of the prefab.
If you change the code in HandsViewer.cs in lines 189 ff. from
PXCMPoint3DF32 smoothedPoint = smoother3D[i][j].SmoothValue(jointData[i][j].positionWorld);
myJoints[i][j].SetActive(true);
myJoints[i][j].transform.position = new Vector3(-1 * smoothedPoint.x, smoothedPoint.y, smoothedPoint.z) * 100f;
jointData[i][j] = null;
to
PXCMPoint3DF32 smoothedPoint = smoother3D[i][j].SmoothValue(jointData[i][j].positionWorld);
myJoints[i][j].SetActive(true);
myJoints[i][j].transform.position = new Vector3(-1 * smoothedPoint.x, smoothedPoint.y, smoothedPoint.z) * 100f;
if (j == 1)
myJoints[i][j].transform.rotation = jointData[i][j].globalOrientation;
jointData[i][j] = null;
and drop the new prefab into the "Palm Center Prefab" field, you can see that the rotation does not fit.
// edit: As seems as the following rotation fixes my problem but not that in this scene:
Quaternion newRot = new Quaternion(-oldRot.x, oldRot.y, -oldRot.z, oldRot.w);
Anyway, something seems to be broken in the calculation of the rotation...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi HoloGuy,
Just to be clear by defining the new Quaternion your issue was fixed? If this is the case, what do you mean when you said: "but not that in this scene". Does it mean that the issue persists?
We will be waiting for your reply, have a nice day!
Best Regards,
-Jose P.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Sorry, I meant that the solution above fixes the rotation issue within my application (at least for the palm center - I didn't check the joints' rotations yet) but I does not work with the provided test scene that comes with the SDK and that you used.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi HoloGuy,
It's great that you were able to fix your code!
Now, regarding the Unity sample of Hand Tracking, I will report this issue to the respective Department.
We appreciate your feedback!
Have a great day!
Best Regards,
-Jose P.
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page