Items with no label
3335 Discussions

I want to get emotional values on Unity using SR 300

ktake5
Beginner
1,547 Views

I am thinking to move objects according to facial expressions on Unity.

https://software.intel.com/sites/landingpage/realsense/camera-sdk/v1.1/documentation/html/index.html?doc_pt_detecting_facial_expressions.html Intel® RealSense™ SDK 2016 R2 Documentation

I thought watching the above document.

By judging which emotion the facial expression acquired from SR 300 is feeling, it is possible to make what can move the object according to the emotion..

But it is stalled.

Is there a way to implement this idea?

By the way, using PXCMFaceData.LandmarkType.

I could try to move the object if the mouth is open more than a certain amount.

And I am not very good at English.

0 Kudos
7 Replies
MartyG
Honored Contributor III
286 Views

You do not need to use that scripting to move an object in Unity, as the R2 SDK has a tool set called the Unity Toolkit that contains pre-made tracking scripts that you can just drag and drop into objects. One of these, 'TrackingAction', can detect a facial expression and allow an object to move or rotate a certain amount based on the settings that you have put into TrackingAction's menus and settings in the Inspector panel of Unity.

I have written a large range of detailed step by step guides on using TrackingAction in Unity, and adapting its code to improve its capabilities.

https://software.intel.com/en-us/forums/realsense/topic/676139 Index of Marty G's RealSense Unity How-To Guides

I will also show you a very old video from my own Unity game project where I control an avatar character's face parts with TrackingAction. I am a little embarrassed to show it as my body animation technology is far more advanced now, but this is the best video example video of animating directly with TrackingAction that I have (I wrote my own Unity custom real-time animation system called CamAnims after that).

https://www.youtube.com/watch%3Fv%3DHspFO87NHeY%26t%3D1m0s https://www.youtube.com/watch?v=HspFO87NHeY&t=1m0s

Here is details of the more advanced CamAnims system for further reading.

/message/486895# 486895 https://communities.intel.com/message/486895# 486895

ktake5
Beginner
286 Views

Thank you for answering.

I got a little video to watch.

Since it seems that you can get hints, I will have you see the details later.

In the future I am thinking to change the face of the 3D model according to emotion of person.

0 Kudos
MartyG
Honored Contributor III
286 Views

There are two ways that you could detect emotions:

1. Use a 'SendMessageAction' component from the Unity Toolkit to trigger an action when a certain face expression is detected, such as a smile, open mouth or closed eyes.

2. Use the CamAnims method described in my CamAnims article to use the values of an object to calculate what the current emotion expressed by the player in front of the camera is.

0 Kudos
ktake5
Beginner
286 Views

I tried the way of 1.

 

However, with this method I felt it difficult to achieve the level I wanted to do.

 

Because the setting items of FacialExpression were insufficient.

 

 

I thought that it would be better to use PersonExpressionsEnum for judgment of emotion value than FaceExpression.

 

 

In particular, I think that it is interesting if the model interlocks by judging emotions of "SADNESS" "SURPRISE" "ANGER".

 

When trying to judge these emotions, I felt that FaceExpression can not deal with it.

 

 

So, after this I will try out if it can not be done by the way of 2.

FaceExpression https://software.intel.com/sites/landingpage/realsense/camera-sdk/v1.1/documentation/html/index.html?faceexpression_expressionsdata_pxcfacedata.html Intel® RealSense™ SDK 2016 R2 Documentation

PersonExpressionsEnum https://software.intel.com/sites/landingpage/realsense/camera-sdk/v1.1/documentation/html/index.html?personexpressionsenum_personexpressions_pxcpersontrackingdata.html Intel® RealSense™ SDK 2016 R2 Documentation

0 Kudos
ktake5
Beginner
286 Views

I tried the way of 2.

 

 

It is acquiring how much each part of the face moved from the initial value, and making a judgment by combining them.

 

 

I got the movement of the mouth, so if I get the movement of the eyebrows, I may be able to do similar things.

 

 

However, since it seems to be difficult to independently create judgments of each emotion, I thought that "if realsense makes judgment of emotion it can take an easy way".
0 Kudos
MartyG
Honored Contributor III
286 Views

I do not think that Intel will create a new emotion detection program for RealSense. I am sure that they would be happy if somebody else made one and shared it with the RealSense community though.

0 Kudos
ktake5
Beginner
286 Views

I see.

I understood that there was no emotion detection program and it was helpful.

Thank you,MatyG.

0 Kudos
Reply