- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi everyone,
I was going through my old RealSense tech demo videos and found one from late 2014 that showcased how the RealSense camera can control an avatar with the hand tracking function using feet and toes. This is because the camera treats the base of the foot like a palm and the toes like fingers. In the video, you can see the avatar's fingers wiggling in response to the wiggling of the real-life toes.
This method also works for controlling an object with leg movements, as the camera treats the knee as a large hand palm. The approach never got used outside of experiments though because by its very nature it was impractical, as the user would have to remove their pants / trousers for knee tracking or their shoes and socks for foot tracking. It may have more specific applications as a means of object control for physically disabled users though.
I have cut out the toe control section from my tech video though and posted it as a standalone video, just so other developers can see it in action.
https://www.youtube.com/watch?v=IOJtApQqa8A RealSense and Foot Tracking - YouTube
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
On a related note for potential disability usability applications, I thought I'd also drop into this discussion my experiment with "thought controlled" avatar legs using facial micro-expressions with a RealSense F200 camera.
https://www.youtube.com/watch?v=HHzXdLqI8p4 'My Father's Face' Tech Trailer 2 - "Thought" controlled avatar legs - YouTube
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page