- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
hi,
I'm working on allowing user interaction through hand movement and open/closing hands using the euclid dev kit with linux (ubuntu/ros).
For me it looks like body tracking is quite limited with the linux person tracking Lib. I only get 6 Joints of the upper Body and not the full skeleton. The hand position is a single point and there are no addition information about thumb or other fingers or if the hand is open/close.
Is there any other way to get this information or are there any plan to extend the person tracking lib for linux. Because from what I read on the internet it seems to work using windows sdk.
Best regards
Tom
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Your assessment of the Person Tracking feature's limitations is accurate. In my own RealSense project, I extrapolate a full skeleton by analyzing a handful of body points such as the palm and spine and running calculations that decide what the other untracked body points would be doing when those tracked points move in a certain way.
That does not help though with your dilemma of how to simulate hand open-close. Since the palm is tracked to provide the hand status, In the Unity game engine's implementation of RealSense hand tracking, if the hand is open and the palm visible to the camera then this counts as Hand Open. If the palm is not visible to the camera because of the hand being closed then this is regarded as Hand Closed.
Since the Hand Closed status is probably partly decided upon by analyzing hand joint positions (something the RealSense SDK for Linux cannot do), the easiest way to simulate Hand Closed may be to equate it to Hand Lost ... if the camera cannot detect the palm then it is automatically classed as being closed.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I could not find a way to tell if the hand was closed. The Person Lib still returns the hand position if the hand is closed.
Is there any way to get more details about the detection from where it could be possible to get hand status(open/close)?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I was unfortunately not able to find a built-in way to detect a hand-closed state in the RealSense SDK For Linux. It does though support a pointing gesture, and comes with a sample program for doing that. Perhaps you could use open hand / point instead of open hand / closed hand?
https://software.intel.com/sites/products/realsense/samples/md_samples_pt_tutorial_3_README_pt_tutorial_3.html Intel® RealSense™ SDK for Linux Samples: pt_tutorial_3

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page