- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The documentation seems to indicate that skeletal tracking is available for the R200 camera in a limited capacity.
While some of the function names appear to be slightly out-of-date, the API generally seems to work as advertised, with only one major issue: QueryNumJoints always returns zero, even while a person is recognized and Joint Tracking is enabled.
In fact, it not only fails to yield joint data on our own test project, but on the Person Tracking sample project as well.
My thought is that either this feature hasn't been implemented yet, or there is a problem with our camera setup. Can you help shed some light on this issue?
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Body skeletal tracking is not available now. We will implement this feature in the future release. Thanks for supporting RealSense SDK.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hey David,
Is the skeletal tracking available for Java in R200 now? I saw the code related to person tracking and skeleton tracking for R200 in Documentation. But the java library libpxcclr.java.jar i got doesn't contains any Java class related to Person Tracking.
Please provide your feedback.
Thanks,
-MohanG
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
So what is the situation with the skeleton tracking? Is it working in the latest SDK? (In C++ or C#)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
It's worth mentioning that you can already fake full skeletal tracking using the hand and face tracking already available. You do this by thinking about how the face or hands are moving when the skeletal part you want to simulate is moving.
For example, if you wanted to simulate the up-down and left-right movement of the spine then you can track the nose-tip face landmark, since the spine cannot move in real life without the nose moving. And if you want to simulate shoulder blade swing then you track the left-right movement of the hand, since the shoulder swings backward and forward as the hand moves left and right.
This video highlights the process.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Ok. I'm confused in the extreme. Every piece of marketing, advertising, and promotion of the RealSense technology, and the SR300 camera which I'm interested in, has claimed it supports full body skeletal tracking. Before I commit to buying, I decided to do some digging into the API to figure out how to start to use this technology, and I find this conversation thread which seems to indicate that all of that is a lie.
Could someone in a position of authority at Intel please answer the simple question: if I buy an SR300, get it all set up, stand in front of it and move my arms and legs about, is there an API call that will return the position of all the joints of my skeleton? Is there a map or description of which joints and bones are resolved? As in ankles, knees, hips, shoulders, elbows, wrists, head?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I should clarify that I do all my development with the F200, the SR300's predecessor, so all of my experience is from the perspective of that device. I can't offer solid information about the SR300, sadly, as anything I say about it would be speculation. *smiles*
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page