Software Archive
Read-only legacy content
17061 Discussions

Recommended algorithm for skeleton tracking?

David_P_4
Beginner
1,166 Views

I noticed there is no algorithm for skeleton tracking (arms and head only, of course) in the SDK Beta. Can you recommend a published algorithm that's compatible with your programming model?

0 Kudos
6 Replies
Pubudu-Silva_Intel
1,166 Views

David

We do have hand tracking and face tracking. I am not sure what you are referring to by "skeleton tracking", that cannot be achieved by the above two. If you could describe in a high-level what you are trying to achieve using the SDK, it will help me to understand your question better 

0 Kudos
David_P_4
Beginner
1,166 Views

"Skeleton tracking" is the generation of a moving stick figure that matches the user's body poses in real-time. It's one of the features that drives Kinect-for-Windows adoption over DIY computer vision setups:

http://msdn.microsoft.com/en-us/library/hh973074.aspx

Although the Kinect system allows multiple Kinect sensors, I doubt that a Kinect and RealSense could be used together, because the two wouldn't know how to accommodate the likely interference of their infrared signals. And it would be super if RealSense developers didn't have to invest in Kinect also or decide between the two.

0 Kudos
Alex_C_2
Beginner
1,166 Views

So is this camera technology different from Kinect, or could skeletal tracking be built into the SDK?

For just hand and face, it looks like RealSense does a lot of tracking, 

0 Kudos
David_P_4
Beginner
1,166 Views

In the July webinar, the RealSense Program Manager didn't answer a question about what camera tech is used. Kinect for Windows v2 ("Kinect4Wv2."), which is also in beta right now, uses "time of flight" which is a change from "structured light" used in v1.

RealSense tracks the face and fingers of a user less than 2 meters away. And so far, it seems not to provide tracking of elbow, shoulders, and spine position and orientation, although those should be within its range.

Kinect for Windows v2 provides the position and orientation of ~20 joints for up to six people 0.5 to 4.5 meters from the camera.

My app needs to track the position and orientation of one user's head, shoulders, elbows, and hands within 4.5 meters, so I have to change my plans and use Kinect4Wv2.

0 Kudos
Pubudu-Silva_Intel
1,166 Views

David,

If understand your explanation to "Skeleton tracking" right, we have algorithm API to similar functionality. Please have a look at sample apps that comes with the installation to see, if any of those offer functionality similar to what you desire.

0 Kudos
Dagan_E_Intel
Employee
1,166 Views

Hi David,

The current hand or face tracking do not track the arm bones.

One of the reasons for that is that in user facing scenarios the arms and shoulders are hidden most of the time.

We cannot suggest a specific algorithm, because we cannot guarantee its success.

May I ask what is the goal of the arms tracking?

if you don't need accurate positions, but only require that for a "nice visualization" I would try "faking" the skeleton.

For example you can estimate the shoulders from the head's positions, and then solve Inverse Kinematics formula for the elbow positionb, using the shoulder and the wrist.

It's not really skeleton tracking, but might do the trick

hth,

Dagan

0 Kudos
Reply