Software Archive
Read-only legacy content
17061 Discussions

Hand Tracking Configuration

David_D_2
Beginner
651 Views

What is the best way to set up the hand configuration for hand tracking?  I want to get the best and clearest results, especially when it comes to gesture tracking.

0 Kudos
5 Replies
MartyG
Honored Contributor III
651 Views

My own general rules for hand tracking include:

* Use the Joint 1 (J1) finger joint for all finger tracking if possible, as it's the one that the camera can consistently see easiest and not lose tracking, whether the hand is facing palm towards the camera or is side-on.

* Use the palm center for doing large movements instead of the wrist joint, as - again - the palm is easier for the camera to consistently see.  Use the wrist joint only if you want to tun your real-life wrist to rotate something.

*  For most hand-control purposes, using the 'Weighted' type of smoothing will probably work best for you.  A Weighted value of between 6 and 10 will give you a nice amount of control that does not make the controlled object shoot across the screen.  If that's too slow for you, the minimum value I use is 3.

*  Most of the time, the Real World Box and Virtual World Box values can all be set to zero if you're making something that does not need to have its movement restricted.

*  If your object is going to be able to move in more than one direction, I recommend using constraints to lock the Z direction so the object can only move horizontally (X) and vertically (Y).  If you have Z unlocked then the object is more liable to move backwards and forwards when you move your hand towards and away from the camera.

Hope that helps!

0 Kudos
David_D_2
Beginner
651 Views

Thanks Marty!  I will definetly need to try these out.  Would these also help with improving gesture recongintion?  That has been my biggest problem so far, is that the camera seems to be kinda of spotty when it comes to picking up different gestures.

0 Kudos
MartyG
Honored Contributor III
651 Views

Hand open and close, spread fingers and the thumb-and-index pinch are the gestures that I have had the most consistent recognition success with.  I haven't had as much luck with the other hand gestures.  

I think one of the main factors in successful recognition of a gesture is the distance of your hand from the camera.  I find that if you move your hand too close to the lens then the camera can no longer see the hand clearly.  It can also have trouble seeing the hand if it is too far back (e.g near to your chest).

0 Kudos
David_D_2
Beginner
651 Views

Alright I may look into the distance when doing gesture recognition.  I have been working with only spread fingers and fist which has given me mixed results.  Sometimes it will pick up the gestrues no problem but for the most part it will have trouble finding the fist.  Most of the time when I make the fist it will just alert me, "Hand Lost", and I will have to open it back up (which sometimes doens't even register as a spread finger) and have it calibrate again.

0 Kudos
MartyG
Honored Contributor III
651 Views

From your description, I am wondering if you are holding your hand up high enough.  One of the features of the current SDK is that if the camera loses tracking (e.g because the hand has moved outside the camera's field of view) then it automatically goes into Stop mode.

If you have your camera positioned on your desk like I do instead of hanging off the top of your monitor, it is easy to unconsciously let your hands drift below the level of the top of the desk, and the hand-lost condition activates.  

Gravity can make it tiring to keep your hands held higher up in the air when the camera is atop the monitor.  So as a camera-on-desk user, I found that the best way to keep tracking from stalling is to keep your eyes loosely fixed on whatever you are controlling on-screen.  For example, with my full-body avatar I follow its head around the screen as it walks.  This helps to prevent your attention from subconsciously drifting and letting your hands drop too low.

Edit: I am not sure what platform you are using for development (C, JavaScript, Unity, etc).  In Unity at least, the hand opened and hand closed gestures have a setting called Openness Factor that determines how far a hand's fingers must open or close before the gesture is triggered.  If the Openness Factor is set high then - in Unity, at least - it is very easy for the camera to think you've made a closed-fist even when your hand is open.  As my avatar currently uses closed and open hand to toggle between crouch and standing positions, it was bouncing up and down on the spot like a kangaroo at one point!

Changing the Openness Factor from the default (which is '50') to a lower value ('20' is my preference) makes it more difficult for a gesture to be mis-triggered.

 

0 Kudos
Reply