Software Archive
Read-only legacy content
17060 Discussions

Demonstration of avatar hand bending with RealSense

MartyG
Honored Contributor III
160 Views

Hi everyone,

During the development of the RealSense game project I am working on, it has been clear for a long time that in order for our camera-controlled full body avatar to be able to use its hands effectively, those hands would have to replicate a real person's ability to bend up and down.

Unless the hands can do this, the arms of the avatar have to be moved into an unnatural-looking position where the hands can reach where they need to go.  The problem became an even more urgent one to solve once we had added physical collision detection to our avatar.  This is because the human arm has great difficulty putting the hand on the front of the body unless the lower arm swings sidewards via shoulder movement and then the hand also bends at the end of the arm to touch against the skin.  

Because our avatar arm lacked a bendable hand, it was very hard to persuade the arm to reach the hand around to the front when the arm was in a relaxed position, because the arm would just collide against the side of the body and stop.  We could foresee that this would cause frustration for the player, which was unacceptable to us as our goal is to make the camera-controlled arms as natural to move as the player's real limbs.

We finally managed to install a hand bending system today, based on the experience gained from several failed prototype attempts.  Here is a YouTube video of the bend at work.

https://www.youtube.com/watch?v=vcUV3au-C4c&feature=youtu.be

HOW WE DID IT IN UNITY

Step One

Create two wrist joints from a small 'Sphere' type object.  Drag and drop one of the spheres onto the other sphere to make the dropped sphere a "child object" of the other.

http://sambiglyon.org/sites/default/files/joint2.jpg

If your hand is part of a full avatar arm instead of just a hovering hand, drag and drop the parent sphere onto the object representing the wrist flesh on your avatar arm, and then drag and drop the parent object of the hand flesh onto the child sphere, so that the spherical joints are inbetween the wrist and the hand.

Step Two

Give the parent and child spheres a 'TrackingAction' SDK script with the unique settings shown in the image below.

http://sambiglyon.org/sites/default/files/joint1.jpg

The main difference between the two is in the Constraint settings.  The parent joint constrains all directional axes except Y, so that the hand can bend up and down when the wrist joint on the real-life hand moves up and down.

On the child joint meanwhile, the up / down axis is locked and the left-right X axis is unlocked instead so that the hand can turn left and right.

When installing the TrackingAction scripts in the opposite hand, set the 'Index' option to '1' instead of '0' so that Unity recognizes the left and right hands as being separate and turns them individually.

Step 3

Finally, put a Rigidbody' physics component with the settings below in both of the sphere joints.

http://sambiglyon.org/sites/default/files/joint3.jpg

Edit: we thought we should add for the sake of clarity that whilst it may seem unusual that every position and rotation constraint on the Rigidbody is locked to disable all movement, TrackingAction scripts actually ignore Rigidbody settings and give their own constraint settings priority.  So even if rotation was locked in a Rigidbody, a camera-powered object will still rotate so long as those axes are unlocked in the TrackingAction's own constraint settings.  

The reason why we bother to use constraints in the Rigidbody is that whilst the camera may ignore them, Unity's physics engine will listen to them under some circumstances, and so freezing them helps to reduce the likelihood that the joint will fall off the arm or be torn out of position if another object collides with it.

0 Kudos
0 Replies
Reply