Software Archive
Read-only legacy content
17061 Discussions

RealSense Unity avatar tech demo reel

MartyG
Honored Contributor III
1,073 Views

Hi everyone,

We wanted to highlight the avatar technology that we developed with RealSense and Unity but weren't sure if the RealSense App Challenge rules would permit us to post our finalist video in public before the results were announced.  So we stitched together a new simple tech demo video from footage in our development archives that includes brand new footage of the very latest state of our RealSense-powered facial animation.

https://www.youtube.com/watch?v=_wFvcmNM5xA&feature=youtu.be

We are aiming for a commercial release of the full game by early June to coincide with the contest result announcements, the E3 games convention and the reported launch window of Windows 10 (we are hoping to release a version of the game as a universal Windows 10 app that is also compatible with Xbox One). 

All of the new RealSense / Unity techniques we discover and develop will be shared on our free online developer diary, which is currently up to episode 41 (contest deadline day), with more to come between now and June.  The diary teaches how to build all of the mechanisms you saw in the above demo reel.

http://test.sambiglyon.org/fatherdiary

0 Kudos
5 Replies
MartyG
Honored Contributor III
1,073 Views

Edit: the archival footage of the arm movements did not really illustrate their full range of movement, so we recorded a new version of the tech reel with fresh arm control footage from the latest version of the game.

https://www.youtube.com/watch?v=PX_RePc5YUQ&feature=youtu.be

Our aim throughout the project has always been to enable the player to perform complex full-body movement choreography, like that in this clip from series 6 of the cartoon WinX Club.  

https://www.youtube.com/watch?v=HXX62gWDgMs

We have leg movement working in prototype form - details of that prototype are in the diary - and will aim to have leg detection fully supported in the final version of our Unity tech.

 

0 Kudos
Vidyasagar_MSC
Innovator
1,073 Views

Interesting Marty!!! Would love to see the final version in june.

0 Kudos
MartyG
Honored Contributor III
1,073 Views

Thanks so much, Vidyasagar.  We got live opening and closing of the eyelids working for the first time today.  This is a feature that the Intel Unity avatar sample packaged with the SDK had, so we were keen to achieve it in our own avatar.

The Intel example works by using loose objects called prefabs that are overlaid on top of a mannequin model, combined with a rather complex and lengthy custom animation controller script that is probably proprietary to Intel.  Our approach throughout our project has been to use the SDK's 'TrackingAction' action script, combined sometimes with some simple JavaScript scripting, so that others can duplicate our tech easily in their own projects without fear that they might be infringing a copyright.  

This will be helpful to them when the next App Challenge contest comes around, as they can build upon what we have already created instead of having to start from scratch and not worry that they might be disqualified because of using another's assets - we give all our techniques and scripting away freely.

0 Kudos
Vidyasagar_MSC
Innovator
1,073 Views

Thanks for a clear explanation Marty.This inspires us and motivates us to move forward. All the very best and Keep Rocking as usual !!

0 Kudos
MartyG
Honored Contributor III
1,073 Views

Thanks Vidyasagar!  ^_^

Some may think that the arm movements are pre-made animations triggered by a gesture.  They are actually live real-time arm movements, which would seem impossible when considering that the arm has no tracking points.

It was built around the principle that a real-life lower and upper arm cannot move without certain parts of the hand (which is trackable) moving as well.  For instance, lower arm movement (side to side and up and down) is driven by the pinky finger of the hand, since the lower arm cannot move without the pinky moving too. 

The shoulder, which drives movement of the upper arm, moves up and down and swings the upper arm sidewards towards and away from the body, is driven by movement of the hand palm up, down and left / right.  This is because it is impossible for the RL shoulder to move without the hand moving.  

Because the lower arm is attached as a child object to the upper arm, the lower arm moves when the upper one does and yet can still move independently because it has its own rotational joint.  This is due to how the 'parent and child' object linking hierarchy works in Unity: the object above in the hierarchy lists moves the one below.  So the shoulder is the overall parent of the arm.  The upper arm is a child of the shoulder and is moved when the shoulder moves.  The lower arm is a child of the upper arm and moves when the upper moves.  The wrist is a child of the lower arm and moves with it, and the hand is a child of the wrist and moves when the wrist moves.

BUT the second half of the parent-child relationship is that when a child object moves, it can only move the object directly below it in the hioerarchy - NOT the object above it.  So the upper arm can move the lower arm, but the lower arm cannot move the upper arm.  So like a real human arm, the upper and lower can move together but the lower can move independently if it receives a rotation input from the camera.

The term I used for designing these movements was Reverse Thinking.  Instead of thinking how to move a shoulder when it has no tracking point, I started at the opposite end of the arm, where the tracking points are, and thought how a hand could move a shoulder.

0 Kudos
Reply