Software Archive
Read-only legacy content
17061 Discussions

Download of Unity 5 RealSense-powered full body avatar available

MartyG
Honored Contributor III
1,009 Views

Edit: since the original posting of this article, a new build has been uploaded with a significantly improved and stable leg animation system.  It can be downloaded using the link below.

Hi everyone,

The latest Unity 5 version of my company's RealSense-powered full-body avatar is now stable enough for us to release a zipped demo application of it for download and trying out.  The controls are still a little twitchy in places but we're making excellent progress in fine-tuning it.

http://sambiglyon.org/sites/default/files/RealSense-Avatar-Build-3.zip  (12 MB)

Once downloaded, open the folder 'RealSense Avatar Build 3' and run the executable file named 'My-Fathers-Face' inside.

BASIC CONTROLS

*  Face the palms towards the camera to control the arms.  Move the palms up and down to lift and drop the arms, and move the palms left and right to swing the arms sideward.

*  Turn the head left and right to rotate the avatar left and right.

*  Lean forward a little in your chair to make the avatar walk forward in whatever direction it is facing, and lean back in the chair to bring it to a halt.

Note: the leg walking is currently in alpha, so please forgive it if the stride looks a bit wrong.

You can turn whilst walking by moving the head left and right whilst walking forward.

*  Open and close the fingers slightly to open and close the avatar fingers.  Do not do a full hand close or the sitting poise will be activated (we will likely assign the sit activation to another gesture later, since the avatar hands will be used to hold and manipulate objects.)

OTHER CONTROLS

The avatar's mouth will move up and down when yours does, and the lips can change shape when you make a grimace expression with your mouth or a big 'O' shape.  The eyelid shape also changes when doing this to reflect the nature of the expression, and the eyebrows change shape.

CONCLUSION

We hope that this demo app will inspire Unity 5 RealSense developers regarding what they can create for games or training simulations with full-body avatars.  Feedback is welcomed!

0 Kudos
11 Replies
康夫_上_
Beginner
1,009 Views

Dear Marty G

 

Hello,

Thank you for a wonderful sample files.

We will reference.

if it is good it will be saved terribly and enjoy the unity project file...^^;

 

0 Kudos
MartyG
Honored Contributor III
1,009 Views

Thank you very much!  I will be posting new samples of this project for people to try as the project progresses.

****

どうもありがとうございました!人々は、プロジェクトの進行にしようとするために、私はこのプロジェクトの新たなサンプルが掲載されます。

0 Kudos
MartyG
Honored Contributor III
1,009 Views

A brand new build of our RealSense-powered full-body avatar compiled in the latest Unity 5.1. is now available for download.  (12 mb)

http://sambiglyon.org/sites/default/files/RealSense-Avatar-Build-4.zip

All of the controls listed above in this article remain true.

The additional changes in this new build include:

* An alpha of a crouching system.  To crouch the avatar, close your fingers into a fist.  To rise up again into the default standing position, move your hand downwards outside of the camera view whilst your hand is closed.

You can also toggle between standing and crouching by moving your fist up and down so that it goes in and out of the camera view.

*  Controllable thumbs have been added to each hand, as a precursor to when we add the ability to handle objects with the hands (as thumbs act as locking mechanisms to grip objects in real life).

*  An alpha of an auto arm-swinging system that gently swings the arms back and forth during walking.

*  A gravity system that enables the avatar to rise up when stepping up on an object or dropping down to a lower level of floor.  The feature is not visible in this simple empty-room demo, but you can get a glimpse of it right at the beginning as the avatar does a little drop down to the floor.

In the tree-house game environment that our avatar is normally in - we hope to release a demo of this in the near future - this gravitation enables the avatar to travel up and down on an elevator.

*  There are also numerous bug fixes.

0 Kudos
MartyG
Honored Contributor III
1,009 Views

Quick update on our full-body avatar project in Unity 5.1.  

*  The arms now have fully human capability - lift the lower arms up and down, and swing the arms across the front of the body using shoulder movement.  This has been a long term goal of ours, and is a lot harder to achieve than you'd think!

*  The entire avatar has been re-scaled to reflect human proportions much more accurately (the previous model had a slender body and over-sized limbs).  It is important that an avatar looks believable to the user if it is a humanoid, and small but odd-looking issues such as too-short upper legs or the hands hanging down too low can break the user's immersion in the experience.

human.jpg

We will have a new demo to download soon!

0 Kudos
MartyG
Honored Contributor III
1,009 Views

I have created a new 12 mb demo specially for this forum of the latest version of my company's RealSense-powered full body avatar in Unity 5.1.  Among numerous advancements since the last build we published on this forum is a new approach to movement joints that has removed all Rigidbody physics components for smoother and more stable RealSense camera control.

http://sambiglyon.org/sites/default/files/RealSense-Avatar-Build-6.zip

See the first post at the top of this page for detailed control instructions.  Or just wave your hands around.  :)

0 Kudos
Ariska_Hidayat
Beginner
1,009 Views

Dear Marty G.

Hi,

I it is very cool. I want to try it. but I cant download it. can you share new link? :)

thx.
 

0 Kudos
MartyG
Honored Contributor III
1,009 Views

Hi Ariska!  The demo that you were trying to download is quite old and the avatar has advanced a lot in its capabilities since then.  So I created a brand new demo of the latest avatar build today for this forum.

http://sambiglyon.org/sites/default/files/RealSense-Avatar-Aug-2015.zip

A separate demo highlights our avatar face tech up close.  You can control the facial animation using your eye, head, eyebrow, eyelid and lip movements.

http://sambiglyon.org/sites/default/files/RealSense-Avatar-Aug-2015-Head.zip

Aside from numerous bug-fixes and cosmetic enhancements, a new capability in this avatar is the ability to jump up in the air on the spot or whilst walking by jerking the head upwards.  The avatar now also has true realistic gravity: it is held up by its feet pushing down on the ground, and moving the feet up causes the avatar to drop down.  This effect can be seen in the demo, where the avatar bobs up and down as it walks due to it pushing the legs away from the ground during each step.

Due to the avatar being on its own in the demo without its game world behind it, there are a couple of unusual animation quirks that are not normally present, such as the top half of the body leaning back further than usual when walking.  Please forgive these!

Summary of the controls:

*  Turn left and right on the spot or whilst walking by turning your head left and right.

*  Lean forwards in your chair to walk forwards.  

As the demo starts with the avatar facing towards the screen, it is best if you turn your head left or right to rotate the avatar's back to you first before you begin walking away into the horizon.

*  Activate the arms by pushing the palms of your hands towards the camera, then move your hands up and down and left and right to move the avatar arms up and down and swing them sideward.

*  Open and close the fingers to open and close the fingers on the avatar.

*  Crouch and un-crouch by closing your hand into a fist and moving it downward.

*  Jerk the head up to jump.

Our latest avatar build also contains some features we weren't able to stabilize enough to put in this demo (due to the previously mentioned quirks caused by having the avatar on its own in a blank environment), such as interacting solidly with objects, breaking into a run when learning further forwards and live control of the legs.  We will highlight these in a future demo.

0 Kudos
MartyG
Honored Contributor III
1,009 Views

A new September demo of the RealSense-powered full-body avatar has been created.  It contains the very latest advances in our arm control system, bug fixes and also incorporates the fix for Z-angle locking recently published by me on this forum (see the stickies at the forum top).

http://sambiglyon.org/sites/default/files/RealSense-Avatar-Sept-2015.zip

 

0 Kudos
Ariska_Hidayat
Beginner
1,009 Views

 

Thank you very much Mr. Marty. 

I've been curious to try it. : D

0 Kudos
Priyadhars_K_Intel
1,009 Views

Hi 

1. Is there a Unity version of the EmotionViewer example? If not, is there any way to use that feature from Unity?

C:\Program Files (x86)\Intel\RSSDK\sample\FF_EmotionViewer

2. As an alternative to Facerig, I think I could use emotion detection and set the character's face to preset expressions based on emotion value.is that possible? 

3. Have you used facial landmarks than using the emotion detection ? 

Regards,

Priya 

0 Kudos
MartyG
Honored Contributor III
1,009 Views

The emotion recognition component of RealSense, powered by Emotient, has been marked for removal from the SDK in an upcoming version, according to the release notes of the current SDK version.  An explanation for this has not been given, though it is probably to do with the terms of Intel's licensing agreement with Emotient for the emotion software. 

Assuming that the emotion detection cannot be retroactively removed from the SDK (e.g by a web patch) once the feature is removed from RealSense, then you could probably make an emotion-detecting app by sticking with the current R4 version of the SDK for the complete duration of your project.  If you upgraded to a newer SDK during that project then you'd probably lose the emotion functionality once it has been removed.

As far as I am aware, there has never been a Unity sample for the emotion detection, though the AR Mirror sample does track the facial landmarks and replicate the expressions on-screen.  That is not the same thing as emotion detection though, of course..

0 Kudos
Reply