Software Archive
Read-only legacy content
17061 Discussions

An update on the RealSense-powered Unity game "My Father's Face"

MartyG
Honored Contributor III
518 Views

Hi everyone,

With a year and a half having passed since my company began development of its RealSense-powered PC game "My Father's Face" in Unity, I thought it'd be an excellent time to provide the community with an update via 15 gallery images from the project.  The first half of the set showcase the main player character's RealSense-driven face expressions and arm / hand movements, and the second half provide glimpses of the mountain-top city of Skyline - based around US style cities - that the character inhabits and interacts with via his RealSense-provided controls.

 

mff-gallery-preview-1.png

mff-gallery-preview-2.png

mff-gallery-preview-3.png

mff-gallery-preview-4.png

mff-gallery-preview-5.png

mff-gallery-preview-6.png

mff-gallery-preview-7.png

mff-gallery-preview-8.png

mff-gallery-preview-9.png

mff-gallery-preview-10.png

mff-gallery-preview-11.png

mff-gallery-preview-12.png

mff-gallery-preview-13.png

mff-gallery-preview-14.png

mff-gallery-preview-15.png

0 Kudos
4 Replies
DKryd
Beginner
518 Views

nice work. are you planing on distribution of game package for public test? 

0 Kudos
MartyG
Honored Contributor III
518 Views

Thanks Douglas!  The game will be released as a commercial product, yes, likely on the Steam game service (once we pass their approval process).  

Development has taken a while because of the technical challenges involved in making such a complex full-body avatar intuitively easy to control.  Our approach has always been that if the virtual body's controls mimic how a person moves their real body as much as possible then it minimizes the need for tutorials, because the player uses their virtual body to do something in much the same way as they would use their real-life body to do it.  They therefore barely have to consciously think about the controls, because life experience has already taught it to them.

Our approach has been to design and try out countless experiments, which end in failure the majority of the time but advance our knowledge of what works and what doesn't a little further each time.  If we hit a difficult problem then we focus on another aspect of development, such as city construction, until an experiment yields a solution that will fix a particular issue.  If that technique can be applied to other areas of the avatar to enhance them then we will also upgrade those parts in order to make the best possible overall avatar for the player.

An example of this is the lean-forward and knee-bending motion, which are controlled by the nose-tip.  There was a time when the body would barely lean forward and could only crouch down a little.  This made it impossible to pick objects up off the floor if they fell from the avatar's hands.  But once we worked out how to adjust the TrackingAction's code to multiply its movement effort by up to x5, the avatar could easily reach their virtual hands all the way to the floor with a dip of the head and even kneel down flat.

We'll keep on evolving our RealSense techniques right up until the game is released, and sharing our techniques on this forum.

0 Kudos
DKryd
Beginner
518 Views

will the game work with one particular camera, or any of them? 

0 Kudos
MartyG
Honored Contributor III
518 Views

It uses both hand and face tracking, so it needs the F200 camera.  It'll also likely be compatible with the new SR300 coming in the first quarter of 2016 (the next generation replacement for the F200).

0 Kudos
Reply