Software Archive
Read-only legacy content
17061 Discussions

Using RealSense in novel ways with "meta" concepts

MartyG
Honored Contributor III
364 Views

Hi everyone,

You may be familiar with the concept of "meta" - replicating an experience inside another experience.  A common example is watching a video recording of you watching a video.  :)

Meta can be applied to RealSense too ito create novel applications.  For example, in my company's RealSense game project, a motion-controlled character can pick up and wear an Oculus Rift style augmented reality super-hero mask in the game world that has ront and rear cameras in it.

squirrelvr.jpg

 An R200 camera on the front of the mask sends a video feed to small screens in front of the eyes inside the mask to give the wearer a view of the environment, with added AR information.  Meanwhile, an F200 camera on the inside of the mask reads the wearer's eye movements and expressions and displays them on digital "eye" mini-screen displays on the front of the mask, with eye-color change to help protect the wearer's true identity.

With imagination, you can embed the consequences of information picked up by the RealSense camera hardware into "virtual cameras" in a simulated world-environment!

0 Kudos
6 Replies
Simon_B_5
Beginner
364 Views

Hi Marty, I'm interested in your comment here: "an F200 camera on the inside of the mask reads the wearer's eye movements and expressions". We have not dabbled in RealSense as yet as I'm not sure if can do what we require. Specifically I need to know if I can use it to measure the eye movements of a subject. In particular the eyelid movements. i.e. detecting blinks and measuring the profile of a blink. I'd appreciate it if you can point me to any information on this. Cheers!

0 Kudos
MartyG
Honored Contributor III
364 Views

You can measure eyelid movements with no problem.  Blinks are more problematic - In the original F200 camera at least (the one that I have), I found that they move so fast that the camera cannot really pick them up, because the blink action is finished by the time that the camera has time to  register it.  This is why, in my avatar, the eyelid blink is an automated animation rather than controlled by the camera.  I don't know if the hardware in newer RealSense cameras such as the SR-300 is fast enough to detect blinks.

You can definitely track slower eye movements such as a quarter or half-closing though and have RealSense do something based on that action.  A recently posted video of my avatar shows the RL eyelid movements being translated into virtual eyeid positions.  The face part starts at 58 seconds in.

https://www.youtube.com/watch?v=HspFO87NHeY

There are other ways to measure tiny motions such as blinks.  When you blink or close your eyes, the eyebrows and the nose-tip do micro-movements.  The camera tends to find it easier to register these movements than tracking the eyelids.  An example of this is my avatar bending down slightly when closing the eyes, because the waist bending is tied to the nose-tip tracking, and the nose in turn is moved by the eye motion.

0 Kudos
RBang
New Contributor II
364 Views

Marty G. wrote:

You can measure eyelid movements with no problem.  Blinks are more problematic - In the original F200 camera at least (the one that I have), I found that they move so fast that the camera cannot really pick them up, because the blink action is finished by the time that the camera has time to  register it.  This is why, in my avatar, the eyelid blink is an automated animation rather than controlled by the camera.  I don't know if the hardware in newer RealSense cameras such as the SR-300 is fast enough to detect blinks.

You can definitely track slower eye movements such as a quarter or half-closing though and have RealSense do something based on that action.  A recently posted video of my avatar shows the RL eyelid movements being translated into virtual eyeid positions.  The face part starts at 58 seconds in.

https://www.youtube.com/watch?v=HspFO87NHeY

There are other ways to measure tiny motions such as blinks.  When you blink or close your eyes, the eyebrows and the nose-tip do micro-movements.  The camera tends to find it easier to register these movements than tracking the eyelids.  An example of this is my avatar bending down slightly when closing the eyes, because the waist bending is tied to the nose-tip tracking, and the nose in turn is moved by the eye motion.



Amazing stuff Marty! I would highly recommend you try to use the SR300 instead of the F200, as a lot of improvements were made based on the legacy device's feedback. If so, it might even help you with the game, where you could assign a certain action/event for the number of blinks or duration of blinks made.

0 Kudos
MartyG
Honored Contributor III
364 Views

Thanks for your kind words!  Sadly I have to use the F200 (one of the original models from 2014, still going strong despite constant use!), because I can't afford to upgrade from 2013 / 2014 hardware until the game comes out and hopefully sells well.  

It's a situation that works out well though, because I optimize my Unity C# code so much that if it runs fast on an i3 Haswell 4th gen CPU and 2013 graphics card like the PC I develop on then it should be blazing fast on a modern PC and camera.  It also means that people with lesser hardware such as laptops will have a chance to play the game too.

I used to control objects directly with RealSense's 'TrackingAction' script.  These days though, I use an animation system I wrote myself that enables Unity animation clips to be controlled by a TrackingAction inside a standalone controller object.  So if a Unity animation clip has a running length between 0 and 1 (0 is the start of the clip and 1 is the end of it), the TrackingAction enables the user to wind back and forth through the animation clip in real-time.  So raising the RL arm, for example, moves the animation clip forward, lifting the virtual arm that is animated by the clip.  Dropping the RL arm winds the animation clip backwards, dropping the animated arm.

This system gives rock-solid reliability and precision, because it enables you to run error-checking routines on the numbers being generated by the TrackingAction and also mostly eliminates flaws associated with direct TrackingAction control (such as objects moving when the RL wrist is turned, or 'flipping out' to reverse orientation because the Trackingaction's angle exceeds 360 degrees during extreme hand movements).

There is another video on my company YouTube channel that shows a prototype "thought control" system for virtual legs, where the legs move when the user thinks about moving their RL legs, as the camera detects facial micro-movements that happen when a person thinks about lifting their legs.  

https://www.youtube.com/watch?v=HHzXdLqI8p4

0 Kudos
Simon_B_5
Beginner
364 Views

Rishabh Banga wrote:

Amazing stuff Marty! I would highly recommend you try to use the SR300 instead of the F200, as a lot of improvements were made based on the legacy device's feedback. If so, it might even help you with the game, where you could assign a certain action/event for the number of blinks or duration of blinks made.

Rishabh, are you able to comment on the SR300's capability in being able to track eyelid movements in real-time (at 60fps).

I have an application that requires an accurate measure of the profile of a blink in terms of the eyelid movements. What would be required would be to measure (in relative terms, not an absolute measure) the eye opening between some of the markers that are shown in the image attached. Specifically between markers at the middle of the upper eyelid and the middle of the lower eyelid. This was a screen grab from this demo:

https://www.youtube.com/watch?v=XlUiKE7L6gM

that is using Faceware Live 2, but I assume the RealSense SDK can provide this measure of the eye?

So the relative measure could be in pixels or whatever. Do you know how I could get an example of this type of data I could process?

Cheers

 

 

0 Kudos
RBang
New Contributor II
364 Views

Simon B. wrote:

Quote:

Rishabh Banga wrote:

 

Amazing stuff Marty! I would highly recommend you try to use the SR300 instead of the F200, as a lot of improvements were made based on the legacy device's feedback. If so, it might even help you with the game, where you could assign a certain action/event for the number of blinks or duration of blinks made.

 

 

Rishabh, are you able to comment on the SR300's capability in being able to track eyelid movements in real-time (at 60fps).

I have an application that requires an accurate measure of the profile of a blink in terms of the eyelid movements. What would be required would be to measure (in relative terms, not an absolute measure) the eye opening between some of the markers that are shown in the image attached. Specifically between markers at the middle of the upper eyelid and the middle of the lower eyelid. This was a screen grab from this demo:

https://www.youtube.com/watch?v=XlUiKE7L6gM

that is using Faceware Live 2, but I assume the RealSense SDK can provide this measure of the eye?

So the relative measure could be in pixels or whatever. Do you know how I could get an example of this type of data I could process?

Cheers



Hi Simon,

Unfortunately I cannot comment on the same as I myself have not tried it out. But based on the video you have shared, I'm pretty sure you should be able to track eyelid movements in real time, though on a relatively slower frame rate.

I'll try to test and get back to you on this one in sometime.

0 Kudos
Reply