Software Archive
Read-only legacy content
17061 Discussions

eyes up doesn't work when chin is tilted or chin is up. How to fix it?

Mona_J_1
Beginner
238 Views

Hi,

I have realized that the eyes up algorithm in Intel RealSense 3D SDK doesn't work when chin is up or tilted as shown in the photos attached and in both cases it reports them as zero. Any idea how could that be fixed?

 

I was wondering if 

1) combining head pose information could help me but if that's the case I am not sure which functions would help me in doing so?

2) using roll, pitch and yaw information in combination with eyes_up from the expression. However for me it is really hard (almost impossible) to interpret those three items and use if elses or case statements to say when is the head tilted or chin is up based on them. I am wondering if similar work has been done with regard to this? Also I am wondering if we know the range of each of roll, pitch and yaw that they change? like from -10 to 10 (as an example). 

 

P.S.: As a sidenote I am wondering when you will be realizing your gaze detection algorithm?

P.P.S.: I know that your eyes_up detection algorithm would work up to 1.2 meter being far from the camera. I would like to know how to define the width from the camera as the center? Like is it as long as my hands? or twice each hand or if I can modify this width? How to do so?

Here are the links to the photos. Sorry the photo upload didn't work. In all these cases we want the eyes_up algorithm to report 1 instead of 0.

1) http://i.imgur.com/v27IeAD.png

2) http://i.imgur.com/BtKplgV.png

3) http://i.imgur.com/71yMJdF.png

Here's the link to the code (I have slightly modified it and I have used Visual Studio Community Edition 2013):

https://github.com/lamiastella/FaceTracking

 

Thanks a lot,

Mona Jalal

0 Kudos
1 Reply
MartyG
Honored Contributor III
238 Views

I too have had difficulty getting eyes-up and eyes-down to be recognized, as well as head up and head down.  I find that when I need to track such a head movement, tracking the nose tip, eye center or the chin always works best.

Tracking the eye center will give you the left-right-up-down movement that you seek, and can also tilt the head sideways. In my Unity project, I use the 'TrackingAction' script with all Position constraints locked and all Rotation constraints unlocked to roll my fullbody avatar's eyes in all directions when the real-life head is moved around.

Regarding setting a center-point distance from the camera, in Unity you can do this by defining an area in front of the camera called the Real World Box.  I think you would give the Real World Box X-Y-Z (width, length and depth) values of about '50' to get your program to respond to the camera only when your body is at this distance from the camera.  I'm not sure how you would define something like a Real World box with a non-Unity program.  I'm sure there are people on this forum who can help with that.

0 Kudos
Reply