- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
I'm very interested in the new Intel RealSense SDK gaze tracking capabilities. Is there some documentation about:
- Gaze tracking accuracy?;
- How well does it work when the person rotates the head?;
- Does it work well when the user goes at a further distance from the camera?;
or, does someone has some experience on these points?
Thanks in advance,
Kenneth
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
My own experiences with eye tracking are solely with the Unity game engine version of RealSense, so my feedback relates to the performance of tracking on that platform.
* Eye tracking is very smooth and fluid. I can easily turn and lift / drop the head of my full-body avatar, and move the eye pupils around in all directions. I can also turn the head whilst leaning forwards in the chair to make my avatar walk, and change the direction of walking by turning the whole avatar body with a left-right head turn without stopping walking.
* Regarding distance, I've only used the short-range F200 desktop camera rather than the longer-range R200 mobile camera, so I can only speak for the behavior of the F200 face-tracking at different distances. My experience is that if you are sitting at the far limit of the tracking range in an armchair then you may have to lean forward a bit in order for the eyes to be detected and tracked, but it's only a small lean-forward that doesn't take much effort.
If you are sitting in an office chair, which is typically closer to the camera than an armchair, then the camera shouldn't have any trouble picking up the gaze. Having said that, if you get *too* close to the camera then it won't be able to see the eyes properly, in the same way that moving the hands too close to the lens can make them unreadable.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Marty,
Any statistics on how much glasses interfere with the eye tracking, especially when there's significant glare from a monitor reflected on them?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I don't wear glasses so I don't have first hand experience of how glasses affect them. I found an Intel blog article that gives an account of the effect of wearing glasses though.
https://software.intel.com/en-us/blogs/2015/01/19/face-tracking-using-the-intel-realsense-sdk-bkms
The key points:
* "Reliably capturing expression information (e.g., EXPRESSION_MOUTH_OPEN, EXPRESSION_SMILE) can be problematic when the user is wearing eyeglasses"
* "Facial hair, glasses could make emotion detection harder" when detecting facial expressions.
* Landmark detection "Works on faces with/without facial hair and glasses"
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks Marty.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The reduced level of emotion detection is not such an issue now since that blog article was written ... not because it was improved, but because the feature has been deprecated (advance warning of being made obsolete) and will be removed in a future SDK release.

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page