- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Well now we know where the emotion detection feature went from the RealSense SDK:
http://www.theverge.com/2016/1/7/10731232/apple-emotient-ai-startup-acquisition
Too bad, it was good stuff. Intel has deep pockets so hopefully they'll find another partner of equal capability or acquire them.
-- roschler
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I developed an alternate method of emotion detection in Unity. It checks the rotation angle of virtual facial objects to determine what emotion that angle represents and then carries out an action if all the trigger events that have been defined are satisfied.
For example, if the rotation joint in my virtual eyebrow was between 10 and 20 degrees, that meant that the eyebrow was in a downward-pointing "angry" poise. The same was true if the bottom lip's rotation object was in that range, indicating an "angry mouth" expression. If both "angry eyebrow" and "angry lip" were detected at the same time then an event is triggered (the changing of the virtual eye pupil texture from the normal one to a smaller "angry" pupil. When the angles were detected moving outside the 10-20 degree range, this triggered the eye pupil to return to its normal texture.
I'm not sure how easy it would be to apply this principle to a non-Unity project. But I'll give you a link to the guide that I posted yesterday on this forum, which is part of a much larger article on checking angles and taking actions based on them.
https://software.intel.com/en-us/forums/realsense/topic/606077#comment-1855505

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page