Software Archive
Read-only legacy content
17061 Discussions

Apple buys Emotient - The answer to "where'd the emotion tech go?"

Robert_Oschler
Beginner
398 Views

Well now we know where the emotion detection feature went from the RealSense SDK:

http://www.theverge.com/2016/1/7/10731232/apple-emotient-ai-startup-acquisition

Too bad, it was good stuff.  Intel has deep pockets so hopefully they'll find another partner of equal capability or acquire them.

-- roschler

0 Kudos
1 Reply
MartyG
Honored Contributor III
398 Views

I developed an alternate method of emotion detection in Unity.  It checks the rotation angle of virtual facial objects to determine what emotion that angle represents and then carries out an action if all the trigger events that have been defined are satisfied.  

For example, if the rotation joint in my virtual eyebrow was between 10 and 20 degrees, that meant that the eyebrow was in a downward-pointing "angry" poise.  The same was true if the bottom lip's rotation object was in that range, indicating an "angry mouth" expression.  If both "angry eyebrow" and "angry lip" were detected at the same time then an event is triggered (the changing of the virtual eye pupil texture from the normal one to a smaller "angry" pupil.  When the angles were detected moving outside the 10-20 degree range, this triggered the eye pupil to return to its normal texture.

I'm not sure how easy it would be to apply this principle to a non-Unity project.  But I'll give you a link to the guide that I posted yesterday on this forum, which is part of a much larger article on checking angles and taking actions based on them.

https://software.intel.com/en-us/forums/realsense/topic/606077#comment-1855505

0 Kudos
Reply