Software Archive
Read-only legacy content
17061 Discussions

No more EmotionTracking?

Lucas_M_2
Beginner
1,055 Views

Hey guys,

I'm not being able to use the PXCMEmotion class nor EnableEmotion() as indicated on the documentation:
(https://software.intel.com/sites/landingpage/realsense/camera-sdk/v1.1/documentation/html/manuals_emotion_detection_via_senseman.html)
And on this video, that I'm using as reference too:
(https://www.youtube.com/watch?v=-A3pe7wgfS4)

Do they removed this function on the new SDK or am I doing something wrong?

@EDIT

I've seen now that the algorithm is deprecated, why is that? Is there any way to make this functional again?

0 Kudos
1 Solution
MartyG
Honored Contributor III
1,055 Views

Emotion detection was a non-Intel component made by a company called Emotient.  It was part of the SDK for about a year before being deprecated and then removed.  The logical explanation would be that either Intel or Emotient ended the licensing agreement after the first year.  As far as I know there are no plans to replace it at present.

Sometimes features get removed because a licensing deal expires or for other reasons.  For instance, Intel removed the tracking of head tilting from the SDK around version R3 because they felt that it didn't work as well as they wanted it to.

I would think that the future of the Metaio-powered 3D augmented reality feature is uncertain too, as Metaio got purchased by Apple last summer and immediately announced that they wouldn't be entering into new licensing agreements for their tech (now Apple's tech).  I don't know how long the licensing period of Intel's agreement with Metaio was, but it would seem unlikely that it would be renewed once it ends.

Edit: talk about synchronicity!  Within an hour of me writing this comment, the news says that Apple has purchased Emotient too!

http://techcrunch.com/2016/01/07/apple-dives-deeper-into-artificial-intelligence-by-acquiring-emotient/

View solution in original post

0 Kudos
4 Replies
MartyG
Honored Contributor III
1,056 Views

Emotion detection was a non-Intel component made by a company called Emotient.  It was part of the SDK for about a year before being deprecated and then removed.  The logical explanation would be that either Intel or Emotient ended the licensing agreement after the first year.  As far as I know there are no plans to replace it at present.

Sometimes features get removed because a licensing deal expires or for other reasons.  For instance, Intel removed the tracking of head tilting from the SDK around version R3 because they felt that it didn't work as well as they wanted it to.

I would think that the future of the Metaio-powered 3D augmented reality feature is uncertain too, as Metaio got purchased by Apple last summer and immediately announced that they wouldn't be entering into new licensing agreements for their tech (now Apple's tech).  I don't know how long the licensing period of Intel's agreement with Metaio was, but it would seem unlikely that it would be renewed once it ends.

Edit: talk about synchronicity!  Within an hour of me writing this comment, the news says that Apple has purchased Emotient too!

http://techcrunch.com/2016/01/07/apple-dives-deeper-into-artificial-intelligence-by-acquiring-emotient/

0 Kudos
Lucas_M_2
Beginner
1,055 Views

Well, I'm disappointed. Hope we can see something similar to EmotionTrack on RealSense's future. It looked like a fun and promising technology to work with. Guess I'll just have to rethink my project to fit RealSense's capabilities.
Thank you for your feedback and time Marty, it was really helpful. I didn't know some features were third-party! 

0 Kudos
MartyG
Honored Contributor III
1,055 Views

You're very welcome, Lucas!

I've suggested in the past to other people who wanted to detect emotions that they could use Unity to set up invisible trigger objects that object representations of facial features (e.g eyebrows) pass through.  

For example, if you placed triggers below the inside edge of an eyebrow then when the eyebrow flexed downwards and passed through that trigger, it would indicate an "angry" emotion, because the virtual eyebrow couldn't have reached that trigger's position without the user's eyebrows (and hence the virtual eyebrow) expressing that emotion.  And putting a trigger above the virtual eyebrow would indicate that the user was happy, because their eyebrows would not likely have raised upwards otherwise.

The AR Mirror Unity sample that comes with the SDK provides an excellent visual representation of how such a system could work.

Of course, if you wanted to simply write your application in C# or C++ without having to bother with learning Unity, the above solution may be a bit of a hassle for you!

0 Kudos
MartyG
Honored Contributor III
1,055 Views

I had an idea for how you could detect emotions without using Unity.  You could think about the emotions in terms of angles of facial landmarks. So if your project was set up to increase or lower an angle value of an eyebrow landmark if the user raises or lowers their RL head, then you could program it to register that the user is angry if the angle moves about 70 degrees in the downwards direction (indicating that the angle must have been moved to that rotation by the RL eyebrow frowning downwards).

Likewise, it could register as "happy" if the landmark's angle goes in the upwards direction, indicating that the eyebrow has lifted upwards because the user is smiling.

The same "down is cross, up is happy" principle could also be applied to the sides of the bottom lip.

The starting angle of each detected landmark could be considered to be a "neutral" expression.

0 Kudos
Reply