In my own project, I use the Unity game engine with RealSense to detect emotions based on the angle of rotational joints in objects representing facial parts. The objects are controlled with facial tracking scripts supplied with the RealSense SDK, such as 'TrackingAction'.
For example, if a rotating object representing an eyebrow, and tied to the movement of the user's real-life eyebrow, rotates downward to a certain angle then it is recognized by my project as being an "angry" expression, since a downward-pointing angle on a real-life eyebrow represents anger on a human face. If the object rotates upwards, representing upward flexing of the eyebrow, then that is recognized as a "happy" emotion.
The same principle can be applied to the mouth's lips, as the corners rotate upwards diagonally when happy and downwards diagonally when angry.
Thanks for the reply. It is nice suggestion to calculate Emotion.
I was exploring further and came across PXCMEmotion class. The latest SDK does not have this. Is there any substitute for this class in latest SDK? Or anything similar available which matches functionalities close to this class?
The emotion component was provided by a third-party company called Emotient which has recently been purchased by Apple. Usually when a company is being bought by Apple, they cease issuing new licences for their technology or renewing existing licensing agreements. This may be related to why the Emotient emotion component was removed from the SDK.
As far as I know, there is not a replacement emotion detection component for the SDK and going to the top Google results about RealSense emotion lead to a broken link. I suspect your best chance is to do what I did and think of ways to adapt the remaining features to simulate emotion.
For example, a non-Unity way to gauge emotion might be to use the infra-red function to read the blood-flow under the skin of the user's face (represented by different shades of monochrome color) and assign emotions to the different shades. For example, a deeper shade could indicate a state of happiness / excitement due to the face becoming flush, whilst a lighter shade could indicate upset or shock (as in "the blood drained from their face in shock" or "they went pale with fright").
There's precedent for the camera being used to read blood flow with IR. One of the earliest demonstrated applications for RealSense in 2014 was as a heart rate sensor, with the heart rate being calculated based on how much blood was under the skin on the area of the body that the camera was looking at.
Some googling of the technique found a blog article from last summer about it. Apparently, the official term for heart sensing is 'pulse estimation'. In the article, they suggested that a way of sensing a pulse without needing a high camera resolution was to examine color changes and count each change as one pulse. So by counting in that way, you could get a 'pulse per minute' value.
The article also mentions other ways of estimating blood flow using IR, such as reading the veins.
More information on Pulse Estimation in the current SDK 2016 R1 documetation: https://software.intel.com/sites/landingpage/realsense/camera-sdk/v1.1/documentation/html/index.html?doc_face_pulse_estimation_data.html