Software Archive
Read-only legacy content
17060 Discussions

RealSense and detecting physiological changes

MartyG
Honored Contributor III
682 Views

Hi everyone,

I'd like to offer another suggestion for a way in which the RealSense camera could be used, this time with the infra-red (IR) function.

In human physiology, the mind leads and the body follows.  This means that if a person strongly believes in something in their mind then it manifests physically in their body in various ways.  One of the simplest examples is the facial blush reaction when thinking of powerful emotional memories.

The same goes for believing that you have been physically struck on a part of your body by a person.  Even if you are just playing a videogame against an avatar, if you mentally relate to the character you are controlling as though it is you yourself that are in the game, then seeing that character get punched in the face, or strike a blow on the character's hand as you try to block, can cause your real-life body to instinctively flinch and perhaps even feel a stimulus on the equivalent part of your real-life body.

I first became truly aware of the phenomena some years ago when watching high-energy cartoon fights such as those in 'Dragon Ball Z' and noticing that as I switched my thoughts off and let myself be immersed in what was happening on-screen, my body was generating responses to the blows struck by the bad guy on the golden-haired hero Goku as I identified with that character.

https://www.youtube.com/watch?v=5MNtMCBGMz4

If the mind believes that it has been impacted in a particular place then it is likely to dispatch body resources such as additional blood-flow to that area to compensate for a bruise or injury it thinks will be there because it recognizes the stimulus.  This may be perceived by the IR function of the RealSense camera as a brighter shade of color than that of the surrounding flesh.

Developers could program their applications to respond to detection of these brighter areas.  For example, it could register it as a hit by the opponent and decrease the health points of the player's avatar.

If you want to read more about this idea, I wrote an article for the Virtual Education Journal last year about harnessing dormant student potential by combining virtual reality with imaginative play.

http://sambiglyon.org/sites/default/files/NLP.pdf

0 Kudos
5 Replies
samontab
Valued Contributor II
682 Views

Hi Marty,

It certainly looks interesting. It reminds me of the heartbeat detection using the camera pointing at a finger that somebody created some time ago.

Although, I don't think the current IR resolution (640x480) is large enough to capture the required level of detail. Or maybe it is enough...I guess there's only one way to find out :)

0 Kudos
MartyG
Honored Contributor III
682 Views

I haven't personally tried the IR function yet.  I just know of it from when Intel suggested that it could be used for heart-rate detection in May 2014.  

I suppose that one could put a virtual beating heart in an avatar that circulated virtual blood around the avatar body as a fluid, and set the avatar's heart pumping rate (and hence the speed of internal blood circulation and pressure) by scanning the player's RL heart rate.  

There's just no end of possibilities with this camera, eh?  ;)

0 Kudos
samontab
Valued Contributor II
682 Views

Yes, it is a really convenient camera.

I wish I could get access to an embeddable version of it. I have a couple of ideas for it..

For the moment, I want to play around with comparing a custom generated stereo 3D output using the color and IR cameras as the source vs the active IR depth.

0 Kudos
Philip_N_
Beginner
682 Views

You would have to be careful to do correct feature matching if you are planning to get 3D data from the colour + IR camera pairs, as the texture/colour/IR reflectance of the surfaces you are trying to match features will distort your 3D model. It would be very interesting to do proper calibration on the depth data as on first impression it seems dependant on background illumination. For example, if you move your hand towards the camera the depth values of your face decrease. This behaviour is odd considering that the structured light pattern should remain unaffected by foreground objects...

0 Kudos
MartyG
Honored Contributor III
682 Views

I found that using the hand in front of the chest often caused tracking of facial features such as the chin and lips to be adversely affected, although I always had the camera placed on my desk rather than hanging from the top of my monitor, so it was more the fault of my fear of knocking the camera flying than it was of the camera itself (since the camera could probably see past my hand if it was positioned higher.)  :)

0 Kudos
Reply