I am a newbie to RealSense and I would like to know if we can apply the facial tracking and analysis technology on smartphone (both Android and iOS)
Seems the RealSense SDK works only on the designated camera, but what if I am developing apps on mobile platform. Can RealSense technology support also?
Looking forward to the reply. Thanks in advance.
The new RealSense smartphone developer kit might be just what you need. The kit is an Android smartphone with a RealSense ZR-300 camera built into it.
The ZR-300 has six sensors including an R200 camera, accelerometer, gyrometer, 8MP rear-facing camera and 2MP front-facing camera.
There isn't an official solution for iOS yet, though there is an open-source version of the camera software for camera capturing called librealsense that works with Mac OSX and Linux.
If you mean a normal phone without a RealSense camera inside it ... I guess it may be theoretically possible. After all, one can make applications that can use an ordinary PC computer webcam. If you are planning on using live input into the camera, you would usually be limited to using features that an ordinary webcam can support though, such as audio processing and RGB video like that in Skype messaging. Features that need dedicated hardware that is present inside the RealSense camera such as hand joint tracking and face landmark tracking likely normally would not be possible with a non-RealSense camera.
Having said that, there are ways you can get around the absence of the dedicated hardware if you think imaginatively. For example, you can emulate the RealSense camera's inputs by using pre-made recordings instead of live camera input. Forum member Samontab also suggested a way to emulate the depth scanning function of the camera.