I looked through the forum, and I don't think this question was ever asked. I wanted to know if there was an emulator of the sensor that can be used for development. I'm working in a group that already has the F200 we can use, but I just wanted advice on how we can develop and test it individually without necessarily having the sensor at that time.
If it were just a matter of emulating the contents of a chip, like emulation of a videogame console, then it might be within the realms of possibility. But the RealSense camera isn't an ordinary webcam. It relies on specialized laser based seeing-hardware such as depth and infra-red cameras for a lot of its data collection. You couldn't emulate that, at least not in real-time.
I have noticed recently on this forum that some people have been looking at creating applications that process pre-made visual content such as pictures and videos. So I suppose it's within the realms of possibility that you could create a "test mode" in your application that looks at a pre-recorded video of a user instead of a live user whilst the camera isn't connected. you shouldn't need to write your own emulator for that, as the existing SDK software ought to be equipped to handle that offline processing already.
Processing loaded-in content rather than a live human isn't something I have experience of myself, but I'm sure there are folks on this forurm who can contribute some well-informed advice.
You could use any normal camera, convert the color stream into grayscale, then threshold it at some level of brightness, set the lower values to 0, and treat the rest of the values as distance in mm. Easy.
SDK contains stream record/playback functionality specifically for such use cases. You can record a clip with required image sequence and then use the playback instead of real camera device. It's expected that all SDK functionality stays the same (the only logical limitation is that you cannot change the device parameters such as laser power, etc).