I am currently trying to understand how the HW sync works with Intel D435 cameras... I read a lot of topics, but none of them answer completely to my questions...
Let me explain a little bit more my expectations: I would like to synchronize 2 (or 3, but let start simple) RS D435 cameras. I've connected them (through pins 5 and 9), and enabled HW sync with the Realsense-viewer tool (with Ubuntu 16.04, all RealSense relatives packages installed properly).
But my real question is, how can I check that HW sync is working well? I mean, is there a little piece of code that could help me to check that? I've tried checking filming a timer on my computer's screen, it seems correct, but nothing ensure me that HW sync is really working or not.
Thanks for any answer you could provide me, and sorry by advance if this post is redundant with another one.
The multiple-camera white paper document states that if hardware sync is working correctly, you should see the timestamps of the cameras drift apart over time. If hardware sync is not working then the timestamps will remain identical, because each individual ASIC is counting the exact same number of cycles between frames.
So you can validate if the sync is working by observing whether there is drift. As the white-paper says, this is counter-intuitive to what you would expect from sync. The drift can be expected to be less than 1 ms / minute, according to the white-paper.
Many thanks for your answers! Indeed, I didn't pay enough attention to this section of the white paper...
Do you have any idea on how I could implement this "HW sync check" in a Python/C++ script? Maybe a test with the 'Image.timeStamp' property?
In the link below, Dorodnic the RealSense SDK Manager has advice for hardware sync in Python, including scripts for setting the cameras as Master and Slave.
I could not find a clear way to validate the sync using code. Visual observation of the timestamps seems to be the recommended approach.
Thanks for the link!
However, I still don't understand how can I check visually, with Intel RealSense Viewer, if HW sync is OK or not... I followed the instructions explained in this video but whether if I enable HW sync or not (or if I plug PIN5 or not), I can't interpret what's happening with timestamps... Maybe I'm missing something.
For example, here are the timestamps of my 2 cameras with HW sync enabled :
And here are the timestamps with HW sync NOT enabled :
How can I deduce anything from that?
Many thanks for your help.
The timestamps on your images seem to be completely unaligned. What is hard to see on the YouTube video is that in the Inter Cam Sync section for each camera's settings in the RealSense Viewer, the Master camera should be set to a value of '1' and the other camera (the Slave) should be set to a value of '2' in this section. This will clearly define which camera is the Master controlling the sync and which camera is the Slave that follows the Master's timing.
Indeed, I retried it and, now, timestamps seem more aligned. Is the order I set the parameters important? I mean, should I set the master camera parameter to 1, then start the Stereo module, then set the slave camera parameter to 2, then start its module, or there is no importance?
Also, even if timestamps seem equal, if I let RealSense Viewer running for a while, I will observe the timestamps drifting, right?
(Master camera is on left, Slave on right)
I would advise setting the master / slave status in the Inter Cam Sync option before you start streaming either of them. That way, you will ensure that the streams are synced from the beginning. This is especially important if you are doing a recording of the synced streams.
Yes, the timestamps should drift apart over time if hardware sync is functioning correctly, drifting by about 1 ms per minute.
Many thanks for your precise answers @MartyG! It really helped me.
One last thing, does timestamps have a unit? When Googling that, I found that timestamps units are 100 ns. That means that, for example, a timestamp of 390632.6 means that the real timestamp is 390632.6 ns?
You are very welcome! I'm pleased I could help.
Usually with RealSense camera sync, the unit used is microseconds (ms).
I believe the 100 ns figure refers to an ideal target for latency reduction that the RealSense development team aim for. In practice, they achieve around 60-70 ms latency on the depth stream.
I re-checked the white paper's section about timestamps and drift. It says "[the cameras] will inevitably drift a few milliseconds over the course of 10s of minutes".
That would suggest to me that the drift is occurring in microseconds if the rough rate of drift is "around 1 ms per minute". Otherwise if it was in milliseconds, the drift would be a lot more than a few milliseconds over the course of tens of minutes having passed. I could not find any references that explicitly confirm this to be the case though. A member of Intel may be able to provide confirmation (I don't work for Intel).