Hello,
I saw on this link that Intel has developped a tool to make films by synchronising four RealSense Cameras.
https://realsense.intel.com/intel-realsense-volumetric-capture/ Volumetric Capture @ Sundance using Intel RealSense Depth Cameras
I also need to do real time camera synchronisation.
Is it possible to have access to this tool? Or could we have the source code of it?
Thank you.
Syncing with D435 is easier now than it was at the time of that demo in January 2018, as hardware sync was added for D435 in firmware version 5.10.3. This should make the use of a commercial flash as a trigger, as Intel did at the Sundance demo, unnecessary.
链接已复制
Syncing with D435 is easier now than it was at the time of that demo in January 2018, as hardware sync was added for D435 in firmware version 5.10.3. This should make the use of a commercial flash as a trigger, as Intel did at the Sundance demo, unnecessary.
You can sync the cameras using the support in firmware 5.10.3. You will have to sync the individual point clouds yourself though, perhaps by using one of the methods (Vicalib or an Affine Transform) I described in my reply to your GitHub post.
Point clouds can also be combined using the software Open3D. Intel are introducing an Open3D wrapper into the SDK soon for easier integration with it.
The first step is to download the firmware file. Then you use a firmware updater tool to install the file on the cameras. The tool is available in Windows and Linux versions.
You can get the firmware file, the updater tool and documentation from the link below.
https://downloadcenter.intel.com/download/28237/Latest-Firmware-for-Intel-RealSense-D400-Product-Family Download Latest Firmware for Intel® RealSense™ D400 Product Family
Once a D435 camera has firmware 5.10.3 or newer then it can do hardware sync. Without a firmware version that has sync support, the camera does not know how to sync.
Once you have the firmware installed in both cameras, you will be able to test sync in the RealSense Viewer software using the 'sync' icon at the top of its options panel.
It looks good. I cannot test the process any further though, as I only have one 400 Series camera. Is sync available when the cameras are not streaming?
If not, unfortunately you will have to wait for advice from an Intel support team member (I don't work for Intel) about what to do next. I do apologize.
It would probably be best to add a note to your existing case about sync on the GitHub.
https://github.com/IntelRealSense/librealsense/issues/2556 multicam live reconstruction · Issue # 2556 · IntelRealSense/librealsense · GitHub
I haven't personally done synching with cables, Intel's webinar on multiple cameras on September 13 2018 stated that hardware syncing with cables is best used when micro-second precision in the syncing is required. Cabling isn't a mandatory requirement though if you don't need micro-second precision.
The webinar added that the more cameras that are added to a multi-camera setup, the higher the specification of PC (for example, an Intel i7 processor) that is recommended. The ideal situation is for cameras to be connected directly to a USB port instead of a USB hub so that each camera's USB port has its own USB controller.
You can read the webinar transcript here:
