- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I am an intern student working with a project that include a Realsense camera. My aim is to trigger the Realsense camera by sending a positive pulse from a pin of a Raspberry Pi 3 Model B+. What I have done is I connected the pin 5(VSync) of the camera directly to the pin 40 (GPIO29 if configured with wiringPi) of the raspberry Pi. Pin 9 of the camera to pin 6 (GND) of the Raspberry Pi.
However, I was a newbie to programming and it is hard to find a well explained tutorial on your librealsense Github. Therefore, I am asking that if there was a function that help me to read the signal and force the camera to capture an image at the rate that I command and print the image on the screen?
Here is what I want to do in C++:
1) Camera receive the pulse from Raspberry Pi at pin 5.
2) Take the picture and print it to screen.
I knew that there is a variable name RS2_STREAM_GPIO existed on the library but I don't know how to handle it. Can someone send me some code example ?
Many thanks,
CaoKha
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
A couple of discussions of the pins can be found in the links below.
https://github.com/IntelRealSense/librealsense/issues/2512
https://forums.intel.com/s/question/0D50P0000490URESA2/d435-vsync-pin?language=en_US
Intel's document on setting up sync between hardware has the following advice about external triggering:
**************
An external signal generator can be used as the master trigger with all cameras set to slave mode. When applying an external sync pulse, the HW SYNC input requires a 100 microsecond positive pulse at the nominal camera frame rate, 33.33 ms for a 30Hz frame rate for example.
Inputs are high impedance, 1.8V CMOS voltage levels. However, it is important to make sure to use a high resolution signal generator. The frequency of the signal generator needs to exactly match the sensor frame rate.
For example, if the sensor is set up as 30 fps, the real frame rate may be 30.015 fps. You may need to use an oscilloscope to measure the real frame and configure the signal generator to the same frequency. For this reason, it may be better to just using one additional camera as master sync signal generator.
*************
The full sync paper can be accessed here:
https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/RealSense_Multiple_Camera_WhitePaper.pdf
A trigger signal can also be wireless.
https://github.com/IntelRealSense/librealsense/issues/2171#issuecomment-409559217
Another means of triggering that has been demonstrated by Intel is to activate a commercial grade flash unit and use the spike in the camera data that the flash caused as the trigger for capture.
https://realsense.intel.com/intel-realsense-volumetric-capture/
Could you provide more details please about the format in which you would like the capture to be displayed on screen, please. For example, do you wish to just display a single static frame on screen or have a live-updating stream? Thanks!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi MartyG,
Thank you very much. Your answer is really valuable to me. However, I use only one camera and what I want is simply program an automatic trigger "button" that capture a photo at certain rate (such as 1 photo each 30 milliseconds) and print the photo on my computer screen (with a timestamp). I found this link (http://www.dslreports.com/forum/r31259855-Syncing-a-camera-and-a-pulsed-flash-light-using-a-TTL-signal) is the closest to what I want to do but . However, when I try to slow down the pulse rate of the generator (sending a 100microsecond pulse per 100 milliseconds) just to take a photo per 100 milliseconds. The frame rate of Depth camera (configured as Slave) remains the same (I use the library <chrono> to estimate the wait_for_frames() function).
Therefore, do you know what I should do to be able to verify that the camera will take a photo when receiving the signal I sent ?
Thanks,
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
If the camera is set to Slave then it should be listening for a pulse. The problem is that Intel's version of the flash trigger has only been described in a single line on a blog and I have not seen any further technical details about how to implement it for the 400 Series cameras specifically.
Intel support agents have access to resources that I do not, so one of the support staff on this forum should hopefully be able to point you in the right direction about how to create a flash-based trigger. Thanks for your patience!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page