Items with no label
3335 Discussions

D435 low latency streaming

JBos3
New Contributor I
2,956 Views

How can we achieve the lowest latency (and jitter) depth stream, and what can we expect for values?

 

I would like to see an Intel document that contains a table of latencies according to their recommended setup. I've read several white papers and haven't seen this information (BKM for Tuning RS Depth Cameras, Muticam, Depth Post-Processing, RS D400 datasheet).

 

Specifically, I will be interested in the following fairly standard configuration:

 

  • A single D435 or D435i camera connected via USB3.1
  • Running the RealSense Viewer (or SDK if that's preferred).
  • "Accurate" RS Preset
  • Streaming only depth, 848x480, Z16
  • autoexposure off
  • post-processing off

 

I measure 130 ms on my laptop (Xeon 2.8GHz CPU, WIN10) for a single IR stream (instead of depth stream since I can read the image to measure lag). This value is measured both using the Intel RealSense Viewer and a custom application built from the SDK.

 

Are there specific software or hardware we can use to minimize lag? For example, a USB framegrabber, powered USB hubs, switching OS to Linux, buying an OEM RS cartridge and building our own high-speed data interface (e.g. FPGA to PCIexpress) etc.

 

Thanks,

-Joe

0 Kudos
1 Solution
Alexandru_O_Intel
2,414 Views
Hello JBos3, Thank you for your interest in the Intel RealSense D435 Depth Camera. Unfortunately, Intel does not provide this level of guidance. All the information regarding the latency topic can be found in the above-mentioned documents. Latency is highly depended on the system and environment and we would recommend testing to find the best configuration for your use case. Best regards, Alexandru

View solution in original post

0 Kudos
5 Replies
MartyG
Honored Contributor III
2,414 Views

Advice about latency vs performance is available in the 'Frame Buffering Management' section of the official RealSense documentation.

 

https://github.com/IntelRealSense/librealsense/wiki/Frame-Buffering-Management-in-RealSense-SDK-2.0

 

According to Dorodnic the RealSense SDK Manager, having stream compression enabled on some low-end PCs may also result in slowdown in the recorder function if the CPU in that particular machine does not handle the built-in frame compression.

 

https://github.com/IntelRealSense/librealsense/issues/2102#issuecomment-436881193

 

A powered USB hub may provide greater stability, especially when long-running the camera for hours / days. However, directly plugging the camera into a PC's USB port is the ideal situation. This is because each USB port on a computer usually has its own USB dedicated controller, whereas on a USB hub the controller may host more than one port.

 

You are certainly welcome to integrate a caseless Depth Module into your own product design. The Vision Processor D4 component that processes the raw camera data can be integrated as a board or incorporated onto a motherboard along with its supporting components, as described on pages 71-72 of the current edition of the RealSense 400 Series data sheet document. This provides the flexibility to realize your hardware design in the way that best suits your needs.

0 Kudos
JBos3
New Contributor I
2,414 Views

Hi Marty,

 

Thanks for the quick response. Yes, I have seen the 'Frame Buffering Management' section on Github. Unfortunately, this document doesn't indicate what latencies are achievable.

 

I would be very interested to hear from the Intel team on their expectations for both out of the box (i.e. D435 via USB3) and the D4 Vision processor component integrated into a dedicated motherboard. In other words, what are the latency contributions from image exposure, readout, depth processing, buffering, and transfer to the host PC (or integrated CPU), and if we follow Intel recommendations, what could one reasonably expect?

 

Thanks,

-Joe

 

0 Kudos
JBos3
New Contributor I
2,414 Views

.

0 Kudos
MartyG
Honored Contributor III
2,414 Views

The values in the Intel tuning guide represents the theoretical best performance (at the time of its writing) that can be achieved from the 400 Series cameras. It is therefore a good benchmark to use in comparison to what you are able to achieve in practice in the real world - an indicator of the level to aim for. There are always ongoing improvements to the SDK and firmware in the pipeline that may provide further performance gains beyond what can currently be achieved.

 

Because of the number of variables that affect performance, from technical specification to real-time environmental conditions, it may be difficult for Intel to provide a precise list of theoretical recommendations in advance. It may be easier to provide advice if an initial prototype is created and the design is then iterated on through different versions as you provide data and Intel offer responses based on that data. I know that Intel's support team members will be happy to help you in whatever way they can though.

0 Kudos
Alexandru_O_Intel
2,415 Views
Hello JBos3, Thank you for your interest in the Intel RealSense D435 Depth Camera. Unfortunately, Intel does not provide this level of guidance. All the information regarding the latency topic can be found in the above-mentioned documents. Latency is highly depended on the system and environment and we would recommend testing to find the best configuration for your use case. Best regards, Alexandru
0 Kudos
Reply