Items with no label

Depth error

TTay1
Beginner
3,793 Views

Dear Intel,

For my application, I have a D435 mounted about 2 meters high pointing towards the ground at about a 45 degree angle. The application reads depth data (no RGB) from the D435 and converts it into a point cloud. The point cloud is visualized as in the image below. This image shows me with my hand about 0.5 meters above the ground.

This next image shows me with my hand slightly lower to the ground. Notice that the point cloud shows that my hand and the ground are connected even though they are not touching. In general, I've seen this issue in other scenarios where the depth visualization shows two objects connected even though they are not touching in reality.

What could be causing this problem? Does the D435 apply some kind of smoothing or regularization that could be causing two disjoint objects to appear connected?

0 Kudos
25 Replies
idata
Employee
389 Views

Hello Teeter,

 

 

Thanks for the clear instructions. I am able to reproduce that phenomenon. I have not been able to figure out the best combination of presets and "Advanced Mode" parameters to reduce these effects. Unfortunately, it's like our engineers said, "There is no simple prescription to offer to user for tuning..." I recommend you continue to experiment with the presets and parameters to tune the camera to your needs.

 

 

Regards,

 

Jesus

 

Intel Customer Support
0 Kudos
TTay1
Beginner
389 Views

Hello Jesus,

It's good that you are able to reproduce this phenomenon.

This is a serious problem because when we use the D435s to observe people, these spurious depth points affect our ability to analyze the person from the point cloud.

Is it possible to have your engineering team take a look? The reason I ask is that it almost seems like a bug that was introduced in the D435. After all, the ZR300 is supposed to have inferior specs but it does not suffer from this problem.

Thanks.

0 Kudos
idata
Employee
389 Views

Hello Teeter,

 

 

The RealSense team is aware of this case. The https://github.com/IntelRealSense/librealsense/tree/master/tools/realsense-viewer Intel® RealSense™ Viewer tool supports several predefined depth presets that can be selected according to the usage. The user can also modify the settings via the Advanced Mode menu and save a customized user preset. All the settings can be saved and loaded via the tool's menu. Please review information on presets and recommended use cases at: https://github.com/IntelRealSense/librealsense/wiki/D400-Series-Visual-Presets https://github.com/IntelRealSense/librealsense/wiki/D400-Series-Visual-Presets.

 

 

Regards,

 

Jesus

 

Intel Customer Support

 

0 Kudos
TTay1
Beginner
389 Views

Hello Jesus,

As I mentioned earlier, the "high accuracy" preset reduces (but does not eliminate) this type of depth error. But it also incorrectly creates holes in the depth data in other areas. And as your engineering team had explained previously, the Advanced mode controls were tuned using machine learning and there is no documentation for how to adjust those settings manually.

Since the engineering team is aware of this problem, do they have a timeline on addressing it? Again, I would point out that the ZR300 doesn't exhibit this problem and I hope that there is a solution that isn't too difficult to implement.

Thanks.

0 Kudos
idata
Employee
389 Views

Hello Teeter,

 

 

The engineering team is aware of this performance difference. The only recommendation we can give right now is the one already provided. The engineering team is looking into creating more presets. Work on presets and documentation is ongoing. Unfortunately, we do not have a timeline we can provide at the moment.

 

 

Regards,

 

Jesus

 

Intel Customer Support
0 Kudos
Reply