Software Archive
Read-only legacy content

Depth Accuracy and reliability across different f200 sensors

Tayfur_C_
Beginner
3,571 Views

Hi,

I ordered two F200 sensors and I am experimenting a little with the accuracy of the depth sensor. I am a little confused regarding the accuracy of the two sensors. I am using the RealSense SDK version 1.3 and the firmware 2_38_5_0 for both sensors.

For both sensors I am checking the center pixel of the depth image using the raw_stream sample from the sdk.

Real Distance 800mm

Sensor 1: Measured Distance 831 => +31mm

Sensor 2: Measured Distance  813 => +13mm

 

This difference gets worse with a higher distance:

 

Real Distance 1200mm

Sensor 1: Measured Distance 1252 => +52mm

Sensor 2: Measured Distance 1212 => +12mm

Thus, the accuracy of Sensor 2 is quite ok even at a distance of 1.2m, Sensor 1 is quite bad with mean error rate of 5.2cm at 1.2m and still 3.1cm at 800mm. Thus my questions are:

1) Does anyone else experienced similar issues?

2) Is it normal that each F200 sensor gives different results like that or is it possible that it is some kind of hardware defect or that my camera was delivered with bad calibration or something like that?

As a side note: The cameras are not running simultaneously, thus it is not a problem of interferences.

 

Thanks in advance for any answers,

Best regards

 

 

 

 

 

 

 

0 Kudos
21 Replies
daflippers
Beginner
3,234 Views

All measurements are within 5%.  What were you expecting?

David

0 Kudos
Tayfur_C_
Beginner
3,235 Views

Well, in this post

https://software.intel.com/en-us/forums/topic/558754

it is said that the accuracy is less than 1mm. But even at a close distance this is not the case for both of my F200 sensors.

And what I was expecting is that the results of two F200 sensors are more or less the same. And there is a relevant difference between both sensors. Sensor 2 actually is quite ok, but Sensor 1 is not. I was hoping to get results in between +/- 2cm.

 

0 Kudos
daflippers
Beginner
3,235 Views

I think anyone expecting an accuracy of <1mm at 1.2m (<0.1%) on something that sells for $99 without warranty is mad.

Camera Specification

Full VGA depth resolution
1080p RGB camera
0.2 – 1.2 meter range (Specific algorithms may have different range and accuracy)
USB 3.0 interface

I think a more likely answer would be accuracy +- 5% with a resolution <1mm.

Perhaps someone from Intel can confirm.

David

0 Kudos
Tayfur_C_
Beginner
3,235 Views

Ok thanks for the answer but this is actually not very helpful. The information of an accuracy of < 1mm is from

David Lu (Intel)          who is giving the information on the link I shared.

My original question remains:

1) Does anyone else experienced similar issues?

2) Is it normal that each F200 sensor gives different results like that or is it possible that it is some kind of hardware defect or that my camera was delivered with bad calibration or something like that?

 

0 Kudos
Philip_N_
Beginner
3,235 Views

Hi Tayfur,

Thanks for doing these tests, its very useful to know this data. With experience from the Kinect v2, I know that you need to power on the device for about 40 minutes for it to thermally stabalise before you get reliable and accurate results for depth. This may be more due to the ToF nature of the Kinect v2, but it could also perhaps be a reason why you are getting vastly different distances between each of your sensors. Can you track the z position of the centre pixel over time to see if it is drifting?

Phil

0 Kudos
daflippers
Beginner
3,235 Views

I read that David Lu had posted that but as I said to have a resolution of <1mm at 1.2m is 0.08333% and while it might be possible to calibrate a camera to measure to that accuracy it would be very time consuming.  Not only that but it will drift with temperature so I will return to my original suggestion that the accuracy is likely to be +-5% and resolution <1mm.

It might not be what you want to hear but in the real world things have tolerance and vary with temperature.  This is reality in the analogue world.

David

0 Kudos
Philip_N_
Beginner
3,235 Views

The Kinect v1 had a minimum bound on depth resolution of 1.8 mm at a distance of 800 mm. I agree that this is not likely to be reached by the v1 due to the need of the system to interpolate between the speckled points, but it is still interesting to see that the error that was reported above on the newer (and still structured light based) f200 is so much larger.

If this effect is temperature based, then perhaps the passive cooling fins on the back of the f200 are not suitable for use cases where precision and stability are needed. 

As an aside, you can still achieve high resolution tracking (sub mm at 1m) with the f200 and other low costs RGB-D sensors if you dont just track one pixel. If the error on a single measurement is random, then the error on a rigid set of points becomes very small, which is why KinectFusion type algorithms or dense ICP can give you very detailed meshes or can perform very good motion tracking.

0 Kudos
Tayfur_C_
Beginner
3,235 Views

Cool thanks for the answers.

 

Regarding the temperature stuff:

Indeed both cameras improve their precision over time.  I did a 25 minutes test and measured the average depth value each 5 minutes at a windows of 50x50 pixels at the center of the image. All distances are measured at a real distance of 924mm.

5 minutes: +29mm

10 minutes: +18mm

15 minutes: +16mm

20 minutes: +15mm

25 minutes: +17mm

I only have these results with one of the sensors yet. But the other starts always with a much bigger difference between measured and real distance. It is also interesting that  there is always a positive offset for both sensors. But since they are not the same, each sensor would need to be "calibrated" to "out-calculate" this offset. Unfortunately, I only have two sensors right now. I would like to know how other F200 sensors perform.

 

0 Kudos
Tayfur_C_
Beginner
3,235 Views

Just remembered that in the previous post 20mm has to be added to each measurement. This is because in one test I tried to "out-calculate" the offset subtracting 20mm and forgot to remove this from code.

0 Kudos
Tayfur_C_
Beginner
3,235 Views

Ok, now I tested both of my sensors over time. Both sensors ran an hour and I measured the distance as a mean value of a window (50x50pixels) at the center of the depth image.

At a real distance of 840 mm:

Sensor 1  First measurement: +22mm; Last Measurement: -2mm (Drift of 24mm)

Sensor 2: First measurement: -6mm; Last Measurement: -24mm (Drift of 18mm)

Same conditions for both sensors. The results are more or less linearly drifting over time and there is some kind of offset between both sensors.

 

At a real distance of 1300mm:

Sensor 1: First measurement: +3mm; Last Measurement: -49mm (Drift of 52mm)

Sensor 2: First Measurement: +29mm; Last Measurement: -23mm (Drift of 42mm)

 

0 Kudos
daflippers
Beginner
3,235 Views

I'm surprised there has been no comment from Intel re specification however all your measurements are within my suggestion of:

Resolution 1mm

Accuracy +- 5%

David

0 Kudos
Philip_N_
Beginner
3,235 Views

I'm currently lending my f200 to a colleague, but as soon as I get it back I'll do some similar testing with a digital temperature sensor attached to the heatsink. It will be interesting to see if these values are just biased but still precise, i.e. if you move the sensor from true distance 500 to true distance 510 you measure 530 going to 540. 

0 Kudos
Tayfur_C_
Beginner
3,235 Views

That would be cool thanks. After my testings yesterday it looks like that one of my two sensors is actually quite accurate. After a few minutes all of the results at real range of 800mm, 1000mm, and 1300mm where more or less between +/-2cm. But the other sensor has a shift in the results. The worst depth value for the second sensor at 1300mm was 7cm. However, if you remove the mean error of the results of the second sensor, it reaches the same accuracy like the first sensor. That is why I believe my second sensor simply has a bad internal calibration.

0 Kudos
daflippers
Beginner
3,235 Views

8< The worst depth value for the second sensor at 1300mm was 7cm.

7/1300*100 = 0.53%

That is very accurate for a mass produced item.  It would be very helpful for Intel to provide the specification for Resolution and Accuracy of depth measurements.  

David 

0 Kudos
Tayfur_C_
Beginner
3,235 Views

Yes of course I would also appreciate an official answer. Actually, I don't want to criticize the product. It is definitely a very cool and cheap product and the results are really good. But since there is a drift between the two sensors I tested, I want to know if this is something which we have to expect when using those sensors or if one of the two sensors is miscalibrated. When calculating the mean error rate we can achieve the same accuracy with the second sensor as we get with the first sensor. Knowing if we need to calibrate each sensor ourselves is an very important information.  This is why I asked if other people also tested the accuracy. At a close distance the first sensor reached a mean error rate of incredible 2mm but the second has a mean error rate of 20mm. I agree that this is still a good result but I would like to know the reason behind the difference between two sensors.

0 Kudos
Philip_N_
Beginner
3,235 Views

Not to be that guy, but its 70/1300*100 = 5.38%. Which is still not bad, but is not < 1%. 

0 Kudos
daflippers
Beginner
3,235 Views

Oops....

0 Kudos
Michael_W_4
Beginner
3,235 Views

I'm also experiencing issues with accuracy and I believe it may be a problem with factory calibration. Does anyone know how to recalibrate the cameras to correct for this error?

0 Kudos
samontab
Valued Contributor II
3,235 Views

A few thoughts:

- I can confirm that the accuracy out of the box of my camera is not stellar, but is good enough for some applications. If I remember correctly it was somewhere around the numbers posted here.

- The technology used to estimate depth in the F200 is called structured light. This is similar in principle to a stereo based system, which means that the farther away the object is, the least accurate its range measurement will be. This is how this technology in particular works, so you can't change this.

- This type of technology is based on computer vision. The algorithm that estimates depth needs to know the pose of the IR camera with respect to the IR projector, as well as a model for the lens used to reduce distortions in the image, and the position of the image sensor with respect to the optical centre. This is not trivial, and maybe Intel has a special procedure or special markers to mass calibrate these cameras.

- Differences in manufacturing will inevitably produce differences in the performance of these cameras. You can't really have a single solution and use it in all the cameras, since it will just not work. You need to have unique calibration parameters per camera. Since these cameras are mass produced, Intel needs to have a good enough calibration for all of them, which may work well for some cameras, but not as well for others.

- As far as I know, the RealSense SDK does not provide a way for re calibrating the cameras. And to be fair, it was never meant to cover something like this. The purpose of the RealSense SDK is to bring 3D development for the masses. Calibrating a camera seems outside of its scope.

- In order to improve on these results, I think you would have to do everything outside the SDK, maybe even the range estimate algorithm. Of course, this defeats the purpose of the SDK.

- Another important factor is the IVCAM parameters. Make sure both cameras have exactly the same parameters set there as well. You can see more information about that here:

https://software.intel.com/en-us/forums/realsense/topic/537872

 

 

0 Kudos
Tayfur_C_
Beginner
2,943 Views

Hi there,

I meanwhile believe that this differences between different cameras is only related to the development kit cameras. There are three possibilities why there are those differences:

 

1..) Calibration issue

Just as samontabb suggests it is possible that one calibration configuration is used for all cameras. However, I believe this is only the case for the development kit cameras. The embedded versions which are delivered to for example DELL (Dell Venue 8) or other tablet PC and laptop manufactures could come with a per-camera calibration. I do not believe that, although the camera is low-cost camera, that each of those tablet and laptop devices will come with different accuracies.

2.) Sensitivity of the cameras

It is also possible that the cameras are very well factory-calibrated even for the development cameras. But maybe the installation into its housing from Creative destroys the camera calibration. External pressure on the sensor device itself might influence the quality of the camera.

3.) Camera is very sensitive to external force

Maybe the camera calibration gets broken because of some external force by unpacking the camera.

As a consequence, I would be very interested in the accuracy of Intel F200 cameras embedded into their target systems (such as the Dell Venue 8). I will get one today and will test it as soon as possible.

By the way, I bought another development kit camera and I was extremely careful with the camera when removing it from its package and by putting it into my test setup. Without doing anything else with the camera I started my test. And indeed I got an accuracy of +/- 0.3cm at a depth of 75cm and +/-1.0cm at a depth of 105cm. Maybe I got a very good camera by the some random luck factor ;), or it is because of reason 3.

 

0 Kudos
Reply