Items with no label
3335 Discussions

How to compute object's height for D435?

RSun9
New Contributor I
12,594 Views

Hi there,

I am trying to convert an object's height in pixels to its actual height in meters. The equation I use is simple geometry. See the diagram below. PH = (D * bh)/f, where PH is the object height I want to measure, D is the measured distance (in meters) between the object and the camera, bh is the object's height in pixels, and f is the focal length.

I know that D is accurate (checked against the ground truth); bh is accurate (visually checked on display); f is generated by the SDK API ('intrinsic.fx' or 'intrinsic.fy'). At resolution 848x480, f is reported to be 423.499. Note f is obtained from the depth frame, as shown in the wiki API how-to ( https://github.com/IntelRealSense/librealsense/wiki/API-How-To).

However, when I apply the numbers into the equation, the height is way off. For example, for a person standing 3.4 meters away from the camera, the height in pixels is 334, and the computed height is 2.68 meters. His actual height is about 1.68 meters.

Why is this happening? Is this a calibration issue? Does my equation not fit D435's optical model?

BTW, the same equation works very well for D415.

Thanks,

Robby

0 Kudos
21 Replies
MartyG
Honored Contributor III
8,603 Views

The first thing that comes to mind is that the D415 and D435 have different field of view (FOV) angles. D435's is wider.

D415 is 69.4° x 42.5° x 77° and D435's FOV is 91.2 x 65.5 x 100.6.

0 Kudos
RSun9
New Contributor I
8,603 Views

Yes I am aware of that. In fact, I have verified VFoV with the focal length and the frame height:

At 848x480 resolution, VFoV = 59.0808 degrees, and the focal length is 423.499. Therefore, tan(VFoV/2) = 0.5667 = (480/2)/423.499

However, that does not seem to solve my problem.

-Robby

0 Kudos
MartyG
Honored Contributor III
8,603 Views

The focal length of the D415's left and right imagers is 1.88 mm and the focal length of the D435's left and right imagers is 1.93 mm. The color sensor's focal length is 1.93 mm. In the RealSense SDK 2.0 software, the focal length is described by a multiplication of the variables fx (pixel width) and fy (pixel height). So as you say, focal length should be determined by resolution.

With math formulas, if you know for certain what some of the values are then you can rearrange the formula to make another value the target instead of PH. So you know for certain what person height (PH), distance of the person from the camera (D) and focal length (f) should be.

If things were working correctly, the target value should be correct no matter how the formula is rearranged. So if you rearranged to make focal length the target for example, the formula should give the correct focal length result when the other values are put into the rearranged formula.

Following the logic above, bh (the object's height in pixels) is the only unknown value that you haven't been able to measure by other means, such as a tape measure. Putting on my Sherlock detective cap, if you are getting an incorrect PH result on the D435, it is likely because the object's pixel height is being miscalculated (because you have verified all the other values)

0 Kudos
RSun9
New Contributor I
8,603 Views

Thanks... The object's height in pixels is computed with OpenCV, from the color frame. How else can I measure it?

0 Kudos
MartyG
Honored Contributor III
8,603 Views

SDK 2.0 has an example script called 'Measure' for measuring the distance between two points, so you may be able to avoid using OpenCV to do the pixel height calculation if you adapt the code in that example.

https://github.com/IntelRealSense/librealsense/tree/master/examples/measure librealsense/examples/measure at master · IntelRealSense/librealsense · GitHub

0 Kudos
RSun9
New Contributor I
8,603 Views

Thanks. I may give that a try. Still I have no reason to suspect that OpenCV is giving me the wrong height in pixels. I have visually verified the height in the display; it looks about right. Also, the same code gives me the accurate height estimation on D415.

Unless the height obtained from the color frame is very different from the height in the depth frame ... is that possible?

0 Kudos
MartyG
Honored Contributor III
8,603 Views

If you are using both the color and depth, I would recommend using the same resolution for both if you are not doing so already, to help ensure that the two are aligned.

0 Kudos
RSun9
New Contributor I
8,603 Views

Yes I am using 848x480 for both color and depth frames.

0 Kudos
MartyG
Honored Contributor III
8,603 Views

There is a long discussion on the link below by vision tech expert ThePaintedSnipe that attempts to work out why D415 and D435 images may have differences.

0 Kudos
RSun9
New Contributor I
8,603 Views

Thanks for the link. I skimmed through the threads. The discussion seems to be about a different problem.

As far as I can tell, my problem is not about the depth data quality -- I am only measuring one object, and the distance seems to be accurate. My problem arises when I convert the distance to height.

For my part, I'd like to know:

1. Will camera re-calibration solve my problem?

2 . If yes, how easy or difficult is it to re-calibrate D435?

0 Kudos
MartyG
Honored Contributor III
8,603 Views

It is hard to say if calibration will solve your issue. Doing so is much easier on the 400 Series cameras though.

0 Kudos
RSun9
New Contributor I
8,603 Views

Thanks. I'll look into that.

0 Kudos
ARyba
Beginner
8,603 Views

Robby, I'm just starting using D415 and thinking of trying to do the same calculations as you already did. I'm using OpenCV/ python. Would appreciate if you can share your code so I don't have to re-invent the wheel...

-albertr

0 Kudos
RSun9
New Contributor I
8,603 Views

Sorry this is part of a project and I can't share the source code now.

0 Kudos
idata
Employee
8,603 Views

Hello Robby_S,

First, you have to convert points to pointcloud and then use those coordinates.

X= depth * (x - ppx)/fx

Y= depth * (y- ppy)/fy

Z= depth

Then use Pythagoras

Distance= sqrt(X*X + Y*Y + Z *Z).

That should fix your issue.

D415 worked maybe due to narrow FOV approximation.

 

 

Please let us know if this fixes your issue.

 

 

Regards,

 

Jesus G.

 

Intel Customer Support
0 Kudos
Robotz
Beginner
6,943 Views

Wow - thanks so much for that. Simple high school math and I was scouring the internet trying to find it. Glas the intrinsic contains the constants needed. Helped me convert the point-cloud to laser scan for slam usage.

Thanks again

 

0 Kudos
RSun9
New Contributor I
8,605 Views

Hi Jesus,

Thanks for the response, but I don't quite understand your solution: Distance= sqrt(X*X + Y*Y + Z *Z). What is this distance? Distance from what?

I am measuring the distance between two pixels, (x0, y0, z0) and (x1, y1, z1), in the point cloud. Since I am only interested in the height, ideally, x0 = x1 and z0 = z1. That leaves H = Y0 - Y1 = depth * (y0-ppy)/fy - depth * (y1-ppy)/fy = depth * (y0 - y1)/fy as the only non-zero component left.

If we denote D = depth, and bh = (y0 - y1), we get H = (D * bh)/fy. That's exactly my formula to start with.

So if I understand your answer correctly, your answer is actually the same as my formula, which does not work on D435.

On the other hand, if I misunderstand your answer, please point it out and explain.

Thanks again,

Robby

0 Kudos
idata
Employee
8,605 Views

Hello @Robby_S,

 

 

We are checking with engineering on this issue. We appreciate your patience.

 

 

Regards,

 

Jesus

 

Intel Customer Support
0 Kudos
Anders_G_Intel
Employee
8,603 Views

First let me clarify the equation: If you choose any two points in a point cloud after you have done the conversion described above, each point will have an (X1, Y1, Z1) and (X2, Y2, Z2). You then calculate deltaX="X"= X2-X1, deltaY="Y"= Y2-Y1, and deltaZ="Z"=Z2-Y1. The distance between the two points is then Distance= sqrt(X*X + Y*Y + Z *Z). Yes, here X, Y, Z are used as shorthand for the differences.

In your case you assume that deltaX and deltaZ is zero. You may be able to control for that, like measuring on a white wall at exactly perpendicular, but in general it is best to use the equation above. Have you tried that?

If you do control the deltaX=0, and deltaZ=0 exactly then YES your equation is correct.

However, I think the real issue may be in making sure you use the DEPTH x, y and DEPTH focal length. It may be that you are using the color x, y or focal length. Note that the external color camera has a 1.5x smaller FOV for the D435 but the FOV for color and depth are the same for the D415, so this may be the source of the discrepancy.

0 Kudos
RSun9
New Contributor I
8,603 Views

Thanks for your detailed explanation.

I was indeed using (x, y) from the color frame. However, I made sure that I aligned the depth frame to the color frame beforehand, so I assumed that the (x, y) coordinates on the depth frame should be quite close to those on the color frame. No?

I got the focal length from the depth streams' intrinsics, so I must be using the depth focal length, right?

0 Kudos
Reply