Software Archive
Read-only legacy content
17061 Discussions

Why is StreamCalibration different from QueryColorFocalLength?

Jan_Rüegg
Beginner
511 Views

When using this code:

  PXCCapture::Device *device;
  std::cout << "Device focal length: " << device->QueryColorFocalLength().x << "," << device->QueryColorFocalLength().y << std::endl;

I get the following output for the resolution 640x480:

    Device focal length: 590.476,590.476


However, when running this code instead:

  PXCProjection *projection = device->CreateProjection();
  PXCCalibration *stream_calibration = projection->QueryInstance<PXCCalibration>();
  PXCCalibration::StreamCalibration calibration;
  PXCCalibration::StreamTransform transformation;
  stream_calibration->QueryStreamProjectionParameters(PXCCapture::STREAM_TYPE_COLOR, &calibration, &transformation);
  std::cout << "Stream focal length: " << calibration.focalLength.x << "," << calibration.focalLength.y << std::endl;

I get the following output, for the same resolution:

    Stream focal length: 460.141,613.521
 

Why is the focal length different in the two cases?

Why is the focal length different for x and y in the second case?

 

0 Kudos
14 Replies
SMali10
New Contributor I
511 Views

Hi Jan,

Are you using the F200 or the R200?

I believe the documentation says that "QueryColorFocalLength" returns a "generic" value for the camera field of view (a value that is approximate for all models of that device) and "QueryStreamProjectionParameters" gets the specific calibrated values for that camera.

It's possible that "QueryStreamProjectionParameters" is returning the rectified values. However it's a bit strange that the values are that different, and it's really strange that the X and Y values are different at all.

I've been having a lot of trouble getting getting the calibration data from the R200 to work at all, and I have yet to get an answer as to whether the functions are broken or not supported for that camera.

0 Kudos
Jan_Rüegg
Beginner
511 Views

Thanks for the thoughts, Sam.

I have an F200. And I have to say that for the "sensor-native" 16x9 resolutions, the vallues for X and Y focal length (in StreamProjectionParameters) are more reasonable, and approximately equal (though still not the same as in QueryColorFocalLength).

However, as soon as I go to a 4x3 resolution, the image I get is (obviously) cropped, and the focal lengths are completely messed up...

0 Kudos
samontab
Valued Contributor II
511 Views

Well, the calibration parameters should work for the image size used in the calibration.

If you change the image size, parameters like the focal length, and the optical centre need to be scaled accordingly. Not sure if the SDK is doing this though.

0 Kudos
Jan_Rüegg
Beginner
511 Views

Yes, the SDK is doing some scaling. And as mentioned, I think for the 16x9 calibrations the scaling is (more or less) correct.

However, for the 4x3 resolution, there seems to be some bug in the code to me...

0 Kudos
Xusheng_L_Intel
Employee
511 Views

The QueryColorFocalLength function returns the color sensor focal length, in pixels, along the horizontal and virtual axises. The parameters vary with the color stream resolution setting. The property value is the model fixed value, and not the device instance calibrated value. For detail info, you can find it @https://software.intel.com/sites/landingpage/realsense/camera-sdk/v1.1/documentation/html/querycolorfocallength_device_pxccapture.html?zoom_highlightsub=QueryColorFocalLength

0 Kudos
Jan_Rüegg
Beginner
511 Views

@David Lu: Thanks for the clarifications. That explains one question, why it is different for the two functions.

However, it is still very unclear why the focal length for X and Y are so hugely different (460x613). That would mean, that all the pixels of the returned image have a ratio of almost 3 to 2, which is clearly not the case.

I'm pretty sure that if I calibrate the images with some toolbox and a checkerboard, I will get a focal length for x and y that are pretty much the same.

Is this a bug in the Intel SDK?

0 Kudos
samontab
Valued Contributor II
511 Views

Hi Jan,

I reckon this is because they need to have a "good enough" calibration procedure for all their mass production of cameras, which may lead to sub par calibration parameters. 

Some time ago I calibrated the camera to remove the barrel distortion from the IR stream, and the images ended up looking way better. I didn't use the SDK though, since I don't think there is a way to specify the calibration parameters to it (i.e. I think they are read only).

If it is not possible to specify the calibration parameters, then everything would need to be done outside the SDK, maybe even the depth estimation algorithm.... which would kinda defeat the purpose of the SDK.

0 Kudos
Jan_Rüegg
Beginner
511 Views

@samontab: I agree they have to do a "good enough" approach for calibrating all the cameras. But if the value they get is 30% off, they can as well not calibrate it and just roll a dice...

Also, the focal length *must* be the same from a theoretical standpoint when just cropping the image from 640x480 to 640x360, which is not the case in the SDK.

0 Kudos
SMali10
New Contributor I
511 Views

@samontab

As David pointed out I believe "QueryColorFocalLength" is the generic "good enough calibration" for all of the cameras. The values returned from "QueryStreamProjectionParameters" should be the device specific calibration values. I would be very surprised to find that these cameras are not individually calibrated.

@Jan

It seems (and this is partly speculation) that many of the algorithms in the SDK are very similar too, if not based off of OpenCV. If that's the case the x and y focal lengths should be identical, regardless of the image aspect ratio. Generally the returned focal length values refer to the lens itself, not the field of view of the image. Different values would mean that it's a heavily distorted or anamorphic lens.

If you divide the x focal length by the y resolution you get 0.95862708, and the y focal length by the x resolution is 0.95862656, which are basically the same. If you are working with OpenCV my guess is to use the larger value 613.521 as your focal length.

However, it's also strange that the values are 460x613 when it's a 640x480 image, as if the values are inverted.

0 Kudos
Jan_Rüegg
Beginner
511 Views

@sam: Very good observation with dividing focal length and resolution. My assumption is that intel does some wrong scaling of the focal length when the image is cropped, within the SDK.

I think the values they give would be correct if the image was not cropped but still the full sensor image (stretched) with the smaller resolution.

0 Kudos
Xusheng_L_Intel
Employee
511 Views

As device->QueryColorFocalLength as QueryStreamProjectionParameters function took parameters from camera directly. In Projection module uses the same data as from QueryStreamProjectionParameters. The camera was calibrated for 16:9 aspect ratio, and for usage case 4:3 need to make the follow calibration parameters correction:

  1. calibration.focalLength.x * 4 / 3
  2. calibration.principalPoint.x * 4 / 3 – colorWidth / 6

So, 460.141 * 4 / 3 = 613.521

0 Kudos
Jan_Rüegg
Beginner
511 Views

@David Lu: Ok, I see how you get to the new focal length. However, this is wrong: The focal length gives the distance in pixels from the camera center to the sensor. However, when cropping the image to 4/3 (which is what the camera does), the distance to the camera center stays the same and the "width" of a pixel stays the same, so the focal length must stay the same as well. If, on the other hand, you stretch the image from the sensor, then the width of one pixel gets smaller, which means that the distance in pixels from the camera center gets bigger, leading to 460.141 * 4 / 3 = 613.521.

In summary, one of two things is wrong:

  • Either the focal length of 613.521 is wrong and should be the same as before
  • Or the image returned by the camera is wrong, and should be stretched instead of cropped

However, changing the principal point is correct when cropping the image left and right, since the top left corner of the image has an offset compared to before then.

0 Kudos
samontab
Valued Contributor II
511 Views

Yes, the focal length should not change when only cropping an image.

The problem is that I'm not sure how the SDK is creating those images at different resolutions. Specially the ones that are not in the aspect ratio of the camera. Maybe it does a crop and a resize? that would explain why the focal length changes...

Now, about cropping changing the principal point, it should only happen when the crop is done from any region that does not start from the top/left of the original image.
 

The image coordinate frame starts at top/left with (0, 0), and goes up to bottom/right at (width-1, height-1). If calibration is done using the full sensor, the principal point should be somewhere near (width/2, height/2).

Cropping is basically taking a rectangular subset of this image.

If you crop, say, a very small area near the centre of the image, the principal point will now be a much smaller number than (width/2, height/2), since your reference is now way closer to it than the reference of the original image was.

But if you crop, say, the first few pixels from the top/left region of the image, then the principal point will still be exactly (width/2, height/2).

So, what really matters when cropping is the top/left coordinate.

I know that the SDK does some kind of cropping because the field of view changes when you request different resolutions, but I'm not sure about the top/left coordinate used, and if that is taken into account in the SDK, or if some scaling is done to the image...

I guess we need some data points at different requested resolutions to see how it is actually done in the SDK, and if there is a problem in the SDK or not...

0 Kudos
SMali10
New Contributor I
511 Views

Hey samontab. Bascially everything you said is correct, but I do need to clarify something if someone just starting out with optics and computer vision comes across this thread.

The principal point is never exactly in the direct center of the image (width/2, height/2). Really the only time it will be the center is in a perfectly made pinhole camera.

The principal point is where the optical center is located on the image plane, represented in pixel coordinates. Basically it's where the center of the lens is on the sensor itself.

Because of variations during manufacturing and small imperfections in an individual lens/optics system, the principle point is usually several pixels away from the center of the image. This is why every depth camera has to be individually calibrated, because if the calculated distortion and optics are off by a few pixels, then everything else will be too.

So the principal point after a center crop would be (width/2, height/2) of the new image, plus the offset from the center of the old image to the principal point of the old image. (though this might only apply to the R200 if the F200 corrects for this automatically)

0 Kudos
Reply