We are trying to find out the measurement between two points (u1,v1) and (u2,v2) on the depth image(Depth Stream) taken by F200, We collected the streams as Sebastian suggested in his tutorial and used QueryVertices of PXCCapture to gather the values of 3D world co-ordinates(x,y,z) on the image , So we are using this code :
_depth = sample->depth; projection->QueryVertices(_depth, vertices);
Also, i am changing my co-ordinate System by this code before initiating the PXCSenseManager So that the z values are positive :
PXCSession *pxcsession = pxcSenseManager->QuerySession();
When i will be doing the QueryVertices, i have to capture it in an array of PXCPoint3DF32 , So, I am initiating a vertices Array and PXCProjection we are initiating by this code :
PXCCaptureManager *capManager = pxcSenseManager->QueryCaptureManager();
PXCCapture::Device *device = capManager->QueryDevice();
PXCProjection *projection = device->CreateProjection();
PXCPoint3DF32 *vertices = (PXCPoint3DF32 *)malloc(sizeof(PXCPoint3DF32) * wid * hei);
So, My doubt is when i get all the co-ordinates of each point, The values of (x,y,z) are not consistent, I mean, for all points on same plane which is parallel to the camera, The values of z are different which is not supposedly correct because z values shows how much far any (u,v) point is from the x-y plane of the camera which should be same for one parallel plane to the Camera.
Can someone suggest the sequence of steps to collect the world co-ordinates in case we skipped any steps. Can the distance between two points in the depth stream be measured using F200 ?
Please clear our doubt.
Thank you, Anjani J.
When you say Z values are not the same, how much different are they?, a few mm?, cm?, metres?.
It makes sense that each point is not exactly the same, because it's a real measurement, and the plane is not perfect, there's noise, etc.
If you want to, say, detect a plane in the real world, you have to fit a plane model into a noisy real data, deal with outliers, etc. It's not a perfect plane equation as you can use in a game for example.
Now, if your Z values are very different, then there's a problem somewhere.
Thank you for your reply samontab,
I have attached Z values for an image which is almost a plane (PFA) which shows the image and taken z values at the middle
i.e. all values at pixel locations (320*i) for image resolution (640*480), i = 1 to 480.
Briefly Suppose that the plane is at 35 cm from the camera than values of Z for depth stream of the plane ( parallel to the camera ) is decreasing gradually starting from 34.2 cm to 38.8 cm on an average (that means some images may give some low value and little high, So its better to take an average, i thought), which is confusing because the plane is parallel to the camera and values should be almost same or some changing.
Please tell us some steps to follow to get exact co-ordinate values in respect of z as this is not looking like a good co-ordinate system.
Thank you again.
Those z values look pretty good to me! How sure are you that the plane is exactly parallel to the camera? Because it looks like you're getting correct results but the camera is slightly off-angle. I'd be concerned if the distances went up and down seemingly randomly over the width of the image but you're getting a very linear increase. I really don't think you have anything to worry about here.
Hello James, Thanx for the reply,
I understood what you are saying, yes it is very likely that camera is not EXACTLY parallel to the plane.
I have one more doubt that are these z values of any (u,v) pixel location on the image, a co-ordinate value or distance from the origin (camera) and the point (u,v) on the plane because i tried to measure the distance between camera and the point and it seems like z value is very close to that and if it is distance, how can i get the co-ordinate value from these (x,y,z) value given by QueryVertices ?
I'm not entirely sure what you're asking there. The 'z' in (x,y,z) and (u,v,z) are the same value for a corresponding point: the distance in mm from the camera plane.
Are you asking how to map from the colour image (an x,y point) to the vertices (a u,v,z point)? If you want to map a few points at a time (ie, significantly less than the whole image unless you don't mind waiting ), you can use Projection.MapColorToDepth(), or if you want to do the whole image at once you can use Projection.ProjectColorToCamera() though the latter is a little awkward because you have to map to the depth image first.