Software Archive
Read-only legacy content
Announcements
FPGA community forums and blogs on community.intel.com are migrating to the new Altera Community and are read-only. For urgent support needs during this transition, please visit the FPGA Design Resources page or contact an Altera Authorized Distributor.
17060 Discussions

How to simulate drag and move mouse?

CLi37
Beginner
1,350 Views

To rotate a 3D object I can move the cursor of mouse to that object and press left button down, then I can hold the left button and move mouse around to rotate the object. How can I do this by hand?

 

 

0 Kudos
6 Replies
steve-vink
Beginner
1,350 Views
Take a look at last year's game winner, Mystic Blocks. This is what you had to do in this game, the author shared a lot of information. I'm on my mobile phone at moment so unable to find the links.
0 Kudos
CLi37
Beginner
1,350 Views

That may be a new solution but I just want a simulator of mouse. That should be in SDK.

In my problem I need a screen coordinate of the hand cursor then I can translate it to 3D world coordinate.

A problem is to lock/release a position and generate relative position of the hand motion.

Can I get a real world 3D coordinate of my hands from cameras? Where is the original point?

0 Kudos
samontab
Valued Contributor II
1,350 Views

Yes you can.

Get the raw depth stream and then use the projection utilities to get the real world 3D coordinates of the pixels that represent the hand:

    /**
        @brief Map depth coordinates to world coordinates for a few pixels.
        @param[in]   npoints        The number of pixels to be mapped.
        @param[in]   pos_uvz        The array of depth coordinates + depth value in the PXCPoint3DF32 structure.
        @param[out]  pos3d          The array of world coordinates, in mm, to be returned.
        @return PXC_STATUS_NO_ERROR     Successful execution.
    */
    virtual pxcStatus PXCAPI ProjectDepthToCamera(pxcI32 npoints, PXCPoint3DF32 *pos_uvz, PXCPoint3DF32 *pos3d)=0;

Take a look at pxcprojection.h for more info.

0 Kudos
CLi37
Beginner
1,350 Views

I can capture the world coordinate of hand from an example of Java in SDK. Weird Intel did not provide this sample code in C++ I have to convert it from Java to  C++. 

My problem is to map or transform the 3D world coordinate to my 2D screen coordinate.

Or how can I match my 3D model world to the 3D hand world?  Intel may need to provide a solution to this basic problem. This is a fundamental part to use 3D camera in virtual reality. By matching two worlds all operations become easy to understand. Moreover why not provide a cursor that is the palm center of a hand in API? 

 

0 Kudos
Dagan_E_Intel
Employee
1,350 Views

hi Chang-Li,

you can have a look at the hands_viewer samples (either C++ or C#) to see how the joint information can be retrieved,
Look for this method: QueryTrackedJoint
I would go for the JOINT_WRIST 
you can then retrieve the image coordinates of the joint by calling:

int wristX=(int)jointData.positionImage.x;
int wristY=(int)jointData.positionImage.y;

This data is in pixels in the coordinates of the depth image (640x480)
Now convert this to your screen's resolution:

int screenX = (wristX * screenWidth) / 640
int screenY = (wristY * screenHeight) / 480

This should convert to the correct screen location.

Now you can use a gesture event (say "fist") to trigger a mouse click

 

0 Kudos
CLi37
Beginner
1,350 Views

I can get the image center coordinate now. But I can not find API to work with mouse drag & move.

The mouse drag & move include:

1. processing WM_LBUTTONDOWN message
2. processing WM_MOUSEMOVE message
3. processing WM_LBUTTONUP message

What gesture can do the LBUTTONDOWN? There is no cursor available to select an object. Without these basic APIs it is hard to control objects in both 2D and 3D. 

0 Kudos
Reply