- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi everyone,
After taking a break from RealSense development recently, my company has resumed RS development by importing our tech into Unity 5 and the new R3 SDK (v5.). Here's our first impressions.
* We were disappointed that none of the techniques we developed and shared with the community over the past year were incorporated into the new SDK. This includes use of multiple Action scripts of the same type inside the same object, full avatar arm control (shoulder, upper and lower arm, not just the hand), movement amplification of face-tracked objects, slider-based object control mechanisms and tracking the RL knee and toe joints. There's also the moving of Interactive Cloth with a TrackingAction, but that was too recent to have made the R3 release.
We are realistic though that just because something is useful, it does not mean a platform-holder will necessarily make official use of it in their engine if it does not fit their vision for the product. We will continue to share the advances we develop with the community so that it can get the benefits of our work.
* Regarding performance of Unity 5 in combination with the R3 SDK: we found that our full-body avatar's arms and legs moved significantly more smoothly and the movement of the limbs mirrored the movements of the user's real-life hand even more closely than before. This is a validation of Intel's assertion in the release notes that hand-tracking is improved in R3. If anything, the detection was over-sensitive. We can tune that though and are generally pleased with the performance.
* Whilst the project worked well when running in the play preview in the editor, the camera went dead when doing a full-screen Build and Run. Again, that is something that we will look into.
We'll post on the forum about new techniques as we discover them. For now though, we feel that Unity 5 and the R3 SDK (v5) gives us an excellent foundation to continue to build upon.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Good feedback and it will help us to improve RealSense SDK in the future. Thanks!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks David! For myself personally, the rotation of objects when the hand is turned side-on, with no way to disable that, is by far the greatest problem and if that was the only item on our wish list to be addressed then I'd be very happy indeed.
My company spent months developing numerous prototypes to counteract this, without luck (yet). Editing of the TrackingAction script code made no difference either, so I would guess that the hand side-turn is tracked by something in the hand-tracking module of the library files that can't be edited by ordinary users.
The issue affects my company's project particularly because we built a full-body avatar with turnable wrists. When the hand is turned to help the hands get a better grip on an object, it causes every other hand-driven TrackingAction in the avatar body to activate and move a body part in whatever direction is set in the TrackingAction's constraints.
So if for example the upper arm section of the avatar is constrained to move up and down when the palm is moved up and down in front of the camera, that arm will move up and down when the hand is turned left and right. The face-tracking has a similar issue - it moves objects when the user's forehead moves towards the camera or away from it.
If we eventually find a solution ourselves, we'll be sure to post it on the forum.
Thanks again, David!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Continuing my feedback about use of the latest RealSense SDK in Unity 5 as I continue to adjust our project ...
* I reported earlier that the RealSense camera would not activate when doing a Build and Run full-screen preview or running a project as an external published executable. I have found the solution for this. When Unity 5 is installed, it is set by default to compile programs in the old X86 (32 bit) processor format. For the camera to work, the program needs to be compiled in X64 (64 bit) mode.
You can set this by going to File > Build Settings in Unity and selecting 'X86_64' from the drop-down menu beside the 'Architecture' option. The camera will then activate when you run the program in full-screen.
* In the previous SDK, there would be an awkward delay of a few seconds between switching between control of our avatar's arms and legs using hand opening and closing gestures. That delay has now vanished and the switch between arm and leg control is instant, making use of the avatar in a game much more practical.
The switch between controls was too easily triggered though, meaning that the legs would often start moving when the closed-hand gesture had not yet been made. We found that changing the 'Openness Factor' of the Hand Opened and Hand Closed rules from the default '50' to a lower '30' fixed this problem.
* Tracked hand and face objects move significantly faster in response to camera inputs, so much so that we were able to strip out the movement amplification mechanisms we had installed when the project was in Unity 4.6.
* Collider-equipped objects with Rigidbody components that used the 'Extrapolate' setting in the 'Interpolate' section of the Rigidbody options seemed - at least, in our project - to provide much weaker collision detection than in Unity 4.6. Switching this setting to 'Interpolate' made collisions reliable again.
The difference between Interpolate and Extrapolate is that Interpolate smooths object movement based on the previous frame of movement animation, whilst Extrapolate calculates smoothing by predicting the next frame of animation to come.
* In the previous SDKs, using the default values of '100' in the Virtual World Box section of the 'TrackingAction' script tended to be disruptive to performance, we found, and having them set to '0' was the preferable option. With Unity 5 and the new SDK though, we found that leaving the Virtual World Box at its default '100' value for X, Y and Z yielded a noticeable improvement in the ease of control of our avatar limbs and reduced the likelihood of the arms spring backwards behind.the avatar's back when the hands were first detected by the camera.

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page