- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
When I add the "Image" prefab to my empty scene with just a camera, the game seems to run at the same frame rate as the real-life camera.
Is this intentional?
For example, when lighting changes drastically, the camera framerate sometimes drops. But then so does the whole game, and the main thread goes up to 60ms, and it runs at 30 ms on average. The main thread should not be this high just to display a webcam texture on a plane.
I am using the older Senz3D camera, but it has the exact same issue with another camera I have tried. So I'm not sure if even the new camera will fix this.
Anyone have an idea for why displaying the webcam on a plane with DrawImages would be so inefficient?
Does the RSSDK pause execution inbetween camera frames or something?
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Okay I found a solution, but I'm not sure what side-effects it will have.
Using the profiler, I found that AcquireFrame was causing all of my application's lag.
Here is the documentation on AcquireFrame:
In SenseToolkitManager.cs, in the Update() function, change this line:
_sts = SenseManager.AcquireFrame(true, 100);
to this:
_sts = SenseManager.AcquireFrame(false, 1);
Now my framerate has jumped from 15-20 fps to 90+ fps, and my Main Thread is at about 10ms instead of 60.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Maybe I haven't understood correctly and I may be wrong, but it sounds strange to me that the main thread have to wait for camera's frame in order to go forward. Are you running the code which acquire frames in a separated thread?
I'm using a background worker which generates an event when each frame is ready (the same event gives the frame to the main thread)..In this way the main thread, which is listening to events from the background worker, goes on its own with no lag and analizes the frame when the event is rised.
Mike
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I'm using everything that is comes default with the Unity Toolkit Package.
I created an empty scene with just a camera and their prefab "Image" and hit play.
When I look at the stats, the framerate of the game seems to be the same as the camera.
For example, when I put my hand over the sensor and it gets darker, the camera's framerate drops (because it's trying to correct for low lighting I guess). That drop in frame rate also makes the game's frame rate drop.
I can make a video later showing exactly what's going on.
What seems to be happening is the whole game just pauses while it waits for the I/O from the camera unless you set that variable to false.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Ok, what I understood is that the game waits for camera inputs to go forward. That's why I think that decoupling the camera from the main thread might work. Unfortunately I have no experience with unity and I can't say much more :(
Mike
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks anyway. I don't have the new camera yet so I don't know what functionality changes this will have, but at least it did improve the framerate.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
AcquireFrame is used to wait for data to be available and pauses processing (but not captures) until you invoke ReleaseFrame, so the frame should be released as soon as possible so the next frames can then be processed. In Unity, it's recommended to use AcquireFrame(false,0) to avoid impacting the Unity rendering performance.

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page