- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
I'm trying Gesture Tracking and Emotion Tracking on two separate threads in two separate classes, and for some reason when my emotion tracking is enabled my gesture tracking feedback is slowed an extreme amount.
Is there a way to fix this or am I just going to have to deal with it for now and wait for a new SDK release?
Thanks!
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
FYI: There is not going to be a new SDK release for the contest.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The camera's default profile( ConfidenceThreshold and FilterOption) for hand tracking and emotion are different. This will cause the performance issue. We do not recommend you use those usages at the same time. You need find best profiles in this case if you do need run those usages at the same time. Thanks!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Samontab I know this. I plan on expanding my application in the future and am looking at a broader horizon than just the contest.
David, thank you for the information knowing this I can adjust my application accordingly. What would be the best profiles for using them at the same time? Is that something that I need to find myself, or do I just need to set the confidence threshold and filteroption the same for both.
Thank you!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You need find the best profiles by yourself if you do need use those algorithm at the same time.
Wezley S. wrote:
Samontab I know this. I plan on expanding my application in the future and am looking at a broader horizon than just the contest.
David, thank you for the information knowing this I can adjust my application accordingly. What would be the best profiles for using them at the same time? Is that something that I need to find myself, or do I just need to set the confidence threshold and filteroption the same for both.
Thank you!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thank you for your help David!
Knowing that this is a common problem with the current SDK I feel more comfortable about compromising a bit in my demo.
Instead of capturing emotion data all the time I adjusted it to where it only captures about 5-6 seconds of emotion data every time I need it to. This way it doesn't slow down the flow of my application, and gives me the needed data.
Do you guys plan on fixing this within the next few SDK versions by chance?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Is this a common issue when running any multiple modules at once?
If it is, then it would not be recommended to use multiple modules at once (only one enabled at a time) or there would be performance issue.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I tried two modules: gesture and speech recognition. When running two modules, tap gesture was not reacted. Some command phrases were not recognized. Performance directly affected user experience.
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page