Software Archive
Read-only legacy content
17061 Discussions

Using 3d segmentation and emotion detection together.

Peter_O_Hanlon
Innovator
719 Views

Okay, so I have some code where I want to do both emotion detection and 3d segmentation together. I'm also doing gesture recognition. So, if I enable hand tracking and emotion detection, things work well. If I enable hand tracking and 3d segmentation, all works well. It's only when I attempt to mix and match emotion detection and 3d segmentation together that things go wrong. Specifically, when I call senseManager.init(handler), the code doesn't return from this call. Effectively, what I have is the following:

PXCMSenseManager senseManager = PXCMSenseManager.CreateInstance();
PXCMSenseManager.Handler handler = new PXCMSenseManager.Handler{ onModuleProcessedFrame = ProcessFrame };
senseManager.EnableHand();
senseManager.EnableEmotion();
senseManager.Enable3DSeg();
senseManager.Init(handler);

I'm curious, if anyone has been able to get these to work together.

0 Kudos
5 Replies
Xusheng_L_Intel
Employee
719 Views

This is known bug and will be fixed in future release. Currently you can not use emotion module with other module at the same time.

0 Kudos
Peter_O_Hanlon
Innovator
719 Views

Thanks David. Do you have any timescale on the fix?

0 Kudos
Xusheng_L_Intel
Employee
719 Views

You should get it pretty soon.

0 Kudos
Piotr_P_
Beginner
719 Views

Is this bug fixed? I am using SDK version 2014, Gold, and observe problems with capturing gestures when emotions are enabled. Is there any patch to use emotions module and do not disturb others?

0 Kudos
Tri
Beginner
719 Views

Has it been fixed? And could we use segmentation with gesture recognition at the same time? 
Thanks.

0 Kudos
Reply