Software Archive
Read-only legacy content
17061 Discussions

SDK Performance Degradation

Wezley_S_
New Contributor I
482 Views

Hi guys,

Am I the only one that notices this? 

After a few days of using the Intel RealSense SDK without restarting my PC performance of the RealSense applications begins to degrade a lot. I'll notice this time to time while working on my application as well, and would then go test the demo apps just to find the same issues.

Intel, are you guys aware of this and will it be fixed in the coming releases? 

My Processor is the: 4th Gen i7-4500U

8Gb of Ram

Nvidia GT730M GPU

0 Kudos
7 Replies
MartyG
Honored Contributor III
482 Views

If you mean that you are leaving RealSense running for a few days constantly without rebooting your PC, I wouldn't worry too much about a memory leak that takes that long to manifest, unless you are building an application that is designed to be constantly running for days.

If you are building your application for the RealSense App Challenge contest, it's certainly very unlikely that the contest judges would spend that long testing your application  :)

0 Kudos
Wezley_S_
New Contributor I
482 Views
I worked really hard on having a good garbage collector system within my application and the fact that the demos start to loose performance as well makes me believe it's the Sdk itself. Unless a memory leak in my program can cause the Sdk to loose performance even after its closed. I don't really think that possible once again because I've been keeping a very close eye on my applications resource consumption.
0 Kudos
Artem_V_Intel
Employee
482 Views

Hi Wesley S.

Could you please specify what type of SDK functionality you are using? Could you provide a scenario which leads to such a performance degradation?

Best regards,

Artem

0 Kudos
Wezley_S_
New Contributor I
482 Views
Hi Artem. The performance degradation is seen in the depth module. And basic usage is where I see the performance degradation. After a fresh pc restart my application and the demo applications work fine. After 3-4 days of working on my app without a pc restart the depth module's input seems to be a little laggy and lacking in performance. The same results can be seen from the Sdk demos with my application completely closed out.
0 Kudos
samontab
Valued Contributor II
482 Views

I had similar issues with my camera. I didn't really get into the details to reproduce it, so I am not sure what is the root cause of it, but there's definitely something wrong with it.

I can confirm that when I noticed a lag in my app, then switched to the demos, the lag is still there, so there is something wrong either the camera itself(unlikely), or with the SDK.

Having said that, I also left my app running overnight just gathering depth, color, and IR, and it was running fine in the morning, so I don't think streams are the problem.

I know that one of the projection functions in the SDK (IR cam to RGB camera projection I think) has a confirmed massive memory leak that becomes evident in a few frames, maybe the other projection has a small one that appears after prolonged usage.

0 Kudos
Wezley_S_
New Contributor I
482 Views

That's good to hear. I did a 1 hour test of my application yesterday  with optimal results, and then again today with less than optimal. It has to be something with the SDK.

I really hope this doesn't affect the judging process, because loss in application performance due to a memory leak in the SDK can hurt a lot of people's chances of scoring what they deserve.

0 Kudos
Artem_V_Intel
Employee
482 Views

Hi Wezley S.

Thank you for reporting the issue. This issue was escalated to engineering team and will be fixed in future releases of Intel(R) RealSense(TM) SDK. I will inform you as soon as any update available.

Best regards,

Artem

0 Kudos
Reply