Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

OpenVino inference engine slows

eddie_patton
New Contributor I
1,331 Views

We are observing the inference engine slowing down over time. For example, our measurements show this:

inf time 68.351100 msec

1 hr later

inf time 75.8 msec
10 hrs later

inf time 191.1 msec.

 The rate that the inference engine is called is 10 Hz. So for 1 hr of time, the inference engine produced results 10 * 60 * 60 = 36000 xs. Granted that would be a heck of a lot of results after 10 hrs, it's something a customer would do with our system.

We don't see a memory leak (observed using the task manager). 

Is there a known bug where this version of OpenVino slows down over time?

  • openvino_2021.1.110
  • target CPU
  • Number of inferences = 1 
  • data precision: FP32
  • C++

Is there an inference engine reset we could try if we notice the processing time increasing? (it would have to be a fast reset because the call rate could be more than 30 Hz).

Thanks

0 Kudos
1 Solution
Iffa_Intel
Moderator
1,225 Views

Hi,


Since your model is confidential, it is difficult to scope the root of your issue. In addition to that, OpenVINO 2021.1 is rather outdated if compared to the latest release. It's recommended to migrate to the newer version of OpenVINO.


You may refer to this OpenVINO GitHub repo, this repo has logs on what fixes are implemented into each release.


We tried to find any related bug fixes in our archive but hardly found any. However, you may refer here for known issue.


Starting from OpenVINO 2022 onwards we have ov::shutdown() function to refresh/close all unused dlls of the inference engine.


Note that, since you mentioned that this is a production application and tested continuously for a long duration, the issue could also caused by CPU consumption or any other process/workload on the CPU over time.

 


Cordially,

Iffa


View solution in original post

7 Replies
Iffa_Intel
Moderator
1,311 Views

Hi,

before proceeding further, could you share:

  1. Which OpenVINO IE sample/demo did you use to test this issue? (please help to share if it's custom)
  2. Which model is used (share the file if possible)
  3. What tool did you use to test the IE over time



Cordially,

Iffa


0 Kudos
eddie_patton
New Contributor I
1,290 Views

 

Hi Iffa

To answer your questions: 

  1. Which OpenVINO IE sample/demo did you use to test this issue? (please help to share if it's custom)
    1. Ans: This is production code. No demo/sample used to observe this issue. 
  2. Which model is used (share the file if possible)
    1. Ans:Sorry, proprietary.
  3. What tool did you use to test the IE over time
    1. We used QueryPerformanceCounter() before and after the calls to ie_infer_request.hpp's StartAsync() and Wait()

I wish I could replicate in an example to show you, but I'd likely not be able to create a similar model

Could you point me to the openvino_2021.1.110 issue database? Perhaps there's a logged issue like this. I couldn't find anything in the forum.

There doesn't appear to be a memory leak, but is there a way to re-fresh the inference engine without reloading and compiling?

0 Kudos
Iffa_Intel
Moderator
1,277 Views

We will further investigate this and get back to you with possible solution/recommendation.


Cordially,

Iffa


0 Kudos
Iffa_Intel
Moderator
1,226 Views

Hi,


Since your model is confidential, it is difficult to scope the root of your issue. In addition to that, OpenVINO 2021.1 is rather outdated if compared to the latest release. It's recommended to migrate to the newer version of OpenVINO.


You may refer to this OpenVINO GitHub repo, this repo has logs on what fixes are implemented into each release.


We tried to find any related bug fixes in our archive but hardly found any. However, you may refer here for known issue.


Starting from OpenVINO 2022 onwards we have ov::shutdown() function to refresh/close all unused dlls of the inference engine.


Note that, since you mentioned that this is a production application and tested continuously for a long duration, the issue could also caused by CPU consumption or any other process/workload on the CPU over time.

 


Cordially,

Iffa


eddie_patton
New Contributor I
1,144 Views

Thanks Iffa

We are in the process of upgrading to the latest OpenVino. And thanks for pointing out the ov::shutdown() and looking for a similar issue in the bug database.

Cheers
Eddie

0 Kudos
eddie_patton
New Contributor I
937 Views

I discovered a fix, but don't understand how why running one task from another causes the slowdown. What I observed is the memory usage (both commit and working set memory as observed using the windows resource monitory) slowly increasing over time. When memory increased to a point that page swaps started occurring (as evidenced by the disk activity), the inference engine started to slow down.  

Our SW was designed like this:

Thread  -> Concurrency::task_group (do pre and post processing) -> Concurrency::task_group (do Inference Engine)

This is the syntax for running the pre/post processing and the inference engine threads for example:

 

	Concurrency::task_group m_processTask;
m_processTask.run([this]()
		{
			m_hrProcess = ProcessInternal();
			if (FAILED(m_hrProcess))
			{
				Log error;
			}
		});

 

 When I removed the second task_group run([]) call, the memory leak stopped.

0 Kudos
Iffa_Intel
Moderator
1,178 Views

Hi,


Intel will no longer monitor this thread since we have provided a solution. If you need any additional information from Intel, please submit a new question. 



Cordially,

Iffa


0 Kudos
Reply