- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
We are observing the inference engine slowing down over time. For example, our measurements show this:
inf time 68.351100 msec
1 hr later
inf time 75.8 msec
10 hrs later
inf time 191.1 msec.
The rate that the inference engine is called is 10 Hz. So for 1 hr of time, the inference engine produced results 10 * 60 * 60 = 36000 xs. Granted that would be a heck of a lot of results after 10 hrs, it's something a customer would do with our system.
We don't see a memory leak (observed using the task manager).
Is there a known bug where this version of OpenVino slows down over time?
- openvino_2021.1.110
- target CPU
- Number of inferences = 1
- data precision: FP32
- C++
Is there an inference engine reset we could try if we notice the processing time increasing? (it would have to be a fast reset because the call rate could be more than 30 Hz).
Thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
Since your model is confidential, it is difficult to scope the root of your issue. In addition to that, OpenVINO 2021.1 is rather outdated if compared to the latest release. It's recommended to migrate to the newer version of OpenVINO.
You may refer to this OpenVINO GitHub repo, this repo has logs on what fixes are implemented into each release.
We tried to find any related bug fixes in our archive but hardly found any. However, you may refer here for known issue.
Starting from OpenVINO 2022 onwards we have ov::shutdown() function to refresh/close all unused dlls of the inference engine.
Note that, since you mentioned that this is a production application and tested continuously for a long duration, the issue could also caused by CPU consumption or any other process/workload on the CPU over time.
Cordially,
Iffa
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
before proceeding further, could you share:
- Which OpenVINO IE sample/demo did you use to test this issue? (please help to share if it's custom)
- Which model is used (share the file if possible)
- What tool did you use to test the IE over time
Cordially,
Iffa
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Iffa
To answer your questions:
- Which OpenVINO IE sample/demo did you use to test this issue? (please help to share if it's custom)
- Ans: This is production code. No demo/sample used to observe this issue.
- Which model is used (share the file if possible)
- Ans:Sorry, proprietary.
- What tool did you use to test the IE over time
- We used QueryPerformanceCounter() before and after the calls to ie_infer_request.hpp's StartAsync() and Wait()
I wish I could replicate in an example to show you, but I'd likely not be able to create a similar model
Could you point me to the openvino_2021.1.110 issue database? Perhaps there's a logged issue like this. I couldn't find anything in the forum.
There doesn't appear to be a memory leak, but is there a way to re-fresh the inference engine without reloading and compiling?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
We will further investigate this and get back to you with possible solution/recommendation.
Cordially,
Iffa
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
Since your model is confidential, it is difficult to scope the root of your issue. In addition to that, OpenVINO 2021.1 is rather outdated if compared to the latest release. It's recommended to migrate to the newer version of OpenVINO.
You may refer to this OpenVINO GitHub repo, this repo has logs on what fixes are implemented into each release.
We tried to find any related bug fixes in our archive but hardly found any. However, you may refer here for known issue.
Starting from OpenVINO 2022 onwards we have ov::shutdown() function to refresh/close all unused dlls of the inference engine.
Note that, since you mentioned that this is a production application and tested continuously for a long duration, the issue could also caused by CPU consumption or any other process/workload on the CPU over time.
Cordially,
Iffa
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks Iffa
We are in the process of upgrading to the latest OpenVino. And thanks for pointing out the ov::shutdown() and looking for a similar issue in the bug database.
Cheers
Eddie
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I discovered a fix, but don't understand how why running one task from another causes the slowdown. What I observed is the memory usage (both commit and working set memory as observed using the windows resource monitory) slowly increasing over time. When memory increased to a point that page swaps started occurring (as evidenced by the disk activity), the inference engine started to slow down.
Our SW was designed like this:
Thread -> Concurrency::task_group (do pre and post processing) -> Concurrency::task_group (do Inference Engine)
This is the syntax for running the pre/post processing and the inference engine threads for example:
Concurrency::task_group m_processTask;
m_processTask.run([this]()
{
m_hrProcess = ProcessInternal();
if (FAILED(m_hrProcess))
{
Log error;
}
});
When I removed the second task_group run([]) call, the memory leak stopped.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
Intel will no longer monitor this thread since we have provided a solution. If you need any additional information from Intel, please submit a new question.
Cordially,
Iffa

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page