- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi, I used valgrind to check my program for memory leaks. Besides the already reported definitely lost leak I got another probable leak in InferenceEngine::Core::LoadNetwork. I looked into a possibility to free the memory somehow with the provided API but I did not find a way to do that. I used OpenVino 2021.4.1.
Valgrind report:
160 bytes in 1 blocks are still reachable in loss record 20 of 29
at 0x4C3217F: operator new(unsigned long) (in /usr/lib/valgrind/vgpreload_memcheck-amd64-linux.so)
by 0x69D49EF: InferenceEngine::ExecutorManager::getIdleCPUStreamsExecutor(InferenceEngine::IStreamsExecutor::Config const&) (in libinference_engine.so)
by 0x7D21173: ???
by 0x699E420: InferenceEngine::IInferencePlugin::LoadNetwork(InferenceEngine::CNNNetwork const&, std::map<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::less<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::allocator<std::pair<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > > > > const&, std::shared_ptr<InferenceEngine::RemoteContext> const&) (in libinference_engine.so)
by 0x6996354: InferenceEngine::IInferencePlugin::LoadNetwork(InferenceEngine::CNNNetwork const&, std::map<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::less<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::allocator<std::pair<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > > > > const&) (in libinference_engine.so)
by 0x69FD723: ??? (in libinference_engine.so)
by 0x6A01142: ??? (in libinference_engine.so)
by 0x6A2B2A7: InferenceEngine::Core::LoadNetwork(InferenceEngine::CNNNetwork const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::map<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::less<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::allocator<std::pair<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > > > > const&) (in libinference_engine.so)
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Matpe333,
Thanks for reaching out to us.
We sincerely apologize for any inconvenience you may have experienced.
For your information, we are aware of this issue and our developers are trying to fix it in the future release.
Your continued support is appreciated.
Regards,
Wan
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Wan,
thanks for your reply. Glad to hear that you are already working on it
Regards,
Matpe333
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Matpe333,
Thanks for your question!
If you need any additional information from Intel, please submit a new question as this thread is no longer being monitored.
Regards,
Wan
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Matpe333,
Could you please try the latest OpenVINO which is 2021.4.2 which should have the fix for a memory leak.
Regards,
Wan
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Matpe333,
Thanks for your question.
This thread will no longer be monitored since we have provided a solution.
If you need any additional information from Intel, please submit a new question.
Regards,
Wan

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page