- 新着としてマーク
- ブックマーク
- 購読
- ミュート
- RSS フィードを購読する
- ハイライト
- 印刷
- 不適切なコンテンツを報告
Hi, I used valgrind to check my program for memory leaks. Besides the already reported definitely lost leak I got another probable leak in InferenceEngine::Core::LoadNetwork. I looked into a possibility to free the memory somehow with the provided API but I did not find a way to do that. I used OpenVino 2021.4.1.
Valgrind report:
160 bytes in 1 blocks are still reachable in loss record 20 of 29
at 0x4C3217F: operator new(unsigned long) (in /usr/lib/valgrind/vgpreload_memcheck-amd64-linux.so)
by 0x69D49EF: InferenceEngine::ExecutorManager::getIdleCPUStreamsExecutor(InferenceEngine::IStreamsExecutor::Config const&) (in libinference_engine.so)
by 0x7D21173: ???
by 0x699E420: InferenceEngine::IInferencePlugin::LoadNetwork(InferenceEngine::CNNNetwork const&, std::map<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::less<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::allocator<std::pair<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > > > > const&, std::shared_ptr<InferenceEngine::RemoteContext> const&) (in libinference_engine.so)
by 0x6996354: InferenceEngine::IInferencePlugin::LoadNetwork(InferenceEngine::CNNNetwork const&, std::map<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::less<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::allocator<std::pair<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > > > > const&) (in libinference_engine.so)
by 0x69FD723: ??? (in libinference_engine.so)
by 0x6A01142: ??? (in libinference_engine.so)
by 0x6A2B2A7: InferenceEngine::Core::LoadNetwork(InferenceEngine::CNNNetwork const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::map<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::less<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::allocator<std::pair<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > > > > const&) (in libinference_engine.so)
コピーされたリンク
- 新着としてマーク
- ブックマーク
- 購読
- ミュート
- RSS フィードを購読する
- ハイライト
- 印刷
- 不適切なコンテンツを報告
Hi Matpe333,
Thanks for reaching out to us.
We sincerely apologize for any inconvenience you may have experienced.
For your information, we are aware of this issue and our developers are trying to fix it in the future release.
Your continued support is appreciated.
Regards,
Wan
- 新着としてマーク
- ブックマーク
- 購読
- ミュート
- RSS フィードを購読する
- ハイライト
- 印刷
- 不適切なコンテンツを報告
Hi Wan,
thanks for your reply. Glad to hear that you are already working on it
Regards,
Matpe333
- 新着としてマーク
- ブックマーク
- 購読
- ミュート
- RSS フィードを購読する
- ハイライト
- 印刷
- 不適切なコンテンツを報告
Hi Matpe333,
Thanks for your question!
If you need any additional information from Intel, please submit a new question as this thread is no longer being monitored.
Regards,
Wan
- 新着としてマーク
- ブックマーク
- 購読
- ミュート
- RSS フィードを購読する
- ハイライト
- 印刷
- 不適切なコンテンツを報告
Hi Matpe333,
Could you please try the latest OpenVINO which is 2021.4.2 which should have the fix for a memory leak.
Regards,
Wan
- 新着としてマーク
- ブックマーク
- 購読
- ミュート
- RSS フィードを購読する
- ハイライト
- 印刷
- 不適切なコンテンツを報告
Hi Matpe333,
Thanks for your question.
This thread will no longer be monitored since we have provided a solution.
If you need any additional information from Intel, please submit a new question.
Regards,
Wan