- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I'm trying to load my pre-trained deeplearning model into the plugin loaded for CPU as target device.
When I try to load the network into the plugin using: auto executable_network = plugin.LoadNetwork(network, {}); I have a memory location exception throw.
See attached.
Do you have an idea on how to fixe this ?
Thank you.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I run into almost the same issue.
I build a c++ library based on the "object_detection_demo_ssd_async" sample. The program work with NSC2. However, it show the memory access error when loading network into CPU plugin.
Is there anything specific to CPU plugin that I missed on the c++ library ?
The environments are: openvino 2019 R1, windows 10, c++ library, C# application, ssd network.
I was be to to run CPU on the sample program "object_detection_demo_ssd_async", But not on the library that I build.
Thanks,
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Dear Delfrate, Jacques and Lee, Terry,
The best way to diagnose these types of issues is to build a debug version of IE from the opensource github, here .
Exceptions such as the type you describe ("memory access violation") could be caused by any number of reasons so it would be impossible to debug on a forum. However if you build a DEBUG version of Inference Engine using the README you will be able to figure out why this error is occuring as you step through your code.
Make sure to re-generate your IR using the github MO first.
Thanks for using OpenVino.
Shubha
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page