Intel® Distribution of OpenVINO™ Toolkit
Community support and discussions about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all things computer vision-related on Intel® platforms.

query_network() and supported layers

sushruthnagesh
Beginner
239 Views

I was trying to run inference on a mobilenet v1 customised model. I observed that the layers output by ie.query_network(net,device) and net.layers.keys() are different. I know that this means that some layers are not supported for inference on that particular device. 

My question is despite this, net.infer() doesn't throw an error and runs successfully. I wanted to know if a layer is not supported on a particular device then will that layer be run on CPU ? Or how does the whole process happen in this case.

0 Kudos
1 Solution
Iffa_Intel
Moderator
219 Views

Greetings,


If you are Running the sample application on hardware other than CPU, it requires performing additional hardware configuration steps.


If your model does runs in CPU in spite of the compatibility, it might be a lucky shot. However, you might face some other problems when running your application afterwards.


You can refer to the model optimizer to convert (your model) section here:

https://software.intel.com/content/www/us/en/develop/tools/openvino-toolkit/get-started.html



Sincerely,

Iffa


View solution in original post

1 Reply
Iffa_Intel
Moderator
220 Views

Greetings,


If you are Running the sample application on hardware other than CPU, it requires performing additional hardware configuration steps.


If your model does runs in CPU in spite of the compatibility, it might be a lucky shot. However, you might face some other problems when running your application afterwards.


You can refer to the model optimizer to convert (your model) section here:

https://software.intel.com/content/www/us/en/develop/tools/openvino-toolkit/get-started.html



Sincerely,

Iffa


View solution in original post

Reply