Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

query_network() and supported layers

sushruthnagesh
Beginner
521 Views

I was trying to run inference on a mobilenet v1 customised model. I observed that the layers output by ie.query_network(net,device) and net.layers.keys() are different. I know that this means that some layers are not supported for inference on that particular device. 

My question is despite this, net.infer() doesn't throw an error and runs successfully. I wanted to know if a layer is not supported on a particular device then will that layer be run on CPU ? Or how does the whole process happen in this case.

0 Kudos
1 Solution
Iffa_Intel
Moderator
501 Views

Greetings,


If you are Running the sample application on hardware other than CPU, it requires performing additional hardware configuration steps.


If your model does runs in CPU in spite of the compatibility, it might be a lucky shot. However, you might face some other problems when running your application afterwards.


You can refer to the model optimizer to convert (your model) section here:

https://software.intel.com/content/www/us/en/develop/tools/openvino-toolkit/get-started.html



Sincerely,

Iffa


View solution in original post

0 Kudos
1 Reply
Iffa_Intel
Moderator
502 Views

Greetings,


If you are Running the sample application on hardware other than CPU, it requires performing additional hardware configuration steps.


If your model does runs in CPU in spite of the compatibility, it might be a lucky shot. However, you might face some other problems when running your application afterwards.


You can refer to the model optimizer to convert (your model) section here:

https://software.intel.com/content/www/us/en/develop/tools/openvino-toolkit/get-started.html



Sincerely,

Iffa


0 Kudos
Reply