Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

openvino inference

Farhâd
Novice
2,382 Views

Trying to run inference with openvino model with yolov5 and get error:

 

net = ie.read_network(model=model)
File "ie_api.pyx", line 367, in openvino.inference_engine.ie_api.IECore.read_network
File "ie_api.pyx", line 410, in openvino.inference_engine.ie_api.IECore.read_network
RuntimeError: Cannot create Interpolate layer /model.11/Resize id:187 from unsupported opset: opset11

0 Kudos
10 Replies
Farhâd
Novice
2,361 Views

what would you do to convert ONNX to IR using Model Optimizer command, if the ONNX model has layers like RESIZE and INTERPOLATE in it? I think I am getting the above error in inference because of these unsupported layers in ONNX which i tried converting to IR.

0 Kudos
Farhâd
Novice
2,360 Views

Screenshot 2023-06-22 at 12-56-36 best_6-21-23_opset 10.xml.png

0 Kudos
Farhâd
Novice
2,358 Views

I used this command for converting ONNX to IR ( yolov5 )

mo --input_model best_.onnx --model_name best_IR -s 255 --input_shape=[1,3,640,640] --input=images --reverse_input_channels --output /model.24/m.0/Conv,/model.24/m.1/Conv,/model.24/m.2/Conv
0 Kudos
Aznie_Intel
Moderator
2,335 Views

Hi Farhad,

 

Thanks for reaching out.

 

RESIZE is a supported layer for ONNX but INTERPOLATE is not a supported layer in ONNX framework. You may refer to this ONNX Supported Operators.

 

For the unsupported layer, you may use OpenVINO Extensibility Mechanism to customize a new operation.

 

Hope this helps.

 

 

Regards,

Aznie


0 Kudos
Farhâd
Novice
2,329 Views

I did run the same ONNX to IR conversion command line a few months ago and it worked.

The error says: "RuntimeError: Cannot create Interpolate layer /model.11/Resize id:187 from unsupported opset: opset11". Is there a version problem? If so please give me a conversion command line with a version that works.

0 Kudos
Aznie_Intel
Moderator
2,275 Views

Hi Farhad,

 

Which OpenVINO version have you been using before? Are you converting with the same OpenVINO version or you are using a different version now?

 

On another note, you may share your model for me to validate from my end.

 

 

Regards,

Aznie


0 Kudos
Farhâd
Novice
2,231 Views

I am using the latest MO as well as the latest openvino-dev .  i am running this job on colab which always install the latest versions. the new version of openvino must be incompatible with the older version of MO .

! pip install openvino-dev
! git clone --depth 1 https://github.com/openvinotoolkit/openvino.git
 
! mo --input_model best.onnx --model_name best -s 255 --input_shape=[1,3,640,640] --input=images --reverse_input_channels --output /model.24/m.0/Conv,/model.24/m.1/Conv,/model.24/m.2/Conv
 
 
0 Kudos
Aznie_Intel
Moderator
2,208 Views

Hi Farhad,

 

Starting from the 2020.4 release, OpenVINO™ supports reading native ONNX models. You may use your ONNX model without converting if nothing's wrong with it/supported. Can you try to run your ONNX model with benchmark_app?

 

For us to further validate this, is it possible for you to share your model?

 

 

Regards,

Aznie


0 Kudos
Farhâd1
Novice
2,098 Views

My question is about openvino IR not ONNX.

0 Kudos
Aznie_Intel
Moderator
2,113 Views

Hi Farhad,


Thank you for your question. If you need any additional information from Intel, please submit a new question as this thread is no longer being monitored. 



Regards,

Aznie


0 Kudos
Reply