- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Trying to run inference with openvino model with yolov5 and get error:
net = ie.read_network(model=model)
File "ie_api.pyx", line 367, in openvino.inference_engine.ie_api.IECore.read_network
File "ie_api.pyx", line 410, in openvino.inference_engine.ie_api.IECore.read_network
RuntimeError: Cannot create Interpolate layer /model.11/Resize id:187 from unsupported opset: opset11
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
what would you do to convert ONNX to IR using Model Optimizer command, if the ONNX model has layers like RESIZE and INTERPOLATE in it? I think I am getting the above error in inference because of these unsupported layers in ONNX which i tried converting to IR.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I used this command for converting ONNX to IR ( yolov5 )
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Farhad,
Thanks for reaching out.
RESIZE is a supported layer for ONNX but INTERPOLATE is not a supported layer in ONNX framework. You may refer to this ONNX Supported Operators.
For the unsupported layer, you may use OpenVINO Extensibility Mechanism to customize a new operation.
Hope this helps.
Regards,
Aznie
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I did run the same ONNX to IR conversion command line a few months ago and it worked.
The error says: "RuntimeError: Cannot create Interpolate layer /model.11/Resize id:187 from unsupported opset: opset11". Is there a version problem? If so please give me a conversion command line with a version that works.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Farhad,
Which OpenVINO version have you been using before? Are you converting with the same OpenVINO version or you are using a different version now?
On another note, you may share your model for me to validate from my end.
Regards,
Aznie
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am using the latest MO as well as the latest openvino-dev . i am running this job on colab which always install the latest versions. the new version of openvino must be incompatible with the older version of MO .
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Farhad,
Starting from the 2020.4 release, OpenVINO™ supports reading native ONNX models. You may use your ONNX model without converting if nothing's wrong with it/supported. Can you try to run your ONNX model with benchmark_app?
For us to further validate this, is it possible for you to share your model?
Regards,
Aznie
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Farhad,
Thank you for your question. If you need any additional information from Intel, please submit a new question as this thread is no longer being monitored.
Regards,
Aznie

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page