Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
Beginner
407 Views

OPENVINO ONNX Model with OPEST-11

Jump to solution

Hi, 

Currently, I am working on a custom network design of SSD (Single shot detection) using PyTorch. SSD requires Non Maximum Suppression (NMS) on its output layers. I am using torch.export.onnx method to export my model. 
I have already exported my model using onnx opset-11, since nms is only supported on opset > 9.

I have successfully optimized my model using OPENVINO optimizer (mo_onnx.py). 

I encountered an Error once I tried to load my IR using OPENVINO(Image attached),

Screenshot from 2020-07-06 22-37-34.png

When I visualize both models (ONNX opset-9 and onnx opset-11), i can see that there are some differences in the input shape of clip layer.  

11.jpg 

9.jpg


Any solution or suggestion to solve this problem? and how can the model pass openvino optimization, then it fails at loading? 

I am using latest openvino 2020.R3, and I tried using opset-10 and it doesn't work as well.

0 Kudos

Accepted Solutions
Highlighted
Moderator
301 Views

Hi @ibrahimsoliman 

Inference Engine now supports NMS layer by CPU plugin via Extensibility mechanism and also by Shape Inference feature.
Please have a try.

View solution in original post

0 Kudos
6 Replies
Moderator
393 Views

Hi @ibrahimsoliman 
MobileNet SSD PyTorch topology has not been officially validated for ONNX-IR conversion via Model Optimizer, so unfortunately we cannot guarantee it is being workable with OpenVINO toolkit. You can find the list of supported PyTorch topologies here https://docs.openvinotoolkit.org/latest/_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_O...

We apologize for the inconvenience.

Best regards, Max.

 

0 Kudos
Highlighted
Beginner
362 Views

Ok, thanks.
Just want to mention that I removed NMS from SSD Pytorch then I can optimize and infer using OpenVINO. but unfortunately, I need to manually apply NMS on my output externally from my model inference.

0 Kudos
Highlighted
Moderator
354 Views

Hi @ibrahimsoliman 

Thank you for sharing this information with OpenVINO community!

0 Kudos
Highlighted
Beginner
328 Views

Hi, I have checked the release note for OpenVINO 2020.4, i noticed the support for clip-11 has been added.

Screenshot from 2020-07-18 09-39-22.png

 

I successfully run onnx model optimizer again including NMS, 
But when i load the network i got different error :

RuntimeError: get_shape was called on a descriptor::Tensor with dynamic shape

Screenshot from 2020-07-18 09-48-16.png

0 Kudos
Highlighted
Moderator
302 Views

Hi @ibrahimsoliman 

Inference Engine now supports NMS layer by CPU plugin via Extensibility mechanism and also by Shape Inference feature.
Please have a try.

View solution in original post

0 Kudos
Highlighted
Moderator
240 Views

Intel will no longer monitor this thread since we have provided a solution.  If you need any additional information from Intel, please submit a new question.

0 Kudos