Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

[VPU] Pytorch to onnx How to fix Unsupported layers ['input.1', '20/Output_0/Data__const'] error

Tahar-berrabah__Samy
852 Views

Hi team,

I have created my own model with PyTorch and then I wanted to use it with my brand new neural compute stick 2.

So I have converted the *.pt model to Onnx one and then I did convert it with the model optimizer tool MO.

Here is the command line logs.

-------------------------------------

Model Optimizer arguments:
Common parameters:
    - Path to the Input Model:     /Users/samo/Documents/1_formation/pytorch/tryout/Onnx_model/onnx_model_traffic_sign_classifier.onnx
    - Path for generated IR:     /Users/samo/Documents/1_formation/pytorch/tryout/.
    - IR output name:     onnx_model_traffic_sign_classifier
    - Log level:     ERROR
    - Batch:     Not specified, inherited from the model
    - Input layers:     Not specified, inherited from the model
    - Output layers:     Not specified, inherited from the model
    - Input shapes:     Not specified, inherited from the model
    - Mean values:     Not specified
    - Scale values:     Not specified
    - Scale factor:     Not specified
    - Precision of IR:     FP16
    - Enable fusing:     True
    - Enable grouped convolutions fusing:     True
    - Move mean values to preprocess section:     False
    - Reverse input channels:     False
ONNX specific parameters:
Model Optimizer version:     2019.3.0-408-gac8584cb7

-----------------------------

I did not get any error while converting the model however when I check the supported layer I see that there is an unsupported layer. Unsupported layers found: ['input.1', '20/Output_0/Data__const'].

 

Does someone have an idea on how to fix this?

Thank you very much,

Samy

-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

MKLDNNPlugin: 2.1.32974
supported layers: ('11', 'MYRIAD')
supported layers: ('12', 'MYRIAD')
supported layers: ('13', 'MYRIAD')
supported layers: ('14', 'MYRIAD')
supported layers: ('15', 'MYRIAD')
supported layers: ('16', 'MYRIAD')
supported layers: ('17', 'MYRIAD')
supported layers: ('18', 'MYRIAD')
supported layers: ('19', 'MYRIAD')
supported layers: ('21', 'MYRIAD')
supported layers: ('22', 'MYRIAD')
supported layers: ('23', 'MYRIAD')
supported layers: ('24', 'MYRIAD')
Unsupported layers found: ['input.1', '20/Output_0/Data__const']
Check whether extensions are available to add to IECore.

0 Kudos
4 Replies
David_C_Intel
Employee
852 Views

Hi Samy,

Thanks for reaching out.

We recommend you updating the OpenVINO™ toolkit to the latest version, you can download it here. Also, it seems you are using a classification model, try checking our samples and demos in the Resources tab, and test your model with a classification sample.

Please let us know if the issue continues.

Best regards,

David

0 Kudos
Tahar-berrabah__Samy
852 Views

Hi David,

Thanks for your reply,

I have tried with the device set to CPU it is working when it is Myriad ( so VPU if I am not mistaking) I will try to update however I have downloaded the kit recently and I also participated to the Intel Udacity scholarship, unfortunately, didn't reach the phase 2.
 

What should I do when the layer is not supported I know that you can create a custom one but not sure how.

 

Best,

Samy

0 Kudos
David_C_Intel
Employee
852 Views

Hi Samy,

When a layer is supported in a device, but not in other device (for example supported in CPU, but not in VPU), you can use the Heterogenous Plugin. It will let you run inference for each layer in your model by setting a primary and a fallback device for backup. Additionally, you can check the supported layers here.

Regards,

David 

0 Kudos
Tahar-berrabah__Samy
852 Views

Thanks, David I will look at the resources.

Best,

Samy

 

0 Kudos
Reply