Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6403 Discussions

Cannot infer shapes or values for node "Where_837" when converting from PyTorch to OpenVINO IR Model

Vishnu_T
Novice
2,436 Views

Pytorch to openvino model optimizer intermediate representation conversion.

Successfully convert pytorch model file (.pt) to ONNX file(.onnx).
I am facing the following error while converting ONNX file (.onnx) to openvino model optimizer intermediate representation(.xml,.bin).

Error :

[ ERROR ] Cannot infer shapes or values for node "Where_837".
[ ERROR ] There is no registered "infer" function for node "Where_837" with op = "Where". Please implement this function in the extensions.

Command : 

python mo_onnx.py --input_model yolov5s.onnx --input_shape [1,3,480,640] --data_type FP32

 

0 Kudos
1 Solution
JesusE_Intel
Moderator
2,290 Views

Hi Vishnu,


The error in the log is related to the Where layer. Could you try upgrading to the latest release 2020.4? The Where layer is now listed as supported for MXNet, TensorFlow and ONNX frameworks. If you continue to see an error converting, please attach the log files and your model.


https://docs.openvinotoolkit.org/latest/openvino_docs_MO_DG_prepare_model_Supported_Frameworks_Layers.html#onnx_supported_operators


Regards,

Jesus


View solution in original post

8 Replies
IntelSupport
Moderator
2,419 Views

Hi Vishnu_T,


The error you are seeing is due to the Where layer not being supported in OpenVINO. Please take a look at the ONNX supported layers list. Could you try replacing that layer with one that is on the supported list?


https://docs.openvinotoolkit.org/latest/_docs_MO_DG_prepare_model_Supported_Frameworks_Layers.html#onnx_supported_operators


Regards,

Jesus



0 Kudos
Vishnu_T
Novice
2,356 Views

Thanks for your reply.

I couldn't find the supporting layer similar to Where layer ONNX Framework.

In MxNet framework Where layers is available equivalent Intermediate Representation is Select layer. 

The below link contains the framework layers and equivalent IR representation.

https://docs.openvinotoolkit.org/2019_R3.1/_docs_MO_DG_prepare_model_Supported_Frameworks_Layers.html

My ONNX file I replace the Select layer instead of Where layer. But I am getting Framework error after modification.  

Please suggest how to find the supporting layer for Where layer.

0 Kudos
IntelSupport
Moderator
2,336 Views

Hi Vishnu_T,


The model optimizer is not able to read the model, the model is either corrupted or may have not been converted from pytorch to onnx successfully. Please check the model and try running the model optimize with the --log_level=DEBUG flag and provide the output.


Regards,

Jesus


0 Kudos
Vishnu_T
Novice
2,306 Views

Hi Jesus,

   Sorry for late reply.

  Converting model from pytorch to ONNX is successful.

  I enabled the log_level in debug mode as per your suggestion.

   I have attached the log file for your perusal.

Thanks

Vishnu T

0 Kudos
JesusE_Intel
Moderator
2,291 Views

Hi Vishnu,


The error in the log is related to the Where layer. Could you try upgrading to the latest release 2020.4? The Where layer is now listed as supported for MXNet, TensorFlow and ONNX frameworks. If you continue to see an error converting, please attach the log files and your model.


https://docs.openvinotoolkit.org/latest/openvino_docs_MO_DG_prepare_model_Supported_Frameworks_Layers.html#onnx_supported_operators


Regards,

Jesus


Vishnu_T
Novice
2,283 Views

Hi Jesus,

   Thanks for your continuous support.

   I will check with latest version and let me know the status.

Thanks

Vishnu T

Vishnu_T
Novice
2,210 Views

New version of openvino (2020.4) solves the issue.

Thanks quick response and updated version to solve this issue.

0 Kudos
JesusE_Intel
Moderator
2,240 Views

Intel will no longer monitor this thread since we have provided a solution. If you need any additional information from Intel, please submit a new question.


Reply