Community
cancel
Showing results for 
Search instead for 
Did you mean: 
TarunM
Beginner
50 Views

Error in converting Onnx model to IR

I have attached the model:-

Error i got is below

 

C:\Program Files (x86)\IntelSWTools\openvino_2021.3.394\deployment_tools\model_optimizer>python mo_onnx.py --input_model "C:\Users\tarunmis\Downloads\workspace\exported-models\my_model\model.onnx" --output_dir "C:\Users\tarunmis\Downloads\workspace\exported-models\my_model" --input_shape (1,512,512,3)
Model Optimizer arguments:
Common parameters:
- Path to the Input Model: C:\Users\tarunmis\Downloads\workspace\exported-models\my_model\model.onnx
- Path for generated IR: C:\Users\tarunmis\Downloads\workspace\exported-models\my_model
- IR output name: model
- Log level: ERROR
- Batch: Not specified, inherited from the model
- Input layers: Not specified, inherited from the model
- Output layers: Not specified, inherited from the model
- Input shapes: (1,512,512,3)
- Mean values: Not specified
- Scale values: Not specified
- Scale factor: Not specified
- Precision of IR: FP32
- Enable fusing: True
- Enable grouped convolutions fusing: True
- Move mean values to preprocess section: None
- Reverse input channels: False
ONNX specific parameters:
- Inference Engine found in: C:\Program Files (x86)\IntelSWTools\openvino_2021.3.394\python\python3.7\openvino
Inference Engine version: 2.1.2021.3.0-2787-60059f2c755-releases/2021/3
Model Optimizer version: 2021.3.0-2787-60059f2c755-releases/2021/3
C:\Users\tarunmis\AppData\Roaming\Python\Python37\site-packages\urllib3\util\selectors.py:14: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3,and in 3.9 it will stop working
from collections import namedtuple, Mapping
C:\Users\tarunmis\AppData\Roaming\Python\Python37\site-packages\urllib3\_collections.py:2: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3,and in 3.9 it will stop working
from collections import Mapping, MutableMapping
[ ERROR ] Cannot infer shapes or values for node "map/while/Preprocessor/ResizeToRange/cond".
[ ERROR ] There is no registered "infer" function for node "map/while/Preprocessor/ResizeToRange/cond" with op = "If". Please implement this function in the extensions.
For more information please refer to Model Optimizer FAQ, question #37. (https://docs.openvinotoolkit.org/latest/openvino_docs_MO_DG_prepare_model_Model_Optimizer_FAQ.html?q...)
[ ERROR ]
[ ERROR ] It can happen due to bug in custom shape infer function <UNKNOWN>.
[ ERROR ] Or because the node inputs have incorrect values/shapes.
[ ERROR ] Or because input shapes are incorrect (embedded to the model or passed via --input_shape).
[ ERROR ] Run Model Optimizer with --log_level=DEBUG for more information.
[ ERROR ] Cannot infer shapes or values for node "StatefulPartitionedCall/map/while_loop".
[ ERROR ] Stopped shape/value propagation at "map/while/Preprocessor/ResizeToRange/cond" node.
For more information please refer to Model Optimizer FAQ, question #38. (https://docs.openvinotoolkit.org/latest/openvino_docs_MO_DG_prepare_model_Model_Optimizer_FAQ.html?q...)
[ ERROR ]
[ ERROR ] It can happen due to bug in custom shape infer function <function Loop.infer at 0x0000017B03B18048>.
[ ERROR ] Or because the node inputs have incorrect values/shapes.
[ ERROR ] Or because input shapes are incorrect (embedded to the model or passed via --input_shape).
[ ERROR ] Run Model Optimizer with --log_level=DEBUG for more information.
[ ERROR ] Exception occurred during running replacer "REPLACEMENT_ID" (<class 'extensions.middle.PartialInfer.PartialInfer'>): Stopped shape/value propagation at "StatefulPartitionedCall/map/while_loop" node.
For more information please refer to Model Optimizer FAQ, question #38. (https://docs.openvinotoolkit.org/latest/openvino_docs_MO_DG_prepare_model_Model_Optimizer_FAQ.html?q...)

C:\Program Files (x86)\IntelSWTools\openvino_2021.3.394\deployment_tools\model_optimizer>python mo_onnx.py --input_model "C:\Users\tarunmis\Downloads\workspace\exported-models\my_model\model.onnx" --output_dir "C:\Users\tarunmis\Downloads\workspace\exported-models\my_model" --input_shape (1,512,512,3)

0 Kudos
2 Replies
IntelSupport
Community Manager
30 Views

Hi TarunM,

Thanks for reaching out. I have tested your model and from the error, I could say that you are getting the error due to Model Optimizer cannot infer shapes or values of your specific nodes. It can happen because of a bug in the custom shape of your infer function. The node inputs have the incorrect value/shapes.

 

By the way, can I know what is the specific onnx model you are using? You might want to refer to this parameter information from Converting an ONNX* Model for a specific model type.

 

Regards,

Aznie


IntelSupport
Community Manager
10 Views

Hi TarunM,

Thank you for your question. If you need any additional information from Intel, please submit a new question as this thread is no longer being monitored.


Regards,

Aznie


Reply