Intel® Distribution of OpenVINO™ Toolkit
Community support and discussions about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all things computer vision-related on Intel® platforms.

Help in converting this model to IR.


I tried converting the model using the command below:-

>python --input_model C:\Users\tarunmis\Downloads\tampering\mxnet_exported.onnx --output_dir C:\Users\tarunmis\Downloads\tampering --input "data" --input_shape (1,3,112,112) --output "fc1"

I am getting the following error:-

Model Optimizer arguments:
Common parameters:
- Path to the Input Model: C:\Users\tarunmis\Downloads\tampering\mxnet_exported.onnx
- Path for generated IR: C:\Users\tarunmis\Downloads\tampering
- IR output name: mxnet_exported
- Log level: ERROR
- Batch: Not specified, inherited from the model
- Input layers: data
- Output layers: fc1
- Input shapes: (1,3,112,112)
- Mean values: Not specified
- Scale values: Not specified
- Scale factor: Not specified
- Precision of IR: FP32
- Enable fusing: True
- Enable grouped convolutions fusing: True
- Move mean values to preprocess section: None
- Reverse input channels: False
ONNX specific parameters:
- Inference Engine found in: C:\Program Files (x86)\Intel\openvino_2021\python\python3.7\openvino
Inference Engine version: 2021.4.0-3839-cd81789d294-releases/2021/4
Model Optimizer version: 2021.4.0-3839-cd81789d294-releases/2021/4
[ ERROR ] Cannot infer shapes or values for node "pre_fc1/WithoutBiases".
[ ERROR ] MatMul input shapes are incorrect. COL_INDEX_DIMs are not equal. Node: pre_fc1/WithoutBiases. Shapes: [array([ 1, 512, 1, 1], dtype=int64), array([ 1, 512, 512, 128], dtype=int64)]
[ ERROR ] It can happen due to bug in custom shape infer function <function MatMul.infer at 0x000002394B16C708>.
[ ERROR ] Or because the node inputs have incorrect values/shapes.
[ ERROR ] Or because input shapes are incorrect (embedded to the model or passed via --input_shape).
[ ERROR ] Run Model Optimizer with --log_level=DEBUG for more information.
[ ERROR ] Exception occurred during running replacer "REPLACEMENT_ID" (<class 'extensions.middle.PartialInfer.PartialInfer'>): Stopped shape/value propagation at "pre_fc1/WithoutBiases" node.
For more information please refer to Model Optimizer FAQ, question #38. (


Please see the attcahed model

0 Kudos
2 Replies
Community Manager

Hi TarunM,

Can I know what is the type of your ONNX model and where is the source of your model? According to the error, it shows that Model Optimizer cannot infer shapes or values for the specified node. It can happen because of a bug in the custom shape infer function, because the node inputs have incorrect values/shapes, or because the input shapes are incorrect.


The MatMul operation might be incorrect. You can refer to Using Shape Inference article on how to set the input shape. Meanwhile, I tested your model using OpenVINO and encountered

"Argument element types are inconsistent" error message. This might relate to the incorrect input shape is your model.




Community Manager

Hi TarunM,

Thank you for your question. If you need any additional information from Intel, please submit a new question as this thread is no longer being monitored.