Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
Beginner
137 Views

Converting a ONNX Model error

I am trying to convert ONNX model to Openvino using the following commandC:\Program Files (x86)\IntelSWTools\openvino_2020.3.194\deployment_tools\model_optimizer>python mo.py --input_model C:\Users\a\Desktop\test1\model.onnx
C:\Program Files (x86)\IntelSWTools\openvino_2020.3.194\deployment_tools\model_optimizer\mo\main.py:87: SyntaxWarning: "is" with a literal. Did you mean "=="?
  if op is 'k':
Model Optimizer arguments:
Common parameters:
        - Path to the Input Model:      C:\Users\a\Desktop\test1\model.onnx
        - Path for generated IR:        C:\Program Files (x86)\IntelSWTools\openvino_2020.3.194\deployment_tools\model_optimizer\.
        - IR output name:       model
        - Log level:    ERROR
        - Batch:        Not specified, inherited from the model
        - Input layers:         Not specified, inherited from the model
        - Output layers:        Not specified, inherited from the model
        - Input shapes:         Not specified, inherited from the model
        - Mean values:  Not specified
        - Scale values:         Not specified
        - Scale factor:         Not specified
        - Precision of IR:      FP32
        - Enable fusing:        True
        - Enable grouped convolutions fusing:   True
        - Move mean values to preprocess section:       False
        - Reverse input channels:       False
ONNX specific parameters:
Model Optimizer version:
C:\Program Files (x86)\IntelSWTools\openvino_2020.3.194\deployment_tools\model_optimizer\mo\ops\reshape.py:71: RuntimeWarning: divide by zero encountered in longlong_scalars
  undefined_dim = num_of_input_elements // num_of_output_elements
[ ERROR ]  Cannot infer shapes or values for node "innerProduct/WithoutBiases".
[ ERROR ]  MatMul input shapes are incorrect. COL_INDEX_DIMs are not equal. Node: innerProduct/WithoutBiases. Shapes: [array([0, 0], dtype=int64), array([1024,    3], dtype=int64)]
[ ERROR ]
[ ERROR ]  It can happen due to bug in custom shape infer function <function MatMul.infer at 0x0000022DB8A48CA0>.
[ ERROR ]  Or because the node inputs have incorrect values/shapes.
[ ERROR ]  Or because input shapes are incorrect (embedded to the model or passed via --input_shape).
[ ERROR ]  Run Model Optimizer with --log_level=DEBUG for more information.
[ ERROR ]  Exception occurred during running replacer "REPLACEMENT_ID" (<class 'extensions.middle.PartialInfer.PartialInfer'>): Stopped shape/value propagation at "innerProduct/WithoutBiases" node.
 For more information please refer to Model Optimizer FAQ (https://docs.openvinotoolkit.org/latest/_docs_MO_DG_prepare_model_Model_Optimizer_FAQ.html), question #38.

What could cause the error?

Thanks

0 Kudos
3 Replies
Highlighted
Moderator
137 Views

Greetings,

It seems that you have syntax error in your code.

C:\Program Files (x86)\IntelSWTools\openvino_2020.3.194\deployment_tools\model_optimizer\mo\main.py:87: SyntaxWarning: "is" with a literal. Did you mean "=="?

Please help to check your code & ensure you are using the correct operator.

 

for [ ERROR ]  MatMul input shapes are incorrect:

if your model contains more than one input, the Model Optimizer is able to convert the model with inputs specified such as:

input: "data"

input_shape

{

           dim: 1

          dim: 3

          dim: 227

          dim: 227

}

you can see 3 choices of data input that you can use from  https://docs.openvinotoolkit.org/latest/_docs_MO_DG_prepare_model_Model_Optimizer_FAQ.html

ensure that you use exact input shape

 

Sincerely,

Iffa

0 Kudos
Highlighted
Beginner
137 Views

Thanks, This is a model exported from Azure. I was hoping to convert it to OpenVINO and use it with NCS2. 

0 Kudos
Highlighted
Moderator
129 Views

May I know further regarding the model (topology) that you use?

Fyi, here is supported topology :

https://docs.openvinotoolkit.org/latest/_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_O... 

 

Sincerely,

Iffa

0 Kudos