Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Androsch__Horst
Beginner
128 Views

Release 2020.1 bug report - fusing replacer / Model Optimizer Tensorflow conversion error

Hello !

OpenVINO version:   2020.1

MO command line:

python3 mo_tf.py --input_model /media/xavier/Modelle2020/mobilenetv2_deeplabv3/mobilev2deep.pb --input 0:MobilenetV2/Conv/Conv2D --input_shape [1,481,641,3] --output ResizeBilinear_2 --output_dir /media/xavier/Modelle2020/openvino/mobilenetv2_deeplabv3

.PB File: As .ZIP File attached

Error Output after mo-tf:

[ ERROR ]  Exception occurred during running replacer "fusing" (<class 'extensions.middle.fusings.Fusing'>): After partial shape inference were found shape collision for node Add2_ (old shape: [  1 241 321  32], new shape: [  1 241 321  -1])

Before 2020.1 I was using 2019.3 there the Outputfrom mo-tf was:

List of operations that cannot be converted to Inference Engine IR:
[ ERROR ]      FusedBatchNormV3 (59)

I got the Information that the FusedBatchNormV3 Problem would be saved in 2020.1 and so it was. But since 2020.1 there is the new "Add2_" Problem described above.

Please help if you can.

Greetings !

 

 

0 Kudos
1 Reply
Max_L_Intel
Moderator
128 Views

Hello, Horst Androsch.

OpenVINO toolkit does not support FusedBatchNormV3 layer conversion. Please find the following topic for the workaround provided by one of users - https://software.intel.com/en-us/forums/intel-distribution-of-openvino-toolkit/topic/831459

Also, please take a look at this similar issue reported for ONNX model conversion - https://software.intel.com/en-us/forums/intel-distribution-of-openvino-toolkit/topic/832585

Hope this helps. 

Reply