Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

Error convert TF2 model

Elul__Eliav
Beginner
1,596 Views

Hi,

I try to convert Mobilenet model was retrained with TF2 to OpenVino model.

I use the "SavedModel" loading for loading the TF model with this command:

"python mo.py --framework tf --saved_model_dir <Path to model>\model --model_name ir_1 --output_dir <Path to model>\Out"

and I get this error:

[ ERROR ]  Unexpected exception happened during extracting attributes for node conv_pw_13_bn/moving_variance/Read/ReadVariableOp.
Original exception message: 'ascii' codec can't decode byte 0xca in position 1: ordinal not in range(128)

 

When I converted similar model with frozen graph loading and Tensorflow 1.13 everything was right.

Is known about a problem with Mobilenet layers that was trained with TF2?

- Tested with OpenVino 2019 R3.1 and 2019 R2

Thanks,

Eliav

 

 

0 Kudos
1 Solution
JesusE_Intel
Moderator
1,596 Views

Hi Eliav,

Thanks for reaching out. Currently the Model Optimizer does not support Tensorflow 2.0. However, there has been reports by other users that if you use Tensorflow 1.14 freeze_graph.py to freeze a Tensorflow 2.0 model the Model Optimizer may accept it.

Hope this helps.

Regards,

Jesus

View solution in original post

0 Kudos
2 Replies
JesusE_Intel
Moderator
1,597 Views

Hi Eliav,

Thanks for reaching out. Currently the Model Optimizer does not support Tensorflow 2.0. However, there has been reports by other users that if you use Tensorflow 1.14 freeze_graph.py to freeze a Tensorflow 2.0 model the Model Optimizer may accept it.

Hope this helps.

Regards,

Jesus

0 Kudos
Anonymous
Not applicable
1,595 Views

This is not true at all! I have wasted another day based on the fact Intel staff have said that you can convert TF2 models by freezing them in TF1.14. It does not work. 

0 Kudos
Reply