Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Elul__Eliav
Beginner
480 Views

Error convert TF2 model

Jump to solution

Hi,

I try to convert Mobilenet model was retrained with TF2 to OpenVino model.

I use the "SavedModel" loading for loading the TF model with this command:

"python mo.py --framework tf --saved_model_dir <Path to model>\model --model_name ir_1 --output_dir <Path to model>\Out"

and I get this error:

[ ERROR ]  Unexpected exception happened during extracting attributes for node conv_pw_13_bn/moving_variance/Read/ReadVariableOp.
Original exception message: 'ascii' codec can't decode byte 0xca in position 1: ordinal not in range(128)

 

When I converted similar model with frozen graph loading and Tensorflow 1.13 everything was right.

Is known about a problem with Mobilenet layers that was trained with TF2?

- Tested with OpenVino 2019 R3.1 and 2019 R2

Thanks,

Eliav

 

 

0 Kudos
1 Solution
JesusE_Intel
Moderator
480 Views

Hi Eliav,

Thanks for reaching out. Currently the Model Optimizer does not support Tensorflow 2.0. However, there has been reports by other users that if you use Tensorflow 1.14 freeze_graph.py to freeze a Tensorflow 2.0 model the Model Optimizer may accept it.

Hope this helps.

Regards,

Jesus

View solution in original post

2 Replies
JesusE_Intel
Moderator
481 Views

Hi Eliav,

Thanks for reaching out. Currently the Model Optimizer does not support Tensorflow 2.0. However, there has been reports by other users that if you use Tensorflow 1.14 freeze_graph.py to freeze a Tensorflow 2.0 model the Model Optimizer may accept it.

Hope this helps.

Regards,

Jesus

View solution in original post

Anonymous
Not applicable
479 Views

This is not true at all! I have wasted another day based on the fact Intel staff have said that you can convert TF2 models by freezing them in TF1.14. It does not work. 

Reply