python3 mo_tf.py --input_meta_graph ~/Downloads/inception_v2_224_quant_20181026/inception_v2_224_quant.ckpt.meta
Getting below error:
[ ERROR ] Unexpected exception happened during extracting attributes for node case/cond/is_jpeg/Substr/Switch.
Original exception message: 'ascii' codec can't decode byte 0xff in position 0: ordinal not in range(128)
Kindly make sure that the model you are trying to optimize using the mentioned command is a non-frozen model.
Incase you downloaded the model from Supported Frozen Quantized Topologies of Converting a TensorFlow* Model then it is a frozen model and to generate IR using it, kindly use the following command
python mo_tf.py –input_model <model_dir>/inception_v2_224_quant_frozen.pb –input_shape [1,224,224,3].
You are right I am using Quantized Topologies but in that directory it contains both files frozen and non-frozen. As it is getting required files for non-frozen, it shouldn't work?
If inception_v2_224_quant.ckpt.meta is not a non-frozen file then please give me link of non-frozen files.
My understanding is .meta, .index , checkpoint, .ckpt-data are non-frozen files and .pb comes in frozen. Correct me if I'm wrong.
As per Converting a TensorFlow* Model for the Supported Frozen Quantized Topologies, the frozen model file (.pb file) should be fed to the Model Optimizer.
You are right regarding the non-frozen files.
Can you please explain what the below command does. It is under point 2 of Loading Non-Frozen Models to the Model Optimizer link.
python3 mo_tf.py --input_meta_graph <INPUT_META_GRAPH>.meta