Community
cancel
Showing results for 
Search instead for 
Did you mean: 
GUPTA__ANMOL
Beginner
317 Views

OpenVino 2018.3 Model Optimizer raises error while generating IR for TensorFlow MobileNet v1 and v2

Hi,

I am unable to generate IR for MobileNet v1 and v2 Networks (In TensorFlow) implemented using Model Optimizer

I picked up the topologies present in the link:

https://software.intel.com/en-us/articles/OpenVINO-Using-TensorFlow

On the Same Link I followed the instructions specified in the section "Loading Non-Frozen Models to the Model Optimizer"

Since tar model folder had .meta files (and no checkpoint file), I followed the command 

mo_tf.py --input_meta_graph <INPUT_META_GRAPH>.meta

On running the command, I started getting following errors:

Model Optimizer version: 1.2.185.5335e231
[ ERROR ]  Cannot load input model: Unsuccessful TensorSliceReader constructor: Failed to find any matching files for mobilenet_v1_1.0_224
[[Node: save/RestoreV2_366 = RestoreV2[dtypes=[DT_FLOAT], _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_save/Const_0_0, save/RestoreV2_366/tensor_names, save/RestoreV2_366/shape_and_slices)]]
 
Thanks
0 Kudos
3 Replies
Monique_J_Intel
Employee
317 Views

Hi Anmol,

Regarding mobilenet_v1_1.0_224,

I reproduced your issue and this is a bug that we will fix. In the meantime, you can run the following and convert the model:

mo_tf.py --input_model mobilenet_v1_1.0_224_frozen.pb --input_shape [1,224,224,3]

Regarding mobilenet_v2_1.0_224,

I reproduced your issue and this is a bug as well as there is an issue interpreting the 'InfeedEnqueueTuple' node before successfully freezing the model. I also tried to convert the frozen .pb file and ran into an error where the input shapes cannot be inferred from node "MobilenetV2/Conv/Conv2D". The work around for this model is to download freeze_graph.py from here. Run the following command to freeze the .pbtxt file:

python3 freeze_graph.py --input_graph mobilenet_v2_1.0_224_eval.pbtxt --input_checkpoint mobilenet_v2_1.0_224.ckpt --binary=false --output_node_names=MobilenetV2/Predictions/Reshape_1 --output_graph mobilenet_v2_frozen.pb

Now, you have your frozen mobilenet v2 model and can convert it running this:

python3 mo.py --input_model mobilenet_v2_frozen.pb --input_shape [1,224,224,3]

Kind Regards,

Monique Jones

 

Kolla__Tarun
Beginner
317 Views

Hey, I kind of have the same problem. I have a unfrozen model and I'm running the following command:

python3 mo_tf.py --input_meta_graph /Users/tarunkolla/Desktop/model/model.ckpt.meta

Following the documentation to load unfrozen model from https://docs.openvinotoolkit.org/2019_R1/_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_TensorFlow.html#freeze-the-tensorflow-model

I get the following error:

[ ERROR ]  The directory "/opt/intel/openvino_2019.1.090/deployment_tools/model_optimizer/." is not writable

Could you help. Thank you.

Shubha_R_Intel
Employee
317 Views

Dear Kolla, Tarun,

You either need to open up the shell from which you instantiate the Model Optimizer python script as Administrator (or sudo) or you need to use the --output_dir switch to dump the generated IR into a directory where you have write permissions. 

Thanks,

Shubha

 

Reply