Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6498 Discussions

Exception occurred during replacer "REPLACER_ID"

Abdul_Aziz__Nurul_Fa
889 Views
python3 mo_tf.py --input_model/home/camaroia5/scripts/PartScannerTrtModel/data/frozen_inference_graph.pb--output_dir Desktop --input_shape [1,300,300,3]--reverse_input_channels--log_level=DEBUG --data_type FP32 --tensorflow_use_custom_operations_config/opt/intel/computer_vision_sdk_2018.5.455/deployment_tools/model_optimizer/extensions/front/tf/ssd_v2_support.json --tensorflow_object_detection_api_pipeline_config/home/camaroia5/scripts/PartScannerTrtModel/data/ssd_inception_v2_coco.config 

Above are my command line to generate model optimizer files, .xml and .bin. As in the command line, I already insert my input_shape but it produce the same error if input_shape was not included. I attach together screen capture of the error. Note that the model used here are ssd_inception_v2 and the model already converted into frozen_inference_graph.

Alternatively, I try some optimization tools and optimized my frozen_inference_graph and produce another pb file called, optimized_inference_graph. I try past it to OpenVINO model optimizer, perhaps it suitable for OpenVINO optimizer but I also get another error. It shows some positive feedback compare with previous one because at least there are some output value produce compare with unknown previously. 

I not sure which model is suitable to used in order to convert it to OpenVINO and what action should be taken upon of this problems. I hope anybody in here could help me in converting my model to OpenVINO optimizer.

0 Kudos
2 Replies
Shubha_R_Intel
Employee
889 Views

Dearest Abdul Aziz, Nurul Fatin Nadiah,

From where did you get your original model ? It's difficult to say what exactly is causing your problem. But the MO FAQ for #38 clearly says this :

38. What does the message "Stopped shape/value propagation at node" mean?

Model Optimizer cannot infer shapes or values for the specified node. It can happen because of a bug in the custom shape infer function, because the node inputs have incorrect values/shapes, or because the input shapes are incorrect.

Which points to your input shapes as the problem, and I see that you are passing in --input_shape [1,300,300,3]. Normally you don't have to pass in a specific input shape for handling the TensorFlow Object Detection API Model Optimizer commands. Please see the below documentation :

https://docs.openvinotoolkit.org/latest/_docs_MO_DG_prepare_model_convert_model_tf_specific_Convert_Object_Detection_API_Models.html

I recommend you try and generate IR first without a custom input shape, and see if that works.

Finally I noticed that you are using a very old version of OpenVino. Please upgrade to the latest R2019.1.0.1 and try again.

Thanks,

Shubha

0 Kudos
Abdul_Aziz__Nurul_Fa
889 Views

First of all, thank you for your fast reply.

I already go thru MO FAQ several times. But I cannot figure out the solution on my problems. Honestly, I cannot understand how to set input shape for MO command lines. Can you explain more about it? I also attached my pbtxt file that I have converted after frozen my inference model to make it easier for me to read the inference graph. 

 

0 Kudos
Reply