Model Optimizer arguments: Common parameters: - Path to the Input Model: /home/ubuntu/tensorflow/exported_graphs/frozen_inference_graph.pb - Path for generated IR: /home/ubuntu/openvino/ir/ - IR output name: frozen_inference_graph - Log level: ERROR - Batch: Not specified, inherited from the model - Input layers: Not specified, inherited from the model - Output layers: Not specified, inherited from the model - Input shapes: Not specified, inherited from the model - Mean values: Not specified - Scale values: Not specified - Scale factor: Not specified - Precision of IR: FP32 - Enable fusing: True - Enable grouped convolutions fusing: True - Move mean values to preprocess section: False - Reverse input channels: False TensorFlow specific parameters: - Input model in text protobuf format: False - Path to model dump for TensorBoard: None - List of shared libraries with TensorFlow custom layers implementation: None - Update the configuration file with input/output node names: None - Use configuration file used to generate the model with Object Detection API: /home/ubuntu/tensorflow/trained_model/ssd_inception_v2_coco_2018_01_28/pipeline.config - Use the config file: None Model Optimizer version: The Preprocessor block has been removed. Only nodes performing mean value subtraction and scaling (if applicable) are kept. [ ANALYSIS INFO ] Your model looks like TensorFlow Object Detection API Model. Check if all parameters are specified: --tensorflow_use_custom_operations_config --tensorflow_object_detection_api_pipeline_config --input_shape (optional) --reverse_input_channels (if you convert a model to use with the Inference Engine sample applications) Detailed information about conversion of this model can be found at https://docs.openvinotoolkit.org/latest/_docs_MO_DG_prepare_model_convert_model_tf_specific_Convert_Object_Detection_API_Models.html [ ERROR ] Exception occurred during running replacer "REPLACEMENT_ID" (): Something bad has happened with graph! Data node "Preprocessor/mul" has 2 producers