Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

Bug R5 with


I'm trying to use R5 model optimizer to convert a frozen tensorflow model (pb text).  I've attached the graph as a zip file.

This bug report was generated:

(tf112_mkl_p36) [bduser@merlin-param01 frozen_tensorflow_model]$ python /opt/intel/computer_vision_sdk/deployment_tools/model_optimizer/ --input_model unet_model_for_inference_dice08771.pb --input_model_is_text
Model Optimizer arguments:
Common parameters:
	- Path to the Input Model: 	/home/bduser/tony/unet/single-node/frozen_tensorflow_model/unet_model_for_inference_dice08771.pb
	- Path for generated IR: 	/home/bduser/tony/unet/single-node/frozen_tensorflow_model/.
	- IR output name: 	unet_model_for_inference_dice08771
	- Log level: 	ERROR
	- Batch: 	Not specified, inherited from the model
	- Input layers: 	Not specified, inherited from the model
	- Output layers: 	Not specified, inherited from the model
	- Input shapes: 	Not specified, inherited from the model
	- Mean values: 	Not specified
	- Scale values: 	Not specified
	- Scale factor: 	Not specified
	- Precision of IR: 	FP32
	- Enable fusing: 	True
	- Enable grouped convolutions fusing: 	True
	- Move mean values to preprocess section: 	False
	- Reverse input channels: 	False
TensorFlow specific parameters:
	- Input model in text protobuf format: 	True
	- Offload unsupported operations: 	False
	- Path to model dump for TensorBoard: 	None
	- List of shared libraries with TensorFlow custom layers implementation: 	None
	- Update the configuration file with input/output node names: 	None
	- Use configuration file used to generate the model with Object Detection API: 	None
	- Operations to offload: 	None
	- Patterns to offload: 	None
	- Use the config file: 	None
Model Optimizer version:
[ ERROR ]  -------------------------------------------------
[ ERROR ]  ----------------- INTERNAL ERROR ----------------
[ ERROR ]  Unexpected exception happened.
[ ERROR ]  Please contact Model Optimizer developers and forward the following information:
[ ERROR ]  Traceback (most recent call last):
  File "/opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/model_optimizer/mo/", line 325, in main
    return driver(argv)
  File "/opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/model_optimizer/mo/", line 267, in driver
  File "/opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/model_optimizer/mo/pipeline/", line 256, in tf2nx
  File "/opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/model_optimizer/mo/middle/passes/", line 218, in partial_infer
    control_flow_infer(graph, n)
  File "/opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/model_optimizer/mo/middle/passes/", line 74, in control_flow_infer
    node.cf_infer(node, is_executable, mark_executability)
  File "/opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/model_optimizer/extensions/ops/", line 67, in control_flow_infer
    assert 1 <= len(switch_data_0_port_node_id) + len(switch_data_1_port_node_id) <= 2

[ ERROR ]  ---------------- END OF BUG REPORT --------------
[ ERROR ]  -------------------------------------------------


I'm not sure what to do next. Could someone help?

Thanks so much.



0 Kudos
0 Replies