- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I was trying to convert a model which uses tf.tile and tf.slice operations after a convolution layer, and got an error while generating the IR:
python3 mo_tf.py --input_model frozen_out/frozen_graph.pb --output_dir out_IR Model Optimizer arguments Batch: 1 Precision of IR: FP32 Enable fusing: True Enable gfusing: True Names of input layers: inherited from the model Path to the Input Model: frozen_out/frozen_graph.pb Input shapes: inherited from the model Log level: ERROR Mean values: () IR output name: inherited from the model Names of output layers: inherited from the model Path for generated IR: out_IR Reverse input channels: False Scale factor: None Scale values: () Version: 0.3.61.37271eb9 Input model in text protobuf format: False Offload unsupported operations: False Path to model dump for TensorBoard: None Update the configuration file with input/output node names: None Operations to offload: None Patterns to offload: None Use the config file: None [ ERROR ] Shape is not defined for output 0 of "output". [ ERROR ] Cannot infer shapes or values for node "output". [ ERROR ] Not all output shapes were inferred or fully defined for node "output". For more information please refer to Model Optimizer FAQ, question #40.
I looked at FAQ#40, it suggests --input_shape option as a fix, but all the input shapes of the tensor “output” are valid.
My environment is Ubuntu 17.10 x64 with python3.6, tensorflow 1.8, CV-SDK 2018.0.234
I attached both the frozen graph and the tensorflow code which was used for creating the graph. In addition, I added full debug log of the error.
According to the Deployment Toolkit Developer Guide all my model operations should be supported.
Any help would be appreciated.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Daniel,
It seems your shape is not defined, please try to use this option --input_shape to override model input shape.
--input_shape [N,H,W,C] for TF, please follow developer guide to learn more. Thank you.
Best regards,
Fiona
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks Fiona.
But as you can see in the full log I attached, the input shape is well defined:
[ 2018-05-29 09:41:18,172 ] [ DEBUG ] [ infer:71 ] Partial infer for input [ 2018-05-29 09:41:18,172 ] [ DEBUG ] [ infer:72 ] Op: Placeholder [ 2018-05-29 09:41:18,172 ] [ DEBUG ] [ infer:77 ] Inputs: [ 2018-05-29 09:41:18,172 ] [ DEBUG ] [ infer:79 ] Outputs: [ 2018-05-29 09:41:18,172 ] [ DEBUG ] [ infer:39 ] output[0]: shape = [ 1 4 4 20], value = <UNKNOWN>
As you suggested, I tried to set it with the --input_shape options but the error is the same.
(command: python3 mo_tf.py --input_model frozen_out/frozen_graph.pb --output_dir out_IR --input_shape=[1,4,4,20])
Furthermore, as I mentioned before, all the input shapes of the tensor “output” are valid:
[ 2018-05-29 09:41:18,178 ] [ DEBUG ] [ infer:72 ] Op: Slice [ 2018-05-29 09:41:18,178 ] [ DEBUG ] [ infer:77 ] Inputs: [ 2018-05-29 09:41:18,178 ] [ DEBUG ] [ infer:39 ] input[0]: shape = [ 1 4 4 20], value = <UNKNOWN> [ 2018-05-29 09:41:18,178 ] [ DEBUG ] [ infer:39 ] input[1]: shape = [4], value = [0 0 0 0] [ 2018-05-29 09:41:18,179 ] [ DEBUG ] [ infer:39 ] input[2]: shape = [4], value = [ 1 4 4 10] [ 2018-05-29 09:41:18,179 ] [ DEBUG ] [ infer:79 ] Outputs: [ 2018-05-29 09:41:18,179 ] [ DEBUG ] [ infer:39 ] output[0]: shape = <UNKNOWN>, value = <UNKNOWN>
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Daniel,
It seems that you may want to supply the command line parameter --input <node name> so that Model Optimizer will know what node you are providing the shape for.
Kind Regards,
Monique Jones
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Monique,
The --input option is not relevant in this case. The input node for the optimizer is the first node in the tensorflow graph, so there is no need to set --input.
In addition, you can see in my previous post, the input placeholder is well defined:
[ 2018-05-29 09:41:18,172 ] [ DEBUG ] [ infer:71 ] Partial infer for input [ 2018-05-29 09:41:18,172 ] [ DEBUG ] [ infer:72 ] Op: Placeholder [ 2018-05-29 09:41:18,172 ] [ DEBUG ] [ infer:77 ] Inputs: [ 2018-05-29 09:41:18,172 ] [ DEBUG ] [ infer:79 ] Outputs: [ 2018-05-29 09:41:18,172 ] [ DEBUG ] [ infer:39 ] output[0]: shape = [ 1 4 4 20], value = <UNKNOWN>
Nevertheless, I added the input node explicitly - as expected the error is the same.
(python3 mo_tf.py --input_model frozen_out/frozen_graph.pb --input=input --input_shape=[1,4,4,20] --output_dir out_IR)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thank you, we will investigate to see if the problem of slide layer.
Fiona

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page