Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
36 Views

Input 0 of node up_conv_1/crop_and_concat/floordiv_1 was passed int64 from up_conv_1/crop_and_concat/sub_1_port_0_ie_placeholder

Were trying to convert the TensorFlow U-Net model from https://github.com/jakeret/tf_unet

I've attached the protobuf text file for the frozen model.  It seems to be complaining about an input being type INT64. However, there is no INT64 datatype in the protobuf (all are INT32).  Can someone help me with the next steps to diagnose the issue?

Here's what I get from the debug log:

 

 2019-05-29 19:42:59,830 ] [ DEBUG ] [ infer:127 ]  --------------------
[ 2019-05-29 19:42:59,830 ] [ DEBUG ] [ infer:128 ]  Partial infer for up_conv_1/crop_and_concat/sub_1/add_
[ 2019-05-29 19:42:59,830 ] [ DEBUG ] [ infer:129 ]  Op: Add
[ 2019-05-29 19:42:59,830 ] [ DEBUG ] [ infer:140 ]  Inputs:
[ 2019-05-29 19:42:59,831 ] [ DEBUG ] [ infer:34 ]  input[0]: shape = [], value = 280
[ 2019-05-29 19:42:59,831 ] [ DEBUG ] [ infer:34 ]  input[1]: shape = [], value = -272
[ 2019-05-29 19:42:59,831 ] [ DEBUG ] [ infer:142 ]  Outputs:
[ 2019-05-29 19:42:59,831 ] [ DEBUG ] [ infer:34 ]  output[0]: shape = [], value = 8
[ 2019-05-29 19:42:59,831 ] [ DEBUG ] [ infer:127 ]  --------------------
[ 2019-05-29 19:42:59,831 ] [ DEBUG ] [ infer:128 ]  Partial infer for up_conv_1/crop_and_concat/floordiv_1
[ 2019-05-29 19:42:59,831 ] [ DEBUG ] [ infer:129 ]  Op: FloorDiv
[ INFO ]  Called "tf_native_tf_node_infer" for node "up_conv_1/crop_and_concat/floordiv_1"
[ 2019-05-29 19:42:59,833 ] [ DEBUG ] [ tf:222 ]  Added placeholder with name 'up_conv_1/crop_and_concat/sub_1_port_0_ie_placeholder'
[ 2019-05-29 19:42:59,834 ] [ DEBUG ] [ tf:222 ]  Added placeholder with name 'up_conv_1/crop_and_concat/floordiv_1/y_port_0_ie_placeholder'
[ 2019-05-29 19:42:59,834 ] [ DEBUG ] [ tf:235 ]  update_input_in_pbs: replace input 'up_conv_1/crop_and_concat/sub_1' with input 'up_conv_1/crop_and_concat/sub_1_port_0_ie_placeholder'
[ 2019-05-29 19:42:59,834 ] [ DEBUG ] [ tf:243 ]  Replacing input '0' of the node 'up_conv_1/crop_and_concat/floordiv_1' with placeholder 'up_conv_1/crop_and_concat/sub_1_port_0_ie_placeholder'
[ 2019-05-29 19:42:59,834 ] [ DEBUG ] [ tf:235 ]  update_input_in_pbs: replace input 'up_conv_1/crop_and_concat/floordiv_1/y' with input 'up_conv_1/crop_and_concat/floordiv_1/y_port_0_ie_placeholder'
[ 2019-05-29 19:42:59,834 ] [ DEBUG ] [ tf:243 ]  Replacing input '1' of the node 'up_conv_1/crop_and_concat/floordiv_1' with placeholder 'up_conv_1/crop_and_concat/floordiv_1/y_port_0_ie_placeholder'
[ ERROR ]  Cannot infer shapes or values for node "up_conv_1/crop_and_concat/floordiv_1".
[ ERROR ]  Input 0 of node up_conv_1/crop_and_concat/floordiv_1 was passed int64 from up_conv_1/crop_and_concat/sub_1_port_0_ie_placeholder:0 incompatible with expected int32.
[ ERROR ]
[ ERROR ]  It can happen due to bug in custom shape infer function <function tf_native_tf_node_infer at 0x7fa14a36d7b8>.
[ ERROR ]  Or because the node inputs have incorrect values/shapes.
[ ERROR ]  Or because input shapes are incorrect (embedded to the model or passed via --input_shape).
[ 2019-05-29 19:42:59,836 ] [ DEBUG ] [ infer:194 ]  Node "up_conv_1/crop_and_concat/floordiv_1" attributes: {'pb': name: "up_conv_1/crop_and_concat/floordiv_1"
op: "FloorDiv"
input: "up_conv_1/crop_and_concat/sub_1_port_0_ie_placeholder"
input: "up_conv_1/crop_and_concat/floordiv_1/y_port_0_ie_placeholder"
attr {
  key: "T"
  value {
    type: DT_INT32
  }
}
, '_in_ports': {0, 1}, '_out_ports': {0}, 'kind': 'op', 'name': 'up_conv_1/crop_and_concat/floordiv_1', 'op': 'FloorDiv', 'precision': 'FP32', 'infer': <function tf_native_tf_node_infer at 0x7fa14a36d7b8>, 'is_output_reachable': True, 'is_undead': False, 'is_const_producer': False, 'is_partial_inferred': False}
[ ERROR ]  Exception occurred during running replacer "REPLACEMENT_ID" (<class 'extensions.middle.PartialInfer.PartialInfer'>): Stopped shape/value propagation at "up_conv_1/crop_and_concat/floordiv_1" node.
 For more information please refer to Model Optimizer FAQ (<INSTALL_DIR>/deployment_tools/documentation/docs/MO_FAQ.html), question #38.
[ 2019-05-29 19:42:59,837 ] [ DEBUG ] [ main:318 ]  Traceback (most recent call last):
  File "/home/bduser/anaconda3/envs/decathlon/lib/python3.7/site-packages/tensorflow/python/framework/importer.py", line 426, in import_graph_def
    graph._c_graph, serialized, options)  # pylint: disable=protected-access
tensorflow.python.framework.errors_impl.InvalidArgumentError: Input 0 of node up_conv_1/crop_and_concat/floordiv_1 was passed int64 from up_conv_1/crop_and_concat/sub_1_port_0_ie_placeholder:0 incompatible with expected int32.
 

Thanks.

0 Kudos
7 Replies
Highlighted
36 Views

Forgot to add my model optimizer command line:

python ~/intel/openvino/deployment_tools/model_optimizer/mo_tf.py --input_model protobuf.pbtxt  --input_shape [1,572,572,1] --input_model_is_text --log_level DEBUG

0 Kudos
Highlighted
Employee
36 Views

Dear G Anthony R. 

What version of OpenVino are you using ? Seems old because when I try your command

C:\Users\sdramani\Downloads\protobuf>python "c:\Program Files (x86)\IntelSWTools\openvino_2019.1.148\deployment_tools\model_optimizer\mo_tf.py"  --input_model protobuf.pbtxt  --input_shape [1,572,572,1] --input_model_is_text --log_level DEBUG

with your frozen protobuf.pbtxt on the latest OpenVino Release 2019R1.1 I get a much more tractable error:

 ERROR ]  List of operations that cannot be converted to Inference Engine IR:
[ ERROR ]      RandomUniform (10)
[ ERROR ]          down_conv_0/conv2d/dropout/random_uniform/RandomUniform
[ ERROR ]          down_conv_0/conv2d_1/dropout/random_uniform/RandomUniform
[ ERROR ]          down_conv_1/conv2d/dropout/random_uniform/RandomUniform
[ ERROR ]          down_conv_1/conv2d_1/dropout/random_uniform/RandomUniform
[ ERROR ]          down_conv_2/conv2d/dropout/random_uniform/RandomUniform
[ ERROR ]          down_conv_2/conv2d_1/dropout/random_uniform/RandomUniform
[ ERROR ]          up_conv_1/conv2d/dropout/random_uniform/RandomUniform
[ ERROR ]          up_conv_1/conv2d_1/dropout/random_uniform/RandomUniform
[ ERROR ]          up_conv_0/conv2d/dropout/random_uniform/RandomUniform
[ ERROR ]          up_conv_0/conv2d_1/dropout/random_uniform/RandomUniform
[ ERROR ]  Part of the nodes was not converted to IR. Stopped.
 For more information please refer to Model Optimizer FAQ (<INSTALL_DIR>/deployment_tools/documentation/docs/MO_FAQ.html), question #24.

The MO FAQ #24 says this:

24. What does the message "Part of the nodes was not translated to IE. Stopped" mean?

Some of the layers are not supported by the Model Optimizer and cannot be translated to an Intermediate Representation. You can extend the Model Optimizer by adding new primitives. For more information, refer to Extending the Model Optimizer with New Primitives page.

Which begs for custom layers.

Another way to handle it if you don't want to do custom layers is Offloading Sub-Graph but I find this feature not so easy to use and I think custom layers is a cleaner solution.

Hope it helps.

Thanks,

Shubha

0 Kudos
Highlighted
36 Views

Looks like 

/home/bduser/intel/openvino_2019.1.094

 

 

0 Kudos
Highlighted
Employee
36 Views

Dear G Anthony R. (Intel),

Wow. That's ancient in OpenVino years. You should always keep up with the latest and greatest releases because improvements are always being made and bugs are being fixed. My guess is that you stumbled upon a bug which has since been fixed in 2019R1.1.

Shubha

0 Kudos
Highlighted
36 Views

Thanks. I updated to the latest version from May ( 2019.1.1-83-g28dfbfd). I'm still seeing the same error:

(openvino) [bduser@param01 hello]$ python /opt/intel/openvino/deployment_tools/model_optimizer/mo_tf.py --input_model protobuf.pbtxt  --input_shape [1,572,572,1] --input_model_is_text
Model Optimizer arguments:
Common parameters:
    - Path to the Input Model:     /home/bduser/tony/tf_unet/hello/protobuf.pbtxt
    - Path for generated IR:     /home/bduser/tony/tf_unet/hello/.
    - IR output name:     protobuf
    - Log level:     ERROR
    - Batch:     Not specified, inherited from the model
    - Input layers:     Not specified, inherited from the model
    - Output layers:     Not specified, inherited from the model
    - Input shapes:     [1,572,572,1]
    - Mean values:     Not specified
    - Scale values:     Not specified
    - Scale factor:     Not specified
    - Precision of IR:     FP32
    - Enable fusing:     True
    - Enable grouped convolutions fusing:     True
    - Move mean values to preprocess section:     False
    - Reverse input channels:     False
TensorFlow specific parameters:
    - Input model in text protobuf format:     True
    - Path to model dump for TensorBoard:     None
    - List of shared libraries with TensorFlow custom layers implementation:     None
    - Update the configuration file with input/output node names:     None
    - Use configuration file used to generate the model with Object Detection API:     None
    - Operations to offload:     None
    - Patterns to offload:     None
    - Use the config file:     None
Model Optimizer version:     2019.1.1-83-g28dfbfd
[ WARNING ]
Detected not satisfied dependencies:
    test-generator: installed: 0.1.2, required: 0.1.1

Please install required versions of components or use install_prerequisites script
/opt/intel/openvino_2019.1.144/deployment_tools/model_optimizer/install_prerequisites/install_prerequisites_tf.sh
Note that install_prerequisites scripts may install additional components.
[ ERROR ]  Cannot infer shapes or values for node "up_conv_1/crop_and_concat/floordiv_1".
[ ERROR ]  Input 1 of node up_conv_1/crop_and_concat/floordiv_1 was passed int32 from up_conv_1/crop_and_concat/floordiv_1/y_port_0_ie_placeholder:0 incompatible with expected int64.
[ ERROR ]
[ ERROR ]  It can happen due to bug in custom shape infer function <function tf_native_tf_node_infer at 0x7fc77de7b400>.
[ ERROR ]  Or because the node inputs have incorrect values/shapes.
[ ERROR ]  Or because input shapes are incorrect (embedded to the model or passed via --input_shape).
[ ERROR ]  Run Model Optimizer with --log_level=DEBUG for more information.
[ ERROR ]  Exception occurred during running replacer "REPLACEMENT_ID" (<class 'extensions.middle.PartialInfer.PartialInfer'>): Stopped shape/value propagation at "up_conv_1/crop_and_concat/floordiv_1" node.
 For more information please refer to Model Optimizer FAQ (<INSTALL_DIR>/deployment_tools/documentation/docs/MO_FAQ.html), question #38.

0 Kudos
Highlighted
Employee
36 Views

Dear G Anthony R,

Wow. Now that is really odd. I'm using Windows and I see that you are using Linux but that should not make a diff. I've never seen such OS specific bugs in OpenVino. One difference I note is that I go into openvino_2019.1.148 while you are going into openvino (which is correct by the way). Can you kindly try invoking Model Optimizer from openvino_2019.1.148 ? If that works then this is clearly a bug because openvino should absolutely work.

If the above doesn't work please completely uninstall all existing old OpenVino's (by running install.sh) and do a clean install.

Sorry that this is happening to you -

Post back here regarding your findings.

Shubha

0 Kudos
Highlighted
36 Views

I'll give that a try.

Thanks so much.

 

 

0 Kudos