Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6401 Discussions

Error while running MO on Universal Sentence Encoder (Cannot infer shapes or values for node )

KishorGandham
Beginner
537 Views

Hi All,

I got Tensorflow based Universal Sentence Encoder model from https://tfhub.dev/google/universal-sentence-encoder/4 

I have Intel toolkit for Openvino 2021.3.394 installed on Ubuntu 20.04 LTS OS.

 

I ran the following command to convert TF2 model to IR format

python3 mo_tf.py --saved_model_dir /home/ubuntu/tfcache/063d866c06683311b44b4992fd46003be952409c/ --output_dir /tmp/ --log_level DEBUG

 

Attached file has the full debug log and the error I am getting. Any help here would be appreciated

Error message:

[ ERROR ] Cannot infer shapes or values for node "StatefulPartitionedCall/text_preprocessor/add_bigrams/SparseToDense/CastDefaultValue".
[ ERROR ] invalid literal for int() with base 10: ''
[ ERROR ]
[ ERROR ] It can happen due to bug in custom shape infer function <function Cast.infer at 0x7fafeebcfa60>.
[ ERROR ] Or because the node inputs have incorrect values/shapes.
[ ERROR ] Or because input shapes are incorrect (embedded to the model or passed via --input_shape).
[ 2021-05-22 06:54:52,019 ] [ DEBUG ] [ infer:197 ] Node "StatefulPartitionedCall/text_preprocessor/add_bigrams/SparseToDense/CastDefaultValue" attributes: {'kind': 'op', 'op': 'Cast', 'type': 'Convert', 'version': 'opset1', 'infer': <function Cast.infer at 0x7fafeebcfa60>, 'type_infer': <function Cast.type_infer at 0x7fafeebcf9d0>, 'dst_type': <class 'numpy.int32'>, 'in_ports_count': 1, 'out_ports_count': 1, 'name': 'StatefulPartitionedCall/text_preprocessor/add_bigrams/SparseToDense/CastDefaultValue', 'dim_attrs': ['channel_dims', 'spatial_dims', 'batch_dims', 'axis'], 'shape_attrs': ['window', 'stride', 'output_shape', 'pad', 'shape'], 'IE': [('layer', [('id', <function Op.substitute_ie_attrs.<locals>.<lambda> at 0x7faf9a524670>), 'name', 'type', 'version'], [('data', [('destination_type', <function Cast.backend_attrs.<locals>.<lambda> at 0x7faf9a524550>)], []), '@ports', '@consts'])], '_in_ports': {0: {}}, '_out_ports': {0: {}}, 'is_output_reachable': True, 'is_undead': False, 'is_const_producer': False, 'is_partial_inferred': False}
[ ERROR ] Exception occurred during running replacer "REPLACEMENT_ID" (<class 'extensions.middle.PartialInfer.PartialInfer'>): Stopped shape/value propagation at "StatefulPartitionedCall/text_preprocessor/add_bigrams/SparseToDense/CastDefaultValue" node.
For more information please refer to Model Optimizer FAQ, question #38. (https://docs.openvinotoolkit.org/latest/openvino_docs_MO_DG_prepare_model_Model_Optimizer_FAQ.html?question=38#question-38)
[ 2021-05-22 06:54:52,020 ] [ DEBUG ] [ main:361 ] Traceback (most recent call last):
File "/opt/intel/openvino_2021.3.394/deployment_tools/model_optimizer/mo/middle/passes/infer.py", line 135, in partial_infer
node.infer(node)
File "/opt/intel/openvino_2021.3.394/deployment_tools/model_optimizer/extensions/ops/Cast.py", line 58, in infer
new_blob, finite_match_count, zero_match_count = convert_blob(node.in_node(0).value, dst_type)
File "/opt/intel/openvino_2021.3.394/deployment_tools/model_optimizer/mo/middle/passes/convert_data_type.py", line 95, in convert_blob
converted_blob = blob.astype(dtype=dst_type, casting="unsafe")
ValueError: invalid literal for int() with base 10: ''

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/opt/intel/openvino_2021.3.394/deployment_tools/model_optimizer/mo/utils/class_registration.py", line 288, in apply_transform
replacer.find_and_replace_pattern(graph)
File "/opt/intel/openvino_2021.3.394/deployment_tools/model_optimizer/extensions/middle/PartialInfer.py", line 33, in find_and_replace_pattern
partial_infer(graph)
File "/opt/intel/openvino_2021.3.394/deployment_tools/model_optimizer/mo/middle/passes/infer.py", line 198, in partial_infer
raise Error('Stopped shape/value propagation at "{}" node. '.format(node.soft_get('name')) +
mo.utils.error.Error: Stopped shape/value propagation at "StatefulPartitionedCall/text_preprocessor/add_bigrams/SparseToDense/CastDefaultValue" node.
For more information please refer to Model Optimizer FAQ, question #38. (https://docs.openvinotoolkit.org/latest/openvino_docs_MO_DG_prepare_model_Model_Optimizer_FAQ.html?question=38#question-38)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/opt/intel/openvino_2021.3.394/deployment_tools/model_optimizer/mo/main.py", line 345, in main
ret_code = driver(argv)
File "/opt/intel/openvino_2021.3.394/deployment_tools/model_optimizer/mo/main.py", line 309, in driver
ret_res = emit_ir(prepare_ir(argv), argv)
File "/opt/intel/openvino_2021.3.394/deployment_tools/model_optimizer/mo/main.py", line 252, in prepare_ir
graph = unified_pipeline(argv)
File "/opt/intel/openvino_2021.3.394/deployment_tools/model_optimizer/mo/pipeline/unified.py", line 25, in unified_pipeline
class_registration.apply_replacements(graph, [
File "/opt/intel/openvino_2021.3.394/deployment_tools/model_optimizer/mo/utils/class_registration.py", line 340, in apply_replacements
apply_replacements_list(graph, replacers_order)
File "/opt/intel/openvino_2021.3.394/deployment_tools/model_optimizer/mo/utils/class_registration.py", line 326, in apply_replacements_list
apply_transform(
File "/opt/intel/openvino_2021.3.394/deployment_tools/model_optimizer/mo/utils/logger.py", line 124, in wrapper
function(*args, **kwargs)
File "/opt/intel/openvino_2021.3.394/deployment_tools/model_optimizer/mo/utils/class_registration.py", line 306, in apply_transform
raise Error('Exception occurred during running replacer "{}" ({}): {}'.format(
mo.utils.error.Error: Exception occurred during running replacer "REPLACEMENT_ID" (<class 'extensions.middle.PartialInfer.PartialInfer'>): Stopped shape/value propagation at "StatefulPartitionedCall/text_preprocessor/add_bigrams/SparseToDense/CastDefaultValue" node.
For more information please refer to Model Optimizer FAQ, question #38. (https://docs.openvinotoolkit.org/latest/openvino_docs_MO_DG_prepare_model_Model_Optimizer_FAQ.html?question=38#question-38)

 

Thanks

Kishor

0 Kudos
2 Replies
Iffa_Intel
Moderator
480 Views

Hi,


Fyi, you need to ensure that your model's topology is supported by OpenVINO.

You may refer to this link: https://docs.openvinotoolkit.org/latest/openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_TensorFlow.html


If its not listed there, then the model is not supported and failure in conversion is expected.



Sincerely,

Iffa


0 Kudos
Iffa_Intel
Moderator
455 Views

Greetings,


Intel will no longer monitor this thread since we have provided a solution. If you need any additional information from Intel, please submit a new question. 


Sincerely,

Iffa


0 Kudos
Reply