Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6404 Discussions

Error while converting model from onnx to IR

bondarenko__mikhail
382 Views

I try to convert onnx model to IR:

python3 mo.py --input_model /home/www/frompytorchtoonnx/output/ep_180_sim_autoencoder.onnx
Model Optimizer arguments:
Common parameters:
	- Path to the Input Model: 	/home/www/frompytorchtotensorflow/output/ep_180_sim_autoencoder.onnx
	- Path for generated IR: 	/opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/model_optimizer/.
	- IR output name: 	ep_180_sim_autoencoder
	- Log level: 	ERROR
	- Batch: 	Not specified, inherited from the model
	- Input layers: 	Not specified, inherited from the model
	- Output layers: 	Not specified, inherited from the model
	- Input shapes: 	Not specified, inherited from the model
	- Mean values: 	Not specified
	- Scale values: 	Not specified
	- Scale factor: 	Not specified
	- Precision of IR: 	FP32
	- Enable fusing: 	True
	- Enable grouped convolutions fusing: 	True
	- Move mean values to preprocess section: 	False
	- Reverse input channels: 	False
ONNX specific parameters:
Model Optimizer version: 	1.5.12.49d067a0

Virtualenv contains this libraries:

decorator==4.3.0
networkx==2.2
numpy==1.16.0
onnx==1.1.2
pkg-resources==0.0.0
protobuf==3.6.1
six==1.12.0
typing==3.6.6
typing-extensions==3.7.2

And I get an error:

[ ERROR ]  -------------------------------------------------
[ ERROR ]  ----------------- INTERNAL ERROR ----------------
[ ERROR ]  Unexpected exception happened.
[ ERROR ]  Please contact Model Optimizer developers and forward the following information:
[ ERROR ]  local variable 'new_attrs' referenced before assignment
[ ERROR ]  Traceback (most recent call last):
  File "/opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/model_optimizer/mo/front/extractor.py", line 601, in extract_node_attrs
    supported, new_attrs = extractor(Node(graph, node))
  File "/opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/model_optimizer/mo/pipeline/onnx.py", line 102, in <lambda>
    extract_node_attrs(graph, lambda node: onnx_op_extractor(node, check_for_duplicates(onnx_op_extractors)))
  File "/opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/model_optimizer/mo/front/onnx/extractor.py", line 72, in onnx_op_extractor
    attrs = onnx_op_extractors[op](node)
  File "/opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/model_optimizer/mo/front/onnx/extractors/constant.py", line 26, in onnx_constant_ext
    value = numpy_helper.to_array(pb_value)
  File "/opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/model_optimizer/venv/lib/python3.5/site-packages/onnx/numpy_helper.py", line 33, in to_array
    raise ValueError("The data type is not defined.")
ValueError: The data type is not defined.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/model_optimizer/mo/main.py", line 325, in main
    return driver(argv)
  File "/opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/model_optimizer/mo/main.py", line 302, in driver
    mean_scale_values=mean_scale)
  File "/opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/model_optimizer/mo/pipeline/onnx.py", line 102, in driver
    extract_node_attrs(graph, lambda node: onnx_op_extractor(node, check_for_duplicates(onnx_op_extractors)))
  File "/opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/model_optimizer/mo/front/extractor.py", line 607, in extract_node_attrs
    new_attrs['name'] if 'name' in new_attrs else '<UNKNOWN>',
UnboundLocalError: local variable 'new_attrs' referenced before assignment

[ ERROR ]  ---------------- END OF BUG REPORT --------------
[ ERROR ]  -------------------------------------------------

 

But this is a syntax error inside the exception. 606 line in /opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/model_optimizer/mo/front/extractor.py, local variable 'new_attrs' referenced before assignment:

 try:
                supported, new_attrs = extractor(Node(graph, node))
 except Exception as e:
                raise Error(
                    'Unexpected exception happened during extracting attributes for node {}.' +
                    '\nOriginal exception message: {}',
                    new_attrs['name'] if 'name' in new_attrs else '<UNKNOWN>',
                    str(e)
                ) from e

 

But if we print some variables:

Variable "graph" contains:

ep_180_sim_autoencoder

Variable "node" contains:

213

Variable "e" contains:

The data type is not defined.

Variable "graph.node[node]" contains:

{'kind': 'op', 'IE': [('layer', [('id', <function update_ie_fields.<locals>.<lambda> at 0x7f61480f7840>), 'name', 'precision', 'type'], [('data', ['auto_pad', 'epsilon', 'min', 'max', ('axis', <function update_ie_fields.<locals>.<lambda> at 0x7f61480f78c8>), 'tiles', ('dim', <function update_ie_fields.<locals>.<lambda> at 0x7f61480f7950>), 'num_axes', ('pool-method', 'pool_method'), 'group', ('rounding-type', 'rounding_type'), ('exclude-pad', 'exclude_pad'), 'operation', 'out-size', 'power', 'shift', 'alpha', 'beta', 'coords', 'classes', 'num', ('local-size', 'local_size'), 'region', 'knorm', 'num_classes', 'keep_top_k', 'variance_encoded_in_target', 'code_type', 'share_location', 'nms_threshold', 'confidence_threshold', 'background_label_id', 'top_k', 'eta', 'visualize', 'visualize_threshold', 'save_file', 'output_directory', 'output_name_prefix', 'output_format', 'label_map_file', 'name_size_file', 'num_test_image', 'prob', 'resize_mode', 'height', 'width', 'height_scale', 'width_scale', 'pad_mode', 'pad_value', 'interp_mode', 'img_size', 'img_h', 'img_w', 'step', 'step_h', 'step_w', ('offset', <function update_ie_fields.<locals>.<lambda> at 0x7f61480f79d8>), 'variance', 'flip', 'clip', ('min_size', <function update_ie_fields.<locals>.<lambda> at 0x7f61480f7a60>), ('max_size', <function update_ie_fields.<locals>.<lambda> at 0x7f61480f7ae8>), ('aspect_ratio', <function update_ie_fields.<locals>.<lambda> at 0x7f61480f7b70>), 'decrease_label_id', 'normalized', 'scale_all_sizes', ('type', 'norm_type'), 'eps', 'across_spatial', 'channel_shared', 'negative_slope', 'engine', 'num_filter', ('type', 'sample_type'), ('order', <function update_ie_fields.<locals>.<lambda> at 0x7f61480f7bf8>), 'pooled_h', 'pooled_w', 'spatial_scale', 'cls_threshold', 'max_num_proposals', 'iou_threshold', 'min_bbox_size', 'feat_stride', 'pre_nms_topn', 'post_nms_topn', ('type', <function update_ie_fields.<locals>.<lambda> at 0x7f61480f7c80>), ('value', <function update_ie_fields.<locals>.<lambda> at 0x7f61480f7d08>), ('output', <function update_ie_fields.<locals>.<lambda> at 0x7f61480f7d90>), ('input_nodes_names', <function update_ie_fields.<locals>.<lambda> at 0x7f61480f7e18>), ('output_tensors_names', <function update_ie_fields.<locals>.<lambda> at 0x7f61480f7ea0>), ('real_input_dims', <function update_ie_fields.<locals>.<lambda> at 0x7f61480f7f28>), ('protobuf', <function update_ie_fields.<locals>.<lambda> at 0x7f61480f9048>), {'custom_attributes': None}, ('strides', <function update_ie_fields.<locals>.<lambda> at 0x7f61480f90d0>), ('kernel', <function update_ie_fields.<locals>.<lambda> at 0x7f61480f9158>), ('dilations', <function update_ie_fields.<locals>.<lambda> at 0x7f61480f91e0>), ('pads_begin', <function update_ie_fields.<locals>.<lambda> at 0x7f61480f9268>), ('pads_end', <function update_ie_fields.<locals>.<lambda> at 0x7f61480f92f0>), ('scale', <function update_ie_fields.<locals>.<lambda> at 0x7f61480f9378>), 'crop_width', 'crop_height', 'write_augmented', 'max_multiplier', 'augment_during_test', 'recompute_mean', 'write_mean', 'mean_per_pixel', 'mode', 'bottomwidth', 'bottomheight', 'chromatic_eigvec', 'kernel_size', 'max_displacement', 'stride_1', 'stride_2', 'single_direction', 'do_abs', 'correlation_type', 'antialias', 'resample_type', 'factor', 'coeff', ('ratio', <function update_ie_fields.<locals>.<lambda> at 0x7f61480f9400>)], []), '@ports', '@consts'])], 'is_undead': False, 'precision': 'FP32', 'is_output_reachable': True, 'dim_attrs': ['batch_dims', 'axis', 'channel_dims', 'spatial_dims'], 'op': 'Constant', 'name': '148', 'is_const_producer': False, 'shape_attrs': ['stride', 'shape', 'pad', 'output_shape', 'window'], 'pb': output: "148"
op_type: "Constant"
attribute {
  name: "value"
  ints: 1
  ints: 1
  type: INTS
}
domain: "org.pytorch.prim"
}

 

0 Kudos
0 Replies
Reply