Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6503 Discussions

Openvino Conversion Error ONNX | Torch Model

pankajrawat
Novice
1,994 Views

During Conversion of Onnx model to openvino we are facing some issues, can you please help us here. 

 

(cenv) u47404@s099-n004:~/intelmac$ python /opt/intel/openvino/deployment_tools/model_optimizer/mo.py --input_model models_stats/torch/timeseries_enode_tcnn_l6.onnx --output_dir models_stats/torch --input_shape [128,80,15]
Model Optimizer arguments:
Common parameters:
        - Path to the Input Model:      /home/u47404/intelmac/models_stats/torch/timeseries_enode_tcnn_l6.onnx
        - Path for generated IR:        /home/u47404/intelmac/models_stats/torch
        - IR output name:       timeseries_enode_tcnn_l6
        - Log level:    ERROR
        - Batch:        Not specified, inherited from the model
        - Input layers:         Not specified, inherited from the model
        - Output layers:        Not specified, inherited from the model
        - Input shapes:         [128,80,15]
        - Mean values:  Not specified
        - Scale values:         Not specified
        - Scale factor:         Not specified
        - Precision of IR:      FP32
        - Enable fusing:        True
        - Enable grouped convolutions fusing:   True
        - Move mean values to preprocess section:       None
        - Reverse input channels:       False
ONNX specific parameters:
Model Optimizer version:        2021.1.0-1237-bece22ac675-releases/2021/1
2020-10-18 23:32:18.068320: W tensorflow/stream_executor/platform/default/dso_loader.cc:59] Could not load dynamic library 'libcudart.so.10.1'; dlerror: libcudart.so.10.1: cannot open shared object file: No such file or directory; LD_LIBRARY_PATH: /opt/intel/openvino/data_processing/dl_streamer/lib:/opt/intel/openvino/data_processing/gstreamer/lib:/opt/intel/openvino/opencv/lib:/opt/intel/openvino/deployment_tools/ngraph/lib:/opt/intel/openvino/deployment_tools/inference_engine/external/hddl_unite/lib:/opt/intel/openvino/deployment_tools/inference_engine/external/hddl/lib:/opt/intel/openvino/deployment_tools/inference_engine/external/gna/lib:/opt/intel/openvino/deployment_tools/inference_engine/external/mkltiny_lnx/lib:/opt/intel/openvino/deployment_tools/inference_engine/external/tbb/lib:/opt/intel/openvino/deployment_tools/inference_engine/lib/intel64:/opt/intel/openvino/data_processing/dl_streamer/lib:/opt/intel/openvino/data_processing/gstreamer/lib:/opt/intel/openvino/opencv/lib:/opt/intel/openvino/deployment_tools/ngraph/lib:/opt/intel/openvino/deployment_tools/inference_engine/external/hddl_unite/lib:/opt/intel/openvino/deployment_tools/inference_engine/external/hddl/lib:/opt/intel/openvino/deployment_tools/inference_engine/external/gna/lib:/opt/intel/openvino/deployment_tools/inference_engine/external/mkltiny_lnx/lib:/opt/intel/openvino/deployment_tools/inference_engine/external/tbb/lib:/opt/intel/openvino/deployment_tools/inference_engine/lib/intel64
2020-10-18 23:32:18.068367: I tensorflow/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine.
[ ERROR ]  -------------------------------------------------
[ ERROR ]  ----------------- INTERNAL ERROR ----------------
[ ERROR ]  Unexpected exception happened.
[ ERROR ]  Please contact Model Optimizer developers and forward the following information:
[ ERROR ]  Exception occurred during running replacer "fusing (<class 'extensions.middle.fusings.Fusing'>)": 
[ ERROR ]  Traceback (most recent call last):
  File "/opt/intel/openvino_2021.1.110/deployment_tools/model_optimizer/mo/utils/class_registration.py", line 286, in apply_transform
    replacer.find_and_replace_pattern(graph)
  File "/opt/intel/openvino_2021.1.110/deployment_tools/model_optimizer/extensions/middle/fusings.py", line 57, in find_and_replace_pattern
    for_graph_and_each_sub_graph_recursively(graph, fuse_pad)
  File "/opt/intel/openvino_2021.1.110/deployment_tools/model_optimizer/mo/middle/pattern_match.py", line 58, in for_graph_and_each_sub_graph_recursively
    func(graph)
  File "/opt/intel/openvino_2021.1.110/deployment_tools/model_optimizer/mo/middle/passes/conv.py", line 73, in fuse_pad
    action=pad_op_transform
  File "/opt/intel/openvino_2021.1.110/deployment_tools/model_optimizer/mo/middle/pattern_match.py", line 95, in apply_pattern
    action(graph, match)
  File "/opt/intel/openvino_2021.1.110/deployment_tools/model_optimizer/mo/middle/passes/conv.py", line 45, in pad_op_transform
    if pads[get_features_dim(op.graph.graph['layout'], input_tensor_dims)] != 0 or \
  File "/opt/intel/openvino_2021.1.110/deployment_tools/model_optimizer/mo/front/common/layout.py", line 84, in get_features_dim
    assert 4 <= shape_len <= 5
AssertionError

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/opt/intel/openvino_2021.1.110/deployment_tools/model_optimizer/mo/main.py", line 298, in main
    ret_code = driver(argv)
  File "/opt/intel/openvino_2021.1.110/deployment_tools/model_optimizer/mo/main.py", line 265, in driver
    ret_res = emit_ir(prepare_ir(argv), argv)
  File "/opt/intel/openvino_2021.1.110/deployment_tools/model_optimizer/mo/main.py", line 234, in prepare_ir
    graph = unified_pipeline(argv)
  File "/opt/intel/openvino_2021.1.110/deployment_tools/model_optimizer/mo/pipeline/unified.py", line 29, in unified_pipeline
    class_registration.ClassType.BACK_REPLACER
  File "/opt/intel/openvino_2021.1.110/deployment_tools/model_optimizer/mo/utils/class_registration.py", line 334, in apply_replacements
    apply_replacements_list(graph, replacers_order)
  File "/opt/intel/openvino_2021.1.110/deployment_tools/model_optimizer/mo/utils/class_registration.py", line 324, in apply_replacements_list
    num_transforms=len(replacers_order))
  File "/opt/intel/openvino_2021.1.110/deployment_tools/model_optimizer/mo/utils/logger.py", line 124, in wrapper
    function(*args, **kwargs)
  File "/opt/intel/openvino_2021.1.110/deployment_tools/model_optimizer/mo/utils/class_registration.py", line 312, in apply_transform
    )) from err
Exception: Exception occurred during running replacer "fusing (<class 'extensions.middle.fusings.Fusing'>)": 

[ ERROR ]  ---------------- END OF BUG REPORT --------------
[ ERROR ]  -------------------------------------------------

 

The issue seems to be due to,       

self.pad = torch.nn.ZeroPad2d((padding, 0, 0, 0))

 

Can you please help.

0 Kudos
2 Replies
Iffa_Intel
Moderator
1,979 Views

Greetings,


First and foremost,before proceeding further, please help to ensure that your topology and Pytorch/Paddle Models are listed in here: https://docs.openvinotoolkit.org/latest/openvino_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_ONNX.html


Those listed are validated by Openvino and you should good to go if these are fulfilled.


Please also help to check whether your framework layer is supported : https://docs.openvinotoolkit.org/latest/openvino_docs_MO_DG_prepare_model_Supported_Frameworks_Layers.html


Sincerely,

Iffa


0 Kudos
Iffa_Intel
Moderator
1,947 Views

Greetings,


Intel will no longer monitor this thread since we have provided a solution. If you need any additional information from Intel, please submit a new question


Sincerely,

Iffa


0 Kudos
Reply