Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

Arbitrary code on MyriadX ?

pmx
Novice
510 Views

Env: Openvino 2021.2 (but same problem on 2021.3), Ubuntu 18.04

Hi, my goal is to run arbitrary converted pytorch code on the MyriadX.

In the basic example below, I create a pytorch module which multiplies 2 matrices given as input. The module is exported in ONNX format (Test.onnx): 

import torch
import torch.nn as nn

class Test(nn.Module):
    def __init__(self):
        super(Test, self).__init__()
        
    def forward(self, x, y):
        res = torch.mm(x,y)
        return res

def export_onnx():
    """
    Exports the model to an ONNX file.
    """
    model = Test()
    X = torch.randn(3, 4, dtype=torch.float)
    Y = torch.randn(4, 2, dtype=torch.float)
    print(model(X,Y))
    torch.onnx.export(
        model,
        (X, Y),
        'Test.onnx',
        opset_version=9,
        do_constant_folding=True,
        input_names=['X', 'Y'],
        verbose=True
    )

if __name__ == "__main__":
    export_onnx()

Then I convert Test.onnx in Openvino IR format with mo_onnx.py and finally convert the IR files in a blob with compile_tool : 

source /opt/intel/openvino_2021/bin/setupvars.sh

convert_model () {
        model_onnx=$1
        model_name=$(basename -s .onnx $model_onnx)
        python3 /opt/intel/openvino_2021/deployment_tools/model_optimizer/mo_onnx.py  \
                --input_model $model_onnx --data_type half \
                --input "X,Y" \
                --input_shape "[3,4],[4,2]"
                /opt/intel/openvino_2021/deployment_tools/tools/compile_tool/compile_tool -d MYRIAD \
                -m ${model_name}.xml \
                -VPU_NUMBER_OF_SHAVES 4 \
                -VPU_NUMBER_OF_CMX_SLICES 4 \
                -o ${model_name}.blob

}
convert_model Test.onnx

 

On the compile tool command, I get the following error message :

Network inputs:
    X : FP16 / NC
    Y : FP16 / NC
Network outputs:
    3 : FP16 / NC

[Warning][VPU][Config] Deprecated option was used : VPU_MYRIAD_PLATFORM
Failed to reshape Network: Check 'Dimension::merge(merged_dimension, arg0_col_dim, arg1_row_dim) || arg0_col_dim.is_dynamic() || arg1_row_dim.is_dynamic()' failed at ngraph/core/src/op/matmul.cpp:118:
Incompatible MatMul matrix dimension. First input dimension=4 at COL_INDEX_DIM=1 doesn't match the second input dimension=1 at ROW_INDEX_DIM=0

 

Do you know where this error is coming from ? Is the compile_tool expecting only inputs whose shape is NCHW or NCWH ? 

Thank you.

0 Kudos
2 Replies
IntelSupport
Moderator
485 Views

Hi pmx,

Thanks for reaching out. That might be happened due to the Inference Engine MYRIAD plugin is not supported by the onnx network.

Check out the following page to find the supported networks in MYRIAD.

https://docs.openvinotoolkit.org/latest/openvino_docs_IE_DG_supported_plugins_MYRIAD.html

Meanwhile, the error might be also due to the interference of the input shape during the model loading and infer_network phase. Have you tried this with CPU or GPU? Is it successfully compile?

 

Furthermore, from the OpenVINO documentation regarding the compile tool, if the input is NCHW the output will be NHWC. To read the onnx model, refer to the ONNX format support in the OpenVINO™ documentation.

 

Regards,

Aznie


0 Kudos
IntelSupport
Moderator
463 Views

Hi pmx,

This thread will no longer be monitored since we have provided a solution. If you need any additional information from Intel, please submit a new question.


Regards,

Aznie


0 Kudos
Reply