Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6480 Discussions

Cannot forward() a net using Openvino Intermediate Representation Files, but can use ONNX file I am making an IR of

Dutta_Roy__Souptik
775 Views

I am stuck at a curious problem with the OpenVINO model optimizer. I have a custom ONNX network which I would like to optimize using the MO. I am using the OpenCV that's shipped with OpenVINO to perform the final inference.

First I convert my onnx network to its equivalent IR representation.

`python mo.py --input_model E:\cv_align.dll --framework onnx --output_dir E:\models\b1 --log_level DEBUG > log.txt`

The output looks fine.

```Model Optimizer arguments:
Common parameters:
    - Path to the Input Model:     E:\cv_align.dll
    - Path for generated IR:     E:\models\b1
    - IR output name:     cv_align
    - Log level:     DEBUG
    - Batch:     Not specified, inherited from the model
    - Input layers:     Not specified, inherited from the model
    - Output layers:     Not specified, inherited from the model
    - Input shapes:     Not specified, inherited from the model
    - Mean values:     Not specified
    - Scale values:     Not specified
    - Scale factor:     Not specified
    - Precision of IR:     FP32
    - Enable fusing:     True
    - Enable grouped convolutions fusing:     True
    - Move mean values to preprocess section:     False
    - Reverse input channels:     False
ONNX specific parameters:
Model Optimizer version:     2019.2.0-436-gf5827d4

[ SUCCESS ] Generated IR model.
[ SUCCESS ] XML file: E:\models\b1\cv_align.xml
[ SUCCESS ] BIN file: E:\models\b1\cv_align.bin
[ SUCCESS ] Total execution time: 30.65 seconds.
```

I take the generated xml,bin and load it into my C++ program. That crashes when I forward the model with dummy inputs. Then I take my onnx and generate network directly from onnx, which works as expected.

Like so,

```
static cv::dnn::Net alignNet;

int main()
{
    //initialise
    //auto out = Align_init("E:\\cv_align.dll", 1);
    auto out = Align_init("E:\\models\\b1\\cv_align.xml",
        "E:\\models\\b1\\cv_align.bin", 1);

return 0;
}

///THIS ONE CRASHES AT POINT SHOWN
ALIGN_OUT Align_init(std::string xmlPath, std::string binPath, int batch_size)
{
    assert(std::experimental::filesystem::exists(xmlPath));
    assert(std::experimental::filesystem::exists(binPath));

    alignNet = cv::dnn::readNetFromModelOptimizer(xmlPath, binPath);
    alignNet.dumpToFile("E:\\models\\dump.dmp");

    alignNet.setPreferableBackend(cv::dnn::DNN_BACKEND_INFERENCE_ENGINE);
    alignNet.setPreferableTarget(cv::dnn::DNN_TARGET_OPENCL);

    //initialise using dummy vars
    std::cout << "  --  Dummy image -- " << std::endl;
    Image dummyImage1 = generateRandMat();
    Image dummyImage2 = generateRandMat();

    Image cImage = getCombinedImage(dummyImage1, dummyImage2);
    auto dummyInput = imgToBlob(cImage);
    alignNet.setInput(dummyInput);
    auto dummyProb = alignNet.forward();  <== This statement throws a Microsoft C++ exception: InferenceEngine::details::InferenceEngineException


    return ALIGN_OUT();
}

///THIS ONE'S FINE
ALIGN_OUT Align_init(std::string onnx_path, int batch_size)
{
    assert(batch_size == 1);
    //TODO: other batch sizes if required

    alignNet = cv::dnn::readNetFromONNX(onnx_path);
    alignNet.dumpToFile("E:\\models\\dump1.dmp");

    alignNet.setPreferableBackend(cv::dnn::DNN_BACKEND_INFERENCE_ENGINE);
    alignNet.setPreferableTarget(cv::dnn::DNN_TARGET_OPENCL);

    //initialise using dummy vars
    std::cout << "  --  Dummy image -- " << std::endl;
    Image dummyImage1 = generateRandMat();
    Image dummyImage2 = generateRandMat();

    Image cImage = getCombinedImage(dummyImage1, dummyImage2);
    auto dummyInput = imgToBlob(cImage);
    alignNet.setInput(dummyInput);
    auto dummyProb = alignNet.forward(); <== This one's fine

return ALIGN_OUT();
}
```


Also just in case
```
typedef cv::Mat Image;
typedef std::vector<Image> Images;
```

Any idea if I am doing something wrong?


Thanks.

----------------------------------------


More info from dumps

ONNX dump -
```
digraph G {
    "261" [label="261\nSlice\nOCV/CPU\n" fillcolor="#ffffb3" style=filled shape=box]
    "262" [label="262\nConvolution\nkernel_size (HxW): 7 x 7\lstride (HxW): 2 x 2\ldilation (HxW): 1 x 1\lpad (HxW): (3, 3) x (3, 3)\lgroup: 1\lOCV/CPU\n" fillcolor="#ffffb3" style=filled shape=box]
    "263" [label="263\nBatchNorm\nOCV/CPU\n" fillcolor="#ffffb3" style=filled shape=box]
    "264" [label="264\nRelu\nOCV/CPU\n" fillcolor="#ffffb3" style=filled shape=box]
    "265" [label="265\nPooling\nkernel_size (HxW): 3 x 3\lstride (HxW): 2 x 2\lpad (HxW): (1, 1) x (1, 1)\lpool: MAX\lOCV/CPU\n" fillcolor="#ffffb3" style=filled shape=box]
    "266" [label="266\nConvolution\nkernel_size (HxW): 3 x 3\lstride (HxW): 1 x 1\ldilation (HxW): 1 x 1\lpad (HxW): (1, 1) x (1, 1)\lgroup: 1\lOCV/CPU\n" fillcolor="#ffffb3" style=filled shape=box]
    "267" [label="267\nBatchNorm\nOCV/CPU\n" fillcolor="#ffffb3" style=filled shape=box]
    "268" [label="268\nRelu\nOCV/CPU\n" fillcolor="#ffffb3" style=filled shape=box]
    "269" [label="269\nConvolution\nkernel_size (HxW): 3 x 3\lstride (HxW): 1 x 1\ldilation (HxW): 1 x 1\lpad (HxW): (1, 1) x (1, 1)\lgroup: 1\lOCV/CPU\n" fillcolor="#ffffb3" style=filled shape=box]
    "270" [label="270\nBatchNorm\nOCV/CPU\n" fillcolor="#ffffb3" style=filled shape=box]
    "271" [label="271\nEltwise\nOCV/CPU\n" fillcolor="#ffffb3" style=filled shape=box]
    "272" [label="272\nRelu\nOCV/CPU\n" fillcolor="#ffffb3" style=filled shape=box]
    "273" [label="273\nConvolution\nkernel_size (HxW): 3 x 3\lstride (HxW): 1 x 1\ldilation (HxW): 1 x 1\lpad (HxW): (1, 1) x (1, 1)\lgroup: 1\lOCV/CPU\n" fillcolor="#ffffb3" style=filled shape=box]
    "274" [label="274\nBatchNorm\nOCV/CPU\n" fillcolor="#ffffb3" style=filled shape=box]
    "275" [label="275\nRelu\nOCV/CPU\n" fillcolor="#ffffb3" style=filled shape=box]
    "276" [label="276\nConvolution\nkernel_size (HxW): 3 x 3\lstride (HxW): 1 x 1\ldilation (HxW): 1 x 1\lpad (HxW): (1, 1) x (1, 1)\lgroup: 1\lOCV/CPU\n" fillcolor="#ffffb3" style=filled shape=box]
    "277" [label="277\nBatchNorm\nOCV/CPU\n" fillcolor="#ffffb3" style=filled shape=box]
    "278" [label="278\nEltwise\nOCV/CPU\n" fillcolor="#ffffb3" style=filled shape=box]
....
...
many more lines
```

IR dump -
```
digraph G {
    "545" [label="545\n\nDLIE/CPU\n" fillcolor="#fdb462" style=filled shape=box]

    "_input" -> "545"
}
```

Seems like it just skips every layer in the middle somehow.

 

Also posted on https://stackoverflow.com/questions/58338846/cannot-forward-a-net-using-openvino-intermediate-representation-files-but-can

0 Kudos
2 Replies
Shubha_R_Intel
Employee
775 Views

Dear Dutta Roy, Souptik,

If you are using OpenCV for inference, you are better off posting your issue here:

https://github.com/opencv/opencv/issues

This forum is concerned with OpenVino which includes Model Optimizer and Inference Engine Core API. I hope you understand. If you can reproduce this error using Inference Engine Core API, then that would be appropriate for this forum.

Thanks,

Shubha

 

0 Kudos
Dutta_Roy__Souptik
775 Views

It's an OpenCV issue. Tested with Inference Engine API, it works fine.

0 Kudos
Reply