Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

[Bug report] Invalid IR generated from ONNX containing Deconvolution

Lundgaard__Thomas
701 Views

Hi,

When I use the model optimizer on the attached ONNX model, the generated intermediate representation is invalid. When loading the network into the inference engine I get

Exception: Error reading network: Error of validate layer: Convolution2 with type: Deconvolution. Cannot parse parameter pads_end  from IR for layer Convolution2. Value -1,-1 cannot be casted to int.
../include/ie_layers.h:338
..\src\inference_engine\ie_layer_valida
c:\intel\computer_vision_sdk\deployment_tools\inference_engine\include\details\ie_exception_conversion.hpp:71

I have OpenVino version 2018.5.456 on Windows 10 and use python 3.6.8. The ONNX model was exported from the latest version of Microsoft CNTK.

Regards,

Thomas

0 Kudos
5 Replies
Shubha_R_Intel
Employee
701 Views

Hello Thomas. This is a bug. 

Model Optimizer generated IR successfully from your ONNX model. However within the XML file, please see the following:

<data dilations="1,1" group="1" kernel="3,3" output="10" pads_begin="1,1" pads_end="-1,-1" strides="2,2"/>

The OpenVino documentation https://software.intel.com/en-us/articles/OpenVINO-ModelOptimizer at sidebar 

Advanced Topics about the Model Optimizer Internals/Intermediate Representation Notation Reference Catalog/Convolution Layer is a description of pad:

Parameter name: pad (pad-x, pad-y)

Description:pad (pad-x, pad-y) is a number of pixels to add to the left (top) of the input. For example, pad (pad-x, pad-y) equal 1 (1, 1) means adding 1 pixel to the left of the input. Right and bottom padding should be calculated from the expected output width (height)

Range of values: integer values starting from 0

The issue is those pads_end="-1,-1" values.

Another way to see this is to build the Visual Studio 2017 samples and run classification.exe as follows (just pass in any folder containing images):

.\classification_sample.exe -i C:\Intel\other-models\customlayer\pics -m C:\Intel\computer_vision_sdk_2018.5.445\deployment_tools\model_optimizer\upsampler.xml

The error below is the same as what I'm telling you above.

[ INFO ] InferenceEngine:
        API version ............ 1.4
        Build .................. 19154
[ INFO ] Parsing input parameters
[ INFO ] Files were added: 2
[ INFO ]     C:\Intel\other-models\customlayer\pics/cat.bmp
[ INFO ]     C:\Intel\other-models\customlayer\pics/dog.bmp
[ INFO ] Loading plugin

        API version ............ 1.5
        Build .................. win_20181005
        Description ....... MKLDNNPlugin
[ INFO ] Loading network files:
        C:\Intel\computer_vision_sdk_2018.5.445\deployment_tools\model_optimizer\upsampler.xml
        C:\Intel\computer_vision_sdk_2018.5.445\deployment_tools\model_optimizer\upsampler.bin
[ ERROR ] Error reading network: Error of validate layer: Convolution2 with type: Deconvolution. Cannot parse parameter pads_end  from IR for layer Convolution2. Value -1,-1 cannot be casted to int.

0 Kudos
Lundgaard__Thomas
701 Views

Hi Shubha,

Yes, that is the same behaviour I am observing. I am able to run the model optimizer on the onnx model, but the generated xml has those negative padding values. As far as I can tell this is a bug in the model optimizer? (Or the onnx exported from CNTK could also be invalid, but I have no way of checking that.)

When I inspect the documentation at the link you provided, that also does not correspond to the IR xml that is actually generated. For example, the documentation mentions attributes "pad-x" and "pad-y", while the generated xml has attributes "pads_begin" and "pads_end".

Thanks for looking into this,
Thomas

0 Kudos
Shubha_R_Intel
Employee
701 Views

Thomas thanks for replying back. I have inquired about a workaround for this issue. Thanks for your patience.

0 Kudos
Shubha_R_Intel
Employee
701 Views

HI thomas. I have edited my original statement from "This is not a bug" to "This is a bug". The dev team acknowledged that this is a bug which has already been fixed for the next release. thank you much using OpenVino !

 

Shubha

0 Kudos
Lundgaard__Thomas
701 Views

Great to hear that this is already fixed. Thanks!

0 Kudos
Reply