Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6404 Discussions

How to implement ONNX Operators in the extensions

huang__grady
Beginner
780 Views
0 Kudos
1 Reply
Shubha_R_Intel
Employee
780 Views

Dear Grady, take a look at the _ext files under here:

https://github.com/opencv/dldt/tree/2018/model-optimizer/extensions/front/onnx

Take a look at the ops files under here:

https://github.com/opencv/dldt/tree/2018/model-optimizer/mo/ops

Once you get an _ext.py file for extraction of attributes and an op.py file for the operation ( an important role for the op.py is inferring outgoing shapes based on incoming shapes), you should be able to create IR using model optimizer. Next, however, you will need to create matching Inference Engine CPU and GPU extensions.

Look here for CPU extensions:

https://github.com/opencv/dldt/tree/2018/inference-engine/src/extension

and here for GPU extensions:

C:\Intel\computer_vision_sdk_2018.5.456\inference_engine\bin\intel64\Release\cldnn_global_custom_kernels

For an example, study the argmax Caffe example:

http://caffe.berkeleyvision.org/tutorial/layers/argmax.html

here is argmax_ext.py

https://github.com/opencv/dldt/blob/2018/model-optimizer/extensions/front/caffe/argmax_ext.py

here is the argmax op:

https://github.com/opencv/dldt/blob/2018/model-optimizer/extensions/ops/argmax.py

and here is the argmax CPU extension:

https://github.com/opencv/dldt/blob/2018/inference-engine/src/extension/ext_argmax.cpp

Basically you need an _ext.py and an op.py for model optimizer and you will need a *.cpp extension on the Inference Engine side for CPU and an OpenCL extension (*.cl) for GPU.

Unfortunately today we do not support custom layers for Myriad.

Hope it helps and thank you for using OpenVino. 

Shubha

0 Kudos
Reply