- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi all
how to convert onnx model with custom layers
document have no find
thanks
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Dear Grady, take a look at the _ext files under here:
https://github.com/opencv/dldt/tree/2018/model-optimizer/extensions/front/onnx
Take a look at the ops files under here:
https://github.com/opencv/dldt/tree/2018/model-optimizer/mo/ops
Once you get an _ext.py file for extraction of attributes and an op.py file for the operation ( an important role for the op.py is inferring outgoing shapes based on incoming shapes), you should be able to create IR using model optimizer. Next, however, you will need to create matching Inference Engine CPU and GPU extensions.
Look here for CPU extensions:
https://github.com/opencv/dldt/tree/2018/inference-engine/src/extension
and here for GPU extensions:
C:\Intel\computer_vision_sdk_2018.5.456\inference_engine\bin\intel64\Release\cldnn_global_custom_kernels
For an example, study the argmax Caffe example:
http://caffe.berkeleyvision.org/tutorial/layers/argmax.html
here is argmax_ext.py
https://github.com/opencv/dldt/blob/2018/model-optimizer/extensions/front/caffe/argmax_ext.py
here is the argmax op:
https://github.com/opencv/dldt/blob/2018/model-optimizer/extensions/ops/argmax.py
and here is the argmax CPU extension:
https://github.com/opencv/dldt/blob/2018/inference-engine/src/extension/ext_argmax.cpp
Basically you need an _ext.py and an op.py for model optimizer and you will need a *.cpp extension on the Inference Engine side for CPU and an OpenCL extension (*.cl) for GPU.
Unfortunately today we do not support custom layers for Myriad.
Hope it helps and thank you for using OpenVino.
Shubha

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page