how to convert onnx model with custom layers
document have no find
Dear Grady, take a look at the _ext files under here:
Take a look at the ops files under here:
Once you get an _ext.py file for extraction of attributes and an op.py file for the operation ( an important role for the op.py is inferring outgoing shapes based on incoming shapes), you should be able to create IR using model optimizer. Next, however, you will need to create matching Inference Engine CPU and GPU extensions.
Look here for CPU extensions:
and here for GPU extensions:
For an example, study the argmax Caffe example:
here is argmax_ext.py
here is the argmax op:
and here is the argmax CPU extension:
Basically you need an _ext.py and an op.py for model optimizer and you will need a *.cpp extension on the Inference Engine side for CPU and an OpenCL extension (*.cl) for GPU.
Unfortunately today we do not support custom layers for Myriad.
Hope it helps and thank you for using OpenVino.