Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

Custom GPU Operations opset name

Senfter__Thomas
Beginner
1,851 Views

Hi,

i created a custom operation for the inference engine (https://github.com/accessio-gmbh/arivo_custom_openvino_layers) according to https://docs.openvinotoolkit.org/latest/openvino_docs_IE_DG_Extensibility_DG_Intro.html (without model optimizer stuff).

While the CPU extension works fine, the GPU version does not work as expected:
"Cannot create WarpAffine layer stn_loc id:151 from unsupported opset: custom_layers"

Layer in the .xml:

<layer id="503" name="warp" precision="FP32" type="WarpAffine" version="custom_layers">

Loading the GPU extension:

iie_core_.SetConfig({{InferenceEngine::PluginConfigParams::KEY_CONFIG_FILE, "/opt/iie/custom_layers.xml"}}, "GPU");

My fix was to load the CPU extension to the CPU plugin, though I do not even use the CPU plugin.

auto extension_ptr = InferenceEngine::make_so_pointer<InferenceEngine::IExtension>("/opt/iie/libcustom_cpu_extensions.so");
instance_->iie_core_.AddExtension(extension_ptr, "CPU");


Is this the correct way to do this? Or can I somewhere in the custom layer .xml specify the opset?

Thanks
Thomas

Labels (2)
0 Kudos
6 Replies
Iffa_Intel
Moderator
1,837 Views

Greetings,


The Process is defined in the "How to Implement Custom GPU Operations" article - https://docs.openvinotoolkit.org/latest/openvino_docs_IE_DG_Extensibility_DG_GPU_Kernel.html

You need to ensure that you follow all the required steps to do so.


Plus, this is further info regarding the operation set: https://docs.openvinotoolkit.org/latest/openvino_docs_MO_DG_IR_and_opsets.html



Sincerely,

Iffa


0 Kudos
Senfter__Thomas
Beginner
1,829 Views

Hi,

I checked your links, but those steps are not the problem. The problem is that ngraph does not recognize my "custom_layers" opset without loading the CPU extension. I figured out that I can also load the extension to the GPU plugin using

auto extension_ptr = InferenceEngine::make_so_pointer<InferenceEngine::IExtension>("/opt/iie/libcustom_cpu_extensions.so");
instance_->iie_core_.AddExtension(extension_ptr, "GPU");

So I guess you have to add the extension to any Plugin, so the opset is known to ngraph?

Thanks
Thomas

0 Kudos
Iffa_Intel
Moderator
1,814 Views

If you take a look at the CPU extensibility section you will found out that:

  1. All custom kernels for the CPU plugin should be inherited from the InferenceEngine::ILayerExecImpl interface.
  2. Next, an implementation constructor checks parameters of nGraph operation, stores needed attributes, and stores an error message in the case of an error.
  3. Then, the registration of custom kernel implementation in the Extension class and implement AddExtension method of the general plugin interface to load your primitives.



Hence, to answer your question, yes, you need to add the extension to the required plugin.


Make sure to check out the CPU section after the GPU: https://docs.openvinotoolkit.org/latest/openvino_docs_IE_DG_Extensibility_DG_CPU_Kernel.html


and the custom ngraph section: https://docs.openvinotoolkit.org/latest/openvino_docs_IE_DG_Extensibility_DG_AddingNGraphOps.html



Sincerely,

Iffa


0 Kudos
Senfter__Thomas
Beginner
1,808 Views
0 Kudos
Iffa_Intel
Moderator
1,799 Views

Hi,


Glad that helps.

If you have no other inquiries shall I close his thread?



Sincerely,

Iffa


0 Kudos
Iffa_Intel
Moderator
1,768 Views

Greetings,


Intel will no longer monitor this thread since we have provided a solution. If you need any additional information from Intel, please submit a new question.



Sincerely,

Iffa


0 Kudos
Reply