Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Mehta__Anushka
Beginner
98 Views

Add custom extensions before converting onnx model to openvino format

I am trying to convert an ONNX model with LogSoftmax in output layer to xml file by running the model optimizer script- mo.py

However, since LogSoftmax is not included in the default supported layers, I get the output- "There is no registered "infer" function for node "170" with op = "LogSoftmax". Please implement this function in the extensions"

I am aware that there are c++ implementations of these custom layers in the extensions folder, and I built the cpu_extensions again to get the libcpu_extension.so file.

But there is not enough information on how to use these extensions while running mo.py. How can this be done?

0 Kudos
1 Reply
Shubha_R_Intel
Employee
98 Views

Dear Mehta, Anushka,

You need to create an _ext file under here:

C:\Program Files (x86)\IntelSWTools\openvino_2019.2.242\deployment_tools\model_optimizer\extensions\front\onnx

and an ops file under here also

C:\Program Files (x86)\IntelSWTools\openvino_2019.2.242\deployment_tools\model_optimizer\mo\ops

with the infer method fully defined in the "ops" file.

Hope it helps,

Thanks,

Shubha

 

Reply