Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

Unsupported primitive of type: Resample name: up_sampling2d_4/ResizeNearestNeighbor

Smistad__Erik
Beginner
1,115 Views

Hi

I am using the UpSampling2D layer in Keras, which uses the ResizeNearestNeighbor layer in tensorflow. The model optimizer is able to convert it, but when I execute with OpenVINO in C++ it it gives me the following error:

Unsupported primitive of type: Resample name: up_sampling2d_4/ResizeNearestNeighbor

Which I find strange as this layer is listed as supported on this page (See nr 49): https://software.intel.com/en-us/articles/OpenVINO-Using-TensorFlow#tensorflow-supported-layers

I am using Ubuntu Linux, have an Intel i7-6700HQ, R5 2018 (2018.5.445) of the OpenVINO SDK.

Any assistance on this matter would be appreciated

 

0 Kudos
9 Replies
Monique_J_Intel
Employee
1,115 Views

Hi Erik,

Can you upload the following:

  • command for model optimizer conversion
  • zip file with your pre-trained model files and IR model files(.xml+.bin)
  • application that you are running or if it's a sample in the package please specify the sample and the device you are targeting(CPU,iGPU etc)

Kind Regards,

Monique Jones

0 Kudos
Smistad__Erik
Beginner
1,115 Views

* Command for model optimizer conversion:

python3 mo_tf.py --input_model /xxx/transfer_learning_model_april_2019.pb --batch 1 --output_dir /xxx/models/

* Files are attached

* Application I'm running is FAST https://github.com/smistad/fast, the OpenVINO specific code is here: https://github.com/smistad/FAST/blob/ca95829b949d72d5a5cc34bd4e58deefc10097b6/source/FAST/Algorithms/NeuralNetwork/OpenVINOEngine.cpp

* The error only occur when the device is CPU, with GPU it works.

0 Kudos
om77
New Contributor I
1,115 Views

Erik,

For CPU mode, did you try to use cpu extension plugin (libcpu_extension_ ....so) additionally?

0 Kudos
Shubha_R_Intel
Employee
1,115 Views

Dear Erik,

Please modify one of the OpenVino samples to resemble the below code:

https://github.com/smistad/FAST/blob/ca95829b949d72d5a5cc34bd4e58deefc10097b6/source/FAST/Algorithms/NeuralNetwork/OpenVINOEngine.cpp

Or maybe one of our existing OpenVino samples will work out of the box (but I doubt it).

If you can supply a main.cpp containing the code logic and reproduces the issue (and also works within the OpenVino sample infrastructure), that would expedite debugging. 

Thanks !

Shubha

0 Kudos
Shubha_R_Intel
Employee
1,115 Views

Dear Erik, also what om77 says often does the trick for the 'Unsupported Primitive' error. Just add -l <PATH_TO cpu_extension.dll> (or cpu_extension.so).

Hope it helps,

Thanks,

Shubha

 

 

0 Kudos
任__鹏飞
Beginner
1,115 Views

The same question 

Can you solve it?

0 Kudos
任__鹏飞
Beginner
1,115 Views

hey I solved this problem according to Shubha R. (Intel)'s’ method.

Find the extension dll file in the installation directory

I installed the default path:

cpu_dll_path = r"C:\Program Files (x86)\IntelSWTools\openvino_2019.1.087\deployment_tools\inference_engine\bin\intel64\Release\cpu_extension_avx2.dll"
And call the class method to load the extension.

And call the class method to load the extension.

plugin.add_cpu_extension(cpu_dll_path)

Then run plugin.load will not give an error

 

0 Kudos
Smistad__Erik
Beginner
1,115 Views

I was able to fix it with Shubha's suggestion aswell:

#ifdef WIN32
auto extension_ptr = make_so_pointer<::InferenceEngine::IExtension>("libcpu_extension.dll");
#else
auto extension_ptr = make_so_pointer<::InferenceEngine::IExtension>("libcpu_extension.so");
#endif
m_inferencePlugin->AddExtension(extension_ptr);

Thanks!

Now I have another issue though, on the GPU, this network model gives the wrong answer, while on the CPU it works now.

0 Kudos
azmat
Beginner
1,115 Views

Any thoughts on how to fix this in Ubuntu Linux (either 16 or 18)?

I have `OpenVINO_2019.3.334 (R3)` installed at `/opt/intel/openvino_fpga_2019.3.334/deployment_tools/inference_engine/lib/intel64`, where I have `libcpu_extension_avx2.so`, `libcpu_extension_avx512.so`, and `libcpu_extension_sse4.so` installed.  I've tried adding this path to my `$LD_LIBRARY_PATH`, but to no avail.  I've also tried re-calling the plugin again to reload the plugin path, as follows:

self.plugin = IEPlugin('CPU')
cpu_ext_path = r"/opt/intel/openvino_fpga_2019.3.334/deployment_tools/inference_engine/lib/intel64/libcpu_extension_avx2.so"
self.plugin.add_cpu_extension(cpu_ext_path)

Unfortunately, I still get this error:

`RuntimeError: Unsupported primitive of type: Resample name: ssh_c3_up`

The original layer is type "Upsampling", which gets renamed to "Resample", per the remapping semantics for MXNet described in:
https://docs.openvinotoolkit.org/latest/_docs_MO_DG_prepare_model_Supported_Frameworks_Layers.html#mxnet_supported_symbols_and_the_mapping_to_the_intermediate_representation_layers

Thanks,

Azmat

0 Kudos
Reply