Intel® Distribution of OpenVINO™ Toolkit
Community support and discussions about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all things computer vision-related on Intel® platforms.

Mask R-CNN on Intel Neural Stick2 Error

ShadiAndisheh
Beginner
261 Views

Hi

I am trying to run Mask R-CNN R-50-FPN.onnx on NSC2. I converted it successfully to IR format (both FP32 and FP16) and got inference with CPU with correct results . However, when I set the device to MYRIAD the following error occurs :

" Traceback (most recent call last):
File "inference.py", line 123, in <module>
exec_net_onnx = ie.load_network(network = net, device_name = 'MYRIAD')
File "ie_api.pyx", line 372, in openvino.inference_engine.ie_api.IECore.load_network
File "ie_api.pyx", line 390, in openvino.inference_engine.ie_api.IECore.load_network
RuntimeError: [ GENERAL_ERROR ]
/home/jenkins/agent/workspace/private-ci/ie/build-linux-ubuntu18/b/repos/openvino/inference-engine/src/vpu/common/src/ngraph/transformations/dynamic_to_static_shape.cpp:199 DynamicToStaticShape transformation encountered dynamic node ROIFeatureExtractor_2 of type ExperimentalDetectronROIFeatureExtractor ver. 6, but only [Ceiling ver. 0, Clamp ver. 0, Concat ver. 0, Convert ver. 0, Exp ver. 0, ExpGatherElements ver. 0, Floor ver. 0, Log ver. 0, MatMul ver. 0, Relu ver. 0, ...] types are supported for dynamic nodes "

I have now 2 question

1. Is this problem occurs due to the incompatibility of Mask R-CNN computations on VPU ?

2.Or it is due to the size of model which makes this problem; If yes is it going to work by optimizing maskrcnn  model? i knew its big but my main goal was that accelerate that to run inference for single image on ncs2 so i even implements it with custom dataset and quantized backbone in pytorch( inherit some parts from pytorch maskrcnn repo and modified some parts myself) so it is so important to me to knew at the end can i run inference in NCS2 or not since it is a part of my thesis and i have limitation in time. many people told me change your model to another one like ssd but the thing is that i cant change the maskrcnn model since goal was to run this and modify roi head and etc and accelerate this. I spends more than 3 weeks till now a lot of searching and trying but still not successful. i really appreciate if you help me in this issue or give me some hint or advice.

Thank yiou so much in advance

0 Kudos
3 Replies
Wan_Intel
Moderator
249 Views

Hi ShadiAndisheh,

Thank you for reaching out to us.

 

Are you using the latest version of Intel® Distribution of OpenVINO™ Toolkit?

 

Referring to Public Pre-Trained Models Device Support, we regret to inform you that Mask R-CNN is only supported for the CPU plugin and the GPU plugin, and it is not supported for the MYRIAD plugin.

 

On another note, supported models for MYRIAD Plugin is available on the following page:

·      https://docs.openvino.ai/latest/omz_models_public_device_support.html#public-pre-trained-models-devi...

·      https://docs.openvino.ai/latest/omz_models_intel_device_support.html#intel-s-pre-trained-models-devi...

 

 

Regards,

Wan


Wan_Intel
Moderator
206 Views

Hi ShadiAndisheh,

 

Could you please share the following information with us so that we can further assist you?

 

·      The conversion command that you used to convert the ONNX model to Intermediate Representation (IR).

·      The IR files (both XML and BIN files)

·      Which demo application are you using? Can you share your scripts with us?

 

 

Regards,

Wan


Wan_Intel
Moderator
182 Views

Hi ShadiAndisheh,

Thank you for your question.

 

If you need any additional information from Intel, please submit a new question as this thread is no longer being monitored.

 

 

Regards,

Wan


Reply