Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

Error on using my classification model on RPi3 B+: The plugin does not support network with MIXED format.

sakata__atsuya
Beginner
475 Views

Hi,

I'm trying to use my own trained network on RaspberryPi 3 B+, but it doesn't work well. 

On Ubuntu 18.04, I fine-tuned the pre-trained VGG16 model on keras, and converted the model(.h5) into tensorflow model(.pb). Then, I ran the model optimizer to convert .pb to IR as below:

python3 /opt/intel/openvino/deployment_tools/model_optimizer/mo_tf.py \
--input_model tf_model.pb \
--batch 1 \
--output_dir ../deploy/FP16 \
--data_type FP16 \
--generate_deprecated_IR_V2

But when I ran a simple inference code using the model on raspberry pi, the following error raised:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
cv2.error: OpenCV(4.1.0-openvino) /home/jenkins/workspace/OpenCV/OpenVINO/build/opencv/modules/dnn/src/op_inf_engine.cpp:747: error: (-215:Assertion failed) in function 'initPlugin'
> Failed to initialize Inference Engine backend: The plugin does not support networks with MIXED format.
> Supported format: FP32 and FP16.

I don't know why the converted network has MIXED format although I specified data_type to FP16.

Does anyone face or solve the similar issue?

System info:

Ubuntu 18.04

Tensorflow 1.13.1

Python 3.7.2(ubuntu), 3.5.3 (RPi)

Thank you very much,

Sakata

0 Kudos
1 Solution
Shubha_R_Intel
Employee
475 Views

Dear sakata, atsuya,

Why are you doing --generate_deprecated_IR_V2 ? What version of OpenVino are you using ? We just released OpenVino 2019R2, please try it - it fixes a lot of issues. I assume you are trying to run inference on a MYRIAD device (NCS2) plugged into your RPi3 B+ ? While  RPi3 B+ will work fine as a host, OpenVino doesn't have an ARM plugin.

Looking forward to hearing your response,

Shubha

View solution in original post

0 Kudos
3 Replies
Shubha_R_Intel
Employee
476 Views

Dear sakata, atsuya,

Why are you doing --generate_deprecated_IR_V2 ? What version of OpenVino are you using ? We just released OpenVino 2019R2, please try it - it fixes a lot of issues. I assume you are trying to run inference on a MYRIAD device (NCS2) plugged into your RPi3 B+ ? While  RPi3 B+ will work fine as a host, OpenVino doesn't have an ARM plugin.

Looking forward to hearing your response,

Shubha

0 Kudos
sakata__atsuya
Beginner
475 Views

Dear Shubha,

I'm so sorry for replying late.

It worked just by updating OpenVino to 2019R2!

I was trying to run inference on a MYRIAD device (NCS2), but I installed OpenVino ver. R1 on RPi3 B+ by my mistake. That is why I added --generate_deprecated_IR_V2 to model optimizer arguments.

Thank you very much for your comment.

Sakata

0 Kudos
Shubha_R_Intel
Employee
475 Views

Dear sakata, atsuya,

Thanks for following up and I'm glad OpenVino 2019R2 fixed things up for you !

Shubha

 

0 Kudos
Reply