- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I'm trying to use my own trained network on RaspberryPi 3 B+, but it doesn't work well.
On Ubuntu 18.04, I fine-tuned the pre-trained VGG16 model on keras, and converted the model(.h5) into tensorflow model(.pb). Then, I ran the model optimizer to convert .pb to IR as below:
python3 /opt/intel/openvino/deployment_tools/model_optimizer/mo_tf.py \ --input_model tf_model.pb \ --batch 1 \ --output_dir ../deploy/FP16 \ --data_type FP16 \ --generate_deprecated_IR_V2
But when I ran a simple inference code using the model on raspberry pi, the following error raised:
Traceback (most recent call last): File "<stdin>", line 1, in <module> cv2.error: OpenCV(4.1.0-openvino) /home/jenkins/workspace/OpenCV/OpenVINO/build/opencv/modules/dnn/src/op_inf_engine.cpp:747: error: (-215:Assertion failed) in function 'initPlugin' > Failed to initialize Inference Engine backend: The plugin does not support networks with MIXED format. > Supported format: FP32 and FP16.
I don't know why the converted network has MIXED format although I specified data_type to FP16.
Does anyone face or solve the similar issue?
System info:
Ubuntu 18.04
Tensorflow 1.13.1
Python 3.7.2(ubuntu), 3.5.3 (RPi)
Thank you very much,
Sakata
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Dear sakata, atsuya,
Why are you doing --generate_deprecated_IR_V2 ? What version of OpenVino are you using ? We just released OpenVino 2019R2, please try it - it fixes a lot of issues. I assume you are trying to run inference on a MYRIAD device (NCS2) plugged into your RPi3 B+ ? While RPi3 B+ will work fine as a host, OpenVino doesn't have an ARM plugin.
Looking forward to hearing your response,
Shubha
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Dear sakata, atsuya,
Why are you doing --generate_deprecated_IR_V2 ? What version of OpenVino are you using ? We just released OpenVino 2019R2, please try it - it fixes a lot of issues. I assume you are trying to run inference on a MYRIAD device (NCS2) plugged into your RPi3 B+ ? While RPi3 B+ will work fine as a host, OpenVino doesn't have an ARM plugin.
Looking forward to hearing your response,
Shubha
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Dear Shubha,
I'm so sorry for replying late.
It worked just by updating OpenVino to 2019R2!
I was trying to run inference on a MYRIAD device (NCS2), but I installed OpenVino ver. R1 on RPi3 B+ by my mistake. That is why I added --generate_deprecated_IR_V2 to model optimizer arguments.
Thank you very much for your comment.
Sakata
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Dear sakata, atsuya,
Thanks for following up and I'm glad OpenVino 2019R2 fixed things up for you !
Shubha
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page