- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
I am working with a google aiy vision kit, which has a Intel® Movidius™ Myriad 2 MA2450 VPU. I am trying to compile a vision model to run on this kit, but the compiler from google has some issues, so I am trying to use openvino.
Pretty new to openvino toolkit. I installed the latest version on my laptop using the docker container and I managed to use the model optimizer (mo.py) to optimize the model and produce the .xml and .bin files as well as perform inference using the cpu via a python script.
Since the vision kit uses a raspberry pi, it would be ideal to be able to also compile the model on my laptop and the pass just the executable on the pi to perform inference on the VPU. Though, as I understand it, the way it is recommended to do such process is by using the python library of openvino. https://docs.openvino.ai/2024/openvino-workflow/running-inference/integrate-openvino-with-your-application.html
The pi zero w, is pretty underpowered and also it really difficult to install the openvino. For the installation, I saw on a forum comment that is recommended to use the archive, but after installing the dependencies and setting up the environmental variables, I cannot include the python library.
So to sum up my questions, is there a way to compile the model on my laptop and run it on the vpu without needing to install the openvino toolkit on the PI? If not, what is the approach I should take regarding running model on the VPU.
Thanks in advance.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Alexandros-Stavrop,
Thank you for reaching out.
The latest version of OpenVINO that supports VPU is OpenVINO 2022.3.2 LTS. If you optimized your model to an intermediate representation format (.xml and .bin) and most likely need to use OpenVINO for inference.
Besides, I'm unable to recommend any other method than OpenVINO.
Regards,
Zul
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Zul,
Thank you for your answer. Regarding inference, do you suggest running the model using the python library? Also, how do you suggest running openvino on the raspberry pi zero?
Best regards,
Alexandros
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Alexandros-Stavrop,
The OpenVINO toolkit for Raspbian OS has only been validated on the Raspberry Pi 3 and 4. You can try to build OpenVINO from source. But note that you are using Raspberry Pi Zero and it has not been tested. The Pi Zero has limited resources and I cannot guarantee this will work.
Regards,
Zul
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
This thread will no longer be monitored since we have provided information on VPU. If you need any additional information from Intel, please submit a new question.

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page