Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

OpenVino: Model-optimizing and inference-Engine for Arm embedded system

Wang__Jimmy
Beginner
304 Views

I am doing a project that use NCS2 on an Arm64 board.

Besides cross-compile issues, I am also confused by the relation of model-optimizer and inference engine.

As there is no model-optimizer in the release for Pi, so does that mean the model-optimizer output is independent of platform,

and I can do the model-optimizing on an Intel PC and save the result for embedded system?

Thanks

0 Kudos
1 Reply
Sahira_Intel
Moderator
304 Views

Hi Jimmy,

Yes, you can convert your model to IR using the model optimizer on a separate system, then transfer the resulting .xml and .bin files to your Raspberry Pi and run inference there. Just make sure that the Model Optimizer and Inference Engine are the same version. 

Please let me know if you have any further questions!

Best Regards,

Sahira 

0 Kudos
Reply