Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Wang__Jimmy
Beginner
99 Views

OpenVino: Model-optimizing and inference-Engine for Arm embedded system

I am doing a project that use NCS2 on an Arm64 board.

Besides cross-compile issues, I am also confused by the relation of model-optimizer and inference engine.

As there is no model-optimizer in the release for Pi, so does that mean the model-optimizer output is independent of platform,

and I can do the model-optimizing on an Intel PC and save the result for embedded system?

Thanks

0 Kudos
1 Reply
Sahira_Intel
Moderator
99 Views

Hi Jimmy,

Yes, you can convert your model to IR using the model optimizer on a separate system, then transfer the resulting .xml and .bin files to your Raspberry Pi and run inference there. Just make sure that the Model Optimizer and Inference Engine are the same version. 

Please let me know if you have any further questions!

Best Regards,

Sahira 

Reply