- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am doing a project that use NCS2 on an Arm64 board.
Besides cross-compile issues, I am also confused by the relation of model-optimizer and inference engine.
As there is no model-optimizer in the release for Pi, so does that mean the model-optimizer output is independent of platform,
and I can do the model-optimizing on an Intel PC and save the result for embedded system?
Thanks
Link Copied
1 Reply
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Jimmy,
Yes, you can convert your model to IR using the model optimizer on a separate system, then transfer the resulting .xml and .bin files to your Raspberry Pi and run inference there. Just make sure that the Model Optimizer and Inference Engine are the same version.
Please let me know if you have any further questions!
Best Regards,
Sahira
Reply
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page