- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi There,
From here, I know that Model Loading and its Inference are performed internally but are there any way I can target a Movidius for my model. I need to distribute the model inference target. I have three models to work with. So on CPU I have loaded all three models on 2 NCS2, taken inference successfully but the same code when running on Raspberry Pi 3B throws NC_ERROR. I am really confused about why that is happening. I need to debug this, which model is loaded on what devices so that I can know what went wrong. Please help.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Saurav,
Can you please confirm that you have followed the instructions to add USB rules for Raspberry Pi.
Best Regards,
Surya
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Yes, I did.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I solved it. First I plugged-in one NCS2 and let OpenVino, load two models on it and start inferencing and then I plugged the other NCS2 and start the same for the third model. It is running successfully.
I am still wanna know how that happened internally and can I configure in targeting the loading of models on NCS2?
Thanks.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Saurav,
Please refer Multiple NCS devices on OpenVINO.
You may also refer to a similar thread.
Best Regards,
Surya
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page