- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello:
I have two sticks, one is Intel neural compute stick and another is Intel neural compute stick 2, I want to use multi sticks to inference models. The first situation is to use two sticks to infer one CNN model, the operation system dispatches the computation burden to each stick automatically.
Another situation is to infer a complete CNN model on each stick respectively, for example, use one stick to inference model1, use another stick to inference model2, can I manually dispatch this mission to each stick?
Can these two assumptions be realized and how to do that in python? Can you offer some sample codes?
In the Openvino samples, I only found python codes to infer one model on one stick.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi, TangChu
The two use cases can be supported by OpenVino. Basically, multiple ExecutableNetwork instances need to be created, and one instance for each device. Though, there is nothing in the API that ties an ExecutableNetwork to a particular device.
For python sample code, you can referto this repo multiple_device_ncs2_async.py which shows how to run one CNN model in multiple ncs2 via multiple threads.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
thanks Yuanyuan, but I still have a question, can python infer multiple models in the openvino? or just can write C++ to infer multiple models in the openvino?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
hi, Yuanyuan, I couldn't open this URL your given"multiple_device_ncs2_async.py," if you are convenient, could you send the sample code "multiple_device_ncs2_async.py" to my e-mail, my email is: chu.tang@anu.edu.au. thanks again.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
hi, Yuanyuan, I have found the"multiple_device_ncs2_async.py" in github, thanks, just wondering can Python infer multiple models in the openvino?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@tang, chu
> , just wondering can Python infer multiple models in the openvino?
Yes, OpenVino can infer multiple models on same device, or infer one models on one device. If you infer multiple models on one ncs2, it may hit the memory limitation.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
thanks Yuanyuan, now I want to use Python to infer multiple models, I have looked all python samples in the openvino, but all samples just infer one model, where I can find the Python sample to infer multiple models?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@tang, chu
There is not too much difference b/w c++ and python interface to infer multiple models. For c++, you can take a look at crossroad_camera_demo.hpp for details.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
ok, thanks yuanyuan.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
tang, chu wrote:hi, Yuanyuan, I have found the"multiple_device_ncs2_async.py" in github, thanks, just wondering can Python infer multiple models in the openvino?
Where did you find it? I'm coming up empty searching github and Google
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page