I have two sticks, one is Intel neural compute stick and another is Intel neural compute stick 2, I want to use multi sticks to inference models. The first situation is to use two sticks to infer one CNN model, the operation system dispatches the computation burden to each stick automatically.
Another situation is to infer a complete CNN model on each stick respectively, for example, use one stick to inference model1, use another stick to inference model2, can I manually dispatch this mission to each stick?
Can these two assumptions be realized and how to do that in python? Can you offer some sample codes?
In the Openvino samples, I only found python codes to infer one model on one stick.
The two use cases can be supported by OpenVino. Basically, multiple ExecutableNetwork instances need to be created, and one instance for each device. Though, there is nothing in the API that ties an ExecutableNetwork to a particular device.
For python sample code, you can referto this repo multiple_device_ncs2_async.py which shows how to run one CNN model in multiple ncs2 via multiple threads.
hi, Yuanyuan, I couldn't open this URL your given"multiple_device_ncs2_async.py," if you are convenient, could you send the sample code "multiple_device_ncs2_async.py" to my e-mail, my email is: email@example.com. thanks again.
> , just wondering can Python infer multiple models in the openvino?
Yes, OpenVino can infer multiple models on same device, or infer one models on one device. If you infer multiple models on one ncs2, it may hit the memory limitation.
thanks Yuanyuan, now I want to use Python to infer multiple models, I have looked all python samples in the openvino, but all samples just infer one model, where I can find the Python sample to infer multiple models?
There is not too much difference b/w c++ and python interface to infer multiple models. For c++, you can take a look at crossroad_camera_demo.hpp for details.
tang, chu wrote:
hi, Yuanyuan, I have found the"multiple_device_ncs2_async.py" in github, thanks, just wondering can Python infer multiple models in the openvino?
Where did you find it? I'm coming up empty searching github and Google