Intel® Distribution of OpenVINO™ Toolkit
Community support and discussions about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all things computer vision-related on Intel® platforms.
6061 Discussions

is it possible to use one model for multi-thread inference?

shirleyliu
New Contributor I
617 Views

hello, My question is if is possible to use the same loaded model for multi-inference using multiple threads (each thread has different input data).

0 Kudos
4 Replies
Iffa_Intel
Moderator
519 Views

Hi,


it should be possible but it's all depends on your software design. It also needs to be relevant to your hardware design (ensure your hardware is capable to achieve the operation that you want to do).

You'll need to design a multithreading sequence properly so that the model could be used by multiple inferencing without halting/disturbing each other operations.



Sincerely,

Iffa




shirleyliu
New Contributor I
500 Views

hello Iffa,

Is there any example for this scenario?

Iffa_Intel
Moderator
492 Views

There are OpenVINO sample applications for multi-model in one inference, multi-device, etc.

You may refer here.

 

However, there is none specific to multithreading and the multi-inference of one model.

 

You may refer to this Threading utilities instead.

Another thing to note is this discussion.

 

 

Sincerely,

Iffa

 

Iffa_Intel
Moderator
464 Views

Greetings,


Intel will no longer monitor this thread since we have provided a solution. If you need any additional information from Intel, please submit a new question. 


Sincerely,

Iffa


Reply