Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6404 Discussions

is it possible to use one model for multi-thread inference?

shirleyliu
New Contributor I
1,696 Views

hello, My question is if is possible to use the same loaded model for multi-inference using multiple threads (each thread has different input data).

0 Kudos
4 Replies
Iffa_Intel
Moderator
1,598 Views

Hi,


it should be possible but it's all depends on your software design. It also needs to be relevant to your hardware design (ensure your hardware is capable to achieve the operation that you want to do).

You'll need to design a multithreading sequence properly so that the model could be used by multiple inferencing without halting/disturbing each other operations.



Sincerely,

Iffa




0 Kudos
shirleyliu
New Contributor I
1,579 Views

hello Iffa,

Is there any example for this scenario?

0 Kudos
Iffa_Intel
Moderator
1,571 Views

There are OpenVINO sample applications for multi-model in one inference, multi-device, etc.

You may refer here.

 

However, there is none specific to multithreading and the multi-inference of one model.

 

You may refer to this Threading utilities instead.

Another thing to note is this discussion.

 

 

Sincerely,

Iffa

 

0 Kudos
Iffa_Intel
Moderator
1,543 Views

Greetings,


Intel will no longer monitor this thread since we have provided a solution. If you need any additional information from Intel, please submit a new question. 


Sincerely,

Iffa


0 Kudos
Reply