- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
hello, My question is if is possible to use the same loaded model for multi-inference using multiple threads (each thread has different input data).
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
it should be possible but it's all depends on your software design. It also needs to be relevant to your hardware design (ensure your hardware is capable to achieve the operation that you want to do).
You'll need to design a multithreading sequence properly so that the model could be used by multiple inferencing without halting/disturbing each other operations.
Sincerely,
Iffa
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
hello Iffa,
Is there any example for this scenario?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
There are OpenVINO sample applications for multi-model in one inference, multi-device, etc.
You may refer here.
However, there is none specific to multithreading and the multi-inference of one model.
You may refer to this Threading utilities instead.
Another thing to note is this discussion.
Sincerely,
Iffa
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Greetings,
Intel will no longer monitor this thread since we have provided a solution. If you need any additional information from Intel, please submit a new question.
Sincerely,
Iffa
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page