it should be possible but it's all depends on your software design. It also needs to be relevant to your hardware design (ensure your hardware is capable to achieve the operation that you want to do).
You'll need to design a multithreading sequence properly so that the model could be used by multiple inferencing without halting/disturbing each other operations.
There are OpenVINO sample applications for multi-model in one inference, multi-device, etc.
You may refer here.
However, there is none specific to multithreading and the multi-inference of one model.
You may refer to this Threading utilities instead.
Another thing to note is this discussion.
Intel will no longer monitor this thread since we have provided a solution. If you need any additional information from Intel, please submit a new question.