Intel® Distribution of OpenVINO™ Toolkit
Community support and discussions about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all things computer vision-related on Intel® platforms.
5541 Discussions

OpenVino inference time become slow when startup multiple processes

User1578037094245524
207 Views

Hi,

I use OpenVino to infer a ResNet50 model. When I start only one process, inference time is  about 300ms.

But When I start another process(just make a copy), both processes inference time become 600ms.

Inference time of different processes affect each other? Could you give me some information about it?

Thanks a lot!

 

0 Kudos
1 Solution
User1578037094245524
178 Views

Hi Aznie,

Thanks a lot for your reply.

I fixed the issue by SetConfig({ { CONFIG_KEY(CPU_BIND_THREAD), "NO" } }, "CPU"). It seems that CPU_BIND_THREAD default config is "YES".  It works well on Windows but on Linux the two processes seem to bind same cpu core and inference time turns dobule. After I set it to NO in the code, both Windows and Linux work well. 

Regards, 

View solution in original post

3 Replies
IntelSupport
Community Manager
196 Views

Hi Lei yan,

Could you provide me the details on what program you are running? Which sample or demo of OpenVINO Toolkit you are rusing? Also, please provide me the environment of your machine and the version of OpenVINO Toolkit.

 

Regards,

Aznie


User1578037094245524
179 Views

Hi Aznie,

Thanks a lot for your reply.

I fixed the issue by SetConfig({ { CONFIG_KEY(CPU_BIND_THREAD), "NO" } }, "CPU"). It seems that CPU_BIND_THREAD default config is "YES".  It works well on Windows but on Linux the two processes seem to bind same cpu core and inference time turns dobule. After I set it to NO in the code, both Windows and Linux work well. 

Regards, 

View solution in original post

IntelSupport
Community Manager
167 Views

 

Hi Lei Yan,

Thank you for sharing the solution. For more information, you also can test the program on the benchmark app and have a look at the parameter of the benchmark app to compare the inference time. This thread will no longer be monitored since this issue has been resolved. If you need any additional information from Intel, please submit a new question.

 

Regards,

Aznie


Reply