Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

Factors affecting the inference time by GPU plugin

liang__heng
Beginner
454 Views

when doing inference by GPU plugin,I test my program in diferent PCs and find the time cost is so differnt. I want to

 find the major factor affecting the inference time,can any one give me some adcice?

0 Kudos
2 Replies
nikos1
Valued Contributor I
454 Views

Hello Liang Heng, 

What are the GPU models in the systems you are testing? 

I think you will find in most cases GPU inference will be faster on GPUs with more EUs and/or higher GPU clock but there are many other factors too. 

For details on GPU EU/Clock please refer to https://ark.intel.com/content/www/us/en/ark.html

Cheers,

Nikos

0 Kudos
Shubha_R_Intel
Employee
454 Views

Dear liang, heng,

In your experiments I assume that you're using the same image(s) each time. What nikos said is correct but also depending on the model you are using and the image sizes you are passing in, the GPU plugin optimizes models for certain ideal kernel sizes. It's best to feed in an image size which is optimal for the model.

You can also use the benchmark_app to perform experiments.

Hope it helps,

Thanks,

Shubha

0 Kudos
Reply