Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6403 Discussions

Inference speed with INT8 model on a bit old CPU

timosy
New Contributor I
621 Views

I prepared INT8 model, and tested it on a new CPU. I found that the inferece speed gets fast, this is OK. But, I tested the inference using the same model on a bit old CPU, I found that it does not get fast. How should I check whether CPU supports INT8 model or not?

Labels (3)
0 Kudos
1 Solution
IntelSupport
Moderator
592 Views

Hi Timosy,

 

Thanks for reaching out.

 

The slow inference on old CPU is expected based on the hardware configuration and model layer framework. Refer to the  Intel® Distribution of OpenVINO™ toolkit Benchmark Results for the inference performance on a specified hardware configuration.

 

Apart from that, INT8 is supported with CPU plugin. Check out the Supported Devices, Supported Model Formats, and Supported Layers documentation for more details.

 

 

Regards,

Aznie


View solution in original post

0 Kudos
2 Replies
IntelSupport
Moderator
593 Views

Hi Timosy,

 

Thanks for reaching out.

 

The slow inference on old CPU is expected based on the hardware configuration and model layer framework. Refer to the  Intel® Distribution of OpenVINO™ toolkit Benchmark Results for the inference performance on a specified hardware configuration.

 

Apart from that, INT8 is supported with CPU plugin. Check out the Supported Devices, Supported Model Formats, and Supported Layers documentation for more details.

 

 

Regards,

Aznie


0 Kudos
IntelSupport
Moderator
570 Views

Hi Timosy,


This thread will no longer be monitored since we have provided information. If you need any additional information from Intel, please submit a new question.



Regards,

Aznie


0 Kudos
Reply