Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6434 Discussions

Inference speed with INT8 model on a bit old CPU

timosy
New Contributor I
776 Views

I prepared INT8 model, and tested it on a new CPU. I found that the inferece speed gets fast, this is OK. But, I tested the inference using the same model on a bit old CPU, I found that it does not get fast. How should I check whether CPU supports INT8 model or not?

Labels (3)
0 Kudos
1 Solution
IntelSupport
Community Manager
747 Views

Hi Timosy,

 

Thanks for reaching out.

 

The slow inference on old CPU is expected based on the hardware configuration and model layer framework. Refer to the  Intel® Distribution of OpenVINO™ toolkit Benchmark Results for the inference performance on a specified hardware configuration.

 

Apart from that, INT8 is supported with CPU plugin. Check out the Supported Devices, Supported Model Formats, and Supported Layers documentation for more details.

 

 

Regards,

Aznie


View solution in original post

0 Kudos
2 Replies
IntelSupport
Community Manager
748 Views

Hi Timosy,

 

Thanks for reaching out.

 

The slow inference on old CPU is expected based on the hardware configuration and model layer framework. Refer to the  Intel® Distribution of OpenVINO™ toolkit Benchmark Results for the inference performance on a specified hardware configuration.

 

Apart from that, INT8 is supported with CPU plugin. Check out the Supported Devices, Supported Model Formats, and Supported Layers documentation for more details.

 

 

Regards,

Aznie


0 Kudos
IntelSupport
Community Manager
725 Views

Hi Timosy,


This thread will no longer be monitored since we have provided information. If you need any additional information from Intel, please submit a new question.



Regards,

Aznie


0 Kudos
Reply