Intel® Distribution of OpenVINO™ Toolkit
Community support and discussions about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all things computer vision-related on Intel® platforms.
6060 Discussions

Inference speed with INT8 model on a bit old CPU

timosy
New Contributor I
216 Views

I prepared INT8 model, and tested it on a new CPU. I found that the inferece speed gets fast, this is OK. But, I tested the inference using the same model on a bit old CPU, I found that it does not get fast. How should I check whether CPU supports INT8 model or not?

Labels (3)
0 Kudos
1 Solution
IntelSupport
Community Manager
187 Views

Hi Timosy,

 

Thanks for reaching out.

 

The slow inference on old CPU is expected based on the hardware configuration and model layer framework. Refer to the  Intel® Distribution of OpenVINO™ toolkit Benchmark Results for the inference performance on a specified hardware configuration.

 

Apart from that, INT8 is supported with CPU plugin. Check out the Supported Devices, Supported Model Formats, and Supported Layers documentation for more details.

 

 

Regards,

Aznie


View solution in original post

2 Replies
IntelSupport
Community Manager
188 Views

Hi Timosy,

 

Thanks for reaching out.

 

The slow inference on old CPU is expected based on the hardware configuration and model layer framework. Refer to the  Intel® Distribution of OpenVINO™ toolkit Benchmark Results for the inference performance on a specified hardware configuration.

 

Apart from that, INT8 is supported with CPU plugin. Check out the Supported Devices, Supported Model Formats, and Supported Layers documentation for more details.

 

 

Regards,

Aznie


IntelSupport
Community Manager
165 Views

Hi Timosy,


This thread will no longer be monitored since we have provided information. If you need any additional information from Intel, please submit a new question.



Regards,

Aznie


Reply