Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6570 Discussions

openVINO benchmark_app : how to run precision INT8 ?

Bhuvaneshwara
Beginner
746 Views

openVINO benchmark_app : this tool gives 2 option to specifiy precision 

--infer_precision Optional. Specifies the inference precision. Example #1: '-infer_precision bf16'. Example #2: '-infer_precision CPU:bf16,GPU:f32'

 

other one is Preprocessing options:
-op <value> Optional. Specifies precision for all output layers of the model.

which is appropriate to run int8 precision?

0 Kudos
2 Replies
Zulkifli_Intel
Moderator
691 Views

Hi Bhuvaneshwara,

Thank you for reaching out.

 

You can run benchmark_app without the model precision option. You can try run the benchmark_app using this command:

benchmark_app -m model_name.xml

 

The Benchmark Tool demonstrates how to use the benchmark_app to estimate deep learning inference performance on supported devices.

 

 

Regards,

Zul


0 Kudos
Zulkifli_Intel
Moderator
598 Views

This thread will no longer be monitored since we have provided a solution. If you need any additional information from Intel, please submit a new question.


0 Kudos
Reply