Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

How to set options for inference

jjjjjjjjjjjjjjjjjj
418 Views

1. Can I use openvino to set the number of CPU cores to use for inference?

2. Can I set profiler options when inferring?

3. Is there a profiler tool for openvino?? Where can I see instructions?

 

Where can I see the options I can set during inference?

0 Kudos
2 Replies
Wan_Intel
Moderator
389 Views

Hi Jjjjjjjjjjjjjjjjjj,

Thank you for reaching out to us.


You can use KEY_CPU_THREADS_NUM to specify the number of threads that CPU plugin should use for inference. Zero (default) means using all (logical) cores.


Supported Configuration Parameters for CPU-specific settings are available at the following page:

https://docs.openvino.ai/latest/openvino_docs_IE_DG_supported_plugins_CPU.html#supported-configuration-parameters


On another note, OpenVINO has powerful capabilities for Performance Analysis of the key stages, such as read time and load time. Most of the modules and features have been tagged with Intel ITT counters, which allows us to measure the performance of these components.


For more information, please refer to Performance analysis using ITT counters.



Regards,

Wan


0 Kudos
Wan_Intel
Moderator
344 Views

Hi Jjjjjjjjjjjjjjjjjj,,

Thanks for your question.


This thread will no longer be monitored since we have provided suggestions. 

If you need any additional information from Intel, please submit a new question.


 

Regards,

Wan


0 Kudos
Reply