Intel® Distribution of OpenVINO™ Toolkit
Community support and discussions about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all things computer vision-related on Intel® platforms.

How to set options for inference

jjjjjjjjjjjjjjjjjj
241 Views

1. Can I use openvino to set the number of CPU cores to use for inference?

2. Can I set profiler options when inferring?

3. Is there a profiler tool for openvino?? Where can I see instructions?

 

Where can I see the options I can set during inference?

0 Kudos
2 Replies
Wan_Intel
Moderator
212 Views

Hi Jjjjjjjjjjjjjjjjjj,

Thank you for reaching out to us.


You can use KEY_CPU_THREADS_NUM to specify the number of threads that CPU plugin should use for inference. Zero (default) means using all (logical) cores.


Supported Configuration Parameters for CPU-specific settings are available at the following page:

https://docs.openvino.ai/latest/openvino_docs_IE_DG_supported_plugins_CPU.html#supported-configurati...


On another note, OpenVINO has powerful capabilities for Performance Analysis of the key stages, such as read time and load time. Most of the modules and features have been tagged with Intel ITT counters, which allows us to measure the performance of these components.


For more information, please refer to Performance analysis using ITT counters.



Regards,

Wan


Wan_Intel
Moderator
167 Views

Hi Jjjjjjjjjjjjjjjjjj,,

Thanks for your question.


This thread will no longer be monitored since we have provided suggestions. 

If you need any additional information from Intel, please submit a new question.


 

Regards,

Wan


Reply