Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
Employee
138 Views

How to check if VNNI (or AVX512) is actually used by OpenVino toolkit?

I just wanted to make sure that VNNI or AVX512 are actually used.

I am trying to compare performance between CPUs. I see improvement but I am not sure how to check if VNNI and AVX512 are actually used.

I am running 'human_pose_esitmation' using Windows. 

I was looking for some OV logs but no luck.

Is there any other way to check it?

Many thanks in advance.

0 Kudos
4 Replies
Highlighted
Employee
136 Views

Also code disassembling seems very complicated to me and if even when I would found such VVNI instruction there might be some logic which does not execute such code and omit it.

0 Kudos
Highlighted
Moderator
92 Views

Greetings,


I'm not sure whether you aware with this official documentation, anyway here it is:

https://software.intel.com/content/www/us/en/develop/articles/get-started-with-intel-deep-learning-b...


This article describes how the use of Intel Distribution of OpenVINO—and the power of vector neural network instructions (VNNI) and Intel® Advanced Vector Extensions 512 (Intel® AVX-512) can accelerate your workload. You can see a clear performance boost with Intel DL-Boost on inference workloads.


If you are using custom model, you need to convert them into IR by using model optimizer.


For logs, you can run --log_level=DEBUG together with the command that you use in terminal or cmd.


Sincerely,

Iffa


0 Kudos
Highlighted
Moderator
73 Views

Greetings,


Intel will no longer monitor this thread since we have provided a solution. If you need any additional information from Intel, please submit a new question.



Sincerely,

Iffa


0 Kudos
Highlighted
Employee
60 Views

@Iffa_Intel 

Thank you for your reply.

I am already worked with OpenVino and also familiar with it.

I am running inferencing of OpenVino, for example launching  C/C++ 'human pose estimation demo' provided with OpenVino:

C:\Users\Frame\Documents\Intel\OpenVINO\omz_demos_build\intel64\Release\human_pose_estimation_demo.exe -i C:\movie.mp4 -m C:\human-pose-estimation-0001.xml -d CPU

I am running on Windows but it also can be Linux.

I have got a specific question about inference using OpenVino. How can I get information that this demo is using CPU acceleration like VNNI (Vector Neural Network Instructions) or AVX512 (Advanced Vector Extensions). These extra CPU instructions give big boost for execution of OpenVino inference. It would be great if I could get information about using or not using these CPU instructions. Maybe somewhere in logs this information can be found? Or maybe user is able to turn on/off usage of VNNI/AVX-512 by OpenVino? I believe that generaly OpenVino should use it but I am interested in my inference run. I tried to find these information but no luck.

Could you help me to find out if my run of 'human pose estimation' demo is using VNNI/AVX-512 CPU instructions?

Thank you in advance.

0 Kudos