Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

OpenVINO using MKL

Malhar
Employee
1,319 Views

I have build the source code of inference engine of OpenVINO (https://github.com/opencv/dldt/tree/2018/inference-engine) using GEMM=MKL in cmake. When i try running the sample application and check performance counts it shows JIT_AVX2_I8 (whereas i was expecting MKL instructions to be executed).

0 Kudos
2 Replies
Shubha_R_Intel
Employee
1,319 Views

Hi Malhar. I assume that you're concerned about not seeing AVX512 instructions ? 

Have you tried MKL Verbose Mode ?

https://software.intel.com/en-us/mkl-windows-developer-guide-using-intel-mkl-verbose-mode

You can post MKL questions at this forum.

https://software.intel.com/en-us/forums/intel-math-kernel-library

0 Kudos
Malhar
Employee
1,319 Views

Hi Shubha R.,

Yes i set MKL_VERBOSE and then checked. I see JIT_AVX2_I8 when i run sample application loaded with INT8 model with -pc as tag. 

I have also build the inference engine dldt code base mentioning the Gemm to be build using MKL. I am not building gemm with JIT.

0 Kudos
Reply