Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

What is the minimal installation set needed for deploying models on windows machines for inference only

Silbert__Ohad
Beginner
590 Views

Hi,

I want to run a costume model I had built and pre-trained inside an application. My application runs on machines without GPU (Intel CPU only) running Windows 7 or 10. What should I install on the user's computer? I want the minimal installation set since I try to avoid heavy installations on the user's computers. 

Thanks
Ohad

0 Kudos
1 Solution
nikos1
Valued Contributor I
590 Views

>  deploying models on windows machines for inference only

C++ or Python?

For C++ CPU inference the following dlls are enough - you may or may not need cpu extension dll depending on topology

inference_engine.dll
MKLDNNPlugin.dll
mkl_tiny_omp.dll

cpu_extension.dll 

In addition you would also need the obvious: IR (.bin and .xml) and application .exe .

Cheers,

Nikos

 

View solution in original post

0 Kudos
4 Replies
Kulecz__Walter
New Contributor I
590 Views

I'd like to know the answer too.

I can answer this question for Python apps using the NCS v1 SDK on Ubuntu or Raspbian but for OpenVINO I've no idea.

On problem I foresee with OpenVINO minimal binary installations is "optimizations".  I installed OpenVINO R5 on Ubuntu 16.04 on an i7 4500U, cloned the hard drive and moved it to an i5 4300U, everything worked.  But when I moved the drive to a "weaker" i5 540M the example code failed with "illegal instruction"  I had to repeat the installation on the 540M to get the examples to run and the inference time went from ~40 fps to ~9 fps, presumably because the 540M lacks AVX and AVX2.

Are there build options to make a "universal" library binary where it falls back to software emulation when hardware features are missing from the CPU?

0 Kudos
nikos1
Valued Contributor I
591 Views

>  deploying models on windows machines for inference only

C++ or Python?

For C++ CPU inference the following dlls are enough - you may or may not need cpu extension dll depending on topology

inference_engine.dll
MKLDNNPlugin.dll
mkl_tiny_omp.dll

cpu_extension.dll 

In addition you would also need the obvious: IR (.bin and .xml) and application .exe .

Cheers,

Nikos

 

0 Kudos
Kulecz__Walter
New Contributor I
590 Views

Thanks, what also needs to be added to the set for Python and for both on Linux?

Is there a "universal" version of the CPU dll?  

I ask because I installed OpenVINO on Linux i5 4200U and later "cloned" the drive and installed it on a "weaker" i5 M540 and got illeagal instruction errors running the demos, presumably from the lack of AVX and AVX2 instructions.  Re-installing OpenVINO on the M540 solved the problem.

I can see a real issue with binary only installations/distributions here.

 

0 Kudos
Silbert__Ohad
Beginner
590 Views

Thank you for the answer. 

0 Kudos
Reply