Community
cancel
Showing results for 
Search instead for 
Did you mean: 
livne__moshe
Beginner
114 Views

minimal installation for inference

Hi,

What is the minimal installation requirements for inference? the whole installation is really big and I presume not all of it is needed.

the inference itself is done in python.

Sorry if this is documented anywhere, but I searched and only got unrelated stuff.

Best regards,

Moshe

0 Kudos
3 Replies
nikos1
Valued Contributor I
114 Views

Hi  Moshe,

> the whole installation is really big and I presume not all of it is needed.

You are right.  For python (and C++) you only need to 

source  ~/intel/computer_vision_sdk/bin/setupvars.sh

so studying the script would give you a good idea of how PATH, LD_LIBRARY_PATH and PYTHONPATH are set and what is the minimal setup you need.

It really depends on how much of OpenVino you need to support, all plug-ins or just one, CPU extensions, format readers, opencv, etc

In the case of C++ for example you could just copy libinference_engine.so and libMKLDNNPlugin.so and libmkl_tiny_omp.so to your current directory and run minimal CPU inference. 

cheers,

nikos

livne__moshe
Beginner
114 Views

Thank you for your reply!

This is a tedious hit and miss process, trimming based on the environment variables. And not very "official".

I would say that there should be an option to install only the inference related parts, no samples, no rpms.

In the silent install there is a components part, but there is not documentation to the comppat.

It's a really great product, but deployment is nearly impossible to smallish end devices.My docker images was huge, most of it because of openvino.

 

 

livne__moshe
Beginner
114 Views

Ahhhh I found the comment at the top of the silent.cfg

the components are:


 

m@dl4:~/l_openvino_toolkit_p_2018.5.445$ ./install.sh --list_components
intel-inference_engine_sdk__noarch, version: 2018.5
intel-inference_engine_cpu__noarch, version: 2018.5
intel-inference_engine_gpu__noarch, version: 2018.5
intel-inference_engine_vpu__noarch, version: 2018.5
intel-inference_engine_gna__noarch, version: 2018.5
intel-inference_engine_hddl__noarch, version: 2018.5
intel-model_optimizer__noarch, version: 2018.5
intel-opencv_ubuntu_16_rel__noarch, version: 2018.5
intel-openvx_ubuntu__noarch, version: 2018.5
intel-models__noarch, version: 2018.5
intel-algorithms_ubuntu__noarch, version: 2018.5

 

so, for inference on cpu, will these be enough?

intel-inference_engine_sdk__noarch, version: 2018.5
intel-inference_engine_cpu__noarch, version: 2018.5
 

 

Reply