Community
cancel
Showing results for 
Search instead for 
Did you mean: 
328 Views

Build OpenVino Inteference Engine Statically

Hi,

We need to compile OpenVino Inference Engine statically so it can be packaged along with our application, (we are building native add-ons using NAPI wrapper, to be used in Electron application ) to deploy in other target machines. 

Tried dldt Inference Engine in this repo :  https://github.com/opencv/dldt  

CMake options are  :

cmake .. -DCMAKE_BUILD_TYPE=Release -DBUILD_SHARED_LIBS=OFF  -DENABLE_MKL_DNN=ON -DENABLE_CLDNN=ON -DENABLE_NGRAPH=ON -DENABLE_OPENCV=OFF -DNGRAPH_STATIC_LIB_ENABLE=ON 

However after build, lib files are not static. (deployment_tools/inference_engine/lib/intel64)

1. Is there any way to make the build static?

2. Our application will make use of both GPU and CPU. So it is necessary to set ENABLE_MKL_DNN=ON?

3. Setting ENABLE_NGRAPH=OFF, still shows dependency of ngrpah files on building, is it because of other Cmake options?

Thanks,

Santhiya

0 Kudos
1 Reply
SIRIGIRI_V_Intel
Employee
328 Views

Hi Santhiya,

Please find the following answers:

  1. It seems that OpenVINO inference engine currently doesn’t support static linking of libraries.
  2. Yes, It is necessary to set the flags ENABLE_MKL_DNN=ON for CPU and ENABLE_CLDNN=ON for GPU.
  3. NGRAPH might depends on other dependencies also. I have tried removing the argument -DENABLE_GRAPH=OFF while building and is working fine.

Feel free to ask any other questions.

Thanks,

Ram prasad

Reply