Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6404 Discussions

How to use OpenVINO library in Qt5.9.1 on Ubuntu

Hank__Li
Beginner
418 Views

Environment:
*OS: Ubuntu 16.04
*Qt: 5.9.1
*OpenVINO: toolkit for linux 2018 R5
*OpnenCV: 4.0.1

Hi 
I want to use  Intel OpenVINO library in Qt 5.9.1
My Qt pro file setting is:
INCLUDEPATH += /usr/local/include \
                               /usr/local/include/opencv4 \
                               /usr/local/include/opencv4/opencv2 \
                              /usr/local/include/opencv4/opencv2/videoio \
                              /opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/inference_engine/include \
                             /opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/inference_engine/include\cpp

LIBS += /usr/local/lib/libopencv_*.so \
        /usr/local/lib/libopencv_highgui.so \
        /usr/local/lib/libopencv_core.so \
        /usr/local/lib/libopencv_imgproc.so \
        /usr/local/lib/libopencv_imgcodecs.so \
        /usr/local/lib/libopencv_video.so \
        /usr/local/lib/libopencv_videoio.so \
        /opt/intel/computer_vision_sdk_2018.5.445/deployment_tools/inference_engine/lib/ubuntu_16.04/intel64/*.so \
        /home/hank/inference_engine_samples_build/intel64/Release/lib/*.so  \
        -ldl -lpthread -fopenmp

And my code is:
// load network from IR
    CNNNetReader netReader;
    netReader.ReadNetwork(PATH_TO_IR_XML);
    netReader.ReadWeights(PATH_TO_IR_BIN);
    // set maximum batch size to be used
    netReader.getNetwork().setBatchSize(1);
    CNNNetwork network = netReader.getNetwork();
    // instantiate a plugin for a target hardware
    InferencePlugin plugin = PluginDispatcher({""}).getPluginByDevice("CPU");
    // create executable network and infer request
    ExecutableNetwork executable_network = plugin.LoadNetwork(network,{});
    InferRequest infer_request = executable_network.CreateInferRequest();

But I got the error message :
undefined reference to 'omp_get_thread_num@VERSION'  libinference_engine.so
undefined reference to 'omp_get_max_threads@VERSION'  libinference_engine.so
undefined reference to 'GOMP_parallel@VERSION'  libinference_engine.so
undefined reference to 'omp_get_num_threads@VERSION'  libinference_engine.so

Has anyone experienced this?
Or know how to write Qt pro file can let OpenVINO be worked?

Thanks.

0 Kudos
0 Replies
Reply