Intel® Distribution of OpenVINO™ Toolkit
Community support and discussions about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all things computer vision-related on Intel® platforms.
5919 Discussions

Human Pose Estimation Demo use RealSense as InputDevice

Pironet__Don
Beginner
210 Views

I'm trying to run the Human Pose Estimation Demo with the RealSense Depth Camera D435 on os x. 

 

 sudo ./human_pose_estimation_demo -m /Users/mars/intel/openvino_2019.3.376/deployment_tools/open_model_zoo/tools/downloader/intel/human-pose-estimation-0001/FP32/human-pose-estimation-0001.xml -i /dev/bus/usb/020/017
Password:
InferenceEngine:
	API version ............ 2.1
	Build .................. 32974
	Description ....... API
Parsing input parameters
[ ERROR ] Failed to create plugin /Users/mars/intel/openvino_2019.3.376/deployment_tools/inference_engine/lib/intel64/libMKLDNNPlugin.dylib for device CPU
Please, check your environment
Cannot load library '/Users/mars/intel/openvino_2019.3.376/deployment_tools/inference_engine/lib/intel64/libMKLDNNPlugin.dylib': dlopen(/Users/mars/intel/openvino_2019.3.376/deployment_tools/inference_engine/lib/intel64/libMKLDNNPlugin.dylib, 1): Library not loaded: @rpath/libmkl_tiny_tbb.dylib
  Referenced from: /Users/mars/intel/openvino_2019.3.376/deployment_tools/inference_engine/lib/intel64/libMKLDNNPlugin.dylib
  Reason: image not found

The input device name I found via the realviewer:

Screenshot 2019-12-18 at 16.02.22.png

How can I use the camera as input device?

0 Kudos
1 Reply
Yurdan__Muhammet
Beginner
210 Views

It's because of the program cannot find the libmkl_tiny_tbb.dylib library to load libMKLDNNPlugin.dylib. I believe the easiest way to fix this problem is to copy the file next to other libraries. Hopefully, this command will fix the problem.

Do not forget to execute the command after initializing the environment variables

sudo cp $INTEL_OPENVINO_DIR/deployment_tools/inference_engine/external/mkltiny_mac/lib/libmkl_tiny_tbb.dylib $INTEL_OPENVINO_DIR/deployment_tools/inference_engine/lib/intel64/
Reply