Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

python3 object_detection_demo_ssd_async.py not working

Supra__Morne_
Beginner
400 Views

Hi

I have been trying for days now to resolve the problem before and I would really appreciate some guidance in resolving the issue:

1. The result below is a success result when running the object_detection_demo_ssd_async demo:

msupra@AI3:~/inference_engine_samples_build/intel64/Release$ ./object_detection_demo_ssd_async -i cam -m /opt/intel/computer_vision_sdk/inference_engine/samples/python_samples/frozen_inference_graph.xml -d CPU
InferenceEngine: 
    API version ............ 1.4
    Build .................. 19154
[ INFO ] Parsing input parameters
[ INFO ] Reading input

(object_detection_demo_ssd_async:2041): GStreamer-CRITICAL **: gst_element_get_state: assertion 'GST_IS_ELEMENT (element)' failed
[ INFO ] Loading plugin

    API version ............ 1.5
    Build .................. lnx_20181004
    Description ....... MKLDNNPlugin
[ INFO ] Loading network files
[ INFO ] Batch size is forced to  1.
[ INFO ] Checking that the inputs are as the demo expects
[ INFO ] Checking that the outputs are as the demo expects
[ INFO ] Loading model to the plugin
[ INFO ] Start inference 
Only 2 proposals found

2. When running the example python scripts with the same inference, I get the following two errors when choosing device CPU and device MYRIAD:

msupra@AI3:/opt/intel/computer_vision_sdk/inference_engine/samples/python_samples$ python3 object_detection_demo_ssd_async.py -i cam -m frozen_inference_graph.xml -d CPU
[ INFO ] Initializing plugin for CPU device...
[ INFO ] Reading IR...
[ ERROR ] Following layers are not supported by the plugin for specified device CPU:
 PriorBoxClustered_5, PriorBoxClustered_4, PriorBoxClustered_3, PriorBoxClustered_2, PriorBoxClustered_1, PriorBoxClustered_0, DetectionOutput
[ ERROR ] Please try to specify cpu extensions library path in demo's command line parameters using -l or --cpu_extension command line argument

 

msupra@AI3:/opt/intel/computer_vision_sdk/inference_engine/samples/python_samples$ python3 object_detection_demo_ssd_async.py -i cam -m frozen_inference_graph.xml -d MYRIAD
[ INFO ] Initializing plugin for MYRIAD device...
[ INFO ] Reading IR...
[ INFO ] Loading IR to the plugin...
Traceback (most recent call last):
  File "object_detection_demo_ssd_async.py", line 182, in <module>
    sys.exit(main() or 0)
  File "object_detection_demo_ssd_async.py", line 77, in main
    exec_net = plugin.load(network=net, num_requests=2)
  File "ie_api.pyx", line 389, in openvino.inference_engine.ie_api.IEPlugin.load
  File "ie_api.pyx", line 400, in openvino.inference_engine.ie_api.IEPlugin.load
RuntimeError: [VPU] Unsupported network precision : FP32

 

If I can get some guidance in how to resolve these issues, then I will finally be getting somewhere with the python API.

 

Regards

Morne

0 Kudos
1 Reply
Shubha_R_Intel
Employee
400 Views

It looks like you are on Linux but the path to the CPU extensions DLL on windows is here:

C:\Users\<User>\Documents\Intel\OpenVINO\inference_engine_samples_2017\intel64\Release\cpu_extension.dll

Look for cpu_extension.so and pass it in with  the -l switch. You must build your inference_engine\samples first however to get cpu_extension.so[dll]

0 Kudos
Reply