Showing results for 
Search instead for 
Did you mean: 

python3 not working


I have been trying for days now to resolve the problem before and I would really appreciate some guidance in resolving the issue:

1. The result below is a success result when running the object_detection_demo_ssd_async demo:

msupra@AI3:~/inference_engine_samples_build/intel64/Release$ ./object_detection_demo_ssd_async -i cam -m /opt/intel/computer_vision_sdk/inference_engine/samples/python_samples/frozen_inference_graph.xml -d CPU
    API version ............ 1.4
    Build .................. 19154
[ INFO ] Parsing input parameters
[ INFO ] Reading input

(object_detection_demo_ssd_async:2041): GStreamer-CRITICAL **: gst_element_get_state: assertion 'GST_IS_ELEMENT (element)' failed
[ INFO ] Loading plugin

    API version ............ 1.5
    Build .................. lnx_20181004
    Description ....... MKLDNNPlugin
[ INFO ] Loading network files
[ INFO ] Batch size is forced to  1.
[ INFO ] Checking that the inputs are as the demo expects
[ INFO ] Checking that the outputs are as the demo expects
[ INFO ] Loading model to the plugin
[ INFO ] Start inference 
Only 2 proposals found

2. When running the example python scripts with the same inference, I get the following two errors when choosing device CPU and device MYRIAD:

msupra@AI3:/opt/intel/computer_vision_sdk/inference_engine/samples/python_samples$ python3 -i cam -m frozen_inference_graph.xml -d CPU
[ INFO ] Initializing plugin for CPU device...
[ INFO ] Reading IR...
[ ERROR ] Following layers are not supported by the plugin for specified device CPU:
 PriorBoxClustered_5, PriorBoxClustered_4, PriorBoxClustered_3, PriorBoxClustered_2, PriorBoxClustered_1, PriorBoxClustered_0, DetectionOutput
[ ERROR ] Please try to specify cpu extensions library path in demo's command line parameters using -l or --cpu_extension command line argument


msupra@AI3:/opt/intel/computer_vision_sdk/inference_engine/samples/python_samples$ python3 -i cam -m frozen_inference_graph.xml -d MYRIAD
[ INFO ] Initializing plugin for MYRIAD device...
[ INFO ] Reading IR...
[ INFO ] Loading IR to the plugin...
Traceback (most recent call last):
  File "", line 182, in <module>
    sys.exit(main() or 0)
  File "", line 77, in main
    exec_net = plugin.load(network=net, num_requests=2)
  File "ie_api.pyx", line 389, in openvino.inference_engine.ie_api.IEPlugin.load
  File "ie_api.pyx", line 400, in openvino.inference_engine.ie_api.IEPlugin.load
RuntimeError: [VPU] Unsupported network precision : FP32


If I can get some guidance in how to resolve these issues, then I will finally be getting somewhere with the python API.




0 Kudos
1 Reply

It looks like you are on Linux but the path to the CPU extensions DLL on windows is here:


Look for and pass it in with  the -l switch. You must build your inference_engine\samples first however to get[dll]