Community
cancel
Showing results for 
Search instead for 
Did you mean: 
N_K__Kameshwari
Beginner
248 Views

Running Async Inference with Python

We are using OpenVINO_2019.2.242

We are trying to perform DL inferences on HDDL-R in async mode. Our requirement is to run multiple infer-requests in a pipeline. 

The requirement is similar to the security barrier async C++ code that is given in the openVINO example programs. (/opt/intel/openvino/deployment_tools/open_model_zoo/demos/security_barrier_camera_demo). This program uses multi-threading. Would the same security barrier Demo work differently if it were to run with python multi threading?

1. Is there example code available for async infering in python?

2. The OpenVINO documents mention callback functions that get invoked post inference in async mode. Does this alleviate the need to run the post-inference code in a separate thread? Is example code for it available?

3. All OpenVINO example programs for multi node pipelines using async infer are in C++. Can any python examples be shared?

Regards,
Kameshwari.

0 Kudos
1 Reply
Shubha_R_Intel
Employee
248 Views

Dear N K, Kameshwari

We have Python Classification sample Async. Would that work for you ?

Keep in mind that due to Python GIL Python really doesn't do threads. Instead it does processes. C++ does handle native threads. But for OpenVino's purposes, Python uses the OpenVino Python API which calls Inference Engine - and of course Inference Engine is multi-threaded. 

Please take a look at the OpenVino Python API Documentation and also Integrate the Inference Engine Doc . I think your questions should be answered through perusal of these documents but let me know if you have further questions.

Thanks,

Shubha