Community
cancel
Showing results for 
Search instead for 
Did you mean: 
114 Views

async mode not working with ssd_mobilenet

Hello, 

I have a problem trying to use the inference engine in an async mode (with a ssd mobilenet v1).

My code worked with previous release. I compared with samples/python/object_detection_demo_ssd_async.py and my code looks very similar.

# Run inference
if self.is_async_mode:
    self.exec_net.start_async(request_id=self.next_request_id, 
                                inputs={self.input_blob: in_frame})
else:
    self.exec_net.start_async(request_id=self.cur_request_id, 
                                inputs={self.input_blob: in_frame})
if self.exec_net.requests[self.cur_request_id].wait(-1) == 0:
     # Parse detection results of the current request
     res = self.exec_net.requests[self.cur_request_id].outputs[self.out_blob]

 

When 'self.is_async_mode' is set to false, it works, but when 'self.is_async_mode' is set to true, it doesn't.

I get the following error : 

   ....

  File "/media/tameimpala/Data/Logos/src/OV_utils.py", line 63, in detect_objects
    inputs={self.input_blob: in_frame})
  File "ie_api.pyx", line 138, in inference_engine.ie_api.ExecutableNetwork.start_async
  File "ie_api.pyx", line 166, in inference_engine.ie_api.InferRequest.async_infer
  File "ie_api.pyx", line 170, in inference_engine.ie_api.InferRequest.async_infer
RuntimeError: [REQUEST_BUSY] 
/teamcity/work/scoring_engine_build/releases_openvino-2018-r4/ie_bridges/python/inference_engine/ie_api_impl.cpp:334

I didn't found anything on the internet, is it a known issue ? 

Thank you.

0 Kudos
1 Reply
Severine_H_Intel
Employee
114 Views

Dear Jean-Charles, 

I could reproduce your issue, I will escalate it to our dev team such as it gets solved. 

Best, 

Severine

Reply