Community
cancel
Showing results for 
Search instead for 
Did you mean: 
idata
Community Manager
326 Views

Queuing multiple inferences in NCSDK v2

Hi,

 

I'm trying to queue multiple images for inferencing on NCS with NCSDK v2 APIs. Here, I'm assuming that multiple images are processed in a batch and returns output. Presently images are being sent for inferencing in a loop sequentially like this:

 

for img in range(list_img):

 

graph.queue_inference_with_fifo_elem(fifoIn, fifoOut, img, 'user object')

 

My concern is:

 

     

  • Can you please confirm, is there a better way for using v2 APIs for batch processing?
  •  

  • I'm assuming that, if a single image takes 500ms for inferencing and if we submit 5 images in a queue, the inferencing should still happen in ~500-600ms (since it is batch processing). Is my assumption correct?
  •  

 

Please provide any comments or suggestions on this regard.

 

Thanks

 

Madhusudhan S
0 Kudos
2 Replies
idata
Community Manager
53 Views

@madhusudhan_s It's a queue, so if you send in five images, it processes each image, one at a time. We don't support batch image processing at the moment. I don't know of any work-arounds with APi2 because one device can only process one inference at a time with the current NCSDK 2.04.00.06.

idata
Community Manager
53 Views

Thanks for the clarification @Tome_at_Intel !!

Reply