Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6575 Discussions

Can LoadTensor accept multidimentional array?

idata
Employee
1,203 Views

Hi,

 

I would like to process a batch of images at once, is it possible with Movidius? Or only one image at a time?

 

Thank you

0 Kudos
6 Replies
idata
Employee
978 Views

@klm LoadTensor doesn’t have any special knowledge of multidimensional arrays. LoadTensor passes a single list of floating point values to the NCS for inference. If the network expects multiple dimensions or multiple images then you would have to pass what the network expects. From LoadTensor's perspective, it’s a single list of floating point values that is passed for a single inference.

 

As for batch processing, currently the NCS doesn’t support batch image processing. A possible option, depending on your specific use case scenario, could be to thread your app to perform the preprocessing on subsequent images while waiting for the NCS to return back with the inference results. You can refer to stream_infer.py in the python examples included with the NCS SDK located at: https://ncsforum.movidius.com/categories/downloads

0 Kudos
idata
Employee
978 Views

@Tome_at_Intel are there any plans to add batch support to NCS?

0 Kudos
idata
Employee
978 Views

@macsz Thank you for voicing your interests on the Intel Movidius NCS forum. We understand that some of our forum members are interested in this feature and to answer your question, batch image processing is a feature that we are considering for a future release, however we cannot provide an ETA for the batch image processing feature at the moment.

0 Kudos
idata
Employee
978 Views

@Tome_at_Intel Do you have any news about the batch image processing?

0 Kudos
idata
Employee
978 Views

Bump… again interested in batch processing

0 Kudos
idata
Employee
978 Views

@pkfuncs @albertcliment Thanks for your input. We're considering all requests regarding future features for the NCSDK. Although there isn't support for batch processing yet, we have added fifo queues to help smooth the flow for processing a large number of inferences as well as being able to use multiple graph files/models with one device. You should be able to queue up multiple inferences and they will be processed in a fifo manner.

0 Kudos
Reply