- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I would like to process a batch of images at once, is it possible with Movidius? Or only one image at a time?
Thank you
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@klm LoadTensor doesn’t have any special knowledge of multidimensional arrays. LoadTensor passes a single list of floating point values to the NCS for inference. If the network expects multiple dimensions or multiple images then you would have to pass what the network expects. From LoadTensor's perspective, it’s a single list of floating point values that is passed for a single inference.
As for batch processing, currently the NCS doesn’t support batch image processing. A possible option, depending on your specific use case scenario, could be to thread your app to perform the preprocessing on subsequent images while waiting for the NCS to return back with the inference results. You can refer to stream_infer.py in the python examples included with the NCS SDK located at: https://ncsforum.movidius.com/categories/downloads
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Tome_at_Intel are there any plans to add batch support to NCS?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@macsz Thank you for voicing your interests on the Intel Movidius NCS forum. We understand that some of our forum members are interested in this feature and to answer your question, batch image processing is a feature that we are considering for a future release, however we cannot provide an ETA for the batch image processing feature at the moment.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Tome_at_Intel Do you have any news about the batch image processing?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Bump… again interested in batch processing
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@pkfuncs @albertcliment Thanks for your input. We're considering all requests regarding future features for the NCSDK. Although there isn't support for batch processing yet, we have added fifo queues to help smooth the flow for processing a large number of inferences as well as being able to use multiple graph files/models with one device. You should be able to queue up multiple inferences and they will be processed in a fifo manner.

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page