- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello openvino support dynamic batching
similar is there on other platform like tensorflow, where when you passed a batch of image you get batch of output
but in openvino same option is by enabling dynamic batching.
But dynamic batching doesn't work for all kind of topologies.
my question is how to use batch prediction for models like mobilenet-ssd, Faster RCNN etc.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello Prateek.
OpenVINO toolkit supports dynamic batching, however, on only CPU and GPU devices and only on certain topologies that do contain specific supported layers - please find more details here https://docs.openvinotoolkit.org/latest/_docs_IE_DG_DynamicBatching.html
So please make sure your model used doesn't contain unsupported layers.
Best regards, Max.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
hello,
Regarding batch prediction
In benchmarking tool, we can define batchsize and the output is the model latency at the batchsize,
how does that work, Can you please shed some light
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @pkhan10
Usually, batching improves the performance. Although, high batch size results in a latency penalty. Depending on your inference device, to achieve best results we recommend you to try different batch size values combining with other parameters (such as -nstreams) in order to find a sweet spot.
Please see more details about it in the following articles:
Performance Topics - https://docs.openvinotoolkit.org/latest/_docs_IE_DG_Intro_to_Performance.html
Optimization Guide - https://docs.openvinotoolkit.org/latest/_docs_optimization_guide_dldt_optimization_guide.html
Hope this helps.
Best regards, Max.
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page