Intel® Distribution of OpenVINO™ Toolkit
Community support and discussions about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all things computer vision-related on Intel® platforms.
5864 Discussions

Dynamic batch inferencing in Python using Openvino 2021.3

nnain1
New Contributor I
298 Views

I like to have dynamic batch input for my Tensorflow model.

When I convert  in model optimization for Tensorflow model, I have error for having 

--input_shape [-1,24,94,3] .

So input_shape is set to batch size 1.

 

Then inside the program, I tried to set dynamic batch as

 

rec_exec_net = rec_ie.load_network(rec_net, args.device, {"DYN_BATCH_ENABLED": "YES"})

 

But when I run program, I have error as

ValueError: could not broadcast input array from shape (2,3,24,94) into shape (1,3,24,94) 

 at line 

request_wrap.execute("async", {rec_input_blob: pimages})

 

How can I have dynamic batch using OpenVino in python?

0 Kudos
4 Replies
Vladimir_Dudnik
Employee
280 Views

please refer to Using Dynamic Batching article in OpenVINO online documentation

nnain1
New Contributor I
270 Views

Yes I know that one. That is C sample and I am looking for Python sample.

IntelSupport
Community Manager
254 Views

Hi nnain1,

Thanks for reaching out. Unfortunately, we do not have anything yet for dynamic batching in Pyhton and only available for C configuration only at the moment.

 

Regards,

Aznie


IntelSupport
Community Manager
242 Views

Hi nnain1,

This thread will no longer be monitored since we have provided a solution. If you need any additional information from Intel, please submit a new question.


Regards,

Aznie


Reply