Community
cancel
Showing results for 
Search instead for 
Did you mean: 
338 Views

Dynamic batch size

Hi

In OpenVINO R3, I can see that there is an option to change the batch size dynamically (C++). This can be done just before loading the network into the plugin.

if (enable_dynamic_batch)
{
    config[PluginConfigParams::KEY_DYN_BATCH_ENABLED] = PluginConfigParams::YES;
}

I want to know if there is any similar functionality available in Python API. Could you please let me know the correct way to do it in python?

Thanks

Indrajit

0 Kudos
9 Replies
Mark_L_Intel1
Moderator
338 Views

Hi Indrajit,

It looks like we should have, if you see the following document:

Plugin load method, we do have a config dictionary as an input parameters. I would expect this should include the parameter you need.

I check the python sample and other document and I didn't find it. So let me check internally.

Mark

338 Views

Hi Mark

I tried setting it in my python script like this:

exec_net = plugin.load(network=net,config={'KEY_DYN_BATCH_ENABLED':'YES'})

But I got the following error:

RuntimeError: [NOT_FOUND] Unsupported property KEY_DYN_BATCH_ENABLED by CPU plugin

I want to know if that is the correct way to set it in python.

Thanks

Indrajit

Mikhail_T_Intel
Employee
338 Views

Hi Indrajit,

To set this config property for the plugin please pass the key without 'KEY_' prefix like this

exec_net = plugin.load(network=net, config={'DYN_BATCH_ENABLED':'YES'})

Other option to change batch size is to use batch_size property of the IENetwork instance or use reshape function (which is preferable in some cases like object detection networks) before loading network to the plugin:

# Setting batch using batch_size
net.batch_size = 8
# Setting batch using reshape
example_shape = (8, 3, 224, 224)
net.reshape({"input_layer_name" : example_shape})

 

Regards,

Mikhail

338 Views

Hi Mikhail

Thanks for your response. Now there is no issue loading the network into the plugin.

For my application, I would load the model once and run inference multiple times with different batch size each time. So I want to be able to change the batch size after the loading the network to the plugin.

With your suggestion I'm able to load the network without any issue but I get the this error when I send a batch of 2 images instead of one:

ValueError: could not broadcast input array from shape (2,3,72,72) into shape (1,3,72,72)

Shouldn't the config property "'DYN_BATCH_ENABLED':'YES'" allow the network to accept images of any batch size?

Please clarify on this.

Thanks

Indrajit

Mark_L_Intel1
Moderator
338 Views

Hi Mikhail,

Do we have a document to describe the Python API?

This is the only document I found:

https://software.intel.com/en-us/articles/OpenVINO-InferEngine

Mark

Caspi__Itai
Beginner
338 Views

Was the 'DYN_BATCH_ENABLED' config name changed in R5? 

Using the code described above I now get the following error when using MYRIAD:

RuntimeError: [NOT_FOUND] DYN_BATCH_ENABLED key does not exist

Nikolay_L_Intel1
Employee
338 Views

Hi, Itai!

Myriad doesn't support dynamic batch. It's supported by CPU and GPU plugins.

Caspi__Itai
Beginner
338 Views

Got it. Thanks Nikolay!

338 Views

Running into the same issue as original poster, how do you enable dynamic batching in 2019_R2 please?

I've tried to set `config={'DYN_BATCH_ENABLED':'YES'}` when loading the network but getting an error "ValueError: could not broadcast input array from shape (2,3,227,227) into shape (1,3,227,227)"

Reply