Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

Allowed batch size for Dynamic Batch Size confusing

Blasutig__Robert
Beginner
2,527 Views

I've just started using dynamic batch sizes using Python and OpenVINO 2020.3.  My Python version is 3.7.6, and my numpy version is 1.18.1.  I have Googled this error and some solutions pointed to numpy, which is why I bring that up.

If I have a network that I initialize to a max batch size of 4, and specify a batch size of 2 at runtime, I get an error like this:

ValueError: cannot copy sequence with size 2 to array axis with dimension 4

Batch sizes of 1 and 4 work, but nothing in between.

I initialize as follows:

device_type="CPU"
batch_size = 4
...
ie.set_config(config={"DYN_BATCH_ENABLED": "YES"}, device_name=device_type)
net.batch_size = batch_size
n, c, h, w = net.inputs[input_blob].shape
net.reshape({input_blob : (batch_size, c, h, w)})

And at runtime when I want to change the batch size:

exec_net.requests[0].set_batch(2)

Is this known behavior, or is there a key step missing?  I have left out a lot of code, if that is required to help troubleshoot, let me know.

0 Kudos
7 Replies
Iffa_Intel
Moderator
2,527 Views

Greetings,

You get the error because your program is returning  with less than 4  data. 

Your list comprehension is trying to insert that data into the 4 slots (and so needs 4 data, which is not there). (You hardcoded this "batch_size = 4")

You tried to override it by running .set_batch(2) but its seems to not working

To change batch size dynamically during execution, you need to be good in creating the algorithm logic.

Instead of declaring batch_size = 4, try to:

global batch_size
def set_batch(x):
    return x
    batch_size = x
    print(str(batch_size))

 

This logic would return any value inserted into set_batch() function and assign it to batch_size var. Adjust accordingly to your desired functions afterward.

Hope my answer helps!

Sincerely,

Iffa

 

 

0 Kudos
Blasutig__Robert
Beginner
2,527 Views

I'm not doing anything unusual, and sample code doesn't show any tricks are necessary.  There is a unit test in OpenVINO that pretty much does the same thing as I am doing.  

def test_set_batch_size(device):
    ie_core = ie.IECore()
    ie_core.set_config({"DYN_BATCH_ENABLED": "YES"}, device)
    net = ie_core.read_network(test_net_xml, test_net_bin)
    net.batch_size = 10
    data = np.zeros(shape=net.input_info['data'].input_data.shape)
    exec_net = ie_core.load_network(net, device)
    data[0] = read_image()[0]
    request = exec_net.requests[0]
    request.set_batch(1)
    request.infer({'data': data})
    assert np.allclose(int(round(request.output_blobs['fc_out'].buffer[0][2])), 1), "Incorrect data for 1st batch"
    del exec_net
    del ie_core
    del net

This unit test sets the batch size dynamically to 1. 

In the documentation for InferRequest.set_batch, the specified batch_size assigned to IENetwork is implied to be a MAX size, not hardcoded.

https://docs.openvinotoolkit.org/latest/ie_python_api/classie__api_1_1InferRequest.html#a7598a35081e9beb4a67175acb371dd3c

ie = IECore()
net = ie.read_network(model=path_to_xml_file, weights=path_to_bin_file)
# Set max batch size
net.batch = 10
ie.set_config(config={"DYN_BATCH_ENABLED": "YES"}, device_name=device)
exec_net = ie.load_network(network=net, device_name=device)
# Set batch size for certain network.
# NOTE: Input data shape will not be changed, but will be used partially in inference which increases performance
exec_net.requests[0].set_batch(2)

What I've observed is that this sample with batch size of 2 will not work.  Only batch size of 1 or the MAX batch size will work.  The unit test will pass, but this code example will not.  

btw, I don't understand how your sample code can execute any logic after the return statement.  

0 Kudos
Iffa_Intel
Moderator
2,527 Views

Pardon,

I thought the  exec_net.requests[0].set_batch(2) is executed in terminal. the code that i gave is just an example to get the set_batch(x), x is as per input by user in console.

Btw,

Since the dynamic function for the first one works, why dont you try the same concept:

ie = IECore()

net = ie.read_network(model=path_to_xml_file, weights=path_to_bin_file)

# Set max batch size

net.batch = 10

ie.set_config(config={"DYN_BATCH_ENABLED": "YES"}, device_name=device)

exec_net = ie.load_network(network=net, device_name=device)

# Set batch size for certain network.

# NOTE: Input data shape will not be changed, but will be used partially in inference which increases performance

request = exec_net.requests[0]

request.set_batch(4)

 

 

Sincerely,

Iffa

0 Kudos
Blasutig__Robert
Beginner
2,522 Views

I'm quite confident dynamic batching has a bug.  I've attached a sample program to demonstrate.

Contents:

  • app.py - the application that demonstrates dynamic batch sizes in python
  • cat.jpg - a cat used as the input image
  • imagenet.txt - the list of imagenet classes
  • squeezenet1.0.bin + xml - a relatively small network to demonstrate this issue

The app has a defined max batch size of 4.  Batch sizes of 1 & 4 will work.  2 & 3 will show the error I originally shared:

cannot copy sequence with size 2 to array axis with dimension 4

 The code can be modified to select between 2 batch sequences:

batch_sizes = [1,2,3,4]
# batch_sizes = [1,4,1,4]

The second option will work, but selecting 1 or the max batch size is not optimal.

The app will iterate over all batch sizes to show success and error.  Inference time will also be shown to demonstrate different processing time as batch size gets changed.

All I ask is for dynamic batching to work with sizes other than min and max.  This appears to be a useful feature to get more performance out of edge devices with varying workloads.

0 Kudos
Sahira_Intel
Moderator
2,492 Views

Hi,

We are currently looking into this bug and I will get back to you as soon as I have more information.

Thanks so much for your patience!


Sincerely,

Sahira


0 Kudos
YonatanF
Beginner
2,292 Views

Hello,

I have also run into this problem.

self.model.infer({self.input_name: x})
File "ie_api.pyx", line 815, in openvino.inference_engine.ie_api.ExecutableNetwork.infer
File "ie_api.pyx", line 1087, in openvino.inference_engine.ie_api.InferRequest.infer
File "ie_api.pyx", line 1089, in openvino.inference_engine.ie_api.InferRequest.infer
File "ie_api.pyx", line 1235, in openvino.inference_engine.ie_api.InferRequest._fill_inputs
ValueError: could not broadcast input array from shape (10,3,224,224) into shape (64,3,224,224)

It works with batchsize=64 and batchsize=1, but anything in between will fail.

Any update one whether this was fixed in v.2021 or what is the cause of the bug?

 

Thanks!

0 Kudos
Ramasabarinathan
Beginner
1,893 Views

Hello,

     Please let us know the status of this bug. To set the dynamic batch size in between the min and max limit of batch size. Will there be a fix in near future?

0 Kudos
Reply