Intel® Distribution of OpenVINO™ Toolkit
Community support and discussions about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all things computer vision-related on Intel® platforms.

Python method to unload a network

TerryEss
Beginner
507 Views

I am using the Python interface to Openvino 2021.4 LTS.  I would like to unload a network.  I have checked the API documentation etc. and can find no method to do this though the C API has  ie_exec_network_free() to do that. 

0 Kudos
4 Replies
Wan_Intel
Moderator
470 Views

Hi TerryEss,

Thank you for reaching out to us and thank you for using Intel® Distribution of OpenVINO™ Toolkit!

 

From Inference Engine C API, you can use the ie_exec_network_free() method to free memory. Unfortunately, we regret to inform you that this method is not available for Inference Engine Python API and Inference Engine C++ API.

 

On another note, load_network from Inference Engine Python API allows users to load a network that was read from the Intermediate Representation (IR) to the plugin with specified device name and creates an ExecutableNetwork object of the IENetwork class.

 

You can create as many networks as you need and use them simultaneously (up to the limitation of the hardware resources).

 

Usage example:

ie = IECore()

net = ie.read_network(model=path_to_xml_file, weights=path_to_bin_file)

exec_net = ie.load_network(network=net, device_name="CPU", num_requests=2)

 

 

Regards,

Wan

 

TerryEss
Beginner
444 Views

I am working on autonomous robotics that can work across multiple "domains".  Each domain can have different object detection requirements which in turn require multiple models.  I need to load the model set for a domain and then unload it when done due to hardware limitations.  I would expect this requirement to be common to any application with sufficient complexity, so can ie_exec_network_free be added to the python API?

Wan_Intel
Moderator
424 Views

Hi TerryEss,

Thanks for your patience.


We appreciate your suggestion to enable this feature for Inference Engine Python API.


On another note, I recommend you initiate a pull request here so you can directly collaborate with our development team to enable this feature for the community.


Hope this helps.



Regards,

Wan


Wan_Intel
Moderator
364 Views

Hi TerryEss,

Thanks for your question!

 

This thread will no longer be monitored since we have provided recommendation. 

If you need any additional information from Intel, please submit a new question.

 

Regards,

Wan

 

Reply