Community
cancel
Showing results for 
Search instead for 
Did you mean: 
souravmondal13
Beginner
141 Views

How to decrease inference time?

Initially, I used a keras hd5 file for inference and it took around 54 sec to produce the result on a 5 sec 30fps video input file. Then I used openvino tool kit for optimization and inference now inference is taking around 50 sec, and I haven't gained much. Since I am new to openvino I don't know much about optimization, please help

Thanks,

Sourav

0 Kudos
3 Replies
samontab
Valued Contributor II
137 Views

It really depends on the network you're using. Maybe it's already as optimised as it can be. Here are some explanations of how OpenVINO optimises a model.

Other alternatives are: use another model, or modify your current network to make it lighter while reducing some accuracy.

souravmondal13
Beginner
131 Views

one more thing, I saw in a couple of examples that we can use CPU extension like sse4, avx2. According to the latest release, these extensions are moved to plugin. I don't know how to use CPU plugins either, can you help me with this?

Thanks again,

Sourav

Tags (1)
Iffa_Intel
Moderator
118 Views

Greetings,

 

CPU extensions are moved to plugin in the latest version of OpenVINO. The extensions are loaded automatically while loading the plugin.


You can refer here for our latest documentation:

https://docs.openvinotoolkit.org/latest/_docs_IE_DG_supported_plugins_CPU.html

 

Sincerely,

Iffa