Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6573 Discussions

Obtain layer-by-layer memory consumption for optimized models

nat98
New Contributor I
1,984 Views

Hello,

 

I have a few optimized models that can be run in the OpenVINO environment and also imported to the DL Workbench.

 

Currently, I can get the total weights, minimum and maximum memory consumption of my models via DL Workbench. However, I wanted to know the memory demand/memory consumption/computing power for each layer of my optimized model.

 

May I know any tools by OpenVINO that I use to view the mentioned parameters for my optimized models?

 

Thank you.

0 Kudos
6 Replies
Zulkifli_Intel
Moderator
1,957 Views

Hi Nat98.


Thank you for reaching out to us.


You can take a look at Model Analyzer which estimates theoretical information on deep learning models layers.


Sincerely,

Zulkifli


0 Kudos
nat98
New Contributor I
1,925 Views

Hi Zulkifli,

 

Thanks for your reply.

 

I had a look and used the Model Analyzer and obtained two .csv files that stored the results as the --per-layer-mode argument was used. I have noticed that whether or not I use additional arguments such as --ignore-unknown-layers, --sparsity-ignore-first-conv, and --sparsity-ignore-fc, yields the same results.

 

My original model has a few Maxpool layers and now it disappears. May I know if OpenVINO optimizes the model by removing that layer, or did the model analyzer ignore the layer as it couldn't analyze it?

 

According to the .csv file generated by --per-layer-mode, it seems that the GFLOPs mean the computing power for each layer in x 10^9. What does the MParams mean?

 

How can I get/calculate my model individual layer's memory demand and the memory required to store it?

 

Thank you.

0 Kudos
Zulkifli_Intel
Moderator
1,889 Views

Hi Nat98,

 

Sorry for the delayed response. Currently, OpenVINO doesn't have a tool to estimate each layer for a deep learning model.

 

Model Analyser only has a small number of supported layers. The list of supported layers can be found here. Also, the OpenVINO model optimizer optimizes the model and removes the useless layers that might increase the inference time, and these layers are mostly used for model training purposes.

 

Sincerely,

Zulkifli 


0 Kudos
nat98
New Contributor I
1,857 Views

Hi Zulkifli,

 

Thanks for your reply.

 

Can the values in the column of GFLOPs in the attached .csv file be set as 10^6 (or smaller) instead of 10^9?

 

This is because the GFLOPs of the last three layers of my model are 0.0000, and perhaps the value is too small to record in 10^9.

 

Thank you.

0 Kudos
Zulkifli_Intel
Moderator
1,814 Views

Hi Nat98,

 

Sorry for the inconvenience, Currently there is no available method to adjust the scale value. For your info, GFLOPs value 0 is possible due to some limitations on the modules itself.

 

Sincerely,

Zulkifli


0 Kudos
Zulkifli_Intel
Moderator
1,774 Views

Thank you for your question. If you need any additional information from Intel, please submit a new question as this thread is no longer being monitored.


0 Kudos
Reply