Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6404 Discussions

Issues while compiling prototxt with Scale layer

idata
Employee
794 Views

Hello, I am trying to compile a prototxt model of DenseNet 121 (whose netscope architecture can be seen here ), that includes Batch and Scale layers. The prototxt file is here.

 

The problem is this: when i run the mvNCProfile, i can see that the generated model (in output_report.html) is really messed up, where the outputs of the scale layers are completely wrong and attached to the final output layer.

 

Please let me know how to overcome this,

 

Thanks!

0 Kudos
7 Replies
idata
Employee
470 Views

@antoniocappiello I was unable to run the prototxt file you provided. When running the prototxt file, I am encountering an error involving the ceil_mode pooling parameter in the pool1 Pooling layer. Upon further inspection it seems like ceil_mode isn't supported in the master version of Caffe. If you have a modified prototxt file that you are running mvNCProfile with, please provide that file for testing. Thanks.

0 Kudos
idata
Employee
470 Views

You simply remove the line with ceil_mode

0 Kudos
idata
Employee
470 Views

@antoniocappiello I tried that as well and ran into another problem. It would save me a lot of time to reproduce this issue if you could provide the file you used during your test. Thanks.

0 Kudos
idata
Employee
470 Views

Hello, you can find the file of the prototxt I used here

 

However, I had to add "Reshape" layers between consecutive "Concat" layers because it seems CaffeParser.py doesn't support consecutive concats, can you check this issue as well? Thanks a lot

0 Kudos
idata
Employee
470 Views

@antoniocappiello I was able to reproduce the same issue you are facing. I can tell you that there is a bug with the mvNCProfile tool in NCSDK version 1.12 that caused errors with the network visualizer. Once there is a release that fixes this issue, I will let you know via PM or this thread. Thanks.

0 Kudos
idata
Employee
470 Views

@Tome_at_Intel another question: I generated three graphs in order to see how it behaves before the first "Concat" layer, so i generated three graphs respectively for the pool, the convolution (that are the inputs of concat) and the concat layer. The output made sense for the pool and the convolution but the output of the concat layer was really terrible. My question is then: is "Concat" layer doing concatenation in a bad way? I tried to change the axis adding concat_param as well trying all the options from 0 to 3 but it doesn't change.

 

Thanks for the support let me know if there is something you can do about concat layer

0 Kudos
idata
Employee
470 Views

@antoniocappiello Can you provide a little more detail about the concat output you saw and post the log here? Thanks.

0 Kudos
Reply