Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
Community Manager
329 Views

Does Deconvolutional layer support BatchNorm folding on Movidius?

Looks like it is not the case.

0 Kudos
8 Replies
Highlighted
Community Manager
29 Views

@Dmitry Have you tried using NCSDK v2.08.01? I was able to get passing results from mvNCCheck with a simple network using conv, deconv, then batchnorm.

0 Kudos
Highlighted
Community Manager
29 Views

Hello Tome! Thank you for your answer. I use the SDK 1.12.00.01. I am sharing my network with you. As you can see in console_log text file, results after deconv layer are invalid. I train Deconv with bias (in Pytorch) and export to Caffe also with bias enabled for Deconvilution layer. However results of layer ConvNdBackward68 do not match Caffe's output while the previous layer's (ConvNdBackward65) results match.

 

https://drive.google.com/open?id=1Gs7iTc3ar6jac9CbHf7OrpKCjDEKC69d

 

In general I overcame this issue by getting rid of the bias for Deconv in the model architecture. Still I am interested, is it a bug, or an undocumented errata point?

 

I looked at the release notes for SDK 2.xxx, it did not mentioned any fixes for layers, so I did not port to it. Porting is a bit of a burden from C++ side.

 

Thank you for your prompt answer!
0 Kudos
Highlighted
Community Manager
29 Views

@Tome_at_Intel Would appreciate your help!

0 Kudos
Highlighted
Community Manager
29 Views

@Dmitry With NCSDK v1.12, batchnorm after deconvolution is not supported. NCSDK v 1.12 is considered a legacy release and is no longer receiving new features. I understand that it is not easy to move your application from one API version to the next, but I strongly recommend updating your applications to NCSDK v2.08.01. Information on migrating your C++ application can be found at https://movidius.github.io/ncsdk/ncapi/c_api_migration.html.

0 Kudos
Highlighted
Community Manager
29 Views

@Tome_at_Intel Okay, I have installed SDK v2 to at least check compatibility of my model with your latest SDK. Now I try to convert my model #2 which includes Concat layer. Please see the link below for prototxt+caffemodel and console log of mvNCCheck. On ConvNdBackward65 I see comparison passes, but on ConvNdBackward68 already fails. This model has Concat and 1x1 Conv layers, and something of this seem to have a bug or being not supported.

 

https://drive.google.com/open?id=19mZSFUfiDsjjQPx4BTumSNxfG4mUstwb

 

Thank you for advice!
0 Kudos
Highlighted
Community Manager
29 Views

@Dmitry Thanks for reporting and thanks for providing your model.

0 Kudos
Highlighted
Community Manager
29 Views

@Tome_at_Intel Any ideas why Concat layer does not work for the model that I have shared?

0 Kudos
Highlighted
Community Manager
29 Views

@Dmitry SSD Mobilenet in Caffe has 1x1 conv with Concat layers and we do support that model. For your network, it seems like the NCSDK deconvolution layer is producing inaccurate results. I'll do more digging and see what I can find regarding deconvolution and this model.

 

For concat, I believe the NCSDK doesn't support concating a specific axis from two tensors (like below):

 

layer { name: "CatBackward71" type: "Concat" bottom: "ConvNdBackward68" bottom: "MaxPool2DBackward54" top: "CatBackward71" concat_param { axis: 1 } }
0 Kudos