Community
cancel
Showing results for 
Search instead for 
Did you mean: 
radjaradja
Beginner
146 Views

Batch normalization example

Hello,

This is my first program using MKL library and i want to include a simple batch normalization call after convolution. This is my code for BN 
Does anyone have an idea what i did wrong and why the execution of my code is failing ?

 

/*** BN1 section ***/

CHECK_ERR(dnnBatchNormalizationCreateForward_F64(&bn1, attributes, lt_conv1_output, 0), err);

resBn1[dnnResourceSrc] = resConv1[dnnResourceDst];

CHECK_ERR(dnnLayoutCreateFromPrimitive_F64(&lt_bn1_output, bn1, dnnResourceDst), err);

CHECK_ERR(dnnAllocateBuffer_F64((void **)&resBn1[dnnResourceDst], lt_bn1_output), err);

CHECK_ERR(init_conversion(&cv_relu1_to_user_output, &user_o, lt_user_output, lt_bn1_output, resBn1[dnnResourceDst]), err);

...............

CHECK_ERR(dnnExecute_F64(bn1, (void *)resBn1), err);

 

0 Kudos
3 Replies
Ying_H_Intel
Employee
146 Views

Hello Radjaradja, 

Thank you for submitting the question about the Deep learning Batch normalization.  Do you have small test case for reproduce the issue? 

On the other hand, you may have known, considering the strong Deep learning compute request , we actually provide  one dedicated  open source library  MKL-DNN  for that.  It's github address :  https://github.com/intel/mkl-dnn
for example, you can find one whole sample in  https://github.com/intel/mkl-dnn/blob/master/examples/simple_net.c and you can add Batch normalization part and let us know if it can reproduce the problem 

Best Regards,

Ying 

 

 

 

radjaradja
Beginner
146 Views

Thank you for your answer.

This is the whole example of my network

I tried to used mkldnn for my application but i read that mkl is more optimized and faster so I'm trying to rewrite my code using it

Ying_H_Intel
Employee
146 Views

Hi Radjaradja, 

Actually, the Deep Neural Network (DNN) component in Intel MKL is deprecated and will be removed in a future release.  And the Open Source MKL-DNN have same or better performance.  for example, the Batch Normalization + ReLU was fused in CONV in MKL-DNN 

I run your case, and it returns 

ok
err (-1)
FAILED
Press any key to continue . . .
As you see, the BN's input,  like  mean, variance, weights are required before to execute the BN. 

So maybe there is something goes wrong in BN. but as we remove them, i may suggest to look at https://github.com/intel/mkl-dnn ; and https://intel.github.io/mkl-dnn/structmkldnn_1_1batch__normalization__forward.html

Best Regards,

Ying