I am learning how to use DAAL and try to understand the training process of neural networks.
Looking at the samples in the installation directory I played around with the neural_net_dense_batch project.
I have noticed that changing the parameters / values of the following two lines does not have an impact on the weights/biases of a trained network:
sgdAlgorithm->parameter.nIterations = 100;
sgdAlgorithm->parameter.accuracyThreshold = 1.0e-5;
I retrieve the weights/biases as follows:
NumericTablePtr wbTrained = net.getResult()->get(training::model)->getWeightsAndBiases();
printNumericTable(wbTrained, "Trained", 4, 4);
According to the Intel Data Analytics Acceleration Library 2017 Update 1 Developer Guide the compute() method should "Finish the computation if the error is less than the threshold or the number of iterations done equals the value of nIterations that you specified." (page 237).
Do these parameters have an impact on the training process of a neural network? If yes, why do I not get different weigths/biases after changing them?
Thank you for your interest in Intel® DAAL.
The present version of the library does not use the parameters such as the number of iterations and the accuracy threshold for the neural network training.
The number of iterations performed by the compute() method in the neural network training is deducted from the number of the data samples (which is assumed to be represented by the first dimension of the input data tensor) and batch size. Thus, the training will use the whole dataset.
If you need to process a smaller number of batches than the total number of batches available in your input data set, please reduce the size of the input data set, respectively.
The quote from Intel DAAL documentation does not reflect the behavior of the library, and we plan to correct it in one of the next releases.
We also investigate the options for adding support of number of iterations and accuracy threshold to the neural network training algorithm to have its behavior more configurable in the future releases of the library.
Please, let us know, if it answers your question.