Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Sunwoo_L_
Beginner
31 Views

maximum_pooling2d layer

Hi,

This is another really beginner question. I built a training net with one convolution layer and one max-pooling layer. Without the max-pooling layer, it works perfect, but whenever I attach any layers more, it causes a segmentation fault in trainingNet.initialize(). 

Collection<LayerDescriptor> configureSimpleNet()
{
   Collection<LayerDescriptor> configuration;
/* 1. first conv. */
   SharedPtr<layers::convolution2d::Batch<> > conv2d(new layers::convolution2d::Batch<>);
   conv2d->parameter.kernelSize = layers::convolution2d::KernelSize(5,5);
   conv2d->parameter.nKernels = 1;
   conv2d->parameter.spatialDimensions = layers::convolution2d::SpatialDimensions(28,28);
   configuration.push_back(LayerDescriptor(0, conv2d, NextLayers()));

/* 2. first max-pool. */
   //pool1->parameter = maximum_pooling2d::Parameter(2,3,2,2,2,2,0,0);
   SharedPtr<layers::maximum_pooling2d::Batch<> > pool2d(new layers::maximum_pooling2d::Batch<>(4));
   configuration.push_back(LayerDescriptor(1, pool2d, NextLayers()));

   return configuration;
}

int main(int argc, char *argv[])
{
    //checkArguments(argc, argv, 1, &datasetFileName);

        float* testData = new float[94*28*28];
        float* testData2 = new float[1];
        testData2[0] = 1;
        for (int i=0; i < 1; i++)
                for (int j = 0; j < 28; j++)
                    for (int k = 0; k < 28; k++)
                        testData[i*28*28 + j*28 + k] = float(j) * float(k) + i;

        size_t nDim = 4, dims[] = {1, 1, 28, 28};
        size_t dims2[] = {1};

        SharedPtr<Tensor> bogusData(new HomogenTensor<float>(nDim, dims, (float*)testData));
        SharedPtr<Tensor> bogusData2(new HomogenTensor<float>(1,dims2,(float*)testData2));

        training::Batch<> trainingNet;

        Collection<LayerDescriptor> layersConfiguration = configureSimpleNet();
        trainingNet.initialize(bogusData->getDimensions(), layersConfiguration);

        trainingNet.input.set(training::data, bogusData);
        trainingNet.input.set(training::groundTruth, bogusData2);

        trainingNet.parameter.optimizationSolver->parameter.learningRateSequence =
                SharedPtr<NumericTable>(new HomogenNumericTable<>(1, 1, NumericTable::doAllocate, 0.001));

        trainingNet.parameter.nIterations = 10;

        trainingNet.compute();
      
        return 0;
}

I don't see any example code that uses multiple layers. Could you guys figure out what's wrong in this code?

Thank you.

0 Kudos
2 Replies
Sunwoo_L_
Beginner
31 Views

I have figured out the reason why. When you are creating a new layer descriptor and pushing back into a collection of layer descriptors, it is important to check the last argument of the constructor, the NextLayer(). Only the last layer has to call NextLayer without arguments. All the other layers should call the constructor with an appropriate index. After I revised the code like below, there's no segmentation fault anymore. Maybe this sample code can help some other newbies. 

Collection<LayerDescriptor> configureSimpleNet()
{
   Collection<LayerDescriptor> configuration;
/* 1. first conv. */
   SharedPtr<layers::convolution2d::Batch<> > conv2d(new layers::convolution2d::Batch<>);
   conv2d->parameter.kernelSize = layers::convolution2d::KernelSize(5,5);
   conv2d->parameter.nKernels = 1;
   conv2d->parameter.spatialDimensions = layers::convolution2d::SpatialDimensions(28,28);
   configuration.push_back(LayerDescriptor(0, conv2d, NextLayers(1)));

/* 2. first max-pool. */
   //pool1->parameter = maximum_pooling2d::Parameter(2,3,2,2,2,2,0,0);
   SharedPtr<layers::maximum_pooling2d::Batch<> > pool2d(new layers::maximum_pooling2d::Batch<>(4));
   pool2d->parameter = maximum_pooling2d::Parameter(2,3,2,2,2,2,0,0);
   configuration.push_back(LayerDescriptor(1, pool2d, NextLayers()));

   return configuration;
}

 

Andrey_N_Intel
Employee
31 Views

Thank you, Sunwoo, we will extend the documentation with required description. Andrey