Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6570 Discussions

[Bug?] How to use a non-Softmax output layer?

idata
Employee
1,086 Views

I am trying to use a non-Softmax (i.e., Sigmoid) output layer but can't get it working.

 

First of all, it seems that the last layer NEEDS to be called 'prob' otherwise the Movidius toolkit ignores it. For example, if I have an output layer like this:

 

layer {

 

name: "sigmoid4"

 

type: "Sigmoid"

 

bottom: "fc4"

 

top: "sigmoid4"

 

}

 

It is happily ignored by the Movidius software. It does not show up in the profiler, for example.

 

Secondly, changing the name to 'prob' makes sure the toolkit 'see' the layer again, but the output is weird:

 

mvNCCheck v02.00, Copyright @ Movidius Ltd 2016

 

USB: Transferring Data…

 

USB: Myriad Execution Finished

 

USB: Myriad Connection Closing.

 

USB: Myriad Connection Closed.

 

Result: (1, 8)

 

1) 7 1.0

 

2) 6 1.0

 

3) 5 1.0

 

4) 4 1.0

 

5) 3 1.0

 

Expected: (8,)

 

1) 6 1.0

 

2) 3 1.0

 

3) 5 0.89453

 

4) 4 0.71387

 

5) 7 0.47974

 

Obtained values

 

Obtained Min Pixel Accuracy: 99.87506866455078% (max allowed=2%), Fail

 

Obtained Average Pixel Accuracy: 24.002671241760254% (max allowed=1%), Fail

 

Obtained Percentage of wrong values: 50.0% (max allowed=0%), Fail

 

Obtained Pixel-wise L2 error: 41.250226424780664% (max allowed=1%), Fail

 

Obtained Global Sum Difference: 1.9202136993408203

 

Any advice? Or is this a bug?

 

By the way, I checked this network in Caffe and there it works fine.

0 Kudos
5 Replies
idata
Employee
865 Views

@Bug80 Hi Bug80. Thank you for bringing this to our attention. Can you provide me with the name of (or a link to) the Caffe network that you are trying to work with? Thanks.

0 Kudos
idata
Employee
865 Views

Unfortunately I cannot share the original network with you. But, I managed to create a 'dummy' network that - more or less - shows the same issues.

 

The classification network has a simple task: sum all the inputs (sum(x)) and calculate y=int(sum(x)) mod 10. The result (0,1,…9) is equal to the class number.

 

The layout is as follows:

 

     

  • Input layer (3000 inputs)
  •  

  • 3 Hidden layers (100 nodes each, TanH)
  •  

  • Output layer (10 nodes, either sigmoid or softmax activation)
  •  

 

The networks were trained with random number from a normal distribution (mu=0, std=1). Both networks (.prototxt + .caffemodel) can be downloaded here: https://www.dropbox.com/s/x63gsl9fz6pmgze/networks.zip?dl=0

 

When I run the sigmoid network through mvNCheck I am getting the following result:

 

==============================================

 

mvNCCheck v02.00, Copyright @ Movidius Ltd 2016

 

USB: Transferring Data…

 

USB: Myriad Execution Finished

 

USB: Myriad Connection Closing.

 

USB: Myriad Connection Closed.

 

Result: (1, 10)

 

1) 6 0.94824

 

2) 4 0.22693

 

3) 9 0.00090933

 

4) 8 0.0

 

5) 7 0.0

 

Expected: (10,)

 

1) 6 0.1925

 

2) 4 0.052948

 

3) 9 0.0039024

 

4) 8 0.00091887

 

5) 0 0.00021613

 

Obtained values

 

Obtained Min Pixel Accuracy: 392.58084297180176% (max allowed=2%), Fail

 

Obtained Average Pixel Accuracy: 48.51687550544739% (max allowed=1%), Fail

 

Obtained Percentage of wrong values: 20.0% (max allowed=0%), Fail

 

Obtained Pixel-wise L2 error: 127.3932585535493% (max allowed=1%), Fail

 

Obtained Global Sum Difference: 0.9339735507965088

 

==============================================

 

And this is the result for the softmax network:

 

==============================================

 

mvNCCheck v02.00, Copyright @ Movidius Ltd 2016

 

USB: Transferring Data…

 

USB: Myriad Execution Finished

 

USB: Myriad Connection Closing.

 

USB: Myriad Connection Closed.

 

Result: (1, 10)

 

1) 6 0.99609

 

2) 9 0.0041351

 

3) 8 0.0

 

4) 7 0.0

 

5) 5 0.0

 

Expected: (10,)

 

1) 6 0.94971

 

2) 9 0.045166

 

3) 4 0.0051384

 

4) 5 0.00019598

 

5) 1 4.53e-06

 

Obtained values

 

Obtained Min Pixel Accuracy: 4.884318634867668% (max allowed=2%), Fail

 

Obtained Average Pixel Accuracy: 0.976765900850296% (max allowed=1%), Pass

 

Obtained Percentage of wrong values: 20.0% (max allowed=0%), Fail

 

Obtained Pixel-wise L2 error: 2.069187123386144% (max allowed=1%), Fail

 

Obtained Global Sum Difference: 0.09276413917541504

 

==============================================

 

So it looks also softmax output fails sometimes!

0 Kudos
idata
Employee
865 Views

@Tome_at_Intel Did you have time to take a look at my test networks and what is going wrong here?

0 Kudos
idata
Employee
865 Views

@Bug80 Sorry for the late reply. To answer your question, yes, I've had some time to look at the test networks; I've downloaded your test network files, tested and reproduced the problem and we are currently working on a fix. Thank you for your patience.

0 Kudos
idata
Employee
865 Views

@Bug80 Looks like your tanh layers may need a little adustment. I changed the top layers of each tanh layer to feed to the next InnerProduct layer. Here are the adjustments I made to your sample network:

 

name: "net_sigmoid"

 

input: "data"

 

input_shape {

 

dim: 1

 

dim: 1

 

dim: 1

 

dim: 3000

 

}

 

layer {

 

name: "fc1"

 

type: "InnerProduct"

 

bottom: "data"

 

top: "fc1"

 

inner_product_param {

 

num_output: 100

 

}

 

}

 

layer {

 

name: "tanh1"

 

type: "TanH"

 

bottom: "fc1"

 

top: "tanh1"

 

}

 

layer {

 

name: "fc2"

 

type: "InnerProduct"

 

bottom: "tanh1"

 

top: "fc2"

 

inner_product_param {

 

num_output: 100

 

}

 

}

 

layer {

 

name: "tanh2"

 

type: "TanH"

 

bottom: "fc2"

 

top: "tanh2"

 

}

 

layer {

 

name: "fc3"

 

type: "InnerProduct"

 

bottom: "tanh2"

 

top: "fc3"

 

inner_product_param {

 

num_output: 100

 

}

 

}

 

layer {

 

name: "tanh3"

 

type: "TanH"

 

bottom: "fc3"

 

top: "tanh3"

 

}

 

layer {

 

name: "fc4"

 

type: "InnerProduct"

 

bottom: "tanh3"

 

top: "fc4"

 

inner_product_param {

 

num_output: 10

 

}

 

}

 

layer {

 

name: "result"

 

type: "Sigmoid"

 

bottom: "fc4"

 

top: "result"

 

}
0 Kudos
Reply