- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
I have been trying to run a 3 layer tensorflow neural network on Movidius NCS but haven’t been able to get the desired output. I would like to report the issue and appreciate any support available regarding this issue.
The details are as follows :
The model was build and trained in Tensorflow. The tensorflow checkpoint file was generated,this was then used to create a graph file using mvNCCompile and a python script was written to load the graph and run the model with the test data on Movidius NCS. The output received was an array of Nans.
For debugging the issue, I parsed the graph file generated , It seems like the movidius tensorflow parser is not accurately converting the tap values used in the model which could be the potential issue why the output is an array Nans. The output of the parsed graph file shows that the dimensions of the Tap Matrix during the Sum Operation in particular layers is not as expected. For example in layer 1 the tap dimension are expected to be 1x1x50 but the graph file shows its 1x50x50 (refer the bold values in the output details below)
Here are the graph parser output details -
Name: b'TensorFlow Network'
Number of Stages: 8
Offset: 2080
Shaves 0 to 11
0
Name: b'layer_1/MatMul'
Op: FullyConnected
InputDim: 1 x 1 x 9
TapDim: 1 x 9 x 50
OutputDum: 1 x 1 x 50
PostOp: NoOp
bias pointer: 0
bias index: 0
opparam pointer: 0
opparam index: 0
input pointer: 0
input index: 1
taps pointer: 0
taps index: 3
output pointer: 400
output index: 4
1
Name: b'layer_1/Relu'
Op: Sum
InputDim: 1 x 1 x 50
TapDim: 1 x 50 x 50
OutputDum: 1 x 1 x 50
PostOp: Relu
bias pointer: 0
bias index: 0
opparam pointer: 0
opparam index: 0
input pointer: 400
input index: 4
taps pointer: 0
taps index: 1
output pointer: 1360
output index: 5
2
Name: b'layer_2/MatMul'
Op: FullyConnected
InputDim: 1 x 1 x 50
TapDim: 1 x 50 x 100
OutputDum: 1 x 1 x 100
PostOp: NoOp
bias pointer: 0
bias index: 0
opparam pointer: 0
opparam index: 0
input pointer: 1360
input index: 5
taps pointer: 960
taps index: 3
output pointer: 2720
output index: 6
3
Name: b'layer_2/Relu'
Op: Sum
InputDim: 1 x 1 x 100
TapDim: 1 x 100 x 100
OutputDum: 1 x 1 x 100
PostOp: Relu
bias pointer: 0
bias index: 0
opparam pointer: 0
opparam index: 0
input pointer: 2720
input index: 6
taps pointer: 0
taps index: 1
output pointer: 4576
output index: 7
4
Name: b'layer_3/MatMul'
Op: FullyConnected
InputDim: 1 x 1 x 100
TapDim: 1 x 100 x 50
OutputDum: 1 x 1 x 50
PostOp: NoOp
bias pointer: 0
bias index: 0
opparam pointer: 0
opparam index: 0
input pointer: 4576
input index: 7
taps pointer: 11008
taps index: 3
output pointer: 6032
output index: 8
5
Name: b'layer_3/Relu'
Op: Sum
InputDim: 1 x 1 x 50
TapDim: 1 x 50 x 50
OutputDum: 1 x 1 x 50
PostOp: Relu
bias pointer: 0
bias index: 0
opparam pointer: 0
opparam index: 0
input pointer: 6032
input index: 8
taps pointer: 0
taps index: 1
output pointer: 6992
output index: 9
6
Name: b'output/MatMul'
Op: FullyConnected
InputDim: 1 x 1 x 50
TapDim: 1 x 50 x 1
OutputDum: 1 x 1 x 1
PostOp: NoOp
bias pointer: 0
bias index: 0
opparam pointer: 0
opparam index: 0
input pointer: 6992
input index: 9
taps pointer: 21056
taps index: 3
output pointer: 7560
output index: 10
7
Name: b'output/add'
Op: Sum
InputDim: 1 x 1 x 1
TapDim: 1 x 1 x 1
OutputDum: 1 x 1 x 1
PostOp: NoOp
bias pointer: 0
bias index: 0
opparam pointer: 0
opparam index: 0
input pointer: 7560
input index: 10
taps pointer: 0
taps index: 1
output pointer: 0
output index: 2
In conclusion, It seems like the Tap Dimensions in the layer where sum operation is done gives wrong dimensions for some reasons. I can provide the python scripts and data used to run the model on NCS so it can be recreated at the support end if need be.
Thanks,
snh
- Tags:
- Tensorflow
Link Copied

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page