Community
cancel
Showing results for 
Search instead for 
Did you mean: 
barinov__dmitriy
Beginner
64 Views

Poor Up AI core X precision

Hello!

I experience huge precision degradation when comparing results from GPU (Intel® Atom™ x7-E3950) and MYRIAD (Up AI core X with Myriad X 2485) backends. GPU works fine and provides reasonable results, while MYRIAD quickly accumulates precision loss when comparing layer-after-layer output. Both backends use FP16 precision and absolutely same network model (please find model files attached).

That's how this experiment was performed: I took the original .bin and .xml files with full network configuration, then removed all layers from "<layers>" and "<edges>" sections and started adding them one by one. For each such configuration I output first 10 elements of last layer. In the following field you can find outputs for some layers. As you can see difference accumulates quickly and in the last layers values vary greatly. The question is, is it working as intended and one should expect such big differencies, or something goes wrong?

Thanks in advance,

Dmitriy Barinov

GPU 1 layer
00: -31.656250
01: -44.000000
02: -44.000000
03: -44.000000
04: -44.000000
05: -44.000000
06: -44.000000
07: -44.000000
08: -44.000000
09: -44.000000

MYRIAD 1 layer
00: -31.640625
01: -44.000000
02: -44.000000
03: -44.000000
04: -44.000000
05: -44.000000
06: -44.000000
07: -44.000000
08: -44.000000
09: -44.000000


GPU 2 layers
00: -3.164063
01: -4.398438
02: -4.398438
03: -4.398438
04: -4.398438
05: -4.398438
06: -4.398438
07: -4.398438
08: -4.398438
09: -4.398438

MYRIAD 2 layers
00: -3.166016
01: -4.402344
02: -4.402344
03: -4.402344
04: -4.402344
05: -4.402344
06: -4.402344
07: -4.402344
08: -4.402344
09: -4.402344

GPU 3 layers
00: -0.316162
01: -0.615234
02: -0.615234
03: -0.615234
04: -0.615234
05: -0.615234
06: -0.615234
07: -0.615234
08: -0.615234
09: -0.615234


MYRYAD 3 layers
00: -0.318848
01: -0.620117
02: -0.620117
03: -0.620117
04: -0.620117
05: -0.620117
06: -0.620117
07: -0.620117
08: -0.620117
09: -0.620117

GPU 6 layers
00: 6.343750
01: 6.015625
02: 6.015625
03: 6.015625
04: 6.015625
05: 6.015625
06: 6.015625
07: 6.015625
08: 6.015625
09: 6.015625

MYRIAD 6 layers
00: 6.121094
01: 5.808594
02: 5.808594
03: 5.808594
04: 5.808594
05: 5.808594
06: 5.808594
07: 5.808594
08: 5.808594
09: 5.808594

GPU 12 layers
00: 4.640625
01: 2.626953
02: 1.486328 <<<<<<<<<
03: -0.224121
04: -0.277588
05: -0.392334
06: -0.174927
07: -0.156372
08: 1.488281
09: 0.386230

MYRIAD 12 layers
00: 4.445313
01: 2.416016
02: 0.615234 <<<<<<<<<< huge difference here
03: -0.200684
04: -0.280273
05: -0.402832
06: -0.176758
07: -0.147949
08: 1.469727
09: 0.458984


GPU 13 layers
00: 0.019714
01: 0.267334
02: -0.327393
03: -0.429688
04: -0.414063
05: 0.583008
06: 0.056091
07: -0.686035
08: 0.830566
09: -0.478516

MYRIAD 13 layers
00: -0.063965
01: 0.024414
02: -0.545898
03: -0.479492
04: -0.519043
05: 0.334961
06: -0.130371
07: -0.627441
08: 0.593262
09: -0.629395

0 Kudos
0 Replies
Reply