Hi, i translate a pytorch.onnx model to MO .xml and bin files, i use classification_sample_async to do the inference and i get different results. Is this a bug? Should i creat the inference app from scratch? The inference problem is that classify in a diferent category if i change the device in CPU is a bad image and in MYRIAD is a good one.
Thanks.
Link Copied
Hi Benjamin,
No need to create the inference app from scratch, could you PM me the model and I'll take a look at this.
I cant or i dont know how to send you a PM. My code is the classificaction_sample_async.exe, on the line of command, i didnt change anything, i have problems with the inference, i have differents results on MYRIAD and CPU.
For more complete information about compiler optimizations, see our Optimization Notice.