- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi, i translate a pytorch.onnx model to MO .xml and bin files, i use classification_sample_async to do the inference and i get different results. Is this a bug? Should i creat the inference app from scratch? The inference problem is that classify in a diferent category if i change the device in CPU is a bad image and in MYRIAD is a good one.
Thanks.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Another issue on MYRIAD its that does not allow me to inference from a batch in CPU i dont have that problem
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Benjamin,
No need to create the inference app from scratch, could you PM me the model and I'll take a look at this.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I cant or i dont know how to send you a PM. My code is the classificaction_sample_async.exe, on the line of command, i didnt change anything, i have problems with the inference, i have differents results on MYRIAD and CPU.

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page