Intel® Distribution of OpenVINO™ Toolkit
Community support and discussions about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all things computer vision-related on Intel® platforms.
5960 Discussions

unable to understand outputs from IR model after converting it from Pytorch

vkana3
Beginner
352 Views

Hi

I'm Vishnu, I created a skin cancer detection pytorch model and successfully converted it to onnx and then to IR model.

However, when I perform inference on the model with an image to predict the class, I get output as given below:

Screenshot (70)_LI.jpg  

for reference please use the links below:

The github link for this project : https://github.com/KVishnuVardhanR/skin_cancer_detection 

the colab notebook for the pytorch model : https://colab.research.google.com/drive/11-jb9W0FWbXZWAlzZluiLOwEMSHOY_CY#scrollTo=85XZfpL9qC5z 

 

Can anyone please help me resolve this issue!!!

0 Kudos
4 Replies
Iffa_Intel
Moderator
333 Views

Greetings,


First and foremost, can you clarify what kind of output you were expecting and the format you expect to see from Pytorch.


This is to ensure that it is possible to achieve your target output with OpenVino.


Sincerely,

Iffa



vkana3
Beginner
330 Views

hi

In Pytorch, after passing the input to model, we use  torch.max(F.softmax(out,dim=1), dim=1)[1], to get our output.

In Openvino, as we are coverting our model, we have to use something like torch.max(F.softmax(out,dim=1), dim=1)[1] this to post process our outputs, but in numpy.

Can you please tell me how to do that in numpy?

 

Iffa_Intel
Moderator
316 Views

Generally, the torch.max returns the maximum value of all elements in the input tensor.


You can refer here for the similar function of torch.max in numpy:

https://github.com/torch/torch7/wiki/Torch-for-Numpy-users


*Refer to calculation section


Hope this helps!

Sincerely,

Iffa


Iffa_Intel
Moderator
310 Views

Greetings,


Intel will no longer monitor this thread since we have provided a solution. If you need any additional information from Intel, please submit a new question.


Sincerely,

Iffa


Reply