Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6404 Discussions

How to get inference results from custom model in C++

Yeoh__Ru_Sern
Novice
489 Views

Hi,

 

I have trained my own classification model and converted to the IR format in openvino. Previously, I was using the OpenCV DNN api for inferencing with the inference engine as the backend.

 

I would like to use the OpenVino API directly now, but I am having troubles reading the results. I am following the hello_classification example, and I understand how to read the network, load the input etc. However, it seems like the hello_classification example is using a custom function to read the results. How can I read and manipulate the inference results?

 

For example, in the OpenCV DNN API, the output results from the inference is returned as a Mat object, and I can read and manipulate it as so, and formatting it get my desired results, eg the most confident predicted class/probability. How can I do the same through the Openvino C++ api? 

 

Thanks.

0 Kudos
1 Reply
Max_L_Intel
Moderator
489 Views

Hi, Yeoh, Ru Sern.

Please refer to this documentation section to get more information on how to use Inference Engine C++ API with your application - https://docs.openvinotoolkit.org/latest/_docs_IE_DG_Integrate_with_customer_application_new_API.html

There's also a link to Hello Classification sample use case for that.

Thanks.
Best regards, Max. 

0 Kudos
Reply