I have trained my own classification model and converted to the IR format in openvino. Previously, I was using the OpenCV DNN api for inferencing with the inference engine as the backend.
I would like to use the OpenVino API directly now, but I am having troubles reading the results. I am following the hello_classification example, and I understand how to read the network, load the input etc. However, it seems like the hello_classification example is using a custom function to read the results. How can I read and manipulate the inference results?
For example, in the OpenCV DNN API, the output results from the inference is returned as a Mat object, and I can read and manipulate it as so, and formatting it get my desired results, eg the most confident predicted class/probability. How can I do the same through the Openvino C++ api?
Hi, Yeoh, Ru Sern.
Please refer to this documentation section to get more information on how to use Inference Engine C++ API with your application - https://docs.openvinotoolkit.org/latest/_docs_IE_DG_Integrate_with_customer_application_new_API.html
There's also a link to Hello Classification sample use case for that.
Best regards, Max.