Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

detection model's output data

deeplearner
Beginner
626 Views

I read the code snippet;

for (auto &item : output_info) {
auto output_name = item.first;
auto output = infer_request.GetBlob(output_name);
{
auto const memLocker = output->cbuffer(); // use const memory locker
// output_buffer is valid as long as the lifetime of memLocker
const float *output_buffer = memLocker.as<const float *>();
/** output_buffer[] - accessing output blob data **/
}
}
 
Sorry, but I can't understand the last sentence "output_buffer[] - accessing output blob data" 
The returned object is pointer to an array, and I should know the size of the array(number of detected things), but I can not find the way. Please help me!
 
0 Kudos
6 Replies
Vladimir_Dudnik
Employee
604 Views

@deeplearner  The InferRequest object allow to obtain output blob though GetBlob function (also used in your code snippet). And Blob object allow to obtain TensorDesc object, which has information about precision, shape and dimensions of data available through pointer, obtained by locking Blob memory buffer.

 

It might be better to look at some workable OpenVINO sample or Open Model Zoo demo to see how it usually implemented in applications.

 

Regards,
  Vladimir

0 Kudos
deeplearner
Beginner
591 Views

Thank you Mr. Dudnik for your kind reply! I fully understood what you said, so solved my problem.  If you allow me an another question, how can I see the "code" of 'workable OpenVINO sample' that you said? The site only shows working methods of samples. 

0 Kudos
Syamimi_Intel
Moderator
576 Views

Hi sinsik jin,

After installation of Intel® Distribution of OpenVINO™ toolkit, code С, C++ and Python* sample applications are available in the following directories, respectively:

 

  • <INSTALL_DIR>/inference_engine/samples/c
  • <INSTALL_DIR>/inference_engine/samples/cpp
  • <INSTALL_DIR>/inference_engine/samples/python

 

You can also refer the following link for Open Model Zoo demos:

https://github.com/openvinotoolkit/open_model_zoo/tree/master/demos

 

 

Regards,

Syamimi


0 Kudos
deeplearner
Beginner
569 Views

Thank you so much for your kind support!!!

 

0 Kudos
Vladimir_Dudnik
Employee
529 Views

Actually OpenVINO online documentation describes location of Demos and Samples, and it is sad to see that you can't find this information even with direct links which has this information just at the top of page for samples

 

After installation of Intel® Distribution of OpenVINO™ toolkit, С, C++ and Python* sample applications are available in the following directories, respectively:

  • <INSTALL_DIR>/inference_engine/samples/c
  • <INSTALL_DIR>/inference_engine/samples/cpp
  • <INSTALL_DIR>/inference_engine/samples/python

 

and for demos

 

For the Intel® Distribution of OpenVINO™ toolkit, the demos are available after installation in the following directory: <INSTALL_DIR>/deployment_tools/open_model_zoo/demos. The demos can also be obtained from the Open Model Zoo GitHub repository. C++, C++ G-API and Python versions are located in the cppcpp_gapi and python subdirectories respectively.

 

0 Kudos
Syamimi_Intel
Moderator
564 Views

Hi sinsik jin,

This thread will no longer be monitored since this issue has been resolved. If you need any additional information from Intel, please submit a new question.


Regards,

Syamimi


0 Kudos
Reply