- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I read the code snippet;
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@deeplearner The InferRequest object allow to obtain output blob though GetBlob function (also used in your code snippet). And Blob object allow to obtain TensorDesc object, which has information about precision, shape and dimensions of data available through pointer, obtained by locking Blob memory buffer.
It might be better to look at some workable OpenVINO sample or Open Model Zoo demo to see how it usually implemented in applications.
Regards,
Vladimir
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thank you Mr. Dudnik for your kind reply! I fully understood what you said, so solved my problem. If you allow me an another question, how can I see the "code" of 'workable OpenVINO sample' that you said? The site only shows working methods of samples.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi sinsik jin,
After installation of Intel® Distribution of OpenVINO™ toolkit, code С, C++ and Python* sample applications are available in the following directories, respectively:
- <INSTALL_DIR>/inference_engine/samples/c
- <INSTALL_DIR>/inference_engine/samples/cpp
- <INSTALL_DIR>/inference_engine/samples/python
You can also refer the following link for Open Model Zoo demos:
https://github.com/openvinotoolkit/open_model_zoo/tree/master/demos
Regards,
Syamimi
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thank you so much for your kind support!!!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Actually OpenVINO online documentation describes location of Demos and Samples, and it is sad to see that you can't find this information even with direct links which has this information just at the top of page for samples
After installation of Intel® Distribution of OpenVINO™ toolkit, С, C++ and Python* sample applications are available in the following directories, respectively:
<INSTALL_DIR>/inference_engine/samples/c
<INSTALL_DIR>/inference_engine/samples/cpp
<INSTALL_DIR>/inference_engine/samples/python
and for demos
For the Intel® Distribution of OpenVINO™ toolkit, the demos are available after installation in the following directory: <INSTALL_DIR>/deployment_tools/open_model_zoo/demos
. The demos can also be obtained from the Open Model Zoo GitHub repository. C++, C++ G-API and Python versions are located in the cpp
, cpp_gapi
and python
subdirectories respectively.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi sinsik jin,
This thread will no longer be monitored since this issue has been resolved. If you need any additional information from Intel, please submit a new question.
Regards,
Syamimi

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page