Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

Model Caching with HDDL

taka
Beginner
738 Views

Hello,

I used model caching with HDDL based on the following URL
https://docs.openvino.ai/latest/openvino_docs_OV_UG_Model_caching_overview.html

On the first run of the attached program, the model cache is output to the current directory and the program can be terminated successfully.
On the second run, core dumped occurs.
The error occurs at line 64.

> infer_request.set_input_tensor(input_tensor);


Both the first and second runs are successful for the inference device "CPU" or "GPU".
I also tried NCS2 in a Windows environment, and NCS2 also runs successfully.

I also checked the support status of the HDDL model cache with the following code
support status of the HDDL model cache, and it returns "true".
Therefore, we believe that the model cache is supported by HDDL.
--------------------------------------------------
// Get list of supported device capabilities
std::vector<std::string> caps = core.get_property(deviceName, ov::device::capabilities);

// Find 'EXPORT_IMPORT' capability in supported capabilities
bool cachingSupported = std::find(caps.begin(), caps.end(), ov::device::capability::EXPORT_IMPORT) ! = caps.end();
--------------------------------------------------
Please let us know how to deal with this.


[environment]
OpenVINO 2022.1
OS: Ubuntu 20.04LTS
CPU :Atom x7-E3950
Device: HDDL plugin

[operation]
root@b07fbfbf6fdb:~/openvino_api2.0_sample# make
root@b07fbfbf6fdb:~/openvino_api2.0_sample# ./run.sh
[setupvars.sh] OpenVINO environment initialized
5462.4ms

classid probability label
------ ------ ------
657: 56.0547% : minivan
437: 2.04468% : beach wagon
469: 1.74255% : cab
818: 1.66779% : sports car
655: 1.2558% : minibus
root@b07fbfbf6fdb:~/openvino_api2.0_sample# ./run.sh
[setupvars.sh] OpenVINO environment initialized
1997.21ms

terminate called after throwing an instance of 'ov::Exception'
what(): Check 'inputs.size() == 1' failed at inference/src/cpp/ie_infer_request.cpp:309:
set_input_tensor() must be called on a function with exactly one parameter.

./run.sh: line 5: 3545 Aborted (core dumped) ./sample
root@b07fbfbf6fdb:~/openvino_api2.0_sample#


Regards,

0 Kudos
0 Replies
Reply