Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

Model Caching with HDDL

taka
Beginner
939 Views

Hello,

I used model caching with HDDL based on the following URL
https://docs.openvino.ai/latest/openvino_docs_OV_UG_Model_caching_overview.html

On the first run of the attached program, the model cache is output to the current directory and the program can be terminated successfully.
On the second run, core dumped occurs.
The error occurs at line 64.

> infer_request.set_input_tensor(input_tensor);


Both the first and second runs are successful for the inference device "CPU" or "GPU".
I also tried NCS2 in a Windows environment, and NCS2 also runs successfully.

I also checked the support status of the HDDL model cache with the following code
support status of the HDDL model cache, and it returns "true".
Therefore, we believe that the model cache is supported by HDDL.
--------------------------------------------------
// Get list of supported device capabilities
std::vector<std::string> caps = core.get_property(deviceName, ov::device::capabilities);

// Find 'EXPORT_IMPORT' capability in supported capabilities
bool cachingSupported = std::find(caps.begin(), caps.end(), ov::device::capability::EXPORT_IMPORT) ! = caps.end();
--------------------------------------------------
Please let us know how to deal with this.

[environment]
OpenVINO 2022.1
OS: Ubuntu 20.04LTS
CPU :Atom x7-E3950
Device: HDDL plugin

[operation]
root@b07fbfbf6fdb:~/openvino_api2.0_sample# make
root@b07fbfbf6fdb:~/openvino_api2.0_sample# ./run.sh
[setupvars.sh] OpenVINO environment initialized
5462.4ms

classid probability label
------ ------ ------
657: 56.0547% : minivan
437: 2.04468% : beach wagon
469: 1.74255% : cab
818: 1.66779% : sports car
655: 1.2558% : minibus
root@b07fbfbf6fdb:~/openvino_api2.0_sample# ./run.sh
[setupvars.sh] OpenVINO environment initialized
1997.21ms

terminate called after throwing an instance of 'ov::Exception'
what(): Check 'inputs.size() == 1' failed at inference/src/cpp/ie_infer_request.cpp:309:
set_input_tensor() must be called on a function with exactly one parameter.

./run.sh: line 5: 3545 Aborted (core dumped) ./sample
root@b07fbfbf6fdb:~/openvino_api2.0_sample#


Regards,

Labels (1)
0 Kudos
2 Replies
Peh_Intel
Moderator
874 Views

Hi Taka,


We are very sorry for the late response.


Thanks for bring out this matter.


I was able to duplicate the issue. Besides, I also tried to use Compile Tool to generate Blob file and direct imported the Blob file to the application on HDDL. However, I still encountered the same error as well.


We will investigate this issue and get back to you.



Regards,

Peh


0 Kudos
Peh_Intel
Moderator
511 Views

Hi Taka,


Thank you for reporting this issue. Unfortunately, due to product development priorities, engineering is unable to fix this bug. Since we cannot commit to a fix, we recommend closing this issue.



Regards,

Peh


0 Kudos
Reply