Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

FP16 model unable to run in OpenVINO

DarkHorse
Employee
1,664 Views

Hi,

 

My customer has train their AI model using PyTorch framework using Intel Sonoma Creek and they want to run the AI Inference on a VPU accelerator card. 

 

They managed to convert their model from:

 

PyTorch -> ONNX -> FP16 IR Format

 

I have confirmed their FP16 model on my OpenVINO 2021 and showing similar error messages:

[ INFO ] Parsing input parameters
[ INFO ] Reading input
[ INFO ] Loading Inference Engine
[ INFO ] Device info:
[ INFO ] CPU
MKLDNNPlugin version ......... 2021.4
Build ........... 0
Loading network files
[ ERROR ] Unknown model format! Cannot find reader for model format: xml and read the model: model_barehand.xml. Please check that reader library exists in your PATH.

 

Customer have no issue running FP32 model in OpenVINO 2022 on CPU.

Customer also tried FP16 model person-detection retail 002 from intel it working ok

Can someone have a look into this?

 

Thanks.

 

 

 

 

Labels (1)
0 Kudos
5 Replies
Peh_Intel
Moderator
1,638 Views

Hi DarkHorse,

 

For your information, in the latest version of OpenVINO™, the error message has changed... earlier it explicitly reported that the found IR-version is not supported... now it fails with "Unknown model format! Cannot find reader for model format".

In short words, OpenVINO™ 2022 is able to load IRv10 (converted by Model Optimizer 2021.4) and IRv11 (converted by Model Optimizer 2022.1) into Inference Engine whereas OpenVINO™ 2021.4 is only able to load IRv10 and will produce the above error if loading IRv11.

 

 

Regards,

Peh


0 Kudos
DarkHorse
Employee
1,626 Views

Hello Peh,

 

My customer is seeing the same error messages even he is trying to run the inferencing in OpenVINO 2022?

Attached is the log files... can you have a look?

 

Thanks.

0 Kudos
Wan_Intel
Moderator
1,613 Views

Hi DarkHorse,

Referring to this article, could you please check if both inference_engine_ir_reader.dll and inference_engine.dll are located in the following directory:

<INSTALL_DIR>\deployment_tools\inference_engine\bin\intel64\Release

 

 

Regards,

Wan


0 Kudos
Peh_Intel
Moderator
1,553 Views

Hi DarkHorse,


Thank you for your question. If you need any additional information from Intel, please submit a new question as this thread is no longer being monitored.



Regards,

Peh


0 Kudos
Farhâd
Novice
1,334 Views

I see this same problem on the raspberry pi4b.  Please let me know the solution.

0 Kudos
Reply