- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi
Can models created using the Intel 2022 OpenVino model optimizer run on 2023 OpenVino toolkit inference engine?
Can models created using the Intel 2022.0 OpenVino model optimizer run on newer versions of the 2022 OpenVino toolkit inference engine? (i.e. I created a model using the 2022.3.0 model optimizer, but want to use it with 2022.3.1 LTS engine).
What is the rule for backwards compatibility? Major releases (2020, 2021, etc) are not compatible with the following year's release (2021, 2022, etc)?
Thx
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi eddie_patton,
Thank you for reaching out to us.
Here are the answers to your questions:
- Yes, model format created using OpenVINO Model Optimizer version 2022 onwards is in Intermediate Representation v11 (IR v11) which can be ran on OpenVINO 2023 that uses Inference Engine API 2.0.
- Yes, the reason would be the same to the 1st question.
- Regarding backwards compatibility, Inference Engine API 2.0 supports backward compatibility for models of OpenVINO IR v10. If you have OpenVINO IR v10 files, they can also be fed to OpenVINO Runtime. However, API 1.0 is not compatible with IRv11.
On a side note, the best practice would be to use the same version of OpenVINO Model Optimizer and Inference Engine to avoid any compatibility issues.
Refer OpenVINO™ API 2.0 Transition Guide for detailed information regarding OpenVINO APIs and OpenVINO IR model formats compatibility.
Regards,
Hairul
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi eddie_patton,
Thank you for reaching out to us.
Here are the answers to your questions:
- Yes, model format created using OpenVINO Model Optimizer version 2022 onwards is in Intermediate Representation v11 (IR v11) which can be ran on OpenVINO 2023 that uses Inference Engine API 2.0.
- Yes, the reason would be the same to the 1st question.
- Regarding backwards compatibility, Inference Engine API 2.0 supports backward compatibility for models of OpenVINO IR v10. If you have OpenVINO IR v10 files, they can also be fed to OpenVINO Runtime. However, API 1.0 is not compatible with IRv11.
On a side note, the best practice would be to use the same version of OpenVINO Model Optimizer and Inference Engine to avoid any compatibility issues.
Refer OpenVINO™ API 2.0 Transition Guide for detailed information regarding OpenVINO APIs and OpenVINO IR model formats compatibility.
Regards,
Hairul
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks Hairul. This is great info to know.
Is there info in the model IR xml file that tells us what IR was used? i.e. IR10 or IR 11?
I reviewed the ov::Model docs https://docs.openvino.ai/2023.0/classov_1_1Model.html and there's api's to get the name, element type, and shape, but nothing about the model version (like what API it is or what version of toolkit was used to create the IR). This info would be really helpful to have.
To confirm, 2022 and 2023 releases are API 2.0. Anything prior is API 1.0?
I tried digging through the release info and it doesn't state what API each release is, other than in the LTS version it states that API 2.0 was introduced.
Releases · openvinotoolkit/openvino · GitHub
Cheers
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi eddie_patton,
For your information, you can get the IR model format version by viewing the model's .xml file. The IR version is written as the following:
<net name="model_name" version="IR_version">
Here is an example of alexnet model in IR format (IRv11) from OpenVINO 2023.0:
Yes, OpenVINO 2022 and 2023 releases are using API 2.0. OpenVINO API 2.0 was first introduced in release 2022.1. Subsequent release versions after 2022.1 will continue to use API 2.0. Anything prior to 2022.1 is Inference Engine API.
Refer Introduction of API 2.0 and Changes to Inference Pipeline in OpenVINO API v2 for more information.
Regards,
Hairul
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi eddie_patton,
This thread will no longer be monitored since this issue has been resolved. If you need any additional information from Intel, please submit a new question.
Regards,
Hairul
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page