Hi,
I see a lot of documentation on converting ONNX file into to IR format (.mapping, .xml, and .bin) using the model optimizer tool.
Is there a way to convert from IR format back to ONNX file format?
Thanks,
Ryan
Link Copied
Hi Ryan,
Greetings to you.
The OpenVINO workflow does not support converting from IR format back to ONNX file format. Model Optimizer loads a model into memory, reads it, builds the internal representation of the model, optimizes it, and produces the Intermediate Representation (IR). IR is the only format that the Inference Engine accepts.
For your information, once the ONNX file format model is converted into IR format files, the IR format files are generated into a new folder while the original model is still located in its original directory.
Regards,
Peh
Hi Ryan,
Greetings to you.
The OpenVINO workflow does not support converting from IR format back to ONNX file format. Model Optimizer loads a model into memory, reads it, builds the internal representation of the model, optimizes it, and produces the Intermediate Representation (IR). IR is the only format that the Inference Engine accepts.
For your information, once the ONNX file format model is converted into IR format files, the IR format files are generated into a new folder while the original model is still located in its original directory.
Regards,
Peh
Hi Ryan,
This thread will no longer be monitored since this issue has been resolved. If you need any additional information from Intel, please submit a new question.
Regards,
Peh
For more complete information about compiler optimizations, see our Optimization Notice.