Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Ram_N_
Beginner
67 Views

Support tflite to IR conversion natively

Hi Folks,

Could we please augment the Model Optimizer to accept tflite models that are precompiled like the ones from mediapipe.dev ? Since we don't have access to the underlying tensorflow models, using packages like tflite2onnx does not work because of various limitations (quantization etc.).

 

It would be great if mo.py can accept a tflite model with fp16 quantization and directly convert to IR representation. Since platforms like mediapipe are opening up, this will help Intel target a wider audience with networks that already run efficiently on iOS, Android and Desktop.

Thanks,

Ram

0 Kudos
2 Replies
Munesh_Intel
Moderator
50 Views

Hi Ram,

Thanks for reaching out to us. I've forwarded your feature request to our development team for consideration.


Regards,

Munesh


Munesh_Intel
Moderator
25 Views

Hi Ram,

We have forwarded your request to support TF Lite over to our developer team. However, we are unable to comment on future support or any planned enhancements, as they are subject to changes.


This thread will no longer be monitored since we have provided an update. If you need any additional information from Intel, please submit a new question.


Regards,

Munesh