Hi, I'm trying to convert a TensorFlow model but when I have to freeze the model I can't choose my output node correctly because my layer isn't supported. How can I replace "tf.image.encode_png" with the Model Optimizer? I'd like to know if it's possible to freeze the model before the "encoding" so in my application the result of inference(after the infer request with OV) becomes the input for the last layer in TensorFlow(tf.image.encode_png("my output")).
You can try to implement the layer tf.image.encode_png as custom layer.
Or you may try dropping the last layer of the model and generate IR from rest of the model, then feed the output of Inference to the layer tf.image.encode_png.