Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6552 Discussions

Deprecation of Tensorflow CudnnRNNV3 on conversion to IR Format

Xanph
Beginner
420 Views

OS: Ubuntu 24.04.2 LTS

OpenVINO Version: 2025.0.0

CUDA Version: V12.5.82

CuDNN Version: 9.8.0

Tensorflow 2 Version: 2.19.0

Keras: 3.9.0

 

I'm attempting to convert my Keras model into the IR format. To do so I ran the following code:

model_name = f"{version}_best_motionx_model"

model_path = f'models/best_keras/{model_name}.keras'

model = keras.models.load_model(model_path)

# Save the model in the SavedModel format

saved_model_dir = f'models/tensorflow/{model_name}_tf'

model.export(saved_model_dir, format='tf_saved_model')

# Convert the SavedModel to OpenVINO IR format with multiple outputs

ir_model = ov.convert_model(

saved_model_dir,

)

# Save the converted IR model

output_dir = "models/intermediate_representation"

ov.save_model(ir_model, f"{output_dir}/ir_motionX{version}.xml")

 

On the ov.convert_model call, I got this error saying that a translator cannot be found for CudnnRNNV3. On looking into this in the Tensorflow docs, it appears to be deprecated.

 

OpConversionFailure:

---------------------------------------------------------------------------
OpConversionFailure Traceback (most recent call last)
Cell In[3], line 15
12 model.export(saved_model_dir, format='tf_saved_model')
14 # Convert the SavedModel to OpenVINO IR format with multiple outputs
---> 15 ir_model = ov.convert_model(
16 saved_model_dir,
17 )
19 # Save the converted IR model
20 output_dir = "models/intermediate_representation"

File /media/xanph/Data A/Development/motionx/motionx/lib/python3.12/site-packages/openvino/tools/ovc/convert.py:105, in convert_model(input_model, input, output, example_input, extension, verbose, share_weights)
103 logger_state = get_logger_state()
104 cli_parser = get_all_cli_parser()
--> 105 ov_model, _ = _convert(cli_parser, params, True)
106 restore_logger_state(logger_state)
107 return ov_model

File /media/xanph/Data A/Development/motionx/motionx/lib/python3.12/site-packages/openvino/tools/ovc/convert_impl.py:565, in _convert(cli_parser, args, python_api_used)
563 send_conversion_result('fail')
564 if python_api_used:
--> 565 raise e
566 else:
567 return None, argv

File /media/xanph/Data A/Development/motionx/motionx/lib/python3.12/site-packages/openvino/tools/ovc/convert_impl.py:505, in _convert(cli_parser, args, python_api_used)
499 if argv.framework is None and get_pytorch_decoder_for_model_on_disk(argv, args):
500 # try to load a model from disk as TorchScript or ExportedProgram
501 # TorchScriptPythonDecoder or TorchFXPythonDecoder object will be assigned to argv.input_model
502 # saved TorchScript and ExportedModel model can be passed to both ovc tool and Python convert_model
503 pytorch_model_on_disk = True
--> 505 ov_model = driver(argv, {"conversion_parameters": non_default_params})
507 if pytorch_model_on_disk:
508 # release memory allocated for temporal object
509 del argv.input_model

File /media/xanph/Data A/Development/motionx/motionx/lib/python3.12/site-packages/openvino/tools/ovc/convert_impl.py:249, in driver(argv, non_default_params)
245 def driver(argv: argparse.Namespace, non_default_params: dict):
246 # Log dictionary with non-default cli parameters where complex classes are excluded.
247 log.debug(str(non_default_params))
--> 249 ov_model = moc_emit_ir(prepare_ir(argv), argv)
251 return ov_model

File /media/xanph/Data A/Development/motionx/motionx/lib/python3.12/site-packages/openvino/tools/ovc/convert_impl.py:195, in prepare_ir(argv)
193 for extension in filtered_extensions(argv.extension):
194 moc_front_end.add_extension(extension)
--> 195 ov_model = moc_pipeline(argv, moc_front_end)
196 return ov_model
198 if not argv.input_model:

File /media/xanph/Data A/Development/motionx/motionx/lib/python3.12/site-packages/openvino/tools/ovc/moc_frontend/pipeline.py:293, in moc_pipeline(argv, moc_front_end)
289 input_model.set_partial_shape(place, ov_shape)
291 input_model.set_tensor_value(place, value)
--> 293 ov_model = moc_front_end.convert(input_model)
295 return ov_model

File /media/xanph/Data A/Development/motionx/motionx/lib/python3.12/site-packages/openvino/frontend/frontend.py:18, in FrontEnd.convert(self, model)
17 def convert(self, model: Union[Model, InputModel]) -> Model:
---> 18 converted_model = super().convert(model)
19 if isinstance(model, InputModel):
20 return Model(converted_model)

OpConversionFailure: Check 'is_conversion_successful' failed at src/frontends/tensorflow/src/frontend.cpp:478:
FrontEnd API failed with OpConversionFailure:
[TensorFlow Frontend] Internal error, no translator found for operation(s): CudnnRNNV3
To facilitate the conversion of unsupported operations, refer to Frontend Extension documentation: https://docs.openvino.ai/latest/openvino_docs_Extensibility_UG_Frontend_Extensions.html

 

For context, I was able to run this conversion back in November. Since then I think the only thing I have changed is moving to Keras 3 and Tensorflow 2.19.0

 

What do you recommend I do?

 

Many thanks

0 Kudos
1 Solution
Peh_Intel
Moderator
346 Views

Hi Xanph,


Thanks for reaching out to us.


I would suggest you to downgrade TensorFlow version to 2.16.1 and convert the Keras model into the IR format again.



Regards,

Peh


View solution in original post

0 Kudos
4 Replies
Peh_Intel
Moderator
347 Views

Hi Xanph,


Thanks for reaching out to us.


I would suggest you to downgrade TensorFlow version to 2.16.1 and convert the Keras model into the IR format again.



Regards,

Peh


0 Kudos
Xanph
Beginner
318 Views

Hi Peh,

 

Thank you for that - it worked.

 

Does OpenVINO support TF 2.17? I ask this because I need to get CUDA 12.3 to work on Ubuntu 24.04, which currently seems to be a bit tricky to support.

 

Many thanks

0 Kudos
Peh_Intel
Moderator
298 Views

Hi Xanph,


Yes, OpenVINO have fully validated the support for TensorFlow version from 1.15.5 to 2.17. This information can be obtained from the System Requirements under DL framework versions of Operating systems and developer environment.



Regards,

Peh


0 Kudos
Peh_Intel
Moderator
162 Views

Hi Xanph,


This thread will no longer be monitored since this issue has been resolved. If you need any additional information from Intel, please submit a new question. 



Regards,

Peh


0 Kudos
Reply