- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I am trying to convert a machine translation model into IR with latest version of OpenVINO. But, it throws some exception as follows.
Case 1: Using the frozen .pb file
- Command :
python3 ../openvino_2020.1.023/deployment_tools/model_optimizer/mo_tf.py --input_model model.2019-08-16-back-to-tf-1-13-translate.frozen.pb
- Error log:
[ ANALYSIS INFO ] It looks like there are input nodes of boolean type:
model.0/load_decoder_states_from_cache
model.0_1/decoder/transformer/use_terms
If this input node is as switch between the training and an inference mode, then you need to freeze this input with value True or False.
In order to do this run the Model Optimizer with the command line parameter:
--input "model.0/load_decoder_states_from_cache->False" or --input "model.0/load_decoder_states_from_cache->True"
--input "model.0_1/decoder/transformer/use_terms->False" or --input "model.0_1/decoder/transformer/use_terms->True"
to switch graph to inference mode.
[ ERROR ] Exception occurred during running replacer "None" (<class 'extensions.front.tf.assign_elimination.AssignElimination'>): Data flow edge coming out of Assign node decoder/transformer/layer-2/feed-forward/ffn/bias/fg_apply
Case 2 : Using the checkpoints(meta-graph)
- Command:
python3 ../openvino_2020.1.023/deployment_tools/model_optimizer/mo_tf.py --input_meta_graph model.2019-08-16-back-to-tf-1-13.ckpt.meta
- Error log:
[ ANALYSIS INFO ] It looks like there are input nodes of boolean type:
model.0/load_decoder_states_from_cache
model.0_1/decoder/transformer/use_terms
If this input node is as switch between the training and an inference mode, then you need to freeze this input with value True or False.
In order to do this run the Model Optimizer with the command line parameter:
--input "model.0/load_decoder_states_from_cache->False" or --input "model.0/load_decoder_states_from_cache->True"
--input "model.0_1/decoder/transformer/use_terms->False" or --input "model.0_1/decoder/transformer/use_terms->True"
to switch graph to inference mode.
[ ERROR ] Exception occurred during running replacer "REPLACEMENT_ID" (<class 'extensions.middle.PartialInfer.PartialInfer'>): Graph contains a cycle. Cannot proceed.
For more information please refer to Model Optimizer FAQ (https://docs.openvinotoolkit.org/latest/_docs_MO_DG_prepare_model_Model_Optimizer_FAQ.html), question #97.
This custom model seems to be based on transformers like a BERT model,since the layers contains: encoders, decoders & selfattention layers. And by checking the error log, it seems to be that the graph contains cycle. As per OpenVINO documentation, model optimizer supports only straight forward models without cycles. Is there any way convert the model into IR either with .pb or ckpt.
Following are the environment & package details:
OS : Ubuntu18.04
OpenVINO: 2020.1.023
Tensoflow: 1.14.0
Have any suggestion on this issue?
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Chithra,
Yes, OpenVINO supports only straight forward models without cycles. You can try to avoid the cycles by using the below methods.
- Replace cycle containing Sub-graph in Model Optimizer
- Extend Model Optimizer with New Primitives from first step
- Edit the network in the original framework to exclude the cycle
Regards,
Ram prasad
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page