Community
cancel
Showing results for 
Search instead for 
Did you mean: 
ChithraJ_Intel
Moderator
123 Views

Machine translation model conversion -Failed

Hi,

I am trying to convert a machine translation model into IR with latest version of OpenVINO. But, it throws some exception as follows.
Case 1: Using the frozen .pb file

  • Command : 
python3 ../openvino_2020.1.023/deployment_tools/model_optimizer/mo_tf.py --input_model model.2019-08-16-back-to-tf-1-13-translate.frozen.pb
  • Error log: 

        [ ANALYSIS INFO ]  It looks like there are input nodes of boolean type:
                model.0/load_decoder_states_from_cache
                model.0_1/decoder/transformer/use_terms
        If this input node is as switch between the training and an inference mode, then you need to freeze this input with value True or False.
        In order to do this run the Model Optimizer with the command line parameter:
                --input "model.0/load_decoder_states_from_cache->False" or --input "model.0/load_decoder_states_from_cache->True"
                --input "model.0_1/decoder/transformer/use_terms->False" or --input "model.0_1/decoder/transformer/use_terms->True"
        to switch graph to inference mode.
        [ ERROR ]  Exception occurred during running replacer "None" (<class 'extensions.front.tf.assign_elimination.AssignElimination'>): Data flow edge coming out of Assign node decoder/transformer/layer-2/feed-forward/ffn/bias/fg_apply

Case 2 : Using the checkpoints(meta-graph)

  • Command: 
python3 ../openvino_2020.1.023/deployment_tools/model_optimizer/mo_tf.py --input_meta_graph model.2019-08-16-back-to-tf-1-13.ckpt.meta
  • Error log: 

[ ANALYSIS INFO ]  It looks like there are input nodes of boolean type:
                model.0/load_decoder_states_from_cache
                model.0_1/decoder/transformer/use_terms
        If this input node is as switch between the training and an inference mode, then you need to freeze this input with value True or False.
        In order to do this run the Model Optimizer with the command line parameter:
                --input "model.0/load_decoder_states_from_cache->False" or --input "model.0/load_decoder_states_from_cache->True"
                --input "model.0_1/decoder/transformer/use_terms->False" or --input "model.0_1/decoder/transformer/use_terms->True"
        to switch graph to inference mode.
        [ ERROR ]  Exception occurred during running replacer "REPLACEMENT_ID" (<class 'extensions.middle.PartialInfer.PartialInfer'>): Graph contains a cycle. Cannot proceed.
        For more information please refer to Model Optimizer FAQ (https://docs.openvinotoolkit.org/latest/_docs_MO_DG_prepare_model_Model_Optimizer_FAQ.html), question #97.

This custom model seems to be based on transformers like a BERT model,since the layers contains: encoders, decoders & selfattention layers. And by checking the error log, it seems to be that the graph contains cycle. As per OpenVINO documentation, model optimizer supports only straight forward models without cycles. Is there any way convert the model into IR either with .pb or ckpt.
        
Following are the environment & package details: 

OS : Ubuntu18.04
OpenVINO: 2020.1.023
Tensoflow: 1.14.0

Have any suggestion on this issue?

 

0 Kudos
1 Reply
SIRIGIRI_V_Intel
Employee
123 Views

Hi Chithra,

Yes, OpenVINO supports only straight forward models without cycles. You can try to avoid the cycles by using the below methods.

  1. Replace cycle containing Sub-graph in Model Optimizer
  2. Extend Model Optimizer with New Primitives from first step
  3. Edit the network in the original framework to exclude the cycle

Regards,

Ram prasad

Reply