Community
cancel
Showing results for 
Search instead for 
Did you mean: 
V_Marco
Beginner
182 Views

Conversion Error TFmodel -> model Optimizer

Hi, I'm trying to convert a custom Tensorflow model that i've already frozen. I run the model optimizer in debug mode and the error that shows to me is

[ ERROR ]  Exception occurred during running replacer "REPLACEMENT_ID" (<class 'extensions.front.output_cut.OutputCut'>): Graph contains 0 node after executing <class 'extensions.front.output_cut.OutputCut'>. It considered as error because resulting IR will be empty which is not usual
[ 2020-02-21 15:18:04,261 ] [ DEBUG ] [ main:324 ]  Traceback (most recent call last):
  File "/opt/intel/openvino_2020.1.023/deployment_tools/model_optimizer/mo/utils/class_registration.py", line 295, in apply_transform
    for_graph_and_each_sub_graph_recursively(graph, lambda _: graph.check_empty_graph(replacer_cls))
  File "/opt/intel/openvino_2020.1.023/deployment_tools/model_optimizer/mo/middle/pattern_match.py", line 58, in for_graph_and_each_sub_graph_recursively
    func(graph)
  File "/opt/intel/openvino_2020.1.023/deployment_tools/model_optimizer/mo/utils/class_registration.py", line 295, in <lambda>
    for_graph_and_each_sub_graph_recursively(graph, lambda _: graph.check_empty_graph(replacer_cls))
  File "/opt/intel/openvino_2020.1.023/deployment_tools/model_optimizer/mo/graph/graph.py", line 719, in check_empty_graph
    "empty which is not usual".format(len(self.nodes()), description))
mo.utils.error.Error: Graph contains 0 node after executing <class 'extensions.front.output_cut.OutputCut'>. It considered as error because resulting IR will be empty which is not usual

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/opt/intel/openvino_2020.1.023/deployment_tools/model_optimizer/mo/main.py", line 314, in main
    return driver(argv)
  File "/opt/intel/openvino_2020.1.023/deployment_tools/model_optimizer/mo/main.py", line 281, in driver
    ret_res = emit_ir(prepare_ir(argv), argv)
  File "/opt/intel/openvino_2020.1.023/deployment_tools/model_optimizer/mo/main.py", line 226, in prepare_ir
    graph = mo_tf.driver(argv)
  File "/opt/intel/openvino_2020.1.023/deployment_tools/model_optimizer/mo/pipeline/tf.py", line 111, in driver
    class_registration.ClassType.BACK_REPLACER
  File "/opt/intel/openvino_2020.1.023/deployment_tools/model_optimizer/mo/utils/class_registration.py", line 324, in apply_replacements
    num_transforms=len(replacers_order))
  File "/opt/intel/openvino_2020.1.023/deployment_tools/model_optimizer/mo/utils/logger.py", line 124, in wrapper
    function(*args, **kwargs)
  File "/opt/intel/openvino_2020.1.023/deployment_tools/model_optimizer/mo/utils/class_registration.py", line 303, in apply_transform
    )) from err
mo.utils.error.Error: Exception occurred during running replacer "REPLACEMENT_ID" (<class 'extensions.front.output_cut.OutputCut'>): Graph contains 0 node after executing <class 'extensions.front.output_cut.OutputCut'>. It considered as error because resulting IR will be empty which is not usual

I attached the frozen model and the original network is in this link:

 

https://github.com/FilippoAleotti/Dwarf-Tensorflow

could you please reproduce the error  or give me some advice (i've already converted custom models with success so i red all the post in forums and on you website) on how and if is possible to convert to IR representation ?

0 Kudos
7 Replies
Luis_at_Intel
Moderator
182 Views

Hi V Marco,

Thanks for reaching out. I am attempting to convert the model you attached (frozen_models.zip) and I am running into issues, could you please provide the MO command you used to convert the model? I'm not sure if it would be possible to convert to IR but we can certainly give it a try. 

Regards,

Luis

V_Marco
Beginner
182 Views

thanks for your reply, i used this one "python3 '/opt/intel/openvino_2020.1.023/deployment_tools/model_optimizer/mo_tf.py' --input_model '/home/marco/OTHERS/DWARF-Tensorflow-master/frozen_dwarf.pb --log_level DEBUG"

analyzing the logs the error started from this node get_next_batch/IteratorGetNext that is the dataloader using "from tensorflow.data import Dataset, Iterator" and the function is "   self.iterator = self.dataset.make_one_shot_iterator()" ... i'm suspecting that this operations are not supported... Could you confirm that?

Luis_at_Intel
Moderator
182 Views

You can find the list of TF* supported operations here, if your operation is not listed there then it's likely not supported. 

Regards,

Luis

V_Marco
Beginner
182 Views

I Thanks to your advice, i modified some variables and now the model optimizer works and generate the .xml.

Now loading the network to the IE ( i'm using the last version of openvino) it shows me the following error:

  File "/home/marco/OTHERS/DWARF-Tensorflow-master/kittievalFrozenTF .py", line 38, in <module>
    exec_net = ie.load_network(network=net, device_name="GPU")
  File "ie_api.pyx", line 134, in openvino.inference_engine.ie_api.IECore.load_network
  File "ie_api.pyx", line 141, in openvino.inference_engine.ie_api.IECore.load_network
RuntimeError: Failed to find reference implementation for `dwarf/decoder/block_5/warp_features/align_present_right_features/image_sampling/MatMul` Layer with `FullyConnected` Type on constant propagation

but i checked and MatMul is a supported layer so i don't understand why it gives to me this error. Could you give me some information? i attached the model.

 

Klinke__Addison
Beginner
182 Views

Hi V Macrco,

Did you ever get the "constant propagation" error resolved? I successfully converted an ONNX model using the OpenVino optimizer and ran into a similar error using IECore.load_network. In my case, it was 

Traceback (most recent call last):
  File "python test.py", line 4, in <module>
    exec_net = ie.load_network(network=net, device_name='CPU')
  File "ie_api.pyx", line 134, in openvino.inference_engine.ie_api.IECore.load_network
  File "ie_api.pyx", line 141, in openvino.inference_engine.ie_api.IECore.load_network
RuntimeError: Failed to find reference implementation for `395/new` Layer with `Convolution` Type on constant propagation

It seems strange to me that IENetwork can load the IR model, but the inference engine doesn't work with it. Since Intel's code creates the IR, why doesn't the rest of their toolkit function properly when using it?

Thank you,

Addison

DLuon2
Beginner
182 Views

Hi V Macrco and  Addison,

Have you resolved the "Layer with `FullyConnected` Type on constant propagation" error? I got the same error with the Matmul op, and can't find any answer. Could you give me some information?

Thank you,

Tuan Luong

l__sw
Beginner
182 Views

Hi,V Marco,

i have the same problem like yours,can tell me how did you solve this error?you mentioned that you modified some variables and the model optimizer works and generate the .xml,and can you tell me about how to do that? 

Thank you