Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
Beginner
219 Views

Problems with convertion of TF model

Hi,

I just trained a model with the Object Detection API on my own made dataset and want to put it on a AI Edge device.

Using https://docs.openvinotoolkit.org/latest/_docs_MO_DG_prepare_model_convert_model_Convert_Model_From_T... as the guide, I tried to convert the model with the MetaGraph method with:

python mo_tf.py --input_meta_graph ...\dataset\model.ckpt-21280.meta

and i got this as the output:

WARNING: Logging before flag parsing goes to stderr.
E0814 14:43:57.873081  1280 main.py:307] Exception occurred during running replacer "REPLACEMENT_ID" (<class 'extensions.front.output_cut.OutputCut'>): Graph contains 0 node after executing <class 'extensions.front.output_cut.OutputCut'>. It considered as error because resulting IR will be empty which is not usual

Any ideas how to fix this? I also tried to freeze the model first but the convertion also failed.

Thanks for your answers.

CD

0 Kudos
6 Replies
Highlighted
Beginner
219 Views

just want to add that I did everything like here https://github.com/datitran/raccoon_dataset to build up the model

0 Kudos
Highlighted
Employee
219 Views

Dear CD,

For Tensorflow Object Detection API, your Model Optimizer command is incorrect. Please have a look at Model Optimizer Tensorflow Object Detection API Doc . You are missing some required switches to your mo_tf.py command.

Hope it helps,

Thanks,

Shubha

0 Kudos
Highlighted
Beginner
219 Views

Hi Shubha,

I already tried it with this command:

python mo_tf.py
--input_model .../dataset/inference_graph/frozen_inference_graph.pb
--tensorflow_use_custom_operations_config extensions/front/tf/ssd_v2_support.json
--tensorflow_object_detection_api_pipeline_config ...\dataset\inference_graph\pipeline.config

 

but it throws back this error:

The Preprocessor block has been removed. Only nodes performing mean value subtraction and scaling (if applicable) are kept.
WARNING: Logging before flag parsing goes to stderr.
E0816 11:14:24.682615 12920 replacement.py:89] Failed to match nodes from custom replacement description with id 'ObjectDetectionAPISSDPostprocessorReplacement':
It means model and custom replacement description are incompatible.
Try to correct custom replacement description according to documentation with respect to model node names
E0816 11:14:33.846275 12920 infer.py:180] Cannot infer shapes or values for node "Postprocessor/Cast_1".
E0816 11:14:33.846275 12920 infer.py:181] 0
E0816 11:14:33.847273 12920 infer.py:182]
E0816 11:14:33.847273 12920 infer.py:183] It can happen due to bug in custom shape infer function <function Cast.infer at 0x000001CF5E343400>.
E0816 11:14:33.847273 12920 infer.py:184] Or because the node inputs have incorrect values/shapes.
E0816 11:14:33.847273 12920 infer.py:185] Or because input shapes are incorrect (embedded to the model or passed via --input_shape).
E0816 11:14:33.849267 12920 infer.py:194] Run Model Optimizer with --log_level=DEBUG for more information.
E0816 11:14:33.849267 12920 main.py:307] Exception occurred during running replacer "REPLACEMENT_ID" (<class 'extensions.middle.PartialInfer.PartialInfer'>): Stopped shape/value propagation at "Postprocessor/Cast_1" node.
 For more information please refer to Model Optimizer FAQ (https://docs.openvinotoolkit.org/latest/_docs_MO_DG_prepare_model_Model_Optimizer_FAQ.html), question #38.

Can you help me with the command? I attached my frozen model to this post.

Thanks,

CD

 

0 Kudos
Highlighted
Employee
219 Views

Dear CD,

Please use ssd_support_api_v1.14.json but first edit the file to change "Postprocessor/Cast" to "Postprocessor/Cast_1". Then it should work.

Let me know !

Thanks,

Shubha

0 Kudos
Highlighted
Beginner
219 Views

Dear Shubha,

Thank you very much for your help! It's working perfectly.

CD

0 Kudos
Highlighted
Employee
219 Views

Dearest CD,

I'm extremely happy to hear it ! Thanks for sharing with the OpenVino community !

Shubha

 

0 Kudos