Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

Converting SSD300 Keras Model to IR

Delattre__Benjamin
629 Views

Hi everyone !

I would like to convert a SSD300 custom model from Keras to IR .bin and .xml but I do not manage to do it.

I know that it exists SSD pretrained compatible with OpenVINO but the point here is to manage to use a custom model.

First of all, I trained my Keras model based on this repos which is a SSD300 with VGG16 feature extractor and Caffe original layers names.

Then I converted my .h5 model to Tensorflow .pb file using tensorflow.python.framework.graph_io.write_graph() function which worked well (Manage to make inferences with it).

And after I tried to use mo_tf.py script to optimized my model but it doesn't work :

python3 /opt/intel/openvino/deployment_tools/model_optimizer/mo_tf.py --log_level DEBUG --input_model "./ssd300_OID_plates_frozen_model_training.pb" --output_dir "./" --input_shape [1,300,300,3]  --data_type FP32  2>&1 | tee -a "log_openvino.log"

This generated this error :

I0529 16:06:28.643168 139922966857472 op.py:204] Start running infer function for individual op node with attributes: {'op': 'Reshape', 'name': 'input_channel_swap/strided_slice_1/Reshape_shrink', 'IE': [('layer', [('id', <function Op.substitute_ie_attrs.<locals>.<lambda> at 0x7f421f74d7b8>), 'name', 'precision', 'type'], [('data', [], []), '@ports', '@consts'])], 'in_ports_count': 2, 'shape_attrs': ['window', 'pad', 'output_shape', 'shape', 'stride'], 'type': 'Reshape', '_out_ports': {0}, 'precision': 'FP32', 'nchw_layout': True, 'out_ports_count': 1, 'infer': <function Reshape.__init__.<locals>.<lambda> at 0x7f421f6f8488>, '_in_ports': {0, 1}, 'kind': 'op', 'dim': array([  1, 300, 300]), 'dim_attrs': ['axis', 'batch_dims', 'channel_dims', 'spatial_dims']}
E0529 16:06:28.643693 139922966857472 main.py:317] Exception occurred during running replacer "REPLACEMENT_ID" (<class 'extensions.middle.ConvertGroupedStridedSlice.ConvertGroupedStridedSlice'>): Number of elements in input [1 1] and output [1, 300, 300] of reshape node input_channel_swap/strided_slice_1/Reshape_shrink mismatch
I0529 16:06:28.644914 139922966857472 main.py:318] Traceback (most recent call last):
  File "/opt/intel/openvino_2019.1.133/deployment_tools/model_optimizer/mo/utils/class_registration.py", line 167, in apply_replacements
    replacer.find_and_replace_pattern(graph)
  File "/opt/intel/openvino_2019.1.133/deployment_tools/model_optimizer/extensions/middle/ConvertGroupedStridedSlice.py", line 162, in find_and_replace_pattern
    self.add_reshape_for_shrink(graph, node)
  File "/opt/intel/openvino_2019.1.133/deployment_tools/model_optimizer/extensions/middle/ConvertGroupedStridedSlice.py", line 227, in add_reshape_for_shrink
    data_nodes=[out_node])
  File "/opt/intel/openvino_2019.1.133/deployment_tools/model_optimizer/mo/ops/op.py", line 205, in create_node_with_data
    new_op_node.infer(new_op_node)
  File "/opt/intel/openvino_2019.1.133/deployment_tools/model_optimizer/mo/ops/reshape.py", line 39, in <lambda>
    lambda node: np.reshape(node.in_node().value,
  File "/opt/intel/openvino_2019.1.133/deployment_tools/model_optimizer/mo/front/common/partial_infer/elemental.py", line 19, in single_output_infer
    node.out_node(0).shape = shape_infer(node)
  File "/opt/intel/openvino_2019.1.133/deployment_tools/model_optimizer/mo/front/common/partial_infer/reshape.py", line 69, in tf_reshape_shape_infer
    node.name))
mo.utils.error.Error: Number of elements in input [1 1] and output [1, 300, 300] of reshape node input_channel_swap/strided_slice_1/Reshape_shrink mismatch

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/opt/intel/openvino_2019.1.133/deployment_tools/model_optimizer/mo/main.py", line 312, in main
    return driver(argv)
  File "/opt/intel/openvino_2019.1.133/deployment_tools/model_optimizer/mo/main.py", line 263, in driver
    is_binary=not argv.input_model_is_text)
  File "/opt/intel/openvino_2019.1.133/deployment_tools/model_optimizer/mo/pipeline/tf.py", line 128, in tf2nx
    class_registration.apply_replacements(graph, class_registration.ClassType.MIDDLE_REPLACER)
  File "/opt/intel/openvino_2019.1.133/deployment_tools/model_optimizer/mo/utils/class_registration.py", line 184, in apply_replacements
    )) from err
mo.utils.error.Error: Exception occurred during running replacer "REPLACEMENT_ID" (<class 'extensions.middle.ConvertGroupedStridedSlice.ConvertGroupedStridedSlice'>): Number of elements in input [1 1] and output [1, 300, 300] of reshape node input_channel_swap/strided_slice_1/Reshape_shrink mismatch

Do you have any idea of what I should do ?

I read a bit about --tensorflow_use_custom_operations_config and --tensorflow_object_detection_api_pipeline_config, I don't understand how to use these parameters.

Thanks !

Regards

0 Kudos
5 Replies
Shubha_R_Intel
Employee
629 Views

OpenVino does support several varieties of SSD as seen from this tensorflow list but they are tensorflow models, not Keras. 

Converting Tensorflow Object Detection API models should answer your questions regarding the method to generate IR but these are technically instructions for Tensorflow Models (not Keras). It seems that you converted your trained Keras model into a Tensorflow frozen pb so hopefully the instructions should work. 

These models can be custom-trained also.

For --tensorflow_use_custom_operations_config, you will find the appropriate json file under  (for instance ssd_v2_support.json):

C:\Program Files (x86)\IntelSWTools\openvino_2019.1.148\deployment_tools\model_optimizer\extensions\front\tf

For --tensorflow_object_detection_api_pipeline_config, you find the *.config files in the Tensorflow Object Detection API repo.

Hope it helps and please make sure you're using the latest version of OpenVino 2019R1.1.

Thanks,

Shubha

 

0 Kudos
Delattre__Benjamin
629 Views

Thank you for your reply ! (It was ascension holiday in France).

Indeed IR does not support Keras model and that is why I converted it to .pb file.

Shubha R. (Intel) wrote:

Converting Tensorflow Object Detection API models should answer your questions regarding the method to generate IR but these are technically instructions for Tensorflow Models (not Keras). It seems that you converted your trained Keras model into a Tensorflow frozen pb so hopefully the instructions should work.

These models can be custom-trained also.

For --tensorflow_use_custom_operations_config, you will find the appropriate json file under  (for instance ssd_v2_support.json):

C:\Program Files (x86)\IntelSWTools\openvino_2019.1.148\deployment_tools\model_optimizer\extensions\front\tf

For --tensorflow_object_detection_api_pipeline_config, you find the *.config files in the Tensorflow Object Detection API repo.

I've already seen this page and I've already tried to use ssd_v2_support.json, even tried to change the output layers name in the file but it did not work.
The documentation explain how to use pre-trained known models with already created .config and .json file but is is not explain how to use custom model and how to create these .config and .json file. Where could I find the documentation about it ?

Thanks.

 

 

 

0 Kudos
Shubha_R_Intel
Employee
629 Views

Dear Delattre, Benjamin,

Unfortunately we don't have documentation on how to create your own supporting *.json files. But if you look at them, they all have a similar framework -  "custom_attributes":, instances":,"match_kind":  . The Model Optimizer code is fully open source. My recommendation would be to study the code and see how it's done. 

Sorry that I don't have a better answer for you. 

Thanks !

Shubha

0 Kudos
Shubha_R_Intel
Employee
629 Views

Dear Delattre, Benjamin,

I apologize for my previous incorrect answer regarding the *.json files for Tensorflow Object Detection API. We do actually have documentation. The thing you are interested in is called "sub-graph replacers" and certainly, we do address it in the documentation !

There is a documentation about conversion of the TF OD API models https://docs.openvinotoolkit.org/latest/_docs_MO_DG_prepare_model_convert_model_tf_specific_Convert_Object_Detection_API_Models.html including description how the model is modified. Also, it contains links to the documentation on how the *.json sub-graph replacers works: https://docs.openvinotoolkit.org/latest/_docs_MO_DG_prepare_model_customize_model_optimizer_Subgraph_Replacement_Model_Optimizer.html

If you run into trouble or have further questions, of course please post back here.

Thanks,

Shubha

0 Kudos
Delattre__Benjamin
629 Views

Thank you for these replies.

I think the second link is about what I'm looking for :

Shubha R. (Intel) wrote:

There is a documentation about conversion of the TF OD API models https://docs.openvinotoolkit.org/latest/_docs_MO_DG_prepare_model_convert_model_tf_specific_Convert_Object_Detection_API_Models.html including description how the model is modified. Also, it contains links to the documentation on how the *.json sub-graph replacers works: https://docs.openvinotoolkit.org/latest/_docs_MO_DG_prepare_model_customize_model_optimizer_Subgraph_Replacement_Model_Optimizer.html

Unfortunately I will not have the time next 2 weeks to work on but I will get back to you as soon as I try this.

Regards,

Benjamin

0 Kudos
Reply