Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.

Showing results for

- Intel Community
- Software
- Software Development SDKs and Libraries
- Intel® Distribution of OpenVINO™ Toolkit
- Model Optimizer did not work for CenterNet

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page

Zhiming

Beginner

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

06-25-2019
02:04 PM

171 Views

Model Optimizer did not work for CenterNet

Hi,

I was running the Keras version of CenterNet available here https://github.com/see--/keras-centernet. Now I am trying to run the model on the Intel Compute Stick 2. But when I tried to transform the frozen tensorflow model to the IR, I got errors 'Graph contains cycle'.

Here are some steps to reproduce the error.

1. Git clone the project

2. Replace the **_process_sample** function in *keras-centernet/keras_centernet/models/decode.py* with the following to avoid the data type mismatches in the model optimizer.

def _process_sample(args): _hm, _reg, _wh = args _scores, _inds = tf.math.top_k(_hm, k=k, sorted=True) _classes = K.cast(_inds % cat, 'float64') #width = K.cast(width, 'float32') _inds = K.cast(_inds / cat, 'float64') _xs = K.cast(_inds % width, 'float64') _ys = K.cast(K.cast(_inds / width, 'float64'), 'float64') _inds = K.cast(_inds, 'int32') _xs = K.cast(_xs, 'float32') _ys = K.cast(_ys, 'float32') _classes = K.cast(_classes, 'float32') _wh = K.gather(_wh, _inds) _reg = K.gather(_reg, _inds) _xs = _xs + _reg[..., 0] _ys = _ys + _reg[..., 1] _x1 = _xs - _wh[..., 0] / 2 _y1 = _ys - _wh[..., 1] / 2 _x2 = _xs + _wh[..., 0] / 2 _y2 = _ys + _wh[..., 1] / 2 # rescale to image coordinates _x1 = output_stride * _x1 _y1 = output_stride * _y1 _x2 = output_stride * _x2 _y2 = output_stride * _y2 _detection = K.stack([_x1, _y1, _x2, _y2, _scores, _classes], -1) return _detection detections = K.map_fn(_process_sample, [hm_flat, reg_flat, wh_flat], dtype=K.floatx()) return detections

3. In **keras_centernet/bin/ctdet_image.py**, generate the frozen TensorFlow model using the following code:

num_output = 1 predictions = [None] * num_output predrediction_node_names = [None] * num_output for i in range(num_output): predrediction_node_names= 'output_node' + str(i) predictions= tf.identity(model.outputs, name=predrediction_node_names) sess = K.get_session() constant_graph = graph_util.convert_variables_to_constants(sess, sess.graph.as_graph_def(), predrediction_node_names) infer_graph = graph_util.remove_training_nodes(constant_graph) graph_io.write_graph(infer_graph, '.', 'tf_model_new2', as_text=False)

4. Run the following command to run the keras-CenterNet and obtain the frozen tensorflow model.

*PYTHONPATH=. python keras_centernet/bin/ctdet_image.py --fn assets/demo2.jpg --inres 512,512 (Note you can run this on a docker)*

5. Use the following command in the model optimizer to generate the IR

*sudo python3 mo_tf.py --input_model ~/keras-centernet/model/tf_model_new2.pb --log_level=DEBUG --input_shape=[1,512,512,3] --data_type FP16*

*Kindly let me know if you need more information. Thanks!*

Link Copied

3 Replies

Shubha_R_Intel

Employee

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

06-26-2019
03:55 PM

171 Views

Dear Hu, Zhiming,

CenterNet is not one of the supported (validated) Model Optimizer models. You do have some options for this error, however, please see the MO FAQ # 97. Please check out Offloading Sub-Graph Inference to TensorFlow .

Thanks,

Shubha

Zhiming

Beginner

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

06-27-2019
06:57 AM

171 Views

Shubha R. (Intel) wrote:Dear Hu, Zhiming,

CenterNet is not one of the supported (validated) Model Optimizer models. You do have some options for this error, however, please see the MO FAQ # 97. Please check out Offloading Sub-Graph Inference to TensorFlow .

Thanks,

Shubha

Thanks very much for the reply. I have looked at the two options: tensorflow_subgraph_patterns and tensorflow_operation_patterns. As there are no error messages suggesting which operators are not supported, maybe I should use the first command line option. Is that correct?

Another question is that I have no idea about what sub-graphs should be offloaded to the tensorflow as there are too many cycles in the architecture for the model. I have attached the graph for the model. Could you please have a look at it and give some suggestions? Thanks again for your time and efforts!

Shubha_R_Intel

Employee

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

06-28-2019
03:39 PM

171 Views

Dear Zhiming,

I haven't had a chance to investigate this yet, but please do a --log_level DEBUG in your Model Optimizer command. Does the offending cyclic node happen at the end of the model ? Somewhere in the middle ? If it's at the beginning or end, you can chop it off using the

Model Cutting Technique . Basically by model cutting you are defining different entry and exit points to your model, and it may be OK for inference - your model may still work as long as you're not chopping off crucial parts.

Thanks,

Shubha

Topic Options

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page

For more complete information about compiler optimizations, see our Optimization Notice.