- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi
I have a pre-trained tensorflow model which contains a Prelu layer(which is implemented as below:)
------------------------------------------------------------------------
with tf.variable_scope(name) as vs:
alphas = tf.get_variable(name='alphas', shape=w_shape, initializer=a_init, **a_init_args )
self.outputs = tf.nn.relu(self.inputs) - tf.multiply(alphas, tf.nn.relu(self.inputs*-1))
----------------------------------------------------------------------------------------
But I can't translate the model with mo_tf.py successfully(it can generate the .bin and the .xml, but pops some wired messages "Cannot apply broadcast")
And while running the inference engine, it will fail. The exception message is something like "prelu_D0/Mul input port 0 is not connected to any data".
I take a look to the "port 0" in that layer, it's the "alphas" of the Prelu
Did I make something wrong here? mo_tf.py translate it to some "Power/Eltwise/ReLU" layers, I assume the weights of each layer is put in the .bin file, what about the "alphas".
Thanks
Link Copied
1 Reply
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Jing,
Could you share your model so that I can reproduce the issue on my side?
Kind Regards,
Monique Jones
Reply
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page