Community
cancel
Showing results for 
Search instead for 
Did you mean: 
JING_L_Intel
Employee
92 Views

mo_tf with PRelu in tensorflow

Hi I have a pre-trained tensorflow model which contains a Prelu layer(which is implemented as below:) ------------------------------------------------------------------------ with tf.variable_scope(name) as vs: alphas = tf.get_variable(name='alphas', shape=w_shape, initializer=a_init, **a_init_args ) self.outputs = tf.nn.relu(self.inputs) - tf.multiply(alphas, tf.nn.relu(self.inputs*-1)) ---------------------------------------------------------------------------------------- But I can't translate the model with mo_tf.py successfully(it can generate the .bin and the .xml, but pops some wired messages "Cannot apply broadcast") And while running the inference engine, it will fail. The exception message is something like "prelu_D0/Mul input port 0 is not connected to any data". I take a look to the "port 0" in that layer, it's the "alphas" of the Prelu Did I make something wrong here? mo_tf.py translate it to some "Power/Eltwise/ReLU" layers, I assume the weights of each layer is put in the .bin file, what about the "alphas". Thanks
0 Kudos
1 Reply
Monique_J_Intel
Employee
92 Views

Hi Jing,

Could you share your model so that I can reproduce the issue on my side?

Kind Regards,

Monique Jones

Reply