Community
cancel
Showing results for 
Search instead for 
Did you mean: 
nikos1
Valued Contributor I
113 Views

Multilayer perceptron support

Jump to solution

In one of our applications we need to execute MLP inference.  Is this a supported OpenVino network or just CNNs are supported? We have a frozen tensorflow model but fails to go through the model optimizer.

0 Kudos
1 Solution
Severine_H_Intel
Employee
113 Views

Hi Nikos, 

As a MLP network is a network with fully connected layers and ReLU activations, therefore we should support it. OpenVINO can do more than just CNN. Can you describe here the error you are facing when importing from TF?

Best, 

Severine

View solution in original post

2 Replies
Severine_H_Intel
Employee
114 Views

Hi Nikos, 

As a MLP network is a network with fully connected layers and ReLU activations, therefore we should support it. OpenVINO can do more than just CNN. Can you describe here the error you are facing when importing from TF?

Best, 

Severine

View solution in original post

nikos1
Valued Contributor I
113 Views

Hi Severine,

Thank you  for confirming MLP support in OpenVino.  MLP model optimization process works fine now.

JFTR the issue was that tf.float64 was used for weights/biases causing the following error below. Changing to the default TF dtype tf.float32 , retraining and model optimization worked fine.

[ ERROR ] Cannot convert type of placeholder "x" because not all of its outputs are "Cast" to float operations: ['MatMul'].

For more information please refer to Model Optimizer FAQ (<INSTALL_DIR>/deployment_tools/documentation/docs/MO_FAQ.html), question #49.

Thanks,

Nikos

Reply