Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

Multilayer perceptron support

nikos1
Valued Contributor I
593 Views

In one of our applications we need to execute MLP inference.  Is this a supported OpenVino network or just CNNs are supported? We have a frozen tensorflow model but fails to go through the model optimizer.

0 Kudos
1 Solution
Severine_H_Intel
Employee
593 Views

Hi Nikos, 

As a MLP network is a network with fully connected layers and ReLU activations, therefore we should support it. OpenVINO can do more than just CNN. Can you describe here the error you are facing when importing from TF?

Best, 

Severine

View solution in original post

0 Kudos
2 Replies
Severine_H_Intel
Employee
594 Views

Hi Nikos, 

As a MLP network is a network with fully connected layers and ReLU activations, therefore we should support it. OpenVINO can do more than just CNN. Can you describe here the error you are facing when importing from TF?

Best, 

Severine

0 Kudos
nikos1
Valued Contributor I
593 Views

Hi Severine,

Thank you  for confirming MLP support in OpenVino.  MLP model optimization process works fine now.

JFTR the issue was that tf.float64 was used for weights/biases causing the following error below. Changing to the default TF dtype tf.float32 , retraining and model optimization worked fine.

[ ERROR ] Cannot convert type of placeholder "x" because not all of its outputs are "Cast" to float operations: ['MatMul'].

For more information please refer to Model Optimizer FAQ (<INSTALL_DIR>/deployment_tools/documentation/docs/MO_FAQ.html), question #49.

Thanks,

Nikos

0 Kudos
Reply