- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
In one of our applications we need to execute MLP inference. Is this a supported OpenVino network or just CNNs are supported? We have a frozen tensorflow model but fails to go through the model optimizer.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Nikos,
As a MLP network is a network with fully connected layers and ReLU activations, therefore we should support it. OpenVINO can do more than just CNN. Can you describe here the error you are facing when importing from TF?
Best,
Severine
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Nikos,
As a MLP network is a network with fully connected layers and ReLU activations, therefore we should support it. OpenVINO can do more than just CNN. Can you describe here the error you are facing when importing from TF?
Best,
Severine
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Severine,
Thank you for confirming MLP support in OpenVino. MLP model optimization process works fine now.
JFTR the issue was that tf.float64 was used for weights/biases causing the following error below. Changing to the default TF dtype tf.float32 , retraining and model optimization worked fine.
[ ERROR ] Cannot convert type of placeholder "x" because not all of its outputs are "Cast" to float operations: ['MatMul'].
For more information please refer to Model Optimizer FAQ (<INSTALL_DIR>/deployment_tools/documentation/docs/MO_FAQ.html), question #49.
Thanks,
Nikos
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page