- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am trying to convert a finetuned BERT Base in Intermediate Representation of OpenVino using the following command line:
python mo.py --input_model C:\Users\my.user\Documents\data\BERT\imdb\v1_frozen_graph.pb --input "input_mask_ids{i32}[1 512],input_type_ids{i32}[1 512],input_word_ids{i32}[1 512]" --output_dir C:\Users\my.user\Documents\data\BERT\imdb\v1_openvino
I obtained this for each layer:
[ ERROR ] List of operations that cannot be converted to Inference Engine IR:
[ ERROR ] Einsum (96)
...
[ ERROR ] StatefulPartitionedCall/model/bert_encoder_1/transformer/layer_5/self_attention/key/einsum/Einsum
[ ERROR ] StatefulPartitionedCall/model/bert_encoder_1/transformer/layer_5/self_attention/query/einsum/Einsum
[ ERROR ] StatefulPartitionedCall/model/bert_encoder_1/transformer/layer_5/self_attention/einsum/Einsum
[ ERROR ] StatefulPartitionedCall/model/bert_encoder_1/transformer/layer_5/self_attention/value/einsum/Einsum
[ ERROR ] StatefulPartitionedCall/model/bert_encoder_1/transformer/layer_5/self_attention/einsum_1/Einsum
[ ERROR ] StatefulPartitionedCall/model/bert_encoder_1/transformer/layer_5/self_attention/attention_output/einsum/Einsum
[ ERROR ] StatefulPartitionedCall/model/bert_encoder_1/transformer/layer_5/intermediate/einsum/Einsum
[ ERROR ] StatefulPartitionedCall/model/bert_encoder_1/transformer/layer_5/output/einsum/Einsum
...
I was wondering how is that Einsum is not supported (from the Doc) whereas BERT models (checkpoints in tf1.15 from the original repo) are successfully converted ? Or am i missing something ?
To create my model I executed the following steps:
- Download a tf.1.15 bert-base checkpoint from the original bert repo (stated as supported in the documentation)
- Converted this checkpoint in tf.25
- I added a dense layer upon the BERT to do some dummy sentiment classification on imdb and reached a correct accuracy.
- Exported the model as a frozen graph.
Thank you very much,
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Greetings,
Currently, BERT Einsum is not supported for OpenVINO 2021.3 and PR has initiated to enable this - Extend MO for operation Einsum-7 by rkazants · Pull Request #5401 · openvinotoolkit/openvino (github.com)
The development for this is in progress and targeted to be released in OpenVINO 2021.4.
Sincerely,
Iffa
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Greetings,
Intel will no longer monitor this thread since we have provided a solution. If you need any additional information from Intel, please submit a new question.
Sincerely,
Iffa
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page