Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

omz_converter not working

sarathfv
Beginner
1,427 Views

(openvino_env) sarath@sarath:~/coredla_work/demo/open_model_zoo$ omz_converter --name resnet-50-tf \
> --download_dir $COREDLA_WORK/demo/models/ \
> --output_dir $COREDLA_WORK/demo/models/
========== Converting resnet-50-tf to IR (FP16)
Conversion command: /usr/bin/python3 -- /home/sarath/.local/bin/mo --framework=tf --output_dir=/home/sarath/coredla_work/demo/models/public/resnet-50-tf/FP16 --model_name=resnet-50-tf --input=map/TensorArrayStack/TensorArrayGatherV3 '--mean_values=[123.68,116.78,103.94]' --output=softmax_tensor --input_model=/home/sarath/coredla_work/demo/models/public/resnet-50-tf/resnet_v1-50.pb --reverse_input_channels '--layout=map/TensorArrayStack/TensorArrayGatherV3(NHWC)' '--input_shape=[1, 224, 224, 3]' --compress_to_fp16=True

[ ERROR ] -------------------------------------------------
[ ERROR ] ----------------- INTERNAL ERROR ----------------
[ ERROR ] Unexpected exception happened.
[ ERROR ] Please contact Model Optimizer developers and forward the following information:
[ ERROR ] load(): incompatible function arguments. The following argument types are supported:
1. (self: openvino._pyopenvino.FrontEnd, path: object) -> ov::frontend::InputModel

Invoked with: <FrontEnd 'tf'>, '/home/sarath/coredla_work/demo/models/public/resnet-50-tf/resnet_v1-50.pb', False
[ ERROR ] Traceback (most recent call last):
File "/home/sarath/.local/lib/python3.8/site-packages/openvino/tools/mo/convert_impl.py", line 896, in _convert
ov_model, legacy_path = driver(argv, {"conversion_parameters": non_default_params})
File "/home/sarath/.local/lib/python3.8/site-packages/openvino/tools/mo/convert_impl.py", line 559, in driver
graph, ngraph_function = prepare_ir(argv)
File "/home/sarath/.local/lib/python3.8/site-packages/openvino/tools/mo/convert_impl.py", line 413, in prepare_ir
ngraph_function = moc_pipeline(argv, moc_front_end)
File "/home/sarath/.local/lib/python3.8/site-packages/openvino/tools/mo/moc_frontend/pipeline.py", line 56, in moc_pipeline
input_model = moc_front_end.load(argv.input_model, share_weights)
TypeError: load(): incompatible function arguments. The following argument types are supported:
1. (self: openvino._pyopenvino.FrontEnd, path: object) -> ov::frontend::InputModel

Invoked with: <FrontEnd 'tf'>, '/home/sarath/coredla_work/demo/models/public/resnet-50-tf/resnet_v1-50.pb', False

[ ERROR ] ---------------- END OF BUG REPORT --------------
[ ERROR ] -------------------------------------------------
[ INFO ] You can also try to use legacy TensorFlow Frontend by using argument --use_legacy_frontend.

FAILED:
resnet-50-tf
WARNING:root:Failed to send event with the following error: Object of type type is not JSON serializable

4 Replies
Zulkifli_Intel
Moderator
1,374 Views

Hi sarathfv,

 

Thank you for reaching out to us.

 

Have you run the following command to install Tensorflow1:

pip install openvino-dev[tensorflow]

 

Also, which OpenVINO version did you use? I tried on my side using OpenVINO 2023.0.2 and was able to convert the model IR format.

tempsnip.png

 

Regards,

Zulkifli

 

0 Kudos
muhammad_ahmad_hayat
1,343 Views

i am having similar issues
(openvino_env) PS C:\Users\ahmad\Downloads\pythonCV\mobilenetssd_C\open_model_zoo-master\tools\model_tools> omz_converter --name ssd_mobilenet_v3_large_coco_2020_01_14.pbtxt
>>
WARNING:root:Failed to send event with the following error: Object of type type is not JSON serializable
No matching models: "C:\Users\ahmad\Downloads\pythonCV\mobilenetssd_C\ssd_mobilenet_v3_large_coco_2020_01_14.pbtxt"

version of tensor flow i am using 

Name: tensorflow
Version: 2.14.1

0 Kudos
muhammad_ahmad_hayat
1,302 Views

i solved the problem by installing 
pip install protobuf==3.20.0
it seem that i had compatibility issue 

(openvino_env) PS C:\Users\ahmad\Downloads\pythonCV\mobilenetssd_C\openvino_env\Lib\site-packages\openvino\tools\mo> pip install protobuf==3.20.0
Collecting protobuf==3.20.0
Downloading protobuf-3.20.0-py2.py3-none-any.whl (162 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 162.1/162.1 kB 441.5 kB/s eta 0:00:00
Attempting uninstall: protobuf
Found existing installation: protobuf 4.25.2
Successfully uninstalled protobuf-4.25.2
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
tensorflow-intel 2.14.1 requires protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<5.0.0dev,>=3.20.3, but you have protobuf 3.20.0 which is incompatible.
Successfully installed protobuf-3.20.0
(openvino_env) PS C:\Users\ahmad\Downloads\pythonCV\mobilenetssd_C\openvino_env\Lib\site-packages\openvino\tools\mo> pip install protobuf==3.20.0
>>
Requirement already satisfied: protobuf==3.20.0 in c:\users\ahmad\downloads\pythoncv\mobilenetssd_c\openvino_env\lib\site-packages (3.20.0)
(openvino_env) PS C:\Users\ahmad\Downloads\pythonCV\mobilenetssd_C\openvino_env\Lib\site-packages\openvino\tools\mo> set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python
(openvino_env) PS C:\Users\ahmad\Downloads\pythonCV\mobilenetssd_C\openvino_env\Lib\site-packages\openvino\tools\mo> python mo.py --model_name ssdmobilev3_F32 --input_model "C:\Users\ahmad\Downloads\pythonCV\mobilenetssd_C\MobileNetSSD_deploy.caffemodel" --output_dir "C:\Users\ahmad\Downloads\pythonCV\mobilenetssd_C\model"
>>
Please expect that Model Optimizer conversion might be slow. You are currently using Python protobuf library implementation.
Check that your protobuf package version is aligned with requirements_caffe.txt.


For more information please refer to Model Conversion API FAQ, question #80. (https://docs.openvino.ai/2023.0/openvino_docs_MO_DG_prepare_model_Model_Optimizer_FAQ.html?question=80#question-80)
[ INFO ] Generated IR will be compressed to FP16. If you get lower accuracy, please consider disabling compression explicitly by adding argument --compress_to_fp16=False.
Find more information about compression to FP16 at https://docs.openvino.ai/2023.0/openvino_docs_MO_DG_FP16_Compression.html
[ INFO ] The model was converted to IR v11, the latest model format that corresponds to the source DL framework input/output format. While IR v11 is backwards compatible with OpenVINO Inference Engine API v1.0, please use API v2.0 (as of 2022.1) to take advantage
of the latest improvements in IR v11.
Find more information about API v2.0 and IR v11 at https://docs.openvino.ai/2023.0/openvino_2_0_transition_guide.html
[ SUCCESS ] Generated IR version 11 model.
[ SUCCESS ] XML file: C:\Users\ahmad\Downloads\pythonCV\mobilenetssd_C\model\ssdmobilev3_F32.xml
[ SUCCESS ] BIN file: C:\Users\ahmad\Downloads\pythonCV\mobilenetssd_C\model\ssdmobilev3_F32.bin

0 Kudos
Zulkifli_Intel
Moderator
1,272 Views

Hi, this thread will no longer be monitored since this issue has been resolved. If you need any additional information from Intel, please submit a new question.


0 Kudos
Reply