Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

Model optimizer problem with alexnet

Tom929
Beginner
1,282 Views

Hello,

I'm trying to generate IR for alexnet and fails with "Killed" message.
I tested squeezenet1.1 too, it shows the same warning about protobuf but succeeded to generate IR.
Attached is log file with --log_level DEBUG option.
OS: Ubuntu 18.04
SDK: 2020.3.194

$ /opt/intel/openvino/deployment_tools/open_model_zoo/tools/downloader/downloader.py --name alexnet --output_dir /home/contec/openvino_models/ir
$ cd /opt/intel/openvino/deployment_tools/model_optimizer
$ cd install_prerequisites
$ ./install_prerequisites_caffe.sh
$ cd ..
$ python3 mo_caffe.py --input_model /home/contec/openvino_models/ir/public/alexnet/alexnet.caffemodel --output_dir /home/contec/openvino_models/ir/public/alexnet/
Model Optimizer arguments:
Common parameters:
- Path to the Input Model: /home/contec/openvino_models/ir/public/alexnet/alexnet.caffemodel
- Path for generated IR: /home/contec/openvino_models/ir/public/alexnet/
- IR output name: alexnet
- Log level: ERROR
- Batch: Not specified, inherited from the model
- Input layers: Not specified, inherited from the model
- Output layers: Not specified, inherited from the model
- Input shapes: Not specified, inherited from the model
- Mean values: Not specified
- Scale values: Not specified
- Scale factor: Not specified
- Precision of IR: FP32
- Enable fusing: True
- Enable grouped convolutions fusing: True
- Move mean values to preprocess section: False
- Reverse input channels: False
Caffe specific parameters:
- Path to Python Caffe* parser generated from caffe.proto: mo/front/caffe/proto
- Enable resnet optimization: True
- Path to the Input prototxt: /home/contec/openvino_models/ir/public/alexnet/alexnet.prototxt
- Path to CustomLayersMapping.xml: Default
- Path to a mean file: Not specified
- Offsets for a mean file: Not specified
Model Optimizer version:
[ WARNING ]
Detected not satisfied dependencies:
protobuf: installed: 3.0.0, required: == 3.6.1

Please install required versions of components or use install_prerequisites script
/opt/intel/openvino_2020.3.194/deployment_tools/model_optimizer/install_prerequisites/install_prerequisites_caffe.sh
Note that install_prerequisites scripts may install additional components.
Killed

Thank you.

0 Kudos
1 Solution
IntelSupport
Community Manager
1,188 Views

Hi Tom929,

 

This is very strange, could you try to manually download the model and convert with the following command?

 

wget https://raw.githubusercontent.com/BVLC/caffe/88c96189bcbf3853b93e2b65c7b5e4948f9d5f67/models/bvlc_alexnet/deploy.prototxt 
wget http://dl.caffe.berkeleyvision.org/bvlc_alexnet.caffemodel

python3 /opt/intel/openvino/deployment_tools/model_optimizer/mo.py \
--input_shape [1,3,227,227] \
--input data \
--mean_values data[104.0,117.0,123.0] \
--output prob \
--input_model bvlc_alexnet.caffemodel \
--input_proto deploy.prototxt

 

 

 

Regards,

Jesus

 

 

View solution in original post

0 Kudos
7 Replies
IntelSupport
Community Manager
1,259 Views

Hi Tom929,


Thanks for reaching out! You will need to update the protobuf version on your system to version 3.6.1. I believe the install_prerequisites.sh script will not install protobuf if there is a previous version installed. Try manually removing version protobuf 3.0.0 that appears to be installed in your system and run the isntall_prerequisits.sh script once more. If that does not work, please try the steps mentioned on the following thread.


https://community.intel.com/t5/Intel-Distribution-of-OpenVINO/protobuf-not-installed-required-3-6-1/td-p/1154807


Regards,

Jesus


0 Kudos
Tom929
Beginner
1,215 Views

Hello,

Thank you for your reply.
I installed protobuf 3.6.1 and warning is disappeared but result is the same.
Attached is log file with --log_level DEBUG option.

$ python3 mo_caffe.py --input_model /home/test/work/03/public/alexnet/alexnet.caffemodel --output_dir /home/test/work/03/
Model Optimizer arguments:
Common parameters:
- Path to the Input Model: /home/test/work/03/public/alexnet/alexnet.caffemodel
- Path for generated IR: /home/test/work/03/
- IR output name: alexnet
- Log level: ERROR
- Batch: Not specified, inherited from the model
- Input layers: Not specified, inherited from the model
- Output layers: Not specified, inherited from the model
- Input shapes: Not specified, inherited from the model
- Mean values: Not specified
- Scale values: Not specified
- Scale factor: Not specified
- Precision of IR: FP32
- Enable fusing: True
- Enable grouped convolutions fusing: True
- Move mean values to preprocess section: False
- Reverse input channels: False
Caffe specific parameters:
- Path to Python Caffe* parser generated from caffe.proto: mo/front/caffe/proto
- Enable resnet optimization: True
- Path to the Input prototxt: /home/test/work/03/public/alexnet/alexnet.prototxt
- Path to CustomLayersMapping.xml: Default
- Path to a mean file: Not specified
- Offsets for a mean file: Not specified
Model Optimizer version:
Killed

Thanks

0 Kudos
IntelSupport
Community Manager
1,200 Views

Hi Tom929,


From the details you provided everything seems to be done correctly, the model optimizer command usually needs more flags but your way should work too. Could you share a log file containing the output of the install_prerequisites_caffe.sh script?


cd /opt/intel/openvino/deployment_tools/model_optimizer/install_prerequisites

sudo ./install_prerequisites_caffe.sh


You mentioned being able to convert the squeezenet1.1 model, could you share the command and log file?


Also, please make sure all external dependencies and environment variables are set.

https://docs.openvinotoolkit.org/latest/_docs_install_guides_installing_openvino_linux.html#install-external-dependencies


Regards,

Jesus



0 Kudos
Tom929
Beginner
1,191 Views

Yes, no problems with generating squeezenet1.1, please refer squeezenet1.1.log.
Another attached is the result of install_prerequisites_caffe.sh.

All external dependencies and environment variables are set and I can build and run demos or samples.

Thanks.

0 Kudos
IntelSupport
Community Manager
1,189 Views

Hi Tom929,

 

This is very strange, could you try to manually download the model and convert with the following command?

 

wget https://raw.githubusercontent.com/BVLC/caffe/88c96189bcbf3853b93e2b65c7b5e4948f9d5f67/models/bvlc_alexnet/deploy.prototxt 
wget http://dl.caffe.berkeleyvision.org/bvlc_alexnet.caffemodel

python3 /opt/intel/openvino/deployment_tools/model_optimizer/mo.py \
--input_shape [1,3,227,227] \
--input data \
--mean_values data[104.0,117.0,123.0] \
--output prob \
--input_model bvlc_alexnet.caffemodel \
--input_proto deploy.prototxt

 

 

 

Regards,

Jesus

 

 

0 Kudos
Tom929
Beginner
1,179 Views

It works good, IR files are generated and I can use them in Hello_Classification sample.
I also tested below commands for Alexnet which I got from model downloader and it worked too.

So the reason was parameters for model optimizer?

python3 /opt/intel/openvino/deployment_tools/model_optimizer/mo.py \
--input_shape [1,3,227,227] \
--input data \
--mean_values data[104.0,117.0,123.0] \
--output prob \
--input_model alexnet.caffemodel \
--input_proto alexnet.prototxt

0 Kudos
IntelSupport
Community Manager
1,174 Views

Hi Tom929,


I'm glad that worked for you! I don't believe it was your model optimizer command as I ran it the same on my system and it works. It's possible the .caffemodel you were originally using may have been corrupted.


Feel free to reach out again if you have additional questions.


Regards,

Jesus


0 Kudos
Reply