- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
When I run the Model Optimizer on Faster-R-CNN model ( Win10, Caffe), conversion doesn't work. Please help me solve the problem, thank you very much.
C:\Intel\computer_vision_sdk_2018.4.420\deployment_tools\inference_engine\samples\object_detection_demo>python %MO_ROOT_PATH%/mo.py --input_model ZF_faster_rcnn_final.caffemodel --input_proto test.prototxt
Model Optimizer arguments:
Common parameters:
- Path to the Input Model: C:\Intel\computer_vision_sdk_2018.4.420\deployment_tools\inference_engine\samples\object_detection_demo\ZF_faster_rcnn_final.caffemodel
- Path for generated IR: C:\Intel\computer_vision_sdk_2018.4.420\deployment_tools\inference_engine\samples\object_detection_demo\.
- IR output name: ZF_faster_rcnn_final
- Log level: ERROR
- Batch: Not specified, inherited from the model
- Input layers: Not specified, inherited from the model
- Output layers: Not specified, inherited from the model
- Input shapes: Not specified, inherited from the model
- Mean values: Not specified
- Scale values: Not specified
- Scale factor: Not specified
- Precision of IR: FP32
- Enable fusing: True
- Enable grouped convolutions fusing: True
- Move mean values to preprocess section: False
- Reverse input channels: False
Caffe specific parameters:
- Enable resnet optimization: True
- Path to the Input prototxt: C:\Intel\computer_vision_sdk_2018.4.420\deployment_tools\inference_engine\samples\object_detection_demo\test.prototxt
- Path to CustomLayersMapping.xml: C:\Intel\computer_vision_sdk_2018.4.420\deployment_tools\model_optimizer\extensions\front\caffe\CustomLayersMapping.xml
- Path to a mean file: Not specified
- Offsets for a mean file: Not specified
Model Optimizer version: 1.4.292.6ef7232d
Please expect that Model Optimizer conversion might be slow. You are currently using Python protobuf library implementation.
However you can use the C++ protobuf implementation that is supplied with the OpenVINO toolkitor build protobuf library from sources.
Navigate to "install_prerequisites" folder and run: python -m easy_install protobuf-3.5.1-py($your_python_version)-win-amd64.egg
set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=cpp
For more information please refer to Model Optimizer FAQ (<INSTALL_DIR>/deployment_tools/documentation/docs/MO_FAQ.html), question #80.
[ ERROR ] Cannot infer shapes or values for node "rpn_conv/3x3".
[ ERROR ] -1
[ ERROR ]
[ ERROR ] It can happen due to bug in custom shape infer function <function Convolution.infer at 0x00000282D83BB400>.
[ ERROR ] Or because the node inputs have incorrect values/shapes.
[ ERROR ] Or because input shapes are incorrect (embedded to the model or passed via --input_shape).
[ ERROR ] Run Model Optimizer with --log_level=DEBUG for more information.
[ ERROR ] Stopped shape/value propagation at "rpn_conv/3x3" node.
For more information please refer to Model Optimizer FAQ (<INSTALL_DIR>/deployment_tools/documentation/docs/MO_FAQ.html), question #38.
Link Copied
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page