Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

Model Opetimizer with Openvino + FPGA

SDumb
Beginner
277 Views

Hi All,

Trying to generate .xml & .bin file by using model optimizer, with command

"python3 /home/datadrive1/intel/computer_vision_sdk_2018.4.420/deployment_tools/model_optimizer/mo_tf.py --input_model 0818_pspnet_1.0_713_resnet_v1/frozen_inference_graph_opt.pb"

 

but it always fail with below log;

Model Optimizer version: 1.4.292.6ef7232d

[ ERROR ] Cannot infer shapes or values for node "pad_to_bounding_box/Pad".

[ ERROR ] Input tensor shape [ 1 1 256 256 3] and pads values (4, 2) do not match for Pad node pad_to_bounding_box/Pad

[ ERROR ] 

[ ERROR ] It can happen due to bug in custom shape infer function <function Pad.infer at 0x7f0994516f28>.

[ ERROR ] Or because the node inputs have incorrect values/shapes.

[ ERROR ] Or because input shapes are incorrect (embedded to the model or passed via --input_shape).

[ ERROR ] Run Model Optimizer with --log_level=DEBUG for more information.

[ ERROR ] Stopped shape/value propagation at "pad_to_bounding_box/Pad" node. 

 For more information please refer to Model Optimizer FAQ (<INSTALL_DIR>/deployment_tools/documentation/docs/MO_FAQ.html), question #38.

 

 

trying with some other model, but got same error log;

cmd:python3 /home/datadrive1/intel/computer_vision_sdk_2018.4.420/deployment_tools/model_optimizer/mo_tf.py --input_model

 ICNet-model.pb

 

[ ERROR ] Shape is not defined for output 0 of "split".

[ ERROR ] Shape is not defined for output 1 of "split".

[ ERROR ] Shape is not defined for output 2 of "split".

[ ERROR ] Cannot infer shapes or values for node "split".

[ ERROR ] Not all output shapes were inferred or fully defined for node "split". 

 For more information please refer to Model Optimizer FAQ (<INSTALL_DIR>/deployment_tools/documentation/docs/MO_FAQ.html), question #40. 

[ ERROR ]  

[ ERROR ] It can happen due to bug in custom shape infer function <function tf_split_infer at 0x7f0cae157d08>.

[ ERROR ] Or because the node inputs have incorrect values/shapes.

[ ERROR ] Or because input shapes are incorrect (embedded to the model or passed via --input_shape).

[ ERROR ] Run Model Optimizer with --log_level=DEBUG for more information.

[ ERROR ] Stopped shape/value propagation at "split" node.

 

Please help me on, how to pass or decide input_shape value to model optimizer for given model.

Ref:

model download link:

https://drive.google.com/uc?id=13E8KvN43QZKi4uf42ECdaReT_FYvdY-m&export=download

 

Thanks & Regards,

Sudhir

0 Kudos
0 Replies
Reply