Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

Model Optimizer with Openvino + FPGA

MuhammadAr_U_Intel
285 Views

Hi,

I am opening this thread for a user question that landed on Intel FPGA forums. Question is related to Model Optimizer

Here is the direct link to customer questions.

https://forums.intel.com/s/question/0D50P00004AUkXQSA1/model-optimizer-with-openvino-fpga?language=en_US

 

Hi All,

Trying to generate .xml & .bin file by using model optimizer, with command

"python3 /home/datadrive1/intel/computer_vision_sdk_2018.4.420/deployment_tools/model_optimizer/mo_tf.py --input_model 0818_pspnet_1.0_713_resnet_v1/frozen_inference_graph_opt.pb"

 

but it always fail with below log;

Model Optimizer version: 1.4.292.6ef7232d

[ ERROR ] Cannot infer shapes or values for node "pad_to_bounding_box/Pad".

[ ERROR ] Input tensor shape [ 1 1 256 256 3] and pads values (4, 2) do not match for Pad node pad_to_bounding_box/Pad

[ ERROR ] 

[ ERROR ] It can happen due to bug in custom shape infer function <function Pad.infer at 0x7f0994516f28>.

[ ERROR ] Or because the node inputs have incorrect values/shapes.

[ ERROR ] Or because input shapes are incorrect (embedded to the model or passed via --input_shape).

[ ERROR ] Run Model Optimizer with --log_level=DEBUG for more information.

[ ERROR ] Stopped shape/value propagation at "pad_to_bounding_box/Pad" node. 

 For more information please refer to Model Optimizer FAQ (<INSTALL_DIR>/deployment_tools/documentation/docs/MO_FAQ.html), question #38.

 

 

trying with some other model, but got same error log;

cmd:python3 /home/datadrive1/intel/computer_vision_sdk_2018.4.420/deployment_tools/model_optimizer/mo_tf.py --input_model

 ICNet-model.pb

 

[ ERROR ] Shape is not defined for output 0 of "split".

[ ERROR ] Shape is not defined for output 1 of "split".

[ ERROR ] Shape is not defined for output 2 of "split".

[ ERROR ] Cannot infer shapes or values for node "split".

[ ERROR ] Not all output shapes were inferred or fully defined for node "split". 

 For more information please refer to Model Optimizer FAQ (<INSTALL_DIR>/deployment_tools/documentation/docs/MO_FAQ.html), question #40. 

[ ERROR ]  

[ ERROR ] It can happen due to bug in custom shape infer function <function tf_split_infer at 0x7f0cae157d08>.

[ ERROR ] Or because the node inputs have incorrect values/shapes.

[ ERROR ] Or because input shapes are incorrect (embedded to the model or passed via --input_shape).

[ ERROR ] Run Model Optimizer with --log_level=DEBUG for more information.

[ ERROR ] Stopped shape/value propagation at "split" node.

 

I will link this post to Intel FPGA forum as well.

0 Kudos
1 Reply
MMoha9
Beginner
285 Views

did you solve the problem?

 

0 Kudos
Reply