Application Acceleration With FPGAs
Programmable Acceleration Cards (PACs), DCP, FPGA AI Suite, Software Stack, and Reference Designs
477 Discussions

Model Optimizer with Openvino + FPGA

SDumb
Beginner
1,828 Views

Hi All,

Trying to generate .xml & .bin file by using model optimizer, with command

"python3 /home/datadrive1/intel/computer_vision_sdk_2018.4.420/deployment_tools/model_optimizer/mo_tf.py --input_model 0818_pspnet_1.0_713_resnet_v1/frozen_inference_graph_opt.pb"

 

but it always fail with below log;

Model Optimizer version: 1.4.292.6ef7232d

[ ERROR ] Cannot infer shapes or values for node "pad_to_bounding_box/Pad".

[ ERROR ] Input tensor shape [ 1 1 256 256 3] and pads values (4, 2) do not match for Pad node pad_to_bounding_box/Pad

[ ERROR ] 

[ ERROR ] It can happen due to bug in custom shape infer function <function Pad.infer at 0x7f0994516f28>.

[ ERROR ] Or because the node inputs have incorrect values/shapes.

[ ERROR ] Or because input shapes are incorrect (embedded to the model or passed via --input_shape).

[ ERROR ] Run Model Optimizer with --log_level=DEBUG for more information.

[ ERROR ] Stopped shape/value propagation at "pad_to_bounding_box/Pad" node. 

 For more information please refer to Model Optimizer FAQ (<INSTALL_DIR>/deployment_tools/documentation/docs/MO_FAQ.html), question #38.

 

 

trying with some other model, but got same error log;

cmd:python3 /home/datadrive1/intel/computer_vision_sdk_2018.4.420/deployment_tools/model_optimizer/mo_tf.py --input_model

 ICNet-model.pb

 

[ ERROR ] Shape is not defined for output 0 of "split".

[ ERROR ] Shape is not defined for output 1 of "split".

[ ERROR ] Shape is not defined for output 2 of "split".

[ ERROR ] Cannot infer shapes or values for node "split".

[ ERROR ] Not all output shapes were inferred or fully defined for node "split". 

 For more information please refer to Model Optimizer FAQ (<INSTALL_DIR>/deployment_tools/documentation/docs/MO_FAQ.html), question #40. 

[ ERROR ]  

[ ERROR ] It can happen due to bug in custom shape infer function <function tf_split_infer at 0x7f0cae157d08>.

[ ERROR ] Or because the node inputs have incorrect values/shapes.

[ ERROR ] Or because input shapes are incorrect (embedded to the model or passed via --input_shape).

[ ERROR ] Run Model Optimizer with --log_level=DEBUG for more information.

[ ERROR ] Stopped shape/value propagation at "split" node.

 

Please help me on, how to pass or decide input_shape value to model optimizer for given model.

Ref:

model download link:

https://drive.google.com/uc?id=13E8KvN43QZKi4uf42ECdaReT_FYvdY-m&export=download

 

Thanks & Regards,

Sudhir

0 Kudos
2 Replies
Shawn_S_Intel
Employee
618 Views

Hi Sudhir,

 

Since this question is specific to Model Optimizer and somewhat independent from FPGA, you should post this to the OpenVINO public forum. You will find OpenVINO experts there who can address your question.

https://software.intel.com/en-us/forums/computer-vision

 

 

MuhammadAr_U_Intel
618 Views

Hi Sudhir,

 

I have created a post on OpenVINO forum for your question. Please follow up.

 

https://software.intel.com/en-us/forums/computer-vision/topic/802213

 

Thanks,

Arslan

0 Kudos
Reply