- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Steps I followed:
- Saved tensorflow model using the saved_model function provided by TF.
- Run OpenVino optimizer for TF using the following command:
python3 mo_tf.py --saved_model_dir $PATH_TO_SAVED_MODEL --output_dir $OUTPUT_PATH --input name_input_layer_1,name_input_layer_2 --input_shape [1,30,180,320,3],[1,30,180,320,3] --model_name model1
- Import `.xml` and `.bin` files from `$OUTPUTH_PATH` in the code:
ie = IECore()
net = ie.read_network(model='OUTPUT_PATH/model1.xml', weights='OUTPUT_PATH/model1.bin')
exec_net = ie.load_network(network=net, device_name="CPU")
- Predict result from the model:
exec_net.infer({ "name_input_layer_1": a_sample, "name_input_layer_2": b_sample })
When the code arrives to the infer line it raises the following error:
ValueError: could not broadcast input array from shape (1,30,180,320,3) into shape (1,3,30,180,320)
I tried giving the shape of the input when I runned the optimizer but it did not work. I also tried adding a batch number insted and it did not work either.
I know tensorflow works with channels last by default, but for some reason when I make the predict openvino still changes the order. Am I missing something, any help would be appreciated.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello Leopoldo Vargas,
Greetings to you.
Please share with us the model and the steps to reproduce the issue.
Sincerely,
Zulkifli
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello Leopoldo Vargas,
Thank you for your question. If you need any additional information from Intel, please submit a new question as this thread is no longer being monitored.
Sincerely,
Zulkifli
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page