- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Enviroment Details
=====================
openvino_fpga_2020.4.287 -- Intel cloud environment
onnx==1.7.0
onnxconverter-common==1.7.0
onnxruntime==1.3.0
torch==1.3.1
Details
Its a custom LSTM model developed using pytorch, model is running fine on both pytorch and onnx runtime, however when the model is converted from ONNX to openvinoIR we are geeting runtime error.
Torch model - running and giving results
Onnx model - running and giving results
Openvino - Model optimizer error while converting onnx to IR
CPU
[ INFO ] Loading network files:
models/torch/timeseries_enode.xml
models/torch/timeseries_enode.bin
Traceback (most recent call last):
File "openvino_model.py", line 96, in <module>
openvino_model(model_xml)
File "openvino_model.py", line 50, in openvino_model
net = ie.read_network(model=model_xml, weights=model_bin)
File "ie_api.pyx", line 261, in openvino.inference_engine.ie_api.IECore.read_network
File "ie_api.pyx", line 293, in openvino.inference_engine.ie_api.IECore.read_network
RuntimeError: Check 'shape_size(get_input_shape(0)) == shape_size(output_shape)' failed at /home/jenkins/agent/workspace/private-ci/ie/build-linux-ubuntu18/b/repos/openvino/ngraph/src/ngraph/op/reshape.cpp:290:
While validating node 'v1::Reshape Reshape_774(Constant_767[0]:f32{1,512,128}, Constant_773[0]:i64{2}) -> (dynamic?)':
Requested output shape Shape{1, 128} is incompatible with input shape Shape{1, 512, 128}
Link Copied
1 Reply
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Pankaj,
I am closing this thread, as it is a duplicate of another thread.
All correspondence regarding this thread will now be posted at the following thread, "OpenVino | Pytorch Error LSTM".
Regards,
Munesh

Reply
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page