- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi, I am trying to convert the UNet tensorflow model. However, I try to load it for inference, I get this error:
exec_net = ie.load_network(network=net, device_name="CPU")
File "ie_api.pyx", line 178, in openvino.inference_engine.ie_api.IECore.load_network
File "ie_api.pyx", line 187, in openvino.inference_engine.ie_api.IECore.load_network
RuntimeError: Interpolate operation should be converted to Interp
I converted the model using
python "%openvino_dir%\deployment_tools\model_optimizer\mo_tf.py" --input_model ".\temp\inference_graph.pb" --log_level=ERROR --input_shape "(1,512,512,3)"
This is the code I am using to infer:
model_xml = './inference_graph.xml'
model_bin = './inference_graph.bin'
ie = IECore()
net = ie.read_network(model=model_xml, weights=model_bin)
input_blob = next(iter(net.inputs))
out_blob = next(iter(net.outputs))
exec_net = ie.load_network(network=net, device_name="CPU")
Link Copied
3 Replies
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Greetings,
If it's possible, could you provide the full error that you are getting?
Sincerely,
Iffa
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I think the issue is due to OpenVino not supporting Upsampling in Tensorflow models.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Yes, that is possible. You may refer here for the one that OpenVino support:
Sincerely,
Iffa

Reply
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page