Community
cancel
Showing results for 
Search instead for 
Did you mean: 
dopeuser
Novice
269 Views

UNet Error during inference. Interpolate operation should be converted to Interp

Hi, I am trying to convert the UNet tensorflow model. However, I try to load it for inference, I get this error:

exec_net = ie.load_network(network=net, device_name="CPU")
File "ie_api.pyx", line 178, in openvino.inference_engine.ie_api.IECore.load_network
File "ie_api.pyx", line 187, in openvino.inference_engine.ie_api.IECore.load_network
RuntimeError: Interpolate operation should be converted to Interp

 

I converted the model using 

python "%openvino_dir%\deployment_tools\model_optimizer\mo_tf.py" --input_model ".\temp\inference_graph.pb" --log_level=ERROR --input_shape "(1,512,512,3)"
This is the code I am using to infer:
model_xml = './inference_graph.xml'
model_bin = './inference_graph.bin'
ie = IECore()
net = ie.read_network(model=model_xml, weights=model_bin)
input_blob = next(iter(net.inputs))
out_blob = next(iter(net.outputs))
exec_net = ie.load_network(network=net, device_name="CPU")
0 Kudos
3 Replies
Iffa_Intel
Moderator
260 Views

Greetings,


If it's possible, could you provide the full error that you are getting?



Sincerely,

Iffa


dopeuser
Novice
259 Views

I think the issue is due to OpenVino not supporting Upsampling in Tensorflow models. 

Iffa_Intel
Moderator
252 Views

Yes, that is possible. You may refer here for the one that OpenVino support:

https://docs.openvinotoolkit.org/latest/openvino_docs_MO_DG_prepare_model_convert_model_Convert_Mode...


Sincerely,

Iffa


Reply