First, I optimized my model for CPU with "mo_tf.py" and "--data_type FP32", on running inference using openVino IEPlugin with "CPU" it's working fine. But when I optimized my model for NCS2 with "--data_type FP16", and ran inference using openVino IEPlugin with "MYRIAD" it's giving following error:
exec_net = plugin.load(network=net)
File "ie_api.pyx", line 395, in openvino.inference_engine.ie_api.IEPlugin.load
File "ie_api.pyx", line 406, in openvino.inference_engine.ie_api.IEPlugin.load
RuntimeError: AssertionFailed: weights->desc().totalDimSize() >= kernelSizeX * kernelSizeY * (input->desc().dim(Dim::C) / groupSize) * output->desc().dim(Dim::C)
Can someone tell, whether the issue is with optimization or with the inferance? I was expecting "--data_type FP16" in optimization module will takecare of optimization for NCS2, or do I have to do something else?