Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6502 Discussions

Fail to run inference with NCS2 for custom model

Singh__Harish
Beginner
387 Views

I am using tensorflow sequential convolutional neural network to test NCS2 for binary classification. Below is the information about my network:

 network inputs = {'conv2d_1_input_3': <openvino.inference_engine.ie_api.InputInfo object at 0x7fcf54c55918>}
[<tf.Tensor 'conv2d_1_input_3:0' dtype=float32>]
network outputs = {'activation_6_3/Sigmoid': <openvino.inference_engine.ie_api.OutputInfo object at 0x7fcf54a53468>}
[<tf.Tensor 'activation_6_3/Sigmoid:0' dtype=float32>]
network input shape = [1, 3, 26, 34] 
network output shape = [1, 1]

First, I optimized my model for CPU with "mo_tf.py" and "--data_type FP32", on running inference using openVino IEPlugin with "CPU" it's working fine. But when I optimized my model for NCS2 with "--data_type FP16", and ran inference using openVino IEPlugin with "MYRIAD" it's giving following error:

    exec_net = plugin.load(network=net)
  File "ie_api.pyx", line 395, in openvino.inference_engine.ie_api.IEPlugin.load
  File "ie_api.pyx", line 406, in openvino.inference_engine.ie_api.IEPlugin.load
RuntimeError: AssertionFailed: weights->desc().totalDimSize() >= kernelSizeX * kernelSizeY * (input->desc().dim(Dim::C) / groupSize) * output->desc().dim(Dim::C)

Can someone tell, whether the issue is with optimization or with the inferance? I was expecting "--data_type FP16" in optimization module will takecare of optimization for NCS2, or do I have to do something else?

0 Kudos
0 Replies
Reply