Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6506 Discussions

RuntimeError: Unsupported primitive of type: Unsqueeze name: conv1d_1/conv1d/ExpandDims

Anjam__Khayam
Beginner
801 Views

Hi, I am seeing following error trying to run inference engine,

---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-19-303a7a27e763> in <module>
     24 out_blob = next(iter(net.outputs))
     25 # Load network to the plugin
---> 26 exec_net = plugin.load(network=net)
     27 del net
     28 

ie_api.pyx in openvino.inference_engine.ie_api.IEPlugin.load()

ie_api.pyx in openvino.inference_engine.ie_api.IEPlugin.load()

RuntimeError: Unsupported primitive of type: Unsqueeze name: conv1d_1/conv1d/ExpandDims

This is how I am running the IE

from PIL import Image
import numpy as np
try:
    from openvino import inference_engine as ie
    from openvino.inference_engine import IENetwork, IEPlugin
except Exception as e:
    exception_type = type(e).__name__
    print("The following error happened while importing Python API module:\n[ {} ] {}".format(exception_type, e))
    sys.exit(1)
    
plugin_dir = None
model_xml = './output/frozen-keras-helloworld.xml'
model_bin = './output/frozen-keras-helloworld.bin'

# Devices: GPU (intel), CPU, MYRIAD
plugin = IEPlugin("CPU", plugin_dirs=plugin_dir)
# Read IR
net = IENetwork.from_ir(model=model_xml, weights=model_bin)
assert len(net.inputs.keys()) == 1
assert len(net.outputs) == 1

input_blob = next(iter(net.inputs))
out_blob = next(iter(net.outputs))
# Load network to the plugin
exec_net = plugin.load(network=net)
del net

res = exec_net.infer(inputs={input_blob: X_test})
# Access the results and get the index of the highest confidence score
output_node_name = list(res.keys())[0]
res = res[output_node_name]

My keras model is as following,

model.add(tf.keras.layers.Conv1D(filters=64, kernel_size=2, activation='relu', input_shape=(X.shape[1], X.shape[2])))
model.add(tf.keras.layers.MaxPooling1D(pool_size=2))
model.add(tf.keras.layers.Flatten())
model.add(tf.keras.layers.Dense(50, activation='relu'))
model.add(tf.keras.layers.Dense(1))
model.compile(optimizer='adam', loss='mse')  

What is the issue here?

0 Kudos
1 Reply
Shubha_R_Intel
Employee
801 Views

Dear Anjam, Khayam,

I don't see anything like the below in your code:

if CPU_DEVICE_NAME in device_name:
            if args.path_to_extension:
                ie.add_cpu_extension(extension_path=args.path_to_extension, device_name=CPU_DEVICE_NAME)

 

Please take a look at C:\Program Files (x86)\IntelSWTools\openvino_2019.2.242\inference_engine\samples\python_samples\benchmark_app\benchmark\benchmark.py

Very often when you get an error like:

RuntimeError: Unsupported primitive of type: Unsqueeze name: conv1d_1/conv1d/ExpandDims

in Python OpenVIno code, it's because you're not adding CPU extensions. All the Python samples and demos do this via the "-l" switch.

Thanks,

Shubha

0 Kudos
Reply