Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

Can`t inference my model with python API

Rizhiy__Andriy
Beginner
247 Views

macOs 10.14.5 

I successful optimize my model with openvino and create .xml and .bin files. When i try inference with this code in python

model_xml = args.frozen_model_filename
model_bin = os.path.splitext(model_xml)[0] + ".bin"

plugin = IEPlugin(device=args.device)
if args.cpu_extension and 'CPU' in args.device:
    plugin.add_cpu_extension(args.cpu_extension)
if args.performance:
    plugin.set_config({"PERF_COUNT": "YES"})
# Read IR
net = IENetwork(model=model_xml, weights=model_bin)
input_blob = next(iter(net.inputs))
exec_net = plugin.load(network=net)

I got bus error when running plugin.load. what am I doing wrong?

0 Kudos
1 Reply
Kenneth_C_Intel
Employee
247 Views

It looks like there may be a problem with your inputs. 

Print out the variables to the console that you are trying to load to make sure that they are getting parsed correctly. 

I also would test them hardcoded to see if it is a user error or an issue with Inference engine. 

Also what version of python are you using ?

Regards,

Kenneth

0 Kudos
Reply