Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

[OpenVINO][HDDLPlugin]run sample application in VPU open /dev/ion failed

XI_Q_Intel
Employee
610 Views

Hi.

I'm running the sample application of squeezenet classification under VPU. The information shows the successful execution. But I got the alert of failure of "opening /dev/ion/", pasted as below. What is the role of memory allocation here? How critical is this failure?

Thank you very much!

desk2:~/inference_engine_samples_build/intel64/Release$ ./classification_sample -i /opt/intel/openvino/deployment_tools/demo/car.png -m ~/squeezenet1.1_FP16/squeezenet1.1.xml -d HDDL
[ INFO ] InferenceEngine: 
	API version ............ 1.6
	Build .................. custom_releases/2019/R1.1_28dfbfdd28954c4dfd2f94403dd8dfc1f411038b
[ INFO ] Parsing input parameters
[ INFO ] Files were added: 1
[ INFO ]     /opt/intel/openvino/deployment_tools/demo/car.png
[ INFO ] Loading plugin

	API version ............ 1.6
	Build .................. 23780
	Description ....... HDDLPlugin
[ INFO ] Loading network files:
	/home/xi/squeezenet1.1_FP16/squeezenet1.1.xml
	/home/xi/squeezenet1.1_FP16/squeezenet1.1.bin
[ INFO ] Preparing input blobs
[ WARNING ] Image is resized from (787, 259) to (227, 227)
[ INFO ] Batch size is 1
[ INFO ] Preparing output blobs
[ INFO ] Loading model to the plugin
[16:23:58.6413][13610]I[ServiceStarter.cpp:40] Info: Waiting for HDDL Service getting ready ...
[16:23:58.6414][13610]I[ServiceStarter.cpp:45] Info: Found HDDL Service is running.
[HDDLPlugin] [16:23:58.6414][13610]I[ConfigParser.cpp:176] Config file '/opt/intel/openvino_2019.1.144/deployment_tools/inference_engine/external/hddl/config/hddl_api.config' has been loaded
Hddl api version:2.2
[HDDLPlugin] [16:23:58.6415][13610]I[HddlClient.cpp:259] Info: Create Dispatcher2.
[HDDLPlugin] [16:23:58.6421][13615]I[Dispatcher2.cpp:148] Info: SenderRoutine starts.
[HDDLPlugin] [16:23:58.6421][13610]I[HddlClient.cpp:270] Info: RegisterClient HDDLPlugin.
[16:23:58.6424][13500]I[ClientManager.cpp:159] client(id:2) registered: clientName=HDDLPlugin socket=2
Client Id:2
[16:23:59.0975][13501]I[GraphManager.cpp:486] Load graph success, graphId=2 graphName=squeezenet1.1
[HDDLPlugin] [16:23:59.0989][13610]I[HddlBlob.cpp:165] Info: HddlBlob initialize ion ...
open /dev/ion failed!
[HDDLPlugin] [16:23:59.0990][13610]W[HddlBlob.cpp:169] Warn: ion_open failed (errno=2). All buffers will use ShareMemory.
[HDDLPlugin] [16:23:59.0992][13610]I[HddlBlob.cpp:165] Info: HddlBlob initialize ion ...
open /dev/ion failed!
[HDDLPlugin] [16:23:59.0992][13610]W[HddlBlob.cpp:169] Warn: ion_open failed (errno=2). All buffers will use ShareMemory.
[ INFO ] Starting inference (1 iterations)
[ INFO ] Processing output blobs

Top 10 results:

Image /opt/intel/openvino/deployment_tools/demo/car.png

classid probability label
------- ----------- -----
817     0.8295898   sports car, sport car
511     0.0961304   convertible
479     0.0439453   car wheel
751     0.0101318   racer, race car, racing car
436     0.0074234   beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon
656     0.0042267   minivan
586     0.0029869   half track
717     0.0018148   pickup, pickup truck
864     0.0013924   tow truck, tow car, wrecker
581     0.0006595   grille, radiator grille


total inference time: 10.3774788
Average running time of one iteration: 10.3774788 ms

Throughput: 96.3625193 FPS

[16:23:59.1121][13500]I[ClientManager.cpp:189] client(id:2) unregistered: clientName=HDDLPlugin socket=2
[HDDLPlugin] [16:23:59.1126][13616]I[Dispatcher2.cpp:235] Info: Other side of pipe is closed, shut down socket.
[HDDLPlugin] [16:23:59.1129][13610]I[Dispatcher2.cpp:81] Info: Client dispatcher exit.
[HDDLPlugin] [16:23:59.1140][13610]I[HddlClient.cpp:203] Info: Hddl client unregistered.
[ INFO ] Execution successful

 

0 Kudos
3 Replies
Shubha_R_Intel
Employee
610 Views

Dear QIN, XI,

You are using a very old version of OpenVino. OpenVino 2019R2 was just released. Please download it and try it as many issues have been fixed.

Thanks,

Shubha

0 Kudos
XI_Q_Intel
Employee
610 Views

Hi, Shubha.

Thank you for your suggestion! I've installed the newest 2019R2. Drivers and IVAD_VPU dependencies are reinstalled as well. The ion initialization and the inference work well. But there's a warning of " cannot find key":

[16:57:02.3220][4280]W[ConfigParser.cpp:269] Warning: Cannot find key, path=scheduler_config.max_graph_per_device subclass=0, use default value: 1.
[16:57:02.3220][4280]W[ConfigParser.cpp:292] Warning: Cannot find key, path=scheduler_config.use_sgad_by_default subclass=0, use default value: false.

What caused this? Thank you so much!

0 Kudos
Shubha_R_Intel
Employee
610 Views

Dear QIN, X,

There are some VPU config paramenters mentioned in the VPU Plugins Doc but .max_graph_per_device and use_sgad_by_default are not mentioned. Let me ask about this and get back to you on this forum.

Thanks for your patience,

Shubha

0 Kudos
Reply