- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I am using Openvino v2019_R1.1 on an IEI Tank with Mustang-V100-MX8, having 8 MyriadX. I am using the Python API for the Inference Engine.
As shown here and here, I try to use the "KEY_VPU_HDDL_GRAPH_TAG" config parameter while initializing my IE Plugin using the "set_config()" property in python(3.5) shown below :
plugin = IEPlugin(device=fd_device, plugin_dirs=intel_plugin_dir) plugin.set_config({"KEY_VPU_HDDL_GRAPH_TAG": "FD_tag"}) plugin.set_config({"KEY_PERF_COUNT": 'YES'}) exec_net = plugin.load(network=net)
I also change my hddl_service.config file as :
"graph_tag_map":{ "FD_tag" : 1 },
Although the HDDL service does reserve a device for the tag scheduler, my Python application throws a Runtime Error saying that the tags are not supported for VPUs :
Looking for dlls in [22:59:34.3252][20513]I[ServiceStarter.cpp:40] Info: Waiting for HDDL Service getting ready ... [22:59:34.3254][20513]I[ServiceStarter.cpp:45] Info: Found HDDL Service is running. [HDDLPlugin] [22:59:34.3257][20513]I[ConfigParser.cpp:176] Config file '/opt/intel/openvino_2019.1.133/deployment_tools/inference_engine/external/hddl/config/hddl_api.config' has been loaded Hddl api version:2.2 [HDDLPlugin] [22:59:34.3257][20513]I[HddlClient.cpp:259] Info: Create Dispatcher2. [HDDLPlugin] [22:59:34.3259][20554]I[Dispatcher2.cpp:148] Info: SenderRoutine starts. [HDDLPlugin] [22:59:34.3260][20513]I[HddlClient.cpp:270] Info: RegisterClient HDDLPlugin. Client Id:1 RuntimeError: [NOT_FOUND] KEY_VPU_HDDL_GRAPH_TAG key is not supported for VPU Exception ignored in: 'openvino.inference_engine.ie_api.IEPlugin.set_config' RuntimeError: [NOT_FOUND] KEY_VPU_HDDL_GRAPH_TAG key is not supported for VPU RuntimeError: [NOT_FOUND] KEY_PERF_COUNT key is not supported for VPU Exception ignored in: 'openvino.inference_engine.ie_api.IEPlugin.set_config' RuntimeError: [NOT_FOUND] KEY_PERF_COUNT key is not supported for VPU [22:59:34.3311][20510]I[ServiceStarter.cpp:40] Info: Waiting for HDDL Service getting ready ... [22:59:34.3312][20510]I[ServiceStarter.cpp:45] Info: Found HDDL Service is running. [22:59:34.3312][20512]I[ServiceStarter.cpp:40] Info: Waiting for HDDL Service getting ready ... [22:59:34.3312][20512]I[ServiceStarter.cpp:45] Info: Found HDDL Service is running. [HDDLPlugin] [22:59:34.3313][20510]I[ConfigParser.cpp:176] Config file '/opt/intel/openvino_2019.1.133/deployment_tools/inference_engine/external/hddl/config/hddl_api.config' has been loaded [HDDLPlugin] [22:59:34.3313][20512]I[ConfigParser.cpp:176] Config file '/opt/intel/openvino_2019.1.133/deployment_tools/inference_engine/external/hddl/config/hddl_api.config' has been loaded Hddl api version:2.2
But this page says that HDDL supports this particular configuration parameter. Please provide assistance with this issue. Does Python API support the tag used? or Do I need to configure the plugin differently?
Thanks
Also,
the device scheduler in v2019_R1.1 seems to be Scaling the Device Requirements wrongly, assigning the lightest network to more number of Devices and not scheduling according to the load. The 2018R5 version does not have this problem.
Attached below is a screenshot of the HDDL service log of the problem. As you can see, The graph with the least memory requirement and infer time is being assigned 4 devices while the heavier graphs are being choked.
Thanks.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
Probably you already fixed it. But for reference, I had the same problem and used
ie.set_config({"VPU_HDDL_GRAPH_TAG": "tagDetect"}, "HDDL")
Note the VPU_HDDL_GRAPH_TAG instead of KEY_VPU_HDDL_GRAPH_TAG

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page