Customer is trying with Yolo-v10 model on MetorLake Platform. (Ubuntu 24.04/Ubuntu 22.04)
Model can convert to IR successfully and it can do inference on CPU and GPU,
but when it comes to NPU, there will be errors like this.
could you help us provide any comment for this ?
[ ERROR ] Exception from src/inference/src/cpp/core.cpp:104:
Exception from src/inference/src/dev/plugin.cpp:53:
Exception from src/plugins/intel_npu/src/plugin/src/plugin.cpp:672:
Exception from src/plugins/intel_npu/src/plugin/src/compiled_model.cpp:61:
Check 'result == ZE_RESULT_SUCCESS' failed at src/plugins/intel_npu/src/compiler/src/zero_compiler_in_driver.cpp:803:
Failed to compile network. L0 createGraph result: ZE_RESULT_ERROR_UNKNOWN, code 0x7ffffffe. [NOT IMPLEMENTED] Unsupported operation __module.model.9.m/aten::max_pool2d/MaxPool with type MaxPool. Try to update the driver to the latest version. If the error persists, please submit a bug report in https://github.com/openvinotoolkit/openvino/issues
Failed to create executable
Traceback (most recent call last):
File "/home/synnex-fae/openvino_env/lib/python3.12/site-packages/openvino/tools/benchmark/main.py", line 408, in main
compiled_model = benchmark.core.compile_model(model, benchmark.device, device_config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/synnex-fae/openvino_env/lib/python3.12/site-packages/openvino/runtime/ie_api.py", line 543, in compile_model
super().compile_model(model, device_name, {} if config is None else config),
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: Exception from src/inference/src/cpp/core.cpp:104:
Exception from src/inference/src/dev/plugin.cpp:53:
Exception from src/plugins/intel_npu/src/plugin/src/plugin.cpp:672:
Exception from src/plugins/intel_npu/src/plugin/src/compiled_model.cpp:61:
Check 'result == ZE_RESULT_SUCCESS' failed at src/plugins/intel_npu/src/compiler/src/zero_compiler_in_driver.cpp:803:
Failed to compile network. L0 createGraph result: ZE_RESULT_ERROR_UNKNOWN, code 0x7ffffffe. [NOT IMPLEMENTED] Unsupported operation __module.model.9.m/aten::max_pool2d/MaxPool with type MaxPool. Try to update the driver to the latest version. If the error persists, please submit a bug report in https://github.com/openvinotoolkit/openvino/issues
Failed to create executable
It seem that OpenVINO 2024.2.0 (2024.1.0 also) can not compile yolov10 model into NPU.
I also find similar issue on github [Bug]: Mish activation function crashes openvino model benchmark on NPU · Issue #24461 · openvinotoolkit/openvino (github.com).
链接已复制
Hi Tchio,
Thanks for reaching out. What is the Linux NPU driver version you used? Can you use the latest NPU driver version?
You may refer to this Linux NPU Driver for the installation.
Regards,
Aznie
Hi Aznie :
thakns for you hlep , we had tun yolov8 should no problem . PLS help check the yolov10 issue .
but we had one more question how to check the NPU loading like as windows take performance as below photo ?
Hi Tchio,
Could you share Yolov10 model files and the source of your model for us to validate from our end? For the Npu plugin, it will not be visible on the Task Manager. Can you check if NPU is detected on Device Manager?
Regards,
Aznie
Hi Aznie :
sorry update the Yolov8 had the same issue but we focus on Yolov10 first .
for the moduel PLS download from below .
or URL :https://drive.google.com/file/d/1Gx14REzGLXZkWAdt193suC-ITB2vBaGn/view?usp=sharing
Could you share Yolov10 model files and the source of your model for us to validate from our end? For the Npu plugin, it will not be visible on the Task Manager. Can you check if NPU is detected on Device Manager? NPU is detected on device Manager,
benchmark_app -h |grep Available
> Available target devices: CPU GPU NPU
Hi Tchio,
For your information, we have submitted a feature request to our developer to enable yolov10 on NPU.
In a meanwhile, we noticed that you have also files the same issue on IPS case. Hence, we would to request to close this case here and we continue the conversation through IPS.
Regards,
Peh
