Mobile and Desktop Processors
Intel® Core™ processors, Intel Atom® processors, tools, and utilities
16792 Discussions

Npu not working on my windows 11 device - ultra 7 155h

iso-uran
Novice
12,012 Views

Windows 11 is installed on my computer with Ultra 7 155h processor. Although 24h2 update is installed and my device is on the latest version, the npu unit does not work. Even when we examine the task manager for a long time, it does not go above 0%. Can I get support on this issue?

isouran_0-1738183898054.png

 

0 Kudos
29 Replies
PC1997
New Contributor I
1,828 Views

I'm glad you finally got it to work! I have a new motherboard and Intel Arrow Lake CPU (which has a NPU) that I haven't installed yet, so I was not able to test the NPU functionality of GeekBench AI on my computer. Sorry for the confusion.

0 Kudos
iso-uran
Novice
1,703 Views

Ancak bunun dışında yapay zeka uygulamaları içeren süreçlerde çalışmıyor, bunu nasıl çözebilirim?

0 Kudos
iso-uran
Novice
1,818 Views

However, it does not work in processes involving AI applications other than this, how can I solve this?

0 Kudos
PC1997
New Contributor I
1,813 Views

Unfortunately, unless applications support it, nothing you can do.

Copilot has a requirement of 40 TOPS. From their thinking when a much more powerful GPU is available, there is no reason for them to develop their code for a NPU. But obviously for laptops and especially phones where battery usage is of concern a NPU is much more power efficient. Additionally, it has its valid uses for certain AI work loads. Its use is much more common on newer phones that support such technologies.

0 Kudos
DhannielM_Intel
Moderator
1,765 Views

Hello iso-uran,


It appears that the NPU utilization is functioning correctly, but it is only activated when OpenVINO is utilized. I would like to extend my gratitude to @PC1997 for providing your knowledge to @iso-uran in this thread. I will investigate this matter internally to conduct a more thorough analysis


Best regards,


Dhanniel M.

Intel Customer Support Technician


0 Kudos
DhannielM_Intel
Moderator
1,719 Views

Hello iso-uran,


Thank you for your patience. Based on my investigation, I concur with the remarks made by @PC1997. The utilization of the Neural Processing Unit (NPU) can vary depending on how an application is programmed to leverage either the NPU or the GPU. Since you are currently working on a language model, I recommend consulting the community or support resources of the AI framework you are using. They can provide guidance on how to effectively utilize the NPU within your framework.

If you have any further questions, please feel free to create a new thread, as this inquiry will now be closed and no longer monitored.


Best regards,


Dhanniel M.

Intel Customer Support Technician


0 Kudos
iso-uran
Novice
1,705 Views

Thank you for your support. As far as I understand, this is a situation caused by the settings of the applications. Do I understand correctly? Are these settings on the software side or in a way that I can interfere with, how can I change the settings to use the NPU actively?

0 Kudos
PC1997
New Contributor I
1,669 Views

Currently, Microsoft Copilot requires the Arm-based Snapdragon Elite X chip. This is one of the CPU+NPU processors that meets the minimum requirements of 40 TOPS.

Support will be added for other NPU's from Intel and AMD that meet the minimum processing performance in the future.

Here is an excellent resource on how to program applications to use the Neural Processing Unit in Copilot:

https://learn.microsoft.com/en-us/windows/ai/npu-devices/

Thanks to @DhannielM_Intel for the mention.

0 Kudos
PC1997
New Contributor I
1,349 Views

Finally an actual full solution is available:

 

https://www.extremetech.com/computing/microsoft-copilot-for-windows-is-transitioning-to-a-native-app

 

"The app is designed to use the device's Neural Processing Unit (NPU) for task completion and query responses, which could enhance speed and efficiency. Currently, this is limited to Windows Insiders, with a broader release anticipated later this year."

0 Kudos
Reply