AI Tools from Intel
Find answers to your toolkit installation, configuration, and get-started questions.
Announcements
FPGA community forums and blogs on community.intel.com are migrating to the new Altera Community and are read-only. For urgent support needs during this transition, please visit the FPGA Design Resources page or contact an Altera Authorized Distributor.

Intel NPU Acceleration Library does not support GGUF format file?

Martin_HZK
Novice
921 Views

I am trying to use intel-npu-acceleration library to enable local LLM inference task. However, it seems that the library does not support GGUF file parsing?

Martin_HZK_0-1741231367552.png

 

0 Kudos
1 Reply
Ying_H_Intel
Moderator
616 Views

Hi Martin_HZK, 

As the  intel/intel-npu-acceleration-library: Intel® NPU Acceleration Library is not supported here, could you please to check your quesiton to Intel® Distribution of OpenVINO™ Toolkit - Intel Community 

or Issues · openvinotoolkit/openvino

FYI, I also noticed some developer also try to use Ollama openvino : like zhaohb/ollama_ov: Add genai backend for ollama to run generative AI models using OpenVINO Runtime. you may try them. 

 

Thanks

 

0 Kudos
Reply