<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Inferencing - llama.cpp crash on 8626, Windows on 8735 in Graphics</title>
    <link>https://community.intel.com/t5/Graphics/Inferencing-llama-cpp-crash-on-8626-Windows-on-8735/m-p/1746499#M151309</link>
    <description>&lt;P&gt;The post before should have been posted this morning, but guess I've missed the "Post Reply" button 🤷‍&lt;span class="lia-unicode-emoji" title=":male_sign:"&gt;♂️&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Meanwhile I wasn't able to update Windows to 26H1, as it seems to require a fresh install, at least until Microsoft officially makes it available as a Windows Update update.&lt;/P&gt;&lt;P&gt;But I'm now on Windows 11 Build 26200.8328.&lt;/P&gt;&lt;P&gt;I've also updated the Intel drivers to '32.0.101.8250' and I don't have any issue with this version.&lt;/P&gt;&lt;P&gt;Inferencing with llama.cpp runs flawlessly.&lt;/P&gt;</description>
    <pubDate>Fri, 01 May 2026 16:40:20 GMT</pubDate>
    <dc:creator>f4nt4</dc:creator>
    <dc:date>2026-05-01T16:40:20Z</dc:date>
    <item>
      <title>Inferencing - llama.cpp crash on 8626, Windows on 8735</title>
      <link>https://community.intel.com/t5/Graphics/Inferencing-llama-cpp-crash-on-8626-Windows-on-8735/m-p/1746417#M151291</link>
      <description>&lt;P&gt;Hi&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I just came across an issues while running the above mentioned Graphic driver versions.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;My specs:&lt;/P&gt;&lt;P&gt;HP OmniBook Ultra Flip Laptop 14-fh0xxx&lt;/P&gt;&lt;P&gt;Intel Core Ultra 7 258V 32GB&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;It started already a few weeks ago with random crashes off llama.cpp, which I've ignored for the time beeing. Today while working on a local AI pipepline it crashed every few requests.&lt;/P&gt;&lt;P&gt;After then upgrading today to latest&amp;nbsp;8735 version, Windows started instantly to crash.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I'm using the officials builds of llama.cpp from github. Regardless of version, currently at the time of writing,&amp;nbsp;b8990, and regardless of the build target (Vulkan or SYSCL). The CPU based build works flawlessly.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Windows event log shows sometime these events following:&lt;/P&gt;&lt;LI-CODE lang="xml"&gt;- &amp;lt;Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event"&amp;gt;
- &amp;lt;System&amp;gt;
  &amp;lt;Provider Name="Microsoft-Windows-WHEA-Logger" Guid="{c26c4f3c-3f66-4e99-8f8a-39405cfed220}" /&amp;gt; 
  &amp;lt;EventID&amp;gt;1&amp;lt;/EventID&amp;gt; 
  &amp;lt;Version&amp;gt;0&amp;lt;/Version&amp;gt; 
  &amp;lt;Level&amp;gt;2&amp;lt;/Level&amp;gt; 
  &amp;lt;Task&amp;gt;0&amp;lt;/Task&amp;gt; 
  &amp;lt;Opcode&amp;gt;0&amp;lt;/Opcode&amp;gt; 
  &amp;lt;Keywords&amp;gt;0x8000000000000002&amp;lt;/Keywords&amp;gt; 
  &amp;lt;TimeCreated SystemTime="2026-04-30T19:07:44.9763869Z" /&amp;gt; 
  &amp;lt;EventRecordID&amp;gt;34522&amp;lt;/EventRecordID&amp;gt; 
  &amp;lt;Correlation ActivityID="{13a087ad-7a4a-4100-b0b8-29d7729c99f6}" /&amp;gt; 
  &amp;lt;Execution ProcessID="6772" ThreadID="8040" /&amp;gt; 
  &amp;lt;Channel&amp;gt;System&amp;lt;/Channel&amp;gt; 
  &amp;lt;Computer&amp;gt;omni&amp;lt;/Computer&amp;gt; 
  &amp;lt;Security UserID="S-1-5-19" /&amp;gt; 
  &amp;lt;/System&amp;gt;
- &amp;lt;EventData&amp;gt;
  &amp;lt;Data Name="Length"&amp;gt;8160&amp;lt;/Data&amp;gt; 
  &amp;lt;Data Name="RawData"&amp;gt;435045521002FFFFFFFF03000100000002000000E01F0000210713001E041A140000000000000000000000000000000000000000000000000000000000000000BDC407CF89B7184EB3C41F732CB5713166A4613D40AB9A40A698F362D464B38F1D345997D4D8DC0102000000000000000000000000000000000000000000000058010000200E00000003000000000000962A2181ED09964994718D729C8E69ED00000000000000000000000000000000010000000000000000000000000000000000000000000000780F0000480000000003000000000000962A2181ED09964994718D729C8E69ED00000000000000000000000000000000010000000000000000000000000000000000000000000000C00F0000201000000003000000000000962A2181ED09964994718D729C8E69ED000000000000000000000000000000000100000000000000000000000000000000000000000000000202000000000000000000000000000011F3878F98C99E4DA0C46065518C4F6D11230701D501000054D69A0A0000000000003D0005000000770300805E2B0800FFFFFFFFE8AB0900080000C7047F6F0180F1FF0B400C8E1C0F006000110000200100400000000000C00000000000812854101ACF0A005D03000100000400000000000100000000001001010001058F080000001010010010B50F50030000000002E9008007000000000000000000000000000000000C00000000408000081800204000000020000000000000000008000000000000000000000000000000080000000000000000B2B01C0040000E1147BC0982EA800EBEB6FFBF2647C78EFF57BFB9A7FFC5FF3AB0340002000E71410380C0C000000000000000000000000000000000A0000404000200088000000002000008B8B0181C20000E930000800224802C0000004080080810002040000000200000002400800004260000200000000000000020000020010300B81201220740000000000A1000000000000000323F1300080900000880A3C902405C00400000000007640310091000603D3D0000000F008000304A02300000000000000004008F0200010004000000000000000200000000000000000000F060050000000000000800000000220001FE057F0000000000000000010100000001000100000000000000000000001009000000808620C0F4B7203466F519D13E0000000000000000000000001F00000000000000000000000000000000000000000000007201000000000001FF0F000010000000000000000000000000000000000000000000000000000000000000000000000000000000000000001F00000000000000000000000000000000000000000000007201000000000001FF0F000010000000000000000000000000000000000000000000000000000000000000000000000000000000000000001F00000000000000000000000000000000000000000000007201000000000001FF0F000010000000000000000000000000000000000000000000000000000000000000000000000000000000000000001F00000000000000000000000000000000000000000000007201000000000001FF0F000010000000000000000000000000000000000000000000000000000000000000000000000000000000000000001F0000000000000000000000000000000000000000000000720100002D0038332C002C0010000000000000000000000000000000000000000000000000000000000000000000000000000000000000001F00000000000000000000000000000000000000000000007201000000000001FF0F00001000000000000000000000000000000000000000000000000000000000000000009002EFBEADDEEFBEADDEEFBEADDEEFBEADDE000100000500008000011F302000001000000000031FF60200002040EFBEADDEEFBEADDE000091FF002B000400000000000000C06A0200000680EFBEADDEEFBEEFBEADDEEFBEEFBEADDEEFBEEFBEDEEFBEDEEFBEDEEFDEEFBEDEEFBEADDEEFEFBEADDEEFBEADDEEFBEADEFEFBEEFBEDEEFBEDEEFBEDEEFBEDEEFBEDEEFBEDEEFBEDEEFBEDEEFBEDEEFBEDEEFBEADDEEFBEDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADEFBEADDEEFBEEFBEADDEEFBEADDEEFEFBEADDEEFBEADEFBEADEFBEADDEEFBEADDEADDEEFBEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADEFBEADEFBEADEFBEADDEEFBEEFBEEFBEADDEEFBEADEFBEADDEADDEEFBEADDEEFDEEFBEADADDEEFBEADDEEFBEADDEEFBEEFBEEFBEADDEEFBEADEFBEADDEADDEEFBEADDEEFDEEFBEADADDEEFBEADDEEFBEADDEEFBEEFBEEFBEADDEEFBEADEFBEADDEADDEEFBEADDEEFDEEFBEADADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDE00807F0800008080000000000000000038060000350C000000000000EFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDEEFBEADDE0202000000000000000000000000000011F3878F98C99E4DA0C46065518C4F6D012107170A0000000000000000000000000000000800000000000000EFBEADDEEFBEADDEEFBEADDE0202000000000000000000000000000011F3878F98C99E4DA0C46065518C4F6D012107110004000000000000A101000000000000930770002116000049087000311600004A08700021160000340C700031160000360C7000211600001F1070003116000021107000211600000A147000311600000C14700021160000F517700031160000F717700021160000E01B700031160000E11B700021160000CB1F700031160000CD1F700021160000B623700031160000B823700030000000240000000A0000002C07000043000001D4237000812B0000D6237000B12B0000D8237000C12B0000D9237000D12B0000DA23700030000000490A000043000001DC23700021160000DF23700031160000E023700021160000A127700031160000A3277000211600008C2B7000311600008E2B700021160000892E7000311600008A2E7000219D0000A32E700021160000A52E700031160000A62E700021160000772F700031160000792F700021160000623370003116000064337000211600004D377000311600004E37700021160000383B700031160000393B700021160000233F700031160000243F7000211600000E437000311600000F43700021160000F946700031160000FA46700030000000240000000A0000002C07000043000001EA4A7000812B0000EC4A7000B12B0000ED4A7000C12B0000EE4A7000D12B0000F04A700030000000490A000043000001F24A700021160000F44A700031160000F64A700021160000CF4E700031160000D04E700021160000BA52700031160000BB527000211600009C557000311600009D557000219D0000B655700021160000B855700031160000B955700021160000A556700031160000A756700021160000905A700031160000925A7000211600007B5E7000311600007C5E70002116000066627000311600006762700021160000516670003116000052667000211600003C6A7000311600003E6A700021160000276E700031160000296E700030000000240000000A0000002C07000043000001FC717000812B0000FE717000B12B000000727000C12B000001727000D12B00000272700030000000490A000043000001047270002116000019727000311600001A7270002116000003767000311600000576700021160000EE79700031160000F079700021160000AF7C700031160000B07C7000219D0000C97C700021160000CB7C700031160000CC7C700021160000D97D700031160000DA7D700021160000C481700031160000C681700021160000AF85700031160000B1857000211600009A897000311600009B89700021160000858D700031160000878D700021160000709170003116000072917000413E000007957000513E000008957000F1BC00000B957000613E00000C957000B14300001295700001BD00001495700081BD00001695700091BD000017957000B1BD00001A957000A12C00001C957000C12C00001D957000D12C00001E95700081A700002095700091A7000021957000A1A7000023957000B1A7000024957000B162000025957000D1A0000026957000019B000028957000E12C000029957000F12C00002A957000C1A700002B957000D1A700002D957000012D00002E957000812600002F957000112D000030957000F13B000032957000211A000033957000E102000035957000E1A700007495700000000000ACF200000600000008010000430000017A95700000000000ADF200000600000008010000430000018795700000000000ADF200000600000008010000430000019395700000000000ADF20000070000000801000043000001A0957000000000003903000043000001AA95700001A40000AC95700061130000AE95700021A50000AF95700031A50000B095700021B20000B295700021660000B395700051430000B695700031660000B795700000000000ACF20000060000000801000043000001BC95700000000000ADF20000060000000801000043000001C995700000000000ADF20000060000000801000043000001D695700041660000E195700061430000E2957000212D0000E5957000312D0000E9957000412D0000EA957000F1A70000F1957000512D0000F2957000F1360000F495700001370000F5957000D1360000F6957000E1360000F895700011370000F995700021370000FA95700031370000FB95700021050000FC957000310500006D96700061AF00006E9670000000000069F3000006000000080100004300000173967000915800007C967000D14200007F967000C158000080967000B1BF000082967000C1BF000083967000D1BF000085967000E1BF000089967000F1BF00008A967000A19800008B967000050000006E09000043000001F4967000240000000A0000002C0700004300000129C46E00812B00002BC46E00B12B00002DC46E00C12B00002EC46E00D12B00002FC46E0030000000490A00004300000131C46E002116000034C46E003116000035C46E002116000003C76E003116000005C76E0021160000EECA6E0031160000F0CA6E0021160000DFCE6E0031160000E0CE6E00219D0000F6CE6E0021160000F8CE6E0031160000F9CE6E0021160000C4D26E0031160000C6D26E0021160000AFD66E0031160000B0D66E00211600009ADA6E00311600009BDA6E002116000085DE6E003116000086DE6E002116000070E26E003116000071E26E00211600005BE66E00311600005CE66E002116000046EA6E003116000048EA6E0030000000240000000A0000002C070000430000013CEB6E00812B00003EEB6E00B12B000040EB6E00C12B000041EB6E00D12B000042EB6E0030000000490A00004300000144EB6E002116000047EB6E003116000048EB6E002116000031EE6E003116000033EE6E00211600001CF26E00311600001DF26E0021160000F1F56E0031160000F2F56E00219D00000BF66E00211600000EF66E00311600000FF66E0021160000F2F96E0031160000F4F96E0021160000DDFD6E0031160000DEFD6E0021160000C8016F0031160000C9016F0021160000B3056F0031160000B4056F00211600009E096F0031160000A0096F0021160000890D6F00311600008A0D6F002116000074116F003116000075116F0030000000240000000A0000002C070000430000014F126F00812B000051126F00B12B000053126F00C12B000054126F00D12B000055126F0030000000490A00004300000157126F00211600005A126F00311600005B126F00211600005F156F003116000061156F00211600004A196F00311600004C196F0021160000041D6F0031160000051D6F00219D00001E1D6F0021160000201D6F0031160000211D6F002116000020216F003116000022216F00211600000B256F00311600000C256F0021160000F6286F0031160000F8286F0021160000E12C6F0031160000E32C6F0021160000CC306F0031160000CD306F0021160000B7346F0031160000B8346F0021160000A2386F0031160000A3386F0030000000240000000A0000002C0700004300000162396F00812B000064396F00B12B000066396F00C12B000067396F00D12B000068396F0030000000490A0000430000016A396F00211600006D396F00311600006E396F00211600008D3C6F00311600008F3C6F002116000078406F003116000079406F002116000017446F003116000018446F00219D000031446F002116000033446F003116000034446F002116000063446F003116000065446F00211600004E486F003116000050486F0021160000394C6F00311600003A4C6F002116000024506F003116000026506F00211600000F546F003116000011546F0021160000FA576F0031160000FC576F0021160000E55B6F0031160000E65B6F0021160000D05F6F0031160000D25F6F0030000000240000000A0000002C0700004300000175606F00812B000077606F00B12B000079606F00C12B00007A606F00D12B00007B606F0030000000490A0000430000017D606F002116000080606F003116000081606F0021160000BB636F0031160000BD636F0021160000A6676F0031160000A7676F00211600002A6B6F00311600002B6B6F00219D0000446B6F0021160000466B6F0031160000476B6F0021160000916B6F0031160000936B6F00211600007C6F6F00311600007E6F6F002116000067736F003116000068736F002116000052776F003116000053776F00211600003D7B6F00311600003E7B6F0021160000287F6F0031160000297F6F002116000013836F003116000015836F0021160000FE866F0031160000FF866F0030000000240000000A0000002C0700004300000188876F00812B00008A876F00B12B00008C876F00C12B00008D876F00D12B00008E876F0030000000490A00004300000190876F002116000093876F003116000094876F0021160000E98A6F0031160000EA8A6F0021160000D48E6F0031160000D68E6F00211600003D926F00311600003E926F00219D000057926F002116000059926F00311600005A926F0021160000BF926F0031160000C1926F0021160000AA966F0031160000AC966F0021160000959A6F0031160000979A6F0021160000809E6F0031160000829E6F00211600006BA26F00311600006CA26F002116000056A66F003116000057A66F002116000041AA6F003116000043AA6F00211600002CAE6F00311600002DAE6F0030000000240000000A0000002C070000430000019BAE6F00812B00009DAE6F00B12B00009FAE6F00C12B0000A0AE6F00D12B0000A1AE6F0030000000490A000043000001A3AE6F0021160000A6AE6F0031160000A7AE6F002116000017B26F003116000018B26F002116000002B66F003116000003B66F002116000050B96F003116000051B96F00219D00006AB96F00211600006CB96F00311600006DB96F0021160000EDB96F0031160000EFB96F0021160000D8BD6F0031160000DABD6F0021160000C3C16F0031160000C5C16F0021160000AEC56F0031160000AFC56F002116000099C96F00311600009AC96F002116000084CD6F003116000086CD6F00211600006FD16F003116000071D16F00211600005AD56F00311600005CD56F0030000000240000000A0000002C07000043000001AED56F00812B0000B0D56F00B12B0000B2D56F00C12B0000B3D56F00D12B0000B4D56F0030000000490A000043000001B6D56F0021160000B9D56F0031160000BAD56F002116000045D96F003116000046D96F002116000030DD6F003116000031DD6F002116000063E06F003116000064E06F00219D00007DE06F00211600007FE06F003116000080E06F00211600001BE16F00311600001CE16F002116000006E56F003116000007E56F0021160000F1E86F0031160000F2E86F0021160000DCEC6F0031160000DDEC6F0021160000C7F06F0031160000C8F06F0021160000B2F46F0031160000B3F46F00211600009DF86F00311600009EF86F002116000088FC6F003116000089FC6F0030000000240000000A0000002C07000043000001C1FC6F00812B0000C3FC6F00B12B0000C5FC6F00C12B0000C6FC6F00D12B0000C7FC6F0030000000490A000043000001C9FC6F0021160000CCFC6F0031160000CDFC6F0021160000730070003116000075007000211600005E047000311600005F04700021160000760770003116000077077000219D000090077000211600009207700031160000&amp;lt;/Data&amp;gt; 
  &amp;lt;/EventData&amp;gt;
  &amp;lt;/Event&amp;gt;&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I'm happy to provide you with more infos if we you need them!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;- nebula&lt;/P&gt;</description>
      <pubDate>Thu, 30 Apr 2026 22:36:54 GMT</pubDate>
      <guid>https://community.intel.com/t5/Graphics/Inferencing-llama-cpp-crash-on-8626-Windows-on-8735/m-p/1746417#M151291</guid>
      <dc:creator>f4nt4</dc:creator>
      <dc:date>2026-04-30T22:36:54Z</dc:date>
    </item>
    <item>
      <title>Re:Inferencing - llama.cpp crash on 8626, Windows on 8735</title>
      <link>https://community.intel.com/t5/Graphics/Inferencing-llama-cpp-crash-on-8626-Windows-on-8735/m-p/1746442#M151297</link>
      <description>&lt;P&gt;Hi f4nt4,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;I'd really like to help you get this sorted out. To better understand what's going on with your system, could you help me out with some details?&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;About your system: What version of Windows are you running, and is everything up to date? Also, have you updated your BIOS recently, and if so, do you know which version you're currently on?&lt;/LI&gt;&lt;LI&gt;The graphics driver 8735 is an intel driver, have you tried using &lt;A href="https://support.hp.com/in-en/drivers/hp-omnibook-ultra-flip-14-inch-2-in-1-laptop-next-gen-ai-pc-14-fh0000/2102254404" rel="noopener noreferrer" target="_blank"&gt;HP Graphics Drivers&lt;/A&gt;? And when you mention crashes - does this happen with other GPU-intensive tasks like games or benchmarks, or just with your AI workloads?&lt;/LI&gt;&lt;LI&gt;Regarding your llama.cpp setup: What kind of models are you running (size, settings, etc.), and how much GPU memory does it typically use before things go sideways? Does it crash right away or only after you've been using it for a while? Are you using any custom settings or modifications?&lt;/LI&gt;&lt;LI&gt;Are you seeing any specific error messages? And have you tried other apps that use Vulkan or SYCL to see if it's just a llama.cpp that is crashing?&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Best regards,&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Louie Jay J.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Intel Customer Support Technician&amp;nbsp;&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Fri, 01 May 2026 02:44:06 GMT</pubDate>
      <guid>https://community.intel.com/t5/Graphics/Inferencing-llama-cpp-crash-on-8626-Windows-on-8735/m-p/1746442#M151297</guid>
      <dc:creator>Jay_Intel</dc:creator>
      <dc:date>2026-05-01T02:44:06Z</dc:date>
    </item>
    <item>
      <title>Re: Inferencing - llama.cpp crash on 8626, Windows on 8735</title>
      <link>https://community.intel.com/t5/Graphics/Inferencing-llama-cpp-crash-on-8626-Windows-on-8735/m-p/1746461#M151300</link>
      <description>&lt;P&gt;Hi Louie,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks for reaching out to me! &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Here the requested infos:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Windows: Win 11 25H2 - Build 26200.8246 (will try to update to 26H1 while you guys are sleeping &lt;span class="lia-unicode-emoji" title=":nerd_face:"&gt;🤓&lt;/span&gt;)&lt;/P&gt;&lt;P&gt;BIOS:&amp;nbsp;HP W75 Ver. 01.04.02, 09.12.2025 (latest HP is providing atm)&lt;/P&gt;&lt;P&gt;HP is providing&amp;nbsp;32.0.101.7026 Rev.B, which according to the Windows Device Manager, is the identical version I've downgrade to:&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="f4nt4_0-1777619490064.png" style="width: 400px;"&gt;&lt;img src="https://community.intel.com/t5/image/serverpage/image-id/72419i11E3B50D692C36A3/image-size/medium?v=v2&amp;amp;px=400&amp;amp;whitelist-exif-data=Orientation%2CResolution%2COriginalDefaultFinalSize%2CCopyright" role="button" title="f4nt4_0-1777619490064.png" alt="f4nt4_0-1777619490064.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;I've tried to reproduce the issue by running Furmark (v2.10.2 - latest version) and&amp;nbsp;also the VRAM stresstest with OCCT (v16.1.9 - latest version).&amp;nbsp;I additionally run the build-in RAM test of the HP BIOS and memtest86+ (v8.00).&lt;/P&gt;&lt;P&gt;Non of them triggered a crash or didn't found RAM related issues.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;These are my llama.cpp settings:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;llama-server.exe \
--port 5801 \
--threads 4 \
--no-mmap \
--mlock \
--parallel 1 \
--cache-type-k q8_0 \
--cache-type-v q8_0 \
--model w:\llm\unsloth_gemma-4-E2B-it-UD-Q4_K_XL.gguf \
--mmproj W:\llm\gemma-4-E2B\mmproj-BF16.gguf \
-ngl 999 \
--ctx-size 3000 \
-ub 512 \
--reasoning off \
--reasoning-budget 0 \
--cache-ram 0 \
--n-predict 2048 \
--no-cache-prompt&lt;/LI-CODE&gt;&lt;P&gt;So basically Gemma4 - E2B in a low quantized version, which is not really (v)RAM heavy with these settings, but I've tried many settings.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Also tried different RAM &amp;lt;-&amp;gt; vRAM allocation through the Intel Graphics Software, but also didn't helped.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;On driver version&amp;nbsp;8626, llama.cpp crashes every few inferencing tasks. Sometimes sooner, sometimes later.&lt;/P&gt;&lt;P&gt;On&amp;nbsp;8735 the screen becomes right away black, with a visible mouse cursor, but Windows is right away not responsible anymore and needs to be hard rebooted by holding the Power button for 5+ seconds.&lt;/P&gt;&lt;P&gt;It's regardless if it's a basic prompt like "write a 1000 word story" or something like audio transcription.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Memory consumption only goes up during model loading (~5GB) and a bit (~200MB) while prompt processing or inferencing. Again, nothing really (v)RAM heavy.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I haven't tested any other tools besides llama.cpp, but could try LM studio. But as LM studio is using the llama.cpp egine, I'm expecting the same result.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I sadly don't see any error message before the crashes. Neither through the debug logs of llama.cpp or within the Windows Event log, but I'm happy to take suggestions on how to get proper logs in such cases &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Hope this helps for the beginning!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Best&lt;/P&gt;&lt;P&gt;- nebula&lt;/P&gt;</description>
      <pubDate>Fri, 01 May 2026 07:35:19 GMT</pubDate>
      <guid>https://community.intel.com/t5/Graphics/Inferencing-llama-cpp-crash-on-8626-Windows-on-8735/m-p/1746461#M151300</guid>
      <dc:creator>f4nt4</dc:creator>
      <dc:date>2026-05-01T07:35:19Z</dc:date>
    </item>
    <item>
      <title>Re: Inferencing - llama.cpp crash on 8626, Windows on 8735</title>
      <link>https://community.intel.com/t5/Graphics/Inferencing-llama-cpp-crash-on-8626-Windows-on-8735/m-p/1746496#M151308</link>
      <description>&lt;P&gt;Hi Louie,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Thanks for reaching out to me! &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Here the requested infos:&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Windows: Win 11 25H2 - Build 26200.8246 (will try to update to 26H1 while you guys are sleeping &lt;span class="lia-unicode-emoji" title=":nerd_face:"&gt;🤓&lt;/span&gt;)&lt;/P&gt;&lt;P&gt;BIOS: HP W75 Ver. 01.04.02, 09.12.2025 (latest HP is providing atm)&lt;/P&gt;&lt;P&gt;HP is providing 32.0.101.7026 Rev.B, which according to the Windows Device Manager, is the identical version I've downgrade to:&lt;/P&gt;&lt;P&gt;I've tried to reproduce the issue by running Furmark (v2.10.2 - latest version) and also the VRAM stresstest with OCCT (v16.1.9 - latest version). I additionally run the build-in RAM test of the HP BIOS and memtest86+ (v8.00).&lt;/P&gt;&lt;P&gt;Non of them triggered a crash or didn't found RAM related issues.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;These are my llama.cpp settings:&lt;/P&gt;&lt;LI-CODE lang="bash"&gt;llama-server.exe \
--port 5801 \
--threads 4 \
--no-mmap \
--mlock \
--parallel 1 \
--cache-type-k q8_0 \
--cache-type-v q8_0 \
--model w:\llm\unsloth_gemma-4-E2B-it-UD-Q4_K_XL.gguf \
--mmproj W:\llm\gemma-4-E2B\mmproj-BF16.gguf \
-ngl 999 \
--ctx-size 3000 \
-ub 512 \
--reasoning off \
--reasoning-budget 0 \
--cache-ram 0 \
--n-predict 2048 \
--no-cache-prompt&lt;/LI-CODE&gt;&lt;P&gt;So basically Gemma4 - E2B in a low quantized version, which is not really (v)RAM heavy with these settings, but I've tried many settings.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Also tried different RAM &amp;lt;-&amp;gt; vRAM allocation through the Intel Graphics Software, but also didn't helped.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;On driver version 8626, llama.cpp crashes every few inferencing tasks. Sometimes sooner, sometimes later.&lt;/P&gt;&lt;P&gt;On 8735 the screen becomes right away black, with a visible mouse cursor, but Windows is right away not responsible anymore and needs to be hard rebooted by holding the Power button for 5+ seconds.&lt;/P&gt;&lt;P&gt;It's regardless if it's a basic prompt like "write a 1000 word story" or something like audio transcription.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Memory consumption only goes up during model loading (~5GB) and a bit (~200MB) while prompt processing or inferencing. Again, nothing really (v)RAM heavy.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;I haven't tested any other tools besides llama.cpp, but could try LM studio. But as LM studio is using the llama.cpp egine, I'm expecting the same result.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;I sadly don't see any error message before the crashes. Neither through the debug logs of llama.cpp or within the Windows Event log, but I'm happy to take suggestions on how to get proper logs in such cases &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Hope this helps for the beginning!&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Best&lt;/P&gt;&lt;P&gt;- nebula&lt;/P&gt;</description>
      <pubDate>Fri, 01 May 2026 16:34:52 GMT</pubDate>
      <guid>https://community.intel.com/t5/Graphics/Inferencing-llama-cpp-crash-on-8626-Windows-on-8735/m-p/1746496#M151308</guid>
      <dc:creator>f4nt4</dc:creator>
      <dc:date>2026-05-01T16:34:52Z</dc:date>
    </item>
    <item>
      <title>Re: Inferencing - llama.cpp crash on 8626, Windows on 8735</title>
      <link>https://community.intel.com/t5/Graphics/Inferencing-llama-cpp-crash-on-8626-Windows-on-8735/m-p/1746499#M151309</link>
      <description>&lt;P&gt;The post before should have been posted this morning, but guess I've missed the "Post Reply" button 🤷‍&lt;span class="lia-unicode-emoji" title=":male_sign:"&gt;♂️&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Meanwhile I wasn't able to update Windows to 26H1, as it seems to require a fresh install, at least until Microsoft officially makes it available as a Windows Update update.&lt;/P&gt;&lt;P&gt;But I'm now on Windows 11 Build 26200.8328.&lt;/P&gt;&lt;P&gt;I've also updated the Intel drivers to '32.0.101.8250' and I don't have any issue with this version.&lt;/P&gt;&lt;P&gt;Inferencing with llama.cpp runs flawlessly.&lt;/P&gt;</description>
      <pubDate>Fri, 01 May 2026 16:40:20 GMT</pubDate>
      <guid>https://community.intel.com/t5/Graphics/Inferencing-llama-cpp-crash-on-8626-Windows-on-8735/m-p/1746499#M151309</guid>
      <dc:creator>f4nt4</dc:creator>
      <dc:date>2026-05-01T16:40:20Z</dc:date>
    </item>
    <item>
      <title>Re: Inferencing - llama.cpp crash on 8626, Windows on 8735</title>
      <link>https://community.intel.com/t5/Graphics/Inferencing-llama-cpp-crash-on-8626-Windows-on-8735/m-p/1746501#M151310</link>
      <description>&lt;P&gt;ok, seems like there is some moderation going on. My post before this, doesn't show up. Here a screenshot of:&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="f4nt4_0-1777653734770.png" style="width: 400px;"&gt;&lt;img src="https://community.intel.com/t5/image/serverpage/image-id/72433i5BDD82CFD0ED7E47/image-size/medium?v=v2&amp;amp;px=400&amp;amp;whitelist-exif-data=Orientation%2CResolution%2COriginalDefaultFinalSize%2CCopyright" role="button" title="f4nt4_0-1777653734770.png" alt="f4nt4_0-1777653734770.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 01 May 2026 16:42:35 GMT</pubDate>
      <guid>https://community.intel.com/t5/Graphics/Inferencing-llama-cpp-crash-on-8626-Windows-on-8735/m-p/1746501#M151310</guid>
      <dc:creator>f4nt4</dc:creator>
      <dc:date>2026-05-01T16:42:35Z</dc:date>
    </item>
    <item>
      <title>Re:Inferencing - llama.cpp crash on 8626, Windows on 8735</title>
      <link>https://community.intel.com/t5/Graphics/Inferencing-llama-cpp-crash-on-8626-Windows-on-8735/m-p/1746534#M151316</link>
      <description>&lt;P&gt;Hi f4nt4,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Thank you for your response, just to confirm, you were previously using version 8735, which was causing an issue with llama.cpp, and this 32.0.101.8250 graphics driver is the latest version from HP, correct? And with this updated driver, you're no longer experiencing any issues?&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Best regards,&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Louie Jay J.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Intel Customer Support Technician&amp;nbsp;&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Sat, 02 May 2026 03:20:55 GMT</pubDate>
      <guid>https://community.intel.com/t5/Graphics/Inferencing-llama-cpp-crash-on-8626-Windows-on-8735/m-p/1746534#M151316</guid>
      <dc:creator>Jay_Intel</dc:creator>
      <dc:date>2026-05-02T03:20:55Z</dc:date>
    </item>
    <item>
      <title>Re: Re:Inferencing - llama.cpp crash on 8626, Windows on 8735</title>
      <link>https://community.intel.com/t5/Graphics/Inferencing-llama-cpp-crash-on-8626-Windows-on-8735/m-p/1746582#M151327</link>
      <description>&lt;P&gt;Hi Louie / Jay,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Yes, I was previously using&amp;nbsp;8735.&lt;/P&gt;&lt;P&gt;The latest driver HP is offering is&amp;nbsp;7026.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Here an overview of which version I've tested and how it behaves:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;TABLE border="1" width="100%"&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD width="50%" height="25px"&gt;Version&lt;/TD&gt;&lt;TD width="50%" height="25px"&gt;Behavior&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD width="50%" height="25px"&gt;7026&lt;/TD&gt;&lt;TD width="50%" height="25px"&gt;No issues&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;8250&lt;/TD&gt;&lt;TD&gt;No issues&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;8626&lt;/TD&gt;&lt;TD&gt;llama.cpp crashes&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;8735&lt;/TD&gt;&lt;TD&gt;Screen gets black + Windows freezes&lt;/TD&gt;&lt;/TR&gt;&lt;TR&gt;&lt;TD&gt;8737 (yesterday released)&lt;/TD&gt;&lt;TD&gt;Either same as&amp;nbsp;8626 or&amp;nbsp;8735&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I've dug in the Windows eventlog a bit around and could obtain some logs of the last llama.cpp crash on&amp;nbsp;8737. Please find it attached.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Hope this helps!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Best&lt;/P&gt;&lt;P&gt;- f4nt4&lt;/P&gt;</description>
      <pubDate>Sun, 03 May 2026 09:35:32 GMT</pubDate>
      <guid>https://community.intel.com/t5/Graphics/Inferencing-llama-cpp-crash-on-8626-Windows-on-8735/m-p/1746582#M151327</guid>
      <dc:creator>f4nt4</dc:creator>
      <dc:date>2026-05-03T09:35:32Z</dc:date>
    </item>
    <item>
      <title>Re:Inferencing - llama.cpp crash on 8626, Windows on 8735</title>
      <link>https://community.intel.com/t5/Graphics/Inferencing-llama-cpp-crash-on-8626-Windows-on-8735/m-p/1746620#M151340</link>
      <description>&lt;P&gt;Hi f4nt4,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-size: inherit;"&gt;Thank you for sharing all of this information, it will certainly help as we move forward with our investigation. Could you please also check whether the logs were attached correctly in the ZIP file? Upon review, it appears that the ZIP file does not currently contain any log files.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-size: inherit;"&gt;Additionally, to have a better understanding of your system configuration and components please generate a complete copy of the System Support Utility (SSU) report. Please follow instructions here and send the report -&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://www.intel.com/content/www/us/en/support/articles/000057926/memory-and-storage.html" rel="noopener noreferrer" target="_blank" style="font-size: inherit;"&gt;&lt;EM&gt;How to get the Intel® System Support Utility Logs on Windows*&lt;/EM&gt;&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;We look forward to your response.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Best regards,&lt;/P&gt;&lt;P&gt;Jed G.&lt;/P&gt;&lt;P&gt;Intel Customer Support Technician&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Mon, 04 May 2026 01:46:47 GMT</pubDate>
      <guid>https://community.intel.com/t5/Graphics/Inferencing-llama-cpp-crash-on-8626-Windows-on-8735/m-p/1746620#M151340</guid>
      <dc:creator>JedG_Intel</dc:creator>
      <dc:date>2026-05-04T01:46:47Z</dc:date>
    </item>
  </channel>
</rss>

