Intel® Optimized AI Frameworks
Receive community support for questions related to PyTorch* and TensorFlow* frameworks.
Announcements
FPGA community forums and blogs on community.intel.com are migrating to the new Altera Community and are read-only. For urgent support needs during this transition, please visit the FPGA Design Resources page or contact an Altera Authorized Distributor.
89 Discussions

Error: llama runner process has terminated: exit status 0xc0000135

Balaji173
Beginner
15,299 Views


I followed the below document to run the ollama model in GPU using Intel IPEX

https://github.com/intel-analytics/ipex-llm/blob/main/docs/mddocs/Quickstart/ollama_quickstart.md

https://www.intel.com/content/www/us/en/content-details/826081/running-ollama-with-open-webui-on-intel-hardware-platform.html

I couldn't get the inference from the model.

Error: llama runner process has terminated: exit status 0xc0000135
can anyone solve the issue

0 Kudos
1 Solution
Srii
Employee
15,122 Views

Please open an issue on the ipex-llm github page. Here is the link : https://github.com/intel-analytics/ipex-llm/issues


View solution in original post

0 Kudos
1 Reply
Srii
Employee
15,123 Views

Please open an issue on the ipex-llm github page. Here is the link : https://github.com/intel-analytics/ipex-llm/issues


0 Kudos
Reply