Intel® Optimized AI Frameworks
Receive community support for questions related to PyTorch* and TensorFlow* frameworks.
86 Discussions

Error: llama runner process has terminated: exit status 0xc0000135

Balaji173
Beginner
7,766 Views


I followed the below document to run the ollama model in GPU using Intel IPEX

https://github.com/intel-analytics/ipex-llm/blob/main/docs/mddocs/Quickstart/ollama_quickstart.md

https://www.intel.com/content/www/us/en/content-details/826081/running-ollama-with-open-webui-on-intel-hardware-platform.html

I couldn't get the inference from the model.

Error: llama runner process has terminated: exit status 0xc0000135
can anyone solve the issue

0 Kudos
1 Solution
Srii
Employee
7,589 Views

Please open an issue on the ipex-llm github page. Here is the link : https://github.com/intel-analytics/ipex-llm/issues


View solution in original post

0 Kudos
1 Reply
Srii
Employee
7,590 Views

Please open an issue on the ipex-llm github page. Here is the link : https://github.com/intel-analytics/ipex-llm/issues


0 Kudos
Reply