Graphics
Intel® graphics drivers and software, compatibility, troubleshooting, performance, and optimization
Announcements
For support on Altera products please visit the Altera Community Forums.
23489 Discussions

Arc B580 and llm-scaler

mathz
Beginner
2,208 Views

Does llm-scaler support Arc B580?

I'd like to add another GPU to my B60 to have a bit more VRAM and B580 seems to be the best value option, if it works that is 😊

0 Kudos
8 Replies
Mike_Intel
Moderator
2,153 Views

Hello mathz,


Thank you for posting in Intel community Forum.


For me to better understand and diagnose the issue further, let me ask you to provide detailed responses to the following questions. This information will help me isolate the problem and determine the most appropriate course of action moving forward.


  1. May I know why do you want to have this configuration?
  2. Does your board support this configuration?


If you have questions, please let us know. Thank you.


Best regards,

Michael L.

Intel Customer Support Technician


0 Kudos
mathz
Beginner
2,136 Views

Hi Mike,

 

thank you for your response.

Yes my motherboard easily support multiple GPUS (I've got a Arock Rack Romed8-2T). And I'd like to use it for inference.

 

Best regards,

Mathz

 

0 Kudos
mathz
Beginner
2,131 Views

Forgot to mention: I'm using llm-scaler at the moment, so my main conern if is llm-scaler can recognize B580. 

0 Kudos
JedG_Intel
Moderator
2,027 Views

Hi mathz,


Thank you for sharing all this information. I'll check all the details and I'll get back to you as soon as possible.


Best regards,

Jed G.

Intel Customer Support Technician



0 Kudos
Mike_Intel
Moderator
1,914 Views

Hello mathz,


Thank you so much for patiently waiting for our update.


As for your inquiry, here is my update:


Support for using two Arc GPUs for LLM inferencing is available through IPEX-LLM. For further details, you may also check these links:


Multi Intel GPUs selection — IPEX-LLM latest documentation

IPEX-LLM — IPEX-LLM latest documentation

ipex-llm/python/llm/example/GPU/Deepspeed-AutoTP/README.md at main · intel/ipex-llm · GitHub


If you have questions, please let us know. Thank you.


Best regards,

Michael L.

Intel Customer Support Technician


0 Kudos
mathz
Beginner
1,864 Views

Hi Michael,

 

thank you for your response.

Unforunatly it does not answer my question.

I'm aware that llm-scaler can utilize multiple GPU's. The question is if it is an viable option to combine B60 and B580 in one setup.

Since both have the same architecture, I can go into more detail about their different VRAM capacities and rephrase the question:


Can you use the entire VRAM of both cards if they have different VRAM capacities, or does the software fall back on the lowest common denominator (in this case 12 GB, for a total of 24 GB instead of the available 36 GB)?

 

Best regards,

Mathz

0 Kudos
Mike_Intel
Moderator
1,734 Views

Hi mathz,

 

Thank you for the clarification. Let me check this again and update this thread once I have an update.

 

If you have questions, please let us know. Thank you.

 

Best regards,

Michael L.

Intel Customer Support Technician

 

0 Kudos
Mike_Intel
Moderator
1,660 Views

Hi mathz,

 

I hope you are having a good day.


Upon further checking, I am really sorry because we have very limited information in regard to this configuration.


However, the expert in this domain is via GitHub page below.

ipex-llm/python/llm/example/GPU/Deepspeed-AutoTP/README.md at main · intel/ipex-llm · GitHub

Intel® Corporation · GitHub

 

Since you are now being directed to GitHub. I need to close this inquiry. 

If you need further assistance, please post a new question as this thread will no longer be monitored. 


Thank you and have a great day. 


Best regards,

Michael L.

Intel Customer Support Technician


0 Kudos
Reply