Intel® ARC™ Graphics
Get answers to your questions or issues when gaming on the world’s best discrete video cards with the latest news surrounding Intel® ARC™ Graphics
1535 Discussions

The Arc GPU is having a huge problem right now

Suzie1818
New Contributor I
3,252 Views

@RonaldM_Intel 

 

After numerous tests, it is confirmed that current Arc's drivers deliver the GPU's performance mostly in proportion to the paired CPU's performance. This leads to a huge problem that all the reviews and claims on the Arc GPU become meaningless and invalid because most of the reviewers were using the latest top-end CPU in their tests. The advertisement saying the A750 has higher performance-per-cost value than RTX3060 can become a fraud because when paired with mainstream CPUs or even lower-end budget CPUs, Arc GPU's performance would drop drastically compared to the condition when it is paired with top-end CPUs for testing in your lab. Please note that this phenomenon is abnormal and typically does not occur on current Nvidia or AMD GPUs.

 

I believe this should be the first priority issue that you guys should fix. Right now not many people are aware of this, but if all the big-name reviewers online find this problem and reveal it with either articles or videos, intel's reputation will be terribly stained and you guys' journey of developing future Arc GPUs will become very tough.

0 Kudos
10 Replies
Hugo_Intel
Moderator
3,173 Views

Hello Suzie1818


Thank you for posting on the Intel® ARC™ Graphics Communities.


We really appreciate your feedback and bringing this topic to our attention. We would like to pass this information on to the corresponding team along with the information already provided by @AjrAlves. We would first to get more information from you, Could you please share links from where you saw this information or short videos in case these tests were performed by you?


Best Regards,


Hugo O.

Intel Customer Support Technician.


0 Kudos
Suzie1818
New Contributor I
3,156 Views

@Hugo_Intel  

@RonaldM_Intel 

 

All the test results and evidences had already been put in this thread . Read through thoroughly.

 

Other evidences came from discussions among many people on reddit (r/IntelArc). It's not possible for me to dig out everything there and give you links. If you guys don't care and don't check out these major communities every day yourselves, I have nothing more to comment but just believe more firmly that you guys don't have the passion and enthusiasm in your own work and product.

0 Kudos
RonaldM_Intel
Moderator
3,149 Views

@Suzie1818 thanks for the feedback, as I noted in this thread we are continuously working to improve performance with DirectX9 and DirectX 11 titles as they have shown plenty of room for improvement since the Arc Graphics cards launched back in October 2022.

I would prefer not to generalize the situation and report issues title by title as platform, software and hardware differ from user to user.

Rest assured that we are cooking new driver updates that will see significant improvements, please stay tuned for the upcoming months.

 

Thank you for supporting Intel products.

 

Best Regards,

Ronald M. 

0 Kudos
Suzie1818
New Contributor I
3,135 Views

@RonaldM_Intel 

 

Right now the problem is not only confined in DX9 and DX11. Hopefully you have read thoroughly all the evidences I provided in the other thread. Even the best scenario in favor of Arc, the 3DMark Time Spy (DX12) synthetic benchmark, where the Arc A770 has already achieved a performance level equal to an RTX3070, manifested the signature and proved that the Arc's driver is infected with the disease of CPU overhead (GPU performance being [dependent on/in proportion to] CPU performance in a 100% GPU bound scenario). You guys must take it seriously because such an issue IS very serious/terrible when it comes to this extent. 

0 Kudos
Suzie1818
New Contributor I
3,119 Views

@RonaldM_Intel 

 

From my point of view, I see it as a deadly poison penetrating deeply into your bones. You guys must do nothing but seek the remedy to get rid of it as soon as possible.

0 Kudos
Suzie1818
New Contributor I
3,043 Views

@RonaldM_Intel 

 

News: Raja Koduri, the architecture designer of Arc GPU, has just left intel.

 

This has led me to start wondering and even suspecting this issue of GPU performance being dependent on CPU is an architechtural mistake by design from the very beginning and might be impossible to be fixed by the software/driver forever.

 

If my suspicion turns out to be the truth, the first generation of Arc GPU would become the worst joke in the GPU history.

0 Kudos
Hugo_Intel
Moderator
3,006 Views

Hello Suzie1818


Thank you for your feedback. I see that @RonaldM_Intel already provided important input related to this feedback that you are sharing with us. I will pass this information to the corresponding team and I will post back in case there is more information that we can share with you.


Best Regards,


Hugo O

Intel Customer Support Technician.


0 Kudos
Suzie1818
New Contributor I
2,953 Views

@RonaldM_Intel 

@Hugo_Intel 

 

Just one more obvious evidence for you. Please see the attached screenshots. It's the built-in benchmark of "Call of Duty: Modern Warfare 2", the bundled game I got from purchasing the Arc GPU. 

 

This time I just switched my A770 between my two PCs. I simply used the same settings for the two tests. The difference on performance due to different CPU was quite obvious. Both systems use DDR4-3400 RAM.

 

You can try using your i9-13900K + A770 combo to run this benchmark. I think the result score will be ironically high. Keep in mind that this is a 99% GPU bound scenario as you can clearly see on the screenshots. It is very unreasonalble for our Arc GPU to get such different performances due to using different CPUs. There must be something terribly wrong either in the driver or in the macro architechture of the GPU's processing pipeline.

0 Kudos
Suzie1818
New Contributor I
2,876 Views

@RonaldM_Intel 

@Hugo_Intel 

 

You can try using your i9-13900K + A770 combo to run this benchmark. I think the result score will be ironically high. 


According to the test results with AC:Origins in the other thread, I am guessing you probably can get a final score of 93fps in this benchmark with COD:MW2 using an i9-13900K CPU. Do you get what I mean? It means the A770's hardware is actually capable of rendering 93fps (average) in this benchmark (if my guessing turns out to be correct), and then theoretically you should get that many fps regardless of what CPU you use as long as it remains a 99% GPU bound condition, including my i5-12400 and i5-13500 and even lower ends like i3-12100. This is simply how it works you can see on Nvidia or AMD GPUs. It should look like this , this , and this .

 

The ultimate question from potetial buyers of Arc GPU: "why should I get lower fps when my CPU is slower than i9-13900K? It just doesn't make any sense! If that's the case, I don't want to buy it anymore because it is a cheat!" See? This is the problem!

 

On the contrary, when you want to benchmark CPUs' gaming performance, you need to create a CPU bound scenario by using top-end GPUs like RTX4090 or RX7900XTX (or RTX3090Ti/RX6900XT for the previous generation) and set to low resolution like 1920×1080 (or even 1280×720) and use lowest graphics settings and result in GPU frametime being much lower than CPU frametime so that the CPU becomes the bottleneck. However, this is absoutely not the normal gaming scenario we are talking about here.

0 Kudos
Reply