Intel® ARC™ Graphics
Get answers to your questions or issues when gaming on the world’s best discrete video cards with the latest news surrounding Intel® ARC™ Graphics
1613 Discussions

Game bug: Assassin's Creed Origins

Suzie1818
New Contributor I
17,417 Views

Assassin's Creed Origins is still unplayable with the latest 4032 driver. Seems like the intel dev team haven't taken any care of it ever since launch of the Arc GPU. Ironically, this is game is one of the few games properly listed with title picture in the ArcControl Game Library.

Labels (2)
37 Replies
Jean_Intel
Employee
8,880 Views

Hello Suzie1818,

 

Thanks for waiting for a response.

 

We will try to improve Vulkan's performance; however, we can't commit to any particular outcome or ETA. For now, it is best if you run the game on DX11.

 

Best regards,

Jean O.

Intel Customer Support Technician


0 Kudos
Suzie1818
New Contributor I
8,410 Views

Hello @Jean_Intel 

 

Thanks for keeping following up on this thread.

 

Unfortunately, I have just tried the latest BETA driver version 4123 on this game, and there is no improvement (DX11) at all.

0 Kudos
Jean_Intel
Employee
8,388 Views

Hello Suzie1818,

 

We appreciate the heads-up and the constant feedback you shared on our communities.

 

We will look further into this scenario and provide a response at our earliest convenience.

 

Best regards,

Jean O.

Intel Customer Support Technician


0 Kudos
Jean_Intel
Employee
8,378 Views

Hello Suzie1818,

 

This time we would like to recommend you use the DX11 for this game as it is with driver 101.4091. Note that in driver 101.4091, we introduced major fixes to improve the performance of this API; here is a video from our Lab that shows the game as playable with high performance. We used the game's own benchmark. 

 

Video link: https://youtu.be/F9pJgMABglw

 

Best regards,

Jean O.

Intel Customer Support Technician


0 Kudos
Suzie1818
New Contributor I
8,367 Views

@Jean_Intel  wrote:

 

Note that in driver 101.4091, we introduced major fixes to improve the performance of this API;

 

here is a video from our Lab that shows the game as playable with high performance. We used the game's own benchmark. 

 

Video link: https://youtu.be/F9pJgMABglw

 


Hey ma'am,

 

Please see the attached screenshots. No improvement, okay? I just did it with the "High" graphic quality preset.

 

In the video you just shared, we see it got only 65 fps average, and you call it "high performance"??? Besides, the video revealed the cloven foot. How come the CPU turned out to be an i9-13900K? What the $&#**^%_^#((......! Who on earth would pair a $600 top-end CPU with a $350 mid-range GPU? The benchmark was not reflecting the practical fact and thus meaningless.

 

Even with the 13900K, you still got 9ms average CPU frame-time, which is way too high for a top-end CPU, you know? That obviously indicated your driver has severe overhead problem and was dragging down the CPU. And that is why Arc's performance is so dependent on CPU that many Arc users with matching mid-range CPUs have been suffering from performance under expectation.

 

In addition, as I have mentioned earlier,


I had used an RTX3060 and it scored 83 average fps with "Very High" setting in this game, and also I had used an RTX3070 and it scored 98 average fps with "Ultra High" setting in this game.

Hey, the scores were achieved with exactly the same i5-12400 I am using right now, a mere i5-12400, okay? With the only 65fps you've got, you let the A770 not even able to match up with an RTX3050, let alone comparing to an RTX3060. Do you think intel lives up to the promises it made to us early adoptors/supporters of your first-generation Arc?

0 Kudos
Suzie1818
New Contributor I
8,297 Views

@Jean_Intel wrote:

 

Note that in driver 101.4091, we introduced major fixes to improve the performance of this API



I think the driver dev team do deserve some praise as there indeed has been some apparent improvement as shown in the screenshots attached.

 

Nevertheless, you guys still have a long way to go before the driver can really achieve some satisfactory performance.

0 Kudos
Suzie1818
New Contributor I
8,094 Views
Jean_Intel wrote:

Hello Suzie1818,

 

This time we would like to recommend you use the DX11 for this game as it is with driver 101.4091. Note that in driver 101.4091, we introduced major fixes to improve the performance of this API; here is a video from our Lab that shows the game as playable with high performance. We used the game's own benchmark. 

 

Video link: https://youtu.be/F9pJgMABglw

 


@RonaldM_Intel 

 

With the same graphic settings and the same GPU (A770), you got average 65 fps with an i9-13900K but I got average 50fps with an i5-12400F. (65-50)/50=30%, which is very close to the actual difference on single-core performance between the two CPUs, and it is actually very ridiculous for a GPU's performance to be so much CPU dependent. This just reveals how badly the current driver is programmed. You guys really need to dig into the problem. What on earth caused such obvious overhead issue?

 

Recently I just read a review here , and they managed to get 15141 3DMark Time Spy Graphics Score with only a moderate overclocking on the A770's boost clock to 2580MHz and the maximum core power consumption only hit 230W throughout their benchmark process. You know, I have only got at most 14638 in the same benchmark score with an extreme overclocking at a boost clock of 2750MHz and the core power consumption at constant 252W throughout the benchmark (my A770 is also the Acer's BiFrost variant). The only difference is that they used an i9-12900K CPU and my CPU is an i5-12400. This evidence again shows that even in the best scenario, the 3DMark Time Spy benchmark, the Arc's driver still obviously suffers from the CPU overhead problem. You know, this kind of phenomenon just won't happen on an Nvidia or an AMD GPU.

0 Kudos
Jean_Intel
Employee
8,347 Views

Hello Suzie1818,

 

Once again, we appreciate the feedback you provide us.

 

We will discuss the information you share and will check this matter internally.

 

Best regards,

Jean O.

Intel Customer Support Technician


0 Kudos
Suzie1818
New Contributor I
8,315 Views

To show the problem more clearly, I did some more tests.

Please see the attached screenshots taken from in-game benchmark results. The PC runnig the tests had i5-12400F CPU and RTX3080 GPU. 

As you can see the CPU frame time with even only a 12400 is much lower than the result with your 13900K (+ Arc combo).

0 Kudos
Suzie1818
New Contributor I
8,314 Views

I happened to have done some experiments the other day and got some findings that clearly show the overhead problem of the Arc driver.

What I did was switching the CPU between my two PCs. One has i5-12400F and the other has i5-13500.

The discrete GPUs installed on the two PCs are RTX3080 and A770 respectively.

Please see the attached screenshots, which were taken with Horizon Zero Dawn's in-game benchmark. As you can see, on the machine with the RTX3080, the performance was basically identical regardless of CPU. In contrast, with the A770, the performance showed great dependency on CPU. Also, you will notice that the CPU frame time (=1000÷fps milliseconds) became unreasonably high with the A770 compared to that with the RTX3080. I think these have been quite clear evidences that the Arc's driver has serious overhead problem, and they suggested a reason why many Arc users with mid-range CPUs have been suffering from performance under expectation.

0 Kudos
RonaldM_Intel
Moderator
8,289 Views

@Suzie1818 

I see your findings and I'll proceed to re-test this with an i5 processor (just ordered some to be shipped to me) and my A750 and A770.

Once I have more data I'll post it here (bare with me as I'm juggling other issues so 1-2 weeks approximately)

 

Best Regards,

Ronald M.

0 Kudos
Suzie1818
New Contributor I
8,220 Views

@RonaldM_Intel 

 

I just did another test for you. I ran the in-game benchmark of "Shadow of the Tomb Raider" on my two PCs. As mentioned above, one is i5-12400F/A770 and the other is i5-13500/RTX3080.

Please see the attached screenshots. Both environments used the same graphic settings as shown. 

Compared with "Horizon Zero Dawn," "Shadow of the Tomb Raider" is obviously much less CPU intensive, but the graphic score of the A770 in this game was still only ~52% of the RTX3080, which is only comparable to an RTX3060. Besides, the CPU frametime (=1000÷fps milliseconds)  on the two test machines showed abnormally big difference, which again indicated that the Arc's driver has severe overhead problem snatching CPU resources excessively. In the previous test with "Horizon Zero Dawn," which is a particularly CPU intensive game, the A770, when paired with an i5-12400 CPU, got only ~43% graphic score compared with the RTX3080, which was only comparable to an RTX2060. So pathetic.

Please note that "Horizon Zero Dawn" and "Shadow of the Tomb Raider" are both DX12 games, which should be what the Arc GPU is really good at. (The A770 can perform much worse in DX11 games such as the topic of this thread, Assassin's Creed Origins.) However, even in these favorable conditions, the benchmark scores still revealed that the Arc's driver leaves much to be desired.

Allow me to reiterate that when you are testing to see the results during the process of developing new drivers, you should not use a top-end CPU like i9-13900K, which just helps to conceal the real shortcoming/problem of the driver, and thus you cannot get the clear picture of the weak spot that should be fortified first. In addition, it is just way too unreasonable to pair an Arc A7 GPU with that kind of CPU when you build a PC in real life. You can do it for fun in your lab but not in real life.

Also as I have seen in the reddit IntelArc community, many people consider the Arc A7 GPU as one of their upgrade options for their old PC. Please take this into consideration. If the Arc's driver is so much CPU dependent as it currently is, you can imagine what disaster it will bring to old systems, and this will result in people turning to other GPU brands because they can provide nearly the same performance on old CPUs.

0 Kudos
RonaldM_Intel
Moderator
7,996 Views

@Suzie1818 I apologize for the late reply.

I was able to set 2 systems (i9-13900K and i5-13400) and saw the different results in performance with Assassins Creed Origins as you pointed out. 

I already reported this to the driver debug team for further investigation. We are working diligently to improve the performance of our driver (as noted by multiple specialized tech media over the past 4 months) and rest assured that more improvements are on its way.

 

Thank you for the time you have taken performing multiple tests and reporting back to us.

 

Best Regards,

Ronald M.

 

0 Kudos
Suzie1818
New Contributor I
7,990 Views

@RonaldM_Intel 

 

Thank you for your diligence of repeating the same experiment yourself.

 

As fas as I understand, DX11 is a big headache because it requires per-title optimization on the driver. Intel got the short end of the stick because unlike Nvidia or AMD, intel has been outside of the field of mainstream gaming GPU for a couple of decades. It is unfair to ask you guys to catch up with the work they have done through so many years in mere months.

 

In fact, I really won't mind if you end up with a decision of giving up on this title AC:Origins. That is because this is just one single game after all, but there are countless old games out there, and it might take another decade for you to optimize the driver to fit them all, which just sounds bizarre and impractical. It is impossible to please everybody, isn't it?

 

Right now what I actually care much more is a general problem which I pointed out in the other thread . I do hope you guys can recognize the issue and fix it as soon as possible.

0 Kudos
RonaldM_Intel
Moderator
3,472 Views

Hello @Suzie1818 

This issue is fixed now with driver 31.0.101.5379. Please try it out and let me know your feedback.

 

Best Regards,

Ronald M.

0 Kudos
Suzie1818
New Contributor I
3,445 Views

Hello @RonaldM_Intel

Thank you for keeping me posted even it's a year after the discussion.

I hope you are doing well.

 

I have tried the latest driver v5379. Its performance is quite good. Not perfect yet, though. I have already given some feedback on IGCIT.

 

Driver efficiency is now one of the biggest challenges of ARC GPU. I appreciate the driver team's persistence in the hard work.

0 Kudos
Freakezoit
Beginner
3,408 Views

Sry but i have to say it is better but not fully fixed - There is still an Cpu Overhead and some issues performance wise.

 

I have done 2 separate test`s (Origins internal Benchmark) :

 

1. 1920x1080 - Ultra Details DX11 with capFrameX

Average fps :  76.7

Between 40 and 70 Sec. after the benchmark starts, there is an massive drop in average FPS

 

2. 1920x1080 - Ultra Details DXVK (2.3) with capFrameX

Average fps :  95.1

Between 40 and 70 Sec. after the benchmark starts - no drop in fps here!

Overall Fps are more evenly.

 

Testsystem:

Msi B550 Gaming Plus - beta Bios 1.83 (with rebar support) - Latest chipset drivers Installed.

AMD 5900X (PBO +200)

Intel Arc A770 LE - Driver 5379 - Bios 1068  (Rebar enabled)

Gpu Clock @ Load 2478Mhz

32GB DDR 4 - 3800Mhz Dual rank - CL 18.

Acer 2 TB industrial NVME PCIE 3.0x4 SSD (Game Drive)

OS : Win10 Home  - latest updates installed.

 

Attached 7z files contain (internal benchmark file and capFrameX Json)

 

 

 

0 Kudos
Reply