- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
please.
When Intel will be released drivers for GMA X3000 (HW T&L and vertex shader 2.0 or 3.0)?
in February?
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
prashu162:ohh so i guess we have to wait much longer for the final GMA X3000 drivers
Wait, what do you mean by "final"? Driver updates happen all the time to fix bugs and the like. "Final" means "no more improvements". I don't think you want this.
If you mean "final" as in "supports HW T&L officially", you've already got it: 14.31 and 15.6 and onwards.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Agent_J:
do u/does anyone know what the time frame is for the 14.32/15.7 drivers to be released?
will 15.7 finally give X3000/X3100 Vista users HW T&L/Vertex Shader support?
There is a bug in the 15.6 drivers which gets repaired in 15.7 regarding whether HW T&L gets reported properly to some applications:
BugID: 2446955, |
Add VertexProcessingCaps (FFTnL caps) to reported caps for Windows Vista* |
D3D |
Windows Vista*, |
Intel G965 Express Chipset, Intel GM965 Express Chipset |
This is due to some issues with how the Microsoft Direct3D9 runtime reports HW capabilities to querying applications, and I suspect itaddresses your concern.
As for the "when", the roadmaps I've seen imply mid-to-late October, +/- a couple weeks.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
My understanding is that per the bug listed above,software which is querying the capabilities of the graphics chipset through theIDirect3D9::GetDeviceCaps function call is getting back a response indicating the HW T&L capability doesn't exist, so software emulation is enabled instead. For the IDirect3D9Device::GetDeviceCaps call, the response is accurate.
Software whichuses...Device::...would, presumably, not exhibit this behavior.Having the capability and not reporting that it exists is probably fundamentally indistinguishable from not having the capability at all inmany if not most situations, so this is definitely a problem.
I'd follow the RSS feed (http://feeds.downloadcenter.intel.com/rss/?p=2842〈=eng) and wait for 15.7 to show up. Tedious, yes, but unless a 15.6.3 is announced it sounds like the next available.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I sure hope they fix whatever is wrong with the new drivers, because it would suck to stay on old drivers with no T&L etc. because it makes alot of games unplayable.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
whatever changes that were made to the 15.6.1 vista drivers seem to completely destroy gaming performance BIG TIME! Here are my benchmarks:
3DMark2001SE
15.4.3 5562
15.6 beta 4208
15.6 final 4310
15.6.1 2872
51.6% decrease in performance from 15.4.3
3dmark03
15.4.3 1652
15.6 beta 1269
15.6 final 1710
15.6.1 963
58.2% decrease in performance from 15.4.3
3dmark05
15.4.3 867
15.6 beta 671 *Force software vertex shader off
15.6 final 882 *Force software vertex shader on
15.6.1 459 *Force software vertex shader on
52.9% decrease in performance from 15.4.3
3dmark06
15.4.3 529
15.6 beta 425 *Force software vertex shader off
15.6 final 493 *Force software vertex shader on
15.6.1 359 *Force software vertex shader on
67.9% decrease in performance from 15.4.3
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I'm starting to think i screwed myself by going Celeron with the X3100 because the CPU is at 100% and gives me 1 FPS when playing EVE-Online and theres 30 ships it has to render. I hope the 15.6.2 is gonna help on the FPS with T&L, if not it still might be easier to sell this laptop when i can say it supports T&L hehe.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Will there be a release this month?????:-(
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Agent_J: I know some other Vista user that gets higher scores in 3dmark06 than you(imperial from VR-Zone), btw. Much too high.
http://www.computerbase.de/forum/showthread.php?t=309411
The thread also gets much higher 3dmark06 than you do. Maybe you want to check something out :P.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
JayPizzle:I'm on a Celeron 530, but I do not have any problems with Warcraft 3 on the 15.4.4 drivers, so that is not the problem, but the drivers that Intel releases, or right now is not releasing, are the problem :(
How about on XP?? You said you had much lower performance with even the XP drivers. I sure don't have that. I know Vista drivers have a problem, but you also mentioned XP.
Agent_J, are you sure you are running a Core 2 Duo laptop?? I have posted numerous times about your score being way too low even for Vista on 3dmark scores. Like I said, you can only get 350 on 3dmark06 while other users with Core 2 Duo and Vista report 500+. I would suggest you check out what is up with your system first.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Warcraft 3 had no changes in graphics quality but ran alot smoother than it does on Vista with the 15.4.4 and 15.6.1 where the 15.6.1 driver makes it unplayable as mentioned earlier. EVE - Online ran the same way as Warcraft 3.
I'm starting to think that with XP, the Hardware T&L is enabled, and the Celeron isn't powerfull enough to handle that, so the graphics quality increases, but the FPS drops. But from what i've heard, theres some problems with T&L hardware/software support which is not triggering correctly depending on what will run best in the drivers, so i'll stick it out for the new drivers, otherwise i think i'll sell this and buy something with a real GPU :)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
JayPizzle:I tested Black and White 2, Medieval Totalwar 2 Warcraft 3 and EVE - Online with Windows XP and the newest X3100 drivers for XP. With Black and White 2 and Medieval the games graphics were much better, even when setting the games at the lowest settings. No matter what i did, the graphics quality would remeain good and the FPS would be very low. When i ran them in Vista with the 15.4.4 driver , the graphics were very bad but the FPS were a lot higher. The 15.6.1 driver for Vista did not increase graphics quality but made the games run with just as low FPS as on XP.
Warcraft 3 had no changes in graphics quality but ran alot smoother than it does on Vista with the 15.4.4 and 15.6.1 where the 15.6.1 driver makes it unplayable as mentioned earlier. EVE - Online ran the same way as Warcraft 3.
I'm starting to think that with XP, the Hardware T&L is enabled, and the Celeron isn't powerfull enough to handle that, so the graphics quality increases, but the FPS drops. But from what i've heard, theres some problems with T&L hardware/software support which is not triggering correctly depending on what will run best in the drivers, so i'll stick it out for the new drivers, otherwise i think i'll sell this and buy something with a real GPU :)
Actually, I think this is why its slower on XP and it looks better.
It looks better on XP because as Archibael said there might be a problem where Vista driver have a bug with the hardware mode support(hardware T&L/Vertex Shaders).
There are couple of reasons why XP runs slower on your system than Vista.
1. Remember reading about the driver having the ability to switch between hardware and software mode depending on the game?? Well, read on.
2. Intel enabled the ability to dynamically switch between the two modes because of the reason that Core 2 Duos are superior to GMA X3000's hardware T&L, hence on games that will benefit from hardware T&L, it'll run it on the CPU instead. However on games that are shader intensive like Battlefield 2, the X3000 will perform better than the CPU.
3. However, it doesn't look like the driver is advanced enough to optimize on a different CPU(or Intel purposely did not optimize it so Core 2 Duo users have an advantage over a user having say, Celeron M)
4. Therefore, people that does not have mid to high end Core 2 Duo CPUs will suffer when the driver enables software mode, simply because the GMA X3000 is faster.
That is THE reason why P4 users on XP with GMA X3000 reports bad scores, while Core 2 Duo users like me don't. Because the drivers dynamically switch between software and hardware, running on software mode will suffer performance loss with slower CPUs.
BTW, head here for Crysis screenshots on GMA X3000 with some performance data: http://forums.vr-zone.com/showthread.php?t=129343&page=32
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
archibael:Hmmm... not sure how "dynamic" it is. IIRC, there's a registry setting for each game which is run in SW mode. Check the .INF files that come with the drivers for details. I think the registry entryis the name of the game's executable with an underscore in front of it, and you turn on/off software decode with a 1 or a 0 in the entry.
Yea, what I am saying its that Intel says it is dynamic, but I am suggesting the wide difference of performance between same OS/GPU systems is because it has a fixed value depending on the game. Like programmers put a check saying this game will have software mode support, etc. Similar to your thinking I guess. The games that will benefit from running on a Core 2 Duo won't benefit running on a Celeron for example. If it was dynamic, it would run hardware mode for the Celeron all the time.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
But graphics are corrupted gun textures and suit textures are not rendered correctly
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Yea, what I am saying its that Intel says it is dynamic, but I am suggesting the wide difference of performance between same OS/GPU systems is because it has a fixed value depending on the game. Like programmers put a check saying this game will have software mode support, etc. Similar to your thinking I guess. The games that will benefit from running on a Core 2 Duo won't benefit running on a Celeron for example. If it was dynamic, it would run hardware mode for the Celeron all the time.
True. What I was trying to get across is that in the absence of Intel making it truly dynamic, you can go in and hack the registry yourself and try to run in HW mode on a Celeron.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I've found that there's something lik 25 locations in the registry that need to be added if you want to force HW or SW vertex shaders. Search for "_age.exe" to find them all. You need to add values with the name of the executable preceeded by an underscore.
I ended up creating a .reg file which I use, butunfortunately the actual location in the registry depends on how many services, etc that you have installed ...
Due to the massively different performance of HW vs SW shaders on different CPUs, the Intel graphics driver configuration really needs a simple way to specify HW or SW shaders for an application. A list of "forced" applications, where anything not in the list uses the driver defaults.
I've got a T7100, and SW vertex shaders are much faster than HW (e.g. with identical settings, maxing at around 25 FPS in Guild Wars with HW, but 60 FPS with SW). SW also has better compatibility - the Temple of Elemental Evil for example has lighting problems with HW shaders, but performs perfectly with SW shaders. Vampire the Masquerade: Bloodlines OTOH won't run without HW shaders ...
For me, the best option would be for the driver to report HW shaders, but actually use SW shaders unless otherwise forced.
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page