Software Archive
Read-only legacy content
17061 Discussions

Driver for GMA X3000 (HW T&L)

directcpp
Beginner
22,950 Views
Hi.
please.
When Intel will be released drivers for GMA X3000 (HW T&L and vertex shader 2.0 or 3.0)?
in February?

0 Kudos
316 Replies
prashu162
Beginner
1,095 Views
October ends and there are no release of drivers in that month its a shame.
Wake up Intel........will u ever provide the fully functional drivers for GMA X3000.
0 Kudos
davidc1
Beginner
1,095 Views
tcdelaney: Do you have Vista?? For Vista Guild Wars is listed in the game that software processing is faster.

prashu162: What is your CPU?? I think if you have a slow CPU I think you can forget about better performance part and just expect better compatibility.

Some juicy GMA X3x00 info: http://www.4gamer.net/games/038/G003822/20070926019/
0 Kudos
tcdelaney
Beginner
1,095 Views

Nope - XP. The difference between something like a Celeron M and a T7xxx is just so great that I don't see how there's any way that a "standard" set of games can be found that work better on SW vs HW unless you just settle for what works better with a Celeron M (which is what I think has been done).

Because there can be such a huge difference in performance, I think there needs to be a graphical way to set these properties.

0 Kudos
prashu162
Beginner
1,095 Views
DavidC1:Thank you very much for your info and i hope the drivers will be released soon.
Info regarding my CPU
Processor:Intel Core2Duo E6400 2.13GHz
Motherboard: DG965RY
Graphics: GMA X3000
RAM: 2GB
can i hope better performance with this configuration or should i think of a new GPU????

Am able to play Far Cry but the textures rnt good they havnt got any better with the new drivers.The water is not at all visible only a blue stretch appears same problem is the ground textures.
But Far Cry is given in the game compatibility list of GMA X3000.
Can anybody help????Is anyone else having the same problem with Far Cry???
0 Kudos
davidc1
Beginner
1,095 Views

You'll see it if you run the Demo version of Farcry or as Iheard, play the older unpatched version of Farcry. Update your patches for Farcry as it should fix graphical errors. Early patches even report graphics problem with Nvidia graphics cards.

Take a look here: http://media.ubi.com/emea/farcry/Readme_133.txt

Farcry patches show quite a lot of fixes related to graphical glitches alone.

I want to test it but I think I'll just wait for you to test it out. Not many people with G965/GMA X3000 have a CPU as fast as I do(E6600) so my experience will probably turn out to be better than others.

As for other games like FIFA and Spiderman, well I can't say because I have not run it but I can assure you that Intel's drivers have a LONG way to go from being like ATI/Nvidia's drivers. Actually its not all because Intel drivers suck compared to ATI/Nvidia, its also because ATI/Nvidia makes exceptional drivers. I don't think Intel will catch up to the quality that ATI/Nvidia's drivers have, if ever. Drivers were the reason that SiS with their Xabre, Matrox with their Parhelia, couldn't even touch the two manufacturers. We all wish Intel can make high quality drivers, but from my experience with their desktop boards and such, they are a hardware making company, not a software making one. Certain things won't come easy for even companies like Intel, pouring resources into driver development won't make Intel's drivers become like ATI/Nvidia's, they will need experience and time.

There are rumors that Intel will have a problem with their high end graphics development, the Larrabbee(which unlike integrated graphics, they'll be using many more programmers/engineers on the project).Still, ifanything, Intel isn't doing too bad, I am glad its such an advancement over the older GMA and Extreme series. Just don't expect them to become the equals that ATI/Nvidia is.

My experience is very positive. I am sure that current flaws like some games not running will be fixed on the next version.

Nope - XP. The difference between something like a Celeron M and a T7xxx is just so great that I don't see how there's any way that a "standard" set of games can be found that work better on SW vs HW unless you just settle for what works better with a Celeron M (which is what I think has been done).

This is done purposely of course. It's called market segmentation. Core 2 Duo systems should be faster than Celeron systems and Intel is a CPU manufacturer. I don't think they'll change it.

(BTW, from the naming, 15.7 Vista driver sounds like it'll be a DX10 enabling driver. It'll be fun waiting for proper DX10 drivers that's for sure :P)

0 Kudos
prashu162
Beginner
1,095 Views
Does anybody know when will Intel be releasing the next version of its drivers for WINXP???
0 Kudos
Aaron_B_Intel
Employee
1,095 Views

Was supposed to be this past week. They just reported to OEMs that it was pushed a week or so because they had some critical "display issues".

15.7 is not DirectX10. That's not showing up until '08.

0 Kudos
tcdelaney
Beginner
1,095 Views

Hi DavidC1,

I think you've misunderstood me. Because the default for games not forced to SW vertex shaders is HW in 14.31.x, T7xxx CPUs are limited by the X3100 to similar performance as a Celeron M in most recent-ish games. This is the reason a lot of people have reduced performance going from the 14.29 to 14.31.x drivers - in 14.29 SW was the default (and only) option.

This isn't market segmentation. It would be more akin to market segmentation if the Intel engineers had gone through games and classified on which architectures they worked better with SW and which with HW, and put that into the drivers (though it's still not really market segmentation - just being able to use the hardware you have). I'm not suggesting that this is what they should do (it would take a lot of time that could be better spent elsewhere) but instead that the users be given a simple way to do this classification themselves.

It would be good if the driver couldalso (at the user's option) interact with a database of such classifications (including whether SW or HW is required for correct behaviour - see ToEE for a game that needs SW to work correctly with 14.31.1, but Bloodlines for a game that needs HW). This would assist users (select game, lookup optimal settings for CPU type, or try out HW/SW manually and add them to the database for other people) and could be used by Intel engineers to alert to incompatibilities.

0 Kudos
davidc1
Beginner
1,095 Views
tcdelaney:

Hi DavidC1,

I think you've misunderstood me. Because the default for games not forced to SW vertex shaders is HW in 14.31.x, T7xxx CPUs are limited by the X3100 to similar performance as a Celeron M in most recent-ish games. This is the reason a lot of people have reduced performance going from the 14.29 to 14.31.x drivers - in 14.29 SW was the default (and only) option.

This isn't market segmentation. It would be more akin to market segmentation if the Intel engineers had gone through games and classified on which architectures they worked better with SW and which with HW, and put that into the drivers (though it's still not really market segmentation - just being able to use the hardware you have). I'm not suggesting that this is what they should do (it would take a lot of time that could be better spent elsewhere) but instead that the users be given a simple way to do this classification themselves.

It would be good if the driver couldalso (at the user's option) interact with a database of such classifications (including whether SW or HW is required for correct behaviour - see ToEE for a game that needs SW to work correctly with 14.31.1, but Bloodlines for a game that needs HW). This would assist users (select game, lookup optimal settings for CPU type, or try out HW/SW manually and add them to the database for other people) and could be used by Intel engineers to alert to incompatibilities.



Recent games like FarCry?? Actually I haven't noticed any situations where recent games had reduced performance over the 14.29 drivers. The older games rely on pixel throughput performance which is why CPU runs it faster since GMA X3000 lacks pixel processing power. On games that needs more shader performance though, GMA X3000 will beat the Core 2 Duo CPUs. Farcry is 2x faster with hardware mode for example.

Now games like Guild Wars and World of Warcraft are different from the first person shooters. They claim they are DX9 games but are really souped up DX7/8 games pushing those versions to the max.

I understand games like Guild Wars must have settings as default, running hardware on XP, and its wrong, but for vast majority of the games, its running fine. What they really need is a graphics driver that does dynamic performance profiling on games and find out which games would run faster on what mode on certain hardware so everyone would stand to gain the benefit.

You should compare other integrated chipsets like AMD 690G on XP and see its not bad for X3000 as you think. Makes sense considering that's the kind meant for comparison.
0 Kudos
davidc1
Beginner
1,095 Views
archibael:

Was supposed to be this past week. They just reported to OEMs that it was pushed a week or so because they had some critical "display issues".

15.7 is not DirectX10. That's not showing up until '08.



Damnit. Anyway, you said about 15.6.2 driver. What is 15.6.2 driver and what is 15.7?? If you say 15.7 is coming very soon does it mean 15.6.2 driver is skipped?? That's why I assumed 15.7 is a DX10 driver.

Very bad news on the DX10 driver front. Its delayed again. It's not Q1 2008:

http://www.4gamer.net/games/038/G003822/20070926019/

DX10/SM4.0 in Q2 '08
OpenGL 2.0 in Q3 '08

They are sure gonna have driver problems with their discrete graphics project if they are having this much hard time with integrated. And it looks like they might completely change the hardware again, meaning whole new drivers. They should just expand on the GMA X3000/X3100 for its Larrabbee with more unified shaders/more texture cache etc, rather than a new architecture so they have some driver foundation to work on.
0 Kudos
prashu162
Beginner
1,095 Views
Processor:Intel Core2Duo E6400 2.13GHz
Motherboard: DG965RY
Graphics: GMA X3000
RAM: 2GB
can i hope better performance with this configuration or should i think of a new GPU????

DavidC1 u dint reply me about can i hope better performance with this CPU configuration???
0 Kudos
whitehat1
Beginner
1,095 Views
Hi new x3100 user lap specs

Celeron 1.86 Gz
1GB Ram
x3100 14.31.1 driver

3dmark03
1217

3dmark05
537

AOE3 is highly playable on high and all but vsync on I would say 20-25 fps note on high not very high.

Flight gear is also good but only in windowed mode.

Since we are waiting for the new drivers is it possible that intel are facing major problems due to the harware and are just not able to make it do what they want I mean for an integradted chip wich only cost a couple of dollars more on top of the MO this thing is quite amazing but as far a developing to get the full potential should we be worried?

I mean a change of architechture for the upcoming discrete GPU does not soud good for the life of the x3000/x3100 etc
0 Kudos
Aaron_B_Intel
Employee
1,095 Views

15.6.2 is bug fixes for some video issues, some refresh rates with certain resolutions, and for persistence in the color correction settings. It's releasedto OEMs, but I have not seen it show up on Intel's Download Center. Perhaps Intel didn't feel it was an important enough release to put out in the wild unless users were specifically in need?

15.7 has a number of other bug fixes, many of which are gaming related, including the proper reporting of the Fixed Function T&L capabilities. Again, I'm not clear on why this was delayed past the original release date (last week), but it's evident something serious was found in beta.

As for Larrabee, it's a completelydifferent team on a completely different architecture for a completelydifferent market. Not sure how many of the learnings in one would be valid in the other, but I'm sure the teams are in communication where it counts. Regardless, the basic graphics driver architecture for chipsets is not slated to change in the next several years at least, so X3xxx appears to be in the support model for some time to come.

0 Kudos
prashu162
Beginner
1,095 Views
How do u people know that the new drivers are released to OEMs???
0 Kudos
davidc1
Beginner
1,095 Views
archibael:

As for Larrabee, it's a completelydifferent team on a completely different architecture for a completelydifferent market. Not sure how many of the learnings in one would be valid in the other, but I'm sure the teams are in communication where it counts.



There are already rumors that it might have driver trouble with Larrabbee. Anyway I saw it from Fudzilla site so I don't know how true it is. But they do have a bit of legitimacy so it may be true.

Archibael is an Intel guy, not necessarily working on GMA project but a hardware guy regardless :P. That's how he knows a little bit more than the rest of us.

0 Kudos
prashu162
Beginner
1,109 Views
thats cool!!!
So archibael i want to know whether the forth coming drivers will enable the AGP Texture acceleration or not???
0 Kudos
Aaron_B_Intel
Employee
1,109 Views
DavidC1:

There are already rumors that it might have driver trouble with Larrabbee. Anyway I saw it from Fudzilla site so I don't know how true it is. But they do have a bit of legitimacy so it may be true.

I have serious doubts that Larrabee has any driver troubles whatsoever as of 11/08/07. But I will say no more. :)


Archibael is an Intel guy, not necessarily working on GMA project but a hardware guy regardless :P. That's how he knows a little bit more than the rest of us.

Definitely not working on GMA. Would love to, but am allergic to Hillsboro and Folsom. :)

0 Kudos
Aaron_B_Intel
Employee
1,109 Views

prashu162:
thats cool!!!
So archibael i want to know whether the forth coming drivers will enable the AGP Texture acceleration or not???

It's not in any of the documentation, so I have no clue. Sorry!

0 Kudos
davidc1
Beginner
1,109 Views
archibael:
DavidC1:

There are already rumors that it might have driver trouble with Larrabbee. Anyway I saw it from Fudzilla site so I don't know how true it is. But they do have a bit of legitimacy so it may be true.

I have serious doubts that Larrabee has any driver troubles whatsoever as of 11/08/07. But I will say no more. :)


Archibael is an Intel guy, not necessarily working on GMA project but a hardware guy regardless :P. That's how he knows a little bit more than the rest of us.

Definitely not working on GMA. Would love to, but am allergic to Hillsboro and Folsom. :)



Well that's good. At least they can make drivers. Though I have to tell you, nobody said anything about GMA X3000 having driver problems until a month before release, so I still have my doubts. But they better do something about their integrated graphics.

Prashu162: I don't know why AGP texture acceleration matters. Look here for the explanation: http://www.intel.com/support/graphics/sb/CS-009689.htm

People are underestimating CPUs with their power. Dedicated processing doesn't mean its always faster. Modern CPUs like Core 2 Duo are very powerful, even running specialized code like graphics. Simple integrated card will have a hard time beating it. You'll not get last generation value dedicated GPU performance out of integrated.
0 Kudos
clairvoyant
Beginner
1,109 Views
archibel i see youre the guy to go to if you wanna get some info from intel..
so my question is
HAS INTEL ABANDONED DRIVER DEVELOPMENT FOR x3000/x3100 ??
WHICH WOULD BE SOOOO LAME OF THEM Angry smiley [:@]
rumors
true or not??

if not, when do you think new drivers should be out?
0 Kudos
davidc1
Beginner
1,109 Views
Clairvoyant:
archibel i see youre the guy to go to if you wanna get some info from intel..
so my question is
HAS INTEL ABANDONED DRIVER DEVELOPMENT FOR x3000/x3100 ??
WHICH WOULD BE SOOOO LAME OF THEM Angry smiley [:@]
rumors
true or not??

if not, when do you think new drivers should be out?


From what Archibael said earlier on this thread, they haven't abandoned the driver development(in fact, nowhere near it), but they have such little resources and manpower to develop it fast enough it almost looks like its abandoned :P.
0 Kudos
Reply