Graphics
Intel® graphics drivers and software, compatibility, troubleshooting, performance, and optimization
20599 Discussions

Intel HD 3000 driver (i7-2640m) no OpenGL 3.3 and OpenCL 1.2 CPU runtime support

idata
Employee
55,355 Views

I have a i7-2640m CPU on my laptop with Integrated Graphics Display (IGD) intel HD 3000. I am on windows 8.0 x64.

I am opening this thread because I feel that intel has forgot about supporting the intel HD 3000 owners with 2 newer version APIs These being OpenGL 3.3 support and OpenCL 1.2 support (CPU)

1. I am desperately trying to find OpenCL 1.2 CPU runtime support.

I have tried this runtime but my graphics card is not supported. intel_sdk_for_ocl_applications_2013_r3_x64_setup.exe

 

Can intel produce a compatible intel sdk for ocl aplications runtime x64 for intel HD 3000 (CPU support)???????? Currently the OpenCL supported version (CPU) is 1.1

2. I also feel that intel has forgot to add some missing features in order to fully support OpenGL 3.3 (and 3.2).

The hardware can support upto OpenGL 3.3. The driver is currently supporting 3.1 Can you please update the graphics card driver

PLEASE intel HELPPPP. I need the intel HD 3000 driver to have latest supported API versions. OpenCL 1.2 (CPU) and OpenGL 3.3 support

54 Replies
mcons2
Novice
5,951 Views

hey Panayiotis

so... not sure what you mean at 1. doesn't that laptop motherboard have an Extended PCI 2 slot? you buy a fitting graphics card, install it and done. I for one have a desktop computer with a Xeon inside and have 4 such slots at my disposal so... I could install 1 graphic card and further 3 envy-dia (shhhh... Intel competitors!) Tesla cards and start cruncing numbers while having OpenGL 4.4 ready.

about OpenCL, haven't study it yet but if I remember correctly this one is used for heterogeneous parallel computing so in other words you don't really know what's behind there, CPU, GPU etc. which can be good but also bad. in CUDA's case you know it's a bunch of GPU cores because you have to directly program against them.

in our case with HD 3000, they say CPU and GPU are on the same chip but still they are 2 separate entities or? in other words the GPU should behave as far as programming it goes exactly like the GPU f a stand alone graphics card or otherwise the whole setup is just rubbish, everything is make belief.

I'm going to get a decent no of CUDA cores graphics card from the competition and see the end of this. Intel will never update teh Windows driver to OpenGL 3.3 and OpenCL seems at least foggy and misty on these obscure so called integrated graphics cards.

cheers

Marius

0 Kudos
idata
Employee
5,951 Views

intel HD 3000 is integrated on the small 2nd gen CPU. Don't laugh it is. That is why when I refer to intel hd 3000 I mean the 2nd gen CPU too as it is one unit. They work independently but the positives are they share fast memory, and have low power consumption. 2 D performance is fast but 3D is slow. Not recommended for gaming. It is a laptop. The only way to expand the graphics card is through a V8.0 EXP GDC Beast Laptop External Independent Video Card Dock Expresscard (ebay). similar to PCI-e 2.0 in terms of bandwidth. so not bad in terms of performance but it becomes a hybrid. Probably I will buy it with an Nvidia GT720 low power silent graphics card. Then I will have it for another 7 years.

mcons2
Novice
5,951 Views

I know the CPU & GPU share the same chip, unclear to me why exactly and I didn't really care till recently when it hit me I couldn't run OpenGL 3.2 features. I also usually don't like it when things get mixed up together for the reason of who knows why but I wanted the Xeon. I somehow believe a stand alone card will always beat an integrated one, no matter what memory the latter may share with the CPU etc. I guess Intel offered this as a low cost graphics card to those interested in having servers and not gaming stations but nowadays the GPU is already a powerful computing tool so this approach CPU & GPU together I think is wrong.

I set my target on a NVidia GeForce 730 2Gb GDDR5 64 bit as it is the most powerful still on PCIe 2, from that point on you have to possess PCIe 3 which I don't and also don't really need. enough CUDA cores to attempt interesting stuff, supports OpenGL 4.4, OpenCL and has at least good performance with games (which for me counts for nothing as I quit playing them 20 years ago)

my honest advice is to buy yourself such a card and be done with it even if Intel patches the Win driver it won't be long till you hit another limitation.

cheers

idata
Employee
5,951 Views

I agree with everything you said except that I would go for an MSI NVidia GeForce 720 2Gb GDDR5 64 bit because of the low power consumption (19-23watt) though I cannot find one

I will not give up with intel hd 3000 yet. I need openGL 3.3 support.

0 Kudos
mcons2
Novice
5,951 Views

according cu nvidia the 730 DDR5 series only consume 25W, see here -> http://www.geforce.com/hardware/desktop-gpus/geforce-gt-730/specifications GeForce GT 730 | Specifications | GeForce

check ebay, maybe you can find one there? your version will also almost surely cost more than the one for a desktop but I think you cannot escape it

yep, keep pressing them, I'll keep an eye out here myself but I've already ordered my GT730 today. will set me back another 60-70 euro but what can you do?

cheers

0 Kudos
idata
Employee
5,951 Views

Hello,

I am looking for a silent type graphics.

I do have a sapphire hd 7750 ultimate (silent type) pci-e 3.0 which was removed from my pci-express 1.0a board due to not performing as expected due to 2D performance incompatibility (removed from my mini-ITX HTPC computer which still performs very well with Nvidia GT220) . However Sapphire AMD HD 7750 ultimate (silent edition 1GB GDRR5) is a hungry beast with 65Watt consumption and I am thinking to replace that with GT730 for my laptop as per your suggestion.

Which one did you chose? What was the price? For me it will be an expensive purchase since I need to buy 3 components: external express card graphics card dongle, graphics card, and power supply

Power supply: 35 euro

External graphics card dongle:65 euro

Graphics card: 100??

Summing to about 200 euro. Is it worth the upgrading?? I have thought about it and it does not make any sense.

I will continue pressuring intel to improve intel hd 3000 driver with just openGL 3.3.

Thanks

0 Kudos
mcons2
Novice
5,951 Views

well I went for this exact product -> http://www.gigabyte.com/products/product-page.aspx?pid=5113# ov http://www.gigabyte.com/products/product-page.aspx?pid=5113# ov

because it has a large bandwidth (40 Gb/s), 385 CUDA cores, 2Gb DDR5 (not DDR3!), 902 MHz clock. it costs 65 euro here where I am. the only shortcoming is that it only has a 64bit Memory Interface Width (instead of 128 as today usual) which I hope is being made up for by the other features listed before. I need it for soft development so it should be more than enough anyway.

power consumption was not an issue on my list since I have a desktop but as far as running quietly all the reviews I've read attested to that, no sound whatsoever.

yep, I guess you're looking at another costs-list there than me, although I might find it that I need to update my power source as well, forgot to check, I'll do it when I have the card installed. eh, had to happen sometime anyway so might as well do it on this occasion

if you only need OpenGL 3.3 and this will be the case for the foreseeable future then I wouldn't spend the 200 euro but if you in 1 or 2 years' time will need OpenGL 4 then HD 3000 will clearly be out of its league and you'll not find the external graphic cards to fit your laptop anymore and then you're stuck.

I guess if you look hard enough you may find a 2nd hand laptop for close to 300 euro which is better than what you have to begin with.

so you'll have to do the math for this but by all means keep banging Intel

cheers

0 Kudos
idata
Employee
5,951 Views

The way I see it is intel did not keep their promise for support. Their lack of support will cost me 200 euros/dollars so they owe me 200 euros + the frustration / disappointment and lost time spent here writing to them in order to convince them do what they should have done already.

DEVELOP DRIVER SUPPORT FOR SUPPORTED APIs AND SUPPORT THEIR CLIENTS.

What is their moto, is it to develop excellent hardware and support hardware up to 3-4 years? The client is irrelevant and the hardware supported APIs versions are irrelevant????

Intel will you support openGL 3.3 which you should have done this a long time ago???? Please respond.

idata
Employee
5,951 Views

I have one question

Why only x8 bus interface? They used to make them to support the better x16 bus interface.

0 Kudos
idata
Employee
5,951 Views

Dear EstabanC_Intel,

I have sent you the information you have requested but I did not get any response since then.

When will I have a formal reply from the openGL development team?

Thanks

Regards

Panayiotis

0 Kudos
EstebanA_C_Intel
Employee
5,951 Views

Hello, palmiris:

The team is currently reviewing the case, there is no ETA so far for the possible support for this.

As soon as I get their outcome, I will certainly let you know.

Regards,

Esteban C

0 Kudos
mcons2
Novice
5,951 Views

I think the _Intel guys stopped reading this thread days ago! or are you still there guys? if so, what's cooking with our HD 3000 Win driver then?

as I was saying, I don't think they'll ever do it and also legally we'd have no case against them as they never promissed OpenGL 3.3 as one _Intel guy was shwoing you the specifications at some point. it is by +- accident that you noticed the Linux driver supports the 3.3 and then showed them the reality which they finally had to accept but if you look at it it's hard to tell which is the actual truth of the matter, is it the Linux driver supporting too much or the Win driver too little? if Intel in their specs never promissed OpenGL 3.3 in the Win driver then it's dead & done, they'll never do it.

yeah, 200 euro just to get OpenGL 3.3 is ridiculous but maybe you get lucky on ebay? try ebay.de there are plenty of sellers there and shipping out of Germany is affordable, about 15 euro. well yes, that will add to the bill but maybe you'll get the parts cheap.

so this all makes us 2 frustrated angry birds customers but that's it for them, they don't really care. or do you Intel? nee, you don't!

0 Kudos
idata
Employee
5,951 Views

Thanks for making me change my mood.

They will care because otherwise in the long term, they will lose customers and developers support. Wont you intel??

What makes a good company is not the product it self only but the service received too. Maybe intel has to be reminded few times as too much pressure in the driver development for newer products has caused developers to behave like robots. Just do what is necessary and nothing else. No human touch and sensitivity.

0 Kudos
mcons2
Novice
5,951 Views

I asked myself this and I don't know why the GT730 only has PCIe2 8x instead of 16x. the 128 bit version is 16x the 64 bit ones are 8x but they feature the more powerful processor, more CUDA cores. more bandwidth etc. and in my view all these weigh in more that the 16x vs. 8x so overall the 64bit/8x card would behave better than the 128bit/16x, ASSUMINGLY

I guess this would make a good question on the nvidia forum if they have one

I don't think that dissatisfying us on this subject will affect Intel that much. we were customers a few years ago when we bough their products, now we're just nagging them

but I hope they update the driver hence making us buy more Intel in the future.

cheers

0 Kudos
idata
Employee
5,951 Views

I concur with all especially with the last comment regarding intel. " hope they update the driver hence making us buy more Intel in the future."

I was thinking maybe I can use glew 1.13 extensions to support openGL 3.3 in windows. Maybe this can be done. I have to do more reading regarding this subject but I don't hold my breath regarding any positive outcomes.

0 Kudos
idata
Employee
5,951 Views

Just to show you the other side of support. OpenGL 3.3 included on iMac intel hd 3000 driver too (as well on Linux as already discussed). Intel please support intel hd 3000 graphics driver on windows with openGL 3.3. It is better late than never. I understand that you wish to support newer technologies but you should also support your "older" products with hardware supported API features.

iMac GPU OpenGL OpenCL

Mac mini (Mid 2011)

Intel HD 30003.3N/A

MacBook Pro (13-inch, Early 2011)

MacBook Pro (13-inch, Late 2011)

Intel HD Graphics 3000<td style="background: whitesmoke; border-width: 0px 1pt 1pt 0px; border-style: none solid solid none; border-color: # 000000 # ddd...
0 Kudos
idata
Employee
5,951 Views

What do you think of this?? Intel did the most development for intel hd 3000 driver on windows compared to other operating systems. Well nooooooot the intel development team did the least development on windows driver and still does not give me a positive reply for openGL 3.3 driver support on windows. Apple says here that is supports openGL 4.0 on intel hd 3000. Whaooooo... In summary it says that By giving up on feature support, basically at the release of an architecture, Intel has been able to provide pretty good drivers on the newer architectures, thus punishing older clients. Is this punishment because they did not buy this year and they bought intel products a couple of years ago?? Interesting assumption but in reality they will loose clients who have bought intel products but were not supported by new hardware supported APIs features/technologies such as updating openGL 3.3 from openGL 3.1 and supporting windows 8.1 driver with wddm 1.3, or having WiDi compatibility in Windows 10 which is a feature on the intel processor and advertised with it. Why would one who bought a mobile intel cpu with IGP such as intel hd 3000 buy again the same configuration if there is no support from intel development to support the driver with new features???? They would either buy a mobile CPU without a graphics or buy from competition instead. Intel is this what you want to accomplish? to drive customers away?? I hope that you do recognize this mistake before you start loosing clients.

Sandy Bridge (HD 2000 / HD 3000) is 9.8% of the Unity editor users and one of the most used GPU architectures.

http://www.g-truc.net/doc/OpenGL%204%20Hardware%20Matrix%202015-06.pdf

Exposed extensions to all OpenGL 4 hardware on the lastest drivers available to date

and here is the rest of the article: source http://www.g-truc.net/post-0729.html June 2015 OpenGL hardware matrix: Intel Sandy Bridge

In practice, Sandy Bridge supports OpenGL 3.1 with all the OpenGL 3.2 extensions but https://www.opengl.org/registry/specs/ARB/texture_multisample.txt ARB_texture_multisample.

Additionally, Sandy Bridge supports six OpenGL 3.3 extensions: https://www.opengl.org/registry/specs/ARB/vertex_type_2_10_10_10_rev.txt ARB_vertex_type_2_10_10_10_rev, https://www.opengl.org/registry/specs/ARB/timer_query.txt ARB_timer_query, https://www.opengl.org/registry/specs/ARB/texture_rgb10_a2ui.txt ARB_texture_rgb10_a2ui, https://www.opengl.org/registry/specs/ARB/shader_bit_encoding.txt ARB_shader_bit_encoding, https://www.opengl.org/registry/specs/ARB/occlusion_query2.txt ARB_occlusion_query2 and https://www.opengl.org/registry/specs/ARB/explicit_attrib_location.txt ARB_explicit_attrib_location.

Finally, Sandy Bridge supports three OpenGL 4.0 extensions: https://www.opengl.org/registry/specs/ARB/texture_query_lod.txt ARB_texture_query_lod, https://www.opengl.org/registry/specs/ARB/texture_buffer_object_rgb32.txt ARB_texture_buffer_object_rgb32 and https://www.opengl.org/registry/specs/ARB/draw_buffers_blend.txt draw_buffers_blend.

By giving up on feature support, basically at the release of an architecture, Intel has been able to provide pretty good drivers on the lastest architectures (Haswell / Broadwell), both in term of quality and feature set (OpenGL 4.3).

However, it is very unfortunate that anything older is more complex to support than a good old http://delphigl.de/glcapsviewer/gl_generatereport.php?reportID=603 GeForce 8, released 9 years ago. Sadly, it can get worse! Older architectures than Sandy Bridge, that is GMA (OpenGL 1.4), GMA HD (OpenGL 2.0) and Ironlake HD Graphics (OpenGL 2.1) represent 12.9% of the editor users.

Lastly, Intel GPU names are dreadful: It was aweful back at the GMA time, it is still just as bad. For example, "Intel HD Graphics" (without any numbering) has been used for Clarkdale, Arrandale, Sandy Bridge, Ivy Bridge, Haswell, Bay Trail, Broadwell and Braswell.

0 Kudos
idata
Employee
5,951 Views

I checked and all GT730 are 8 channel. I could not find a 16 channel GT730.

An 8x PCI-e 2.0 is same like a 16x PCI-e version 1.0 (or 1.1). This is what puzzles me. I guess it cannot be used in V 1.1 pci-e slots as this will not benefit anything as will be like having a 4 channel card pci-e 2.0 card..

I did buy today the express card external dongle. I will use Sapphire hd 7750 pci-e 3.0 for testing which I purchased couple of years ago (though 1GB gddr5) when I tried to replace my GT220 on my Home Theater PC (mini-itx with mobile P9600 cpu and mobile ram modules) but had compatibility issues as motherboard has one pci-e x16 1.0a version. I will send the bill to intel, as adviced by my advocate, if they fail to support me. (bought also a 6 Amps 72 Watt DC adaptor for the external pci-e graphics card dongle).

Edit:

The GT 730 card 2GB gddr5 from Asus GT730-2GD5-BRK has all the pins, in order to make the graphics card a full x16 pci-e card (2.0). It consumes 25 Watts. However it has only 2 wires going to the fan as opposed to 3 wires, which means the fan speed cannot be controlled by software, or automatically as the temperature rises.

As for the pci-e x16 and x8 on pci-e 2.0 it does not make a lot of difference in real benchmark scenarios, however I think if used on pci-e 1.0 card the difference might be noticeable.

Regards

0 Kudos
mcons2
Novice
5,630 Views

hey palmiris

so any news yet?

it's been a week now that I have the Gigabyte NVidia 730 DDR5 PCI2 8x and it works without a glitch. runs quietly and nicely every piece of soft on OpenGL but I'm still to use it for OpenCL/CUD/etc but i don't expect any problems. 8x or 16x it's just the same to me, I've bought it for software development, I needed CUDA cores and a decent GPU frequency which the card has plenty of (as opposed to the 16x GT 730 which was very low on such resources). so Hallelujah!

so does it look by you? how much money spent yet?

cheers

ps

I think we should just ignore and forget about Intel when it comes to video stuff. they seem to be newbies on this subject!

0 Kudos
idata
Employee
5,630 Views

From intel I did not have a reply yet. It seems that the development team is not prepared to diversify their goals by supporting previous technologies. If this is their target they have lost me as a client.

Regards alternatives, yes you were right.

Bought the following as per your suggestion for GT 730.

PCI-E graphics card dock

Power supply AC/DC for the dock 6A 12V

Graphics Card: GT 730 from ASUS 2GB GDDR5 64 bit PCI-e 16X max power consumption 25Watt

Will have the graphics card tomorrow since it is ordered locally but the rest still waiting as they were ordered from China.

Total cost: 65 (dock) + 10 (power supply) + 90 (Asus GT 730)

Total intel bill 165. I will send them the bill if I have a rejection for openGL 3.3 support.

0 Kudos
mcons2
Novice
5,630 Views

well yes, you invested a bit of money in everything can at least move on with your job 'cause if you wait for Intel to shift anything... you'll be 100 and still waiting! they won't do it!

you went for ASUS and they're a quite pricier than Gigabyte but in my case it made no difference. and I think I looked for ASUS myself here and couldn't find the exact same configuration, it was I think the same GT 730 but with 1Gb memory instead of 2. you also say it's 16x but on PCIe 1, mine is 8x on PCIe 2 so as you said probably the same at the end.

if you do send them the bill let me know as I'll add myself the 65 euro I had to pay for my card

out of curiosity, are you in Greece?

cheers

0 Kudos
Reply