Developing Games on Intel Graphics
If you are gaming on graphics integrated in your Intel Processor, this is the place for you! Find answers to your questions or post your issues with PC games
489 Discussions

Big OpenGL performance drop when porting from fixed-function to shaders

Andrew_McDonald
Beginner
884 Views
I have some fixed-function OpenGL code that I'm upgrading to use shaders. I've done the code work, but am seeing a huge performance drop. The frame rate in a typical scene has halved. There are still some parts of the scene using the fixed-function pipeline though. Are there any obvious gotchas that could account for this?

Off the top of my head, the code looks comparable. I've replaced the fixed-function code which used one directional light, ambient and diffuse only, for a simple vertex shader with per-vertex directional ambient/diffuse lighting, and a trivial pixel shader with one texture lookup modulated withthe interpolated colour. Previously the code called glInterleavedArrays; now I have three calls to glVertexAttribPointer instead (not using vertex array objects.) The vertex & index data are stored in buffer objects. The rendering looks identical.

The test machine is an Intel Clarkdale Core i5-660 running Windows 7 Ultimate x64 SP1, using the Clarkdale GPU. Driver version is 8.15.10.2559. I don't see the same issue on another GeForce test machine, so I think it's something to do with Intel's GPU or drivers.
0 Kudos
7 Replies
Deepak_V_Intel
Employee
884 Views
Hi Andrew,
Would you have the test application to share with us to replicate the issue and test out? Have you tried some of the other applications with shaders to see if you are getting the same issue?
Thanks
-deepak
0 Kudos
Andrew_McDonald
Beginner
884 Views
Thanks for the reply Deepak (bit late though, it would be great if Intel could monitor these forums more closely!)

I can't share code as it's for a full game. But by chance another user had posted about an issue that sounds the same, around the same time I did:

http://software.intel.com/en-us/forums/showthread.php?t=102479&o=a&s=lr

He had made a sample to demonstrate it, so could you try testing that? If not I'll try to make a sample of my own.
0 Kudos
SergeyKostrov
Valued Contributor II
884 Views
I can't share code as it's for a full game. But by chance another user had posted about an issue that sounds the same, around the same time I did:

http://software.intel.com/en-us/forums/showthread.php?t=102479&o=a&s=lr

[SergeyK] Where did you see a "...sample to demostrate..."?

He had made a sample to demonstrate it, so could you try testing that? If not I'll try to make a sample of my own.


Itwould be nice to have a test case.

0 Kudos
Andrew_McDonald
Beginner
884 Views
Where did you see a "...sample to demostrate..."?


Most of the post is about that. The test scene with the sphere, in the screenshot...? The code and binary weren't actually posted, but I'm assuming he'd be up for sharing it with Intel if they asked.

0 Kudos
Andrew_McDonald
Beginner
884 Views
Hi, is there any progress on this? Was 'survivorx' able to share his test case?
0 Kudos
Deepak_V_Intel
Employee
884 Views
Hi Andrew,
We do not have the test application yet. We will follow up to get it andreplicate it on our systems.
Thanks
-deepak
0 Kudos
adamredwoods
Beginner
884 Views
I'd like to add that I have similar results on Dell Inspiron with Intel G41 Express Chipset with the latest drivers 8.15.10.
I've tested a basic shader using OpenGL's GLSL and the vertex shader performance is slow. It seems as if the shader is being processed on the CPU. I've compared the exact shader on Google Chome's WebGL and saw high frame rates, similar to fixed-function pipeline rates.
I'm dealing with about 70,000 vertexes, which is no problem in the fixed-function pipeline, but performance is degraded on GLSL (from 150fps to 22fps).
I've tried changing the 3D settings in the display control panel, but there is no noticeable effect.
the only immediate solution I can think of is to use the Angle openGL replacement library, which converts opengl commands to directx, or to use a non-Intel video card.
0 Kudos
Reply