Graphics
Intel® graphics drivers and software, compatibility, troubleshooting, performance, and optimization
22680 Diskussionen

Integer texture support in Intel HD 4600

MJone15
Einsteiger
3.606Aufrufe

I am having a problem allocating 64x64 OpenGL textures of the format ALPHA16I_EXT (internal) / ALPHA_INTEGER_EXT.

GPU caps lists the GL_EXT_texture_integer extension as being available. The driver is 10.18.10.3345 (10-31-2013).

The machine is a Dell with a hybrid NVidia/Intel configuration.

My question is more a driver question than a dev question: should the HD 4600 support this texture format? Is there limited support?

Thanks.

0 Kudos
1 Lösung
ROBERT_U_Intel
Mitarbeiter
1.910Aufrufe

Hi Michael

The latest version of the OpenGL specs recommend not to use the alpha formats when only using one component.

Here's the list of formats with the deprecated ones highlighted in red (they are no longer part of the core version of the spec)

So my recommendation would be to use R16I/RED_INTEGER instead.

Thanks

Robert

Lösung in ursprünglichem Beitrag anzeigen

4 Antworten
ROBERT_U_Intel
Mitarbeiter
1.910Aufrufe

Hi Michael

I will check with our OGL developers and get back to you.

Thanks

Robert

ROBERT_U_Intel
Mitarbeiter
1.911Aufrufe

Hi Michael

The latest version of the OpenGL specs recommend not to use the alpha formats when only using one component.

Here's the list of formats with the deprecated ones highlighted in red (they are no longer part of the core version of the spec)

So my recommendation would be to use R16I/RED_INTEGER instead.

Thanks

Robert

MJone15
Einsteiger
1.910Aufrufe

Thanks. Offhand, I think that should work for us.

I will give it a shot and flag as answered if it works out.

MJone15
Einsteiger
1.910Aufrufe

That worked. Thanks. The following specifics do the trick:

Data format: GL_SHORT

Texture format: GL_RED_INTEGER[_EXT]

Internal format: GL_R16I

Also, I did the same thing with a 32 bit alpha luminance, and used an RGBA16UI texture instead.

Antworten