Graphics
Intel® graphics drivers and software, compatibility, troubleshooting, performance, and optimization
22668 ディスカッション

Integer texture support in Intel HD 4600

MJone15
ビギナー
3,593件の閲覧回数

I am having a problem allocating 64x64 OpenGL textures of the format ALPHA16I_EXT (internal) / ALPHA_INTEGER_EXT.

GPU caps lists the GL_EXT_texture_integer extension as being available. The driver is 10.18.10.3345 (10-31-2013).

The machine is a Dell with a hybrid NVidia/Intel configuration.

My question is more a driver question than a dev question: should the HD 4600 support this texture format? Is there limited support?

Thanks.

0 件の賞賛
1 解決策
ROBERT_U_Intel
従業員
1,897件の閲覧回数

Hi Michael

The latest version of the OpenGL specs recommend not to use the alpha formats when only using one component.

Here's the list of formats with the deprecated ones highlighted in red (they are no longer part of the core version of the spec)

So my recommendation would be to use R16I/RED_INTEGER instead.

Thanks

Robert

元の投稿で解決策を見る

4 返答(返信)
ROBERT_U_Intel
従業員
1,897件の閲覧回数

Hi Michael

I will check with our OGL developers and get back to you.

Thanks

Robert

ROBERT_U_Intel
従業員
1,898件の閲覧回数

Hi Michael

The latest version of the OpenGL specs recommend not to use the alpha formats when only using one component.

Here's the list of formats with the deprecated ones highlighted in red (they are no longer part of the core version of the spec)

So my recommendation would be to use R16I/RED_INTEGER instead.

Thanks

Robert

MJone15
ビギナー
1,897件の閲覧回数

Thanks. Offhand, I think that should work for us.

I will give it a shot and flag as answered if it works out.

MJone15
ビギナー
1,897件の閲覧回数

That worked. Thanks. The following specifics do the trick:

Data format: GL_SHORT

Texture format: GL_RED_INTEGER[_EXT]

Internal format: GL_R16I

Also, I did the same thing with a 32 bit alpha luminance, and used an RGBA16UI texture instead.

返信