Graphics
Intel® graphics drivers and software, compatibility, troubleshooting, performance, and optimization
22259 Discussions

glVertexArrayAttribIFormat not working correctly on Intel HD Graphics 530

Vadim54
Beginner
122 Views

Hello Intel Community,

I'm encountering an issue with glVertexArrayAttribIFormat on my Intel HD Graphics 530 GPU.

System details:

  • GPU: Intel(R) HD Graphics 530

  • Driver version: 31.0.101.2111

  • OpenGL version: 4.6.0 - Build 31.0.101.2111

  • GLSL version: 4.60 - Build 31.0.101.2111

  • OS: Windows 10 64-bit

Problem:

I'm using the following input in my vertex shader:

layout (location = 5) in uint inst_inTransformIndex;

And the corresponding VAO setup in code is done using glVertexArrayAttribIFormat:

glVertexArrayAttribBinding(vao, 5, 1);
glVertexArrayAttribIFormat(vao, 5, 1, GL_UNSIGNED_INT, 24);
glEnableVertexArrayAttrib(vao, 5);

The buffer contains valid uint32_t data, and the attribute is correctly bound with divisor 1 via:

glVertexArrayVertexBuffer(vao, 1, instanceBuffer, 0, stride);
glVertexArrayBindingDivisor(vao, 1, 1);

However, the attribute value seen in the shader is either 0 or garbage.

If I instead switch the shader input to:

layout (location = 5) in float inst_inTransformIndex;

...and use glVertexArrayAttribFormat with the same data, the values work correctly.

Even more interestingly:

  • On a system with an NVIDIA RTX 3060 Laptop GPU, using the same code, everything works perfectly — including with uint and glVertexArrayAttribIFormat.
  • If I attach RenderDoc to the application, the issue disappears — the data is passed correctly, even on Intel HD Graphics 530.

Full OpenGL VAO setup (captured from RenderDoc):

glCreateVertexArrays(1, VertexArray_FilledRectangles);

glEnableVertexArrayAttrib(VertexArray_FilledRectangles, 0);
glVertexArrayAttribFormat(VertexArray_FilledRectangles, 0, 2, GL_FLOAT, GL_FALSE, 0);
glVertexArrayAttribBinding(VertexArray_FilledRectangles, 0, 0);

glVertexArrayVertexBuffer(VertexArray_FilledRectangles, 0, GeometryBuffer_FilledRectangles, 0, stride);
glVertexArrayBindingDivisor(VertexArray_FilledRectangles, 0, 0);

glEnableVertexArrayAttrib(VertexArray_FilledRectangles, 1);
glVertexArrayAttribFormat(VertexArray_FilledRectangles, 1, 2, GL_FLOAT, GL_FALSE, 0);
glVertexArrayAttribBinding(VertexArray_FilledRectangles, 1, 1);

glEnableVertexArrayAttrib(VertexArray_FilledRectangles, 2);
glVertexArrayAttribFormat(VertexArray_FilledRectangles, 2, 1, GL_FLOAT, GL_FALSE, 8);
glVertexArrayAttribBinding(VertexArray_FilledRectangles, 2, 1);

glEnableVertexArrayAttrib(VertexArray_FilledRectangles, 3);
glVertexArrayAttribFormat(VertexArray_FilledRectangles, 3, 2, GL_FLOAT, GL_FALSE, 12);
glVertexArrayAttribBinding(VertexArray_FilledRectangles, 3, 1);

glEnableVertexArrayAttrib(VertexArray_FilledRectangles, 4);
glVertexArrayAttribFormat(VertexArray_FilledRectangles, 4, 4, GL_UNSIGNED_BYTE, GL_TRUE, 20);
glVertexArrayAttribBinding(VertexArray_FilledRectangles, 4, 1);

glEnableVertexArrayAttrib(VertexArray_FilledRectangles, 5);
glVertexArrayAttribIFormat(VertexArray_FilledRectangles, 5, 1, GL_UNSIGNED_INT, 24);
glVertexArrayAttribBinding(VertexArray_FilledRectangles, 5, 1);

glVertexArrayVertexBuffer(VertexArray_FilledRectangles, 1, InstancesBuffer_FilledRectangles, 0, stride);
glVertexArrayBindingDivisor(VertexArray_FilledRectangles, 1, 1);

glVertexArrayElementBuffer(VertexArray_FilledRectangles, IndexBuffer_FilledRectangles);
glObjectLabel(GL_VERTEX_ARRAY, VertexArray_FilledRectangles, -1, "VertexArray_FilledRectangles");

Question:

Is there a known issue or limitation with glVertexArrayAttribIFormat on Intel HD Graphics 530 with this driver version? Is there a recommended workaround, or would using float instead of uint be the only reliable alternative?

Thanks in advance for any help or insight!

0 Kudos
0 Replies
Reply