Showing results for 
Search instead for 
Did you mean: 

glVertexArrayAttribFormat incorrectly requires that vao be bound

System Setup Information:

System Used: ASUS Zenbook UX305CA
CPU SKU: Intel Core m3-6Y30
GPU SKU: Intel HD Graphics 515
Processor Line:
System BIOS Version: UX305CA.201
CMOS settings:
Graphics Driver Version:
GOP/VBIOS Version:
Operating System: Windows 10 Home 64-bit
OS Version: 1803 (OS Build 17134.472)
Occurs on non-Intel GPUs?: No


Steps to Reproduce:
1. Create a vertex array object with direct state access function glCreateVertexArrays(1, &foo)
2. Enable vertex array object attribute with glEnableVertexArrayAttrib(foo, ...)
3. Initialize vertex array object attribute with glVertexArrayAttribFormat(foo, ...)
4. Set vertex array object binding with glVertexArrayAttribBinding(foo, ...)
5. Bind required vertex buffer(s) with glVertexArrayVertexBuffer(foo, ...)
6. Bind required index buffer with glVertexArrayElementBuffer(foo, ...)
7. Bind vertex array object with glBindVertexArray(foo)
8. Draw primitives with any indexed draw call


Expected Results:

Geometry visible on screen, or an error reported by any of the above functions.


Actual Results:

No geometry is visible, no errors are reported.


Additional Information:

Same code path works without issue on NVIDIA GeForce GTX 1070. This appears to only cause problems with multiple vertex array objects; the first vao initialized by my application doesn't have any issues. I was able to workaround the issue by binding the vertex array object just around the call to glVertexArrayAttribFormat. Passing an unused vao name for the vaobj parameter or the same (valid) vao name for all vertex array objects causes different behavior so it appears the vaobj parameter is being used to some extent.

Unfortunately I don't have a minimal repro but I can take some time to provide one if you have difficulty reproducing the issue.

0 Kudos
0 Replies