- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
I am experiencing a bug when trying to create shader resource views for a TextureCubeArray in DirectX 11. I am creating a shader resource view for each cubemap of the array like this:
UINT slices = tex2DarraySize / 6; // independent cubemaps for (UINT i = 0; i < slices; ++i) { shaderResourceViewDesc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURECUBEARRAY; shaderResourceViewDesc.TextureCubeArray.First2DArrayFace = i * 6; shaderResourceViewDesc.TextureCubeArray.NumCubes = 1; shaderResourceViewDesc.TextureCubeArray.MostDetailedMip = 0; //from most detailed... shaderResourceViewDesc.TextureCubeArray.MipLevels = -1; //...to least detailed pTexture->additionalSRVs_DX11.push_back(nullptr); hr = device->CreateShaderResourceView(pTexture->texture2D_DX11, &shaderResourceViewDesc, &pTexture->additionalSRVs_DX11); assert(SUCCEEDED(hr) && "RenderTargetView Creation failed!"); }
The problem is with the First2DArrayFace = i * 6; piece, as it seems not to index slices of a Texture2DArray which is the base resource type, but like it was a cubearray. So if I write First2DArrayFace = I; then the results are good. Unfortunately this workaround is not good because other GPUs, like Nvidia behave differently and comply with the first case. The name of the field also suggest that we should refer to the 2D array slice as does the documentation on MSDN. This bug occurs to me with an Intel HD 620 laptop integrated GPU. I hope I could provide good feedback.
Kind Regards,
Janos
Link Copied
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page