Developing Games on Intel Graphics
If you are gaming on graphics integrated in your Intel Processor, this is the place for you! Find answers to your questions or post your issues with PC games
486 Discussions

Direct3D process handle leak (sample code included)

Steve_Browne
Beginner
809 Views

I originally posted this in the regular support forum, but I found this forum and thought it might get more attention. The original post can be found here:
Intel 945G chipset with latest drivers (6.14.10.4926)
Windows XP SP3
I have an application that is having a similar problem to the thread I found here:
The application I have is also for video surveillance, but I'm seeing it with a smaller number of HD resolution cameras (1920x1080). I was able to debug the application and trace the leaks down to some Direct3D calls both when allocating memory and releasing some Direct3D objects as well.
I wrote a quick sample app that I'm able to reproduce the issue on. Basically I just keep creating 1920x1080 swap chains until it fails (typically due to not enough video memory) and you will see that there are 10 process handles created at that point (using SysInternals handle.exe). Then after releasing all the swap chains created there will be 16 process handles. These process handles will remain for the duration of the application even though everything was properly released. Not only does this happen when it runs out of memory and causes the the D3DERR_OUTOFVIDEOMEMORY, but several calls before the out of memory failure you will see process handles getting created that will never be released. For instance, in my case the 6th call to CreateAdditionalSwapChain succeeds but creates a process handle and all additional calls do the same up until the 12th call which fails and creates 5 process handles.
Please let me know if there is any additional information I can provide.
Here is the sample code I am able to reproduce the issue with:
void PerformDirect3DTest(HWND hWnd)
{
HRESULT hr = S_OK;
CComPtr d3d = NULL;
CComPtr d3d_device = NULL;
d3d.Attach(Direct3DCreate9(D3D_SDK_VERSION));
// see what we can do with the graphics hardware before we try to do anything
D3DCAPS9 dxcaps;
D3DDEVTYPE devtype = D3DDEVTYPE_HAL;
DWORD dwFlags = D3DCREATE_MULTITHREADED | D3DCREATE_FPU_PRESERVE;
hr = d3d->GetDeviceCaps(D3DADAPTER_DEFAULT, devtype, &dxcaps);
// If device doesn't support HW T&L or doesn't support 1.1 vertex
// shaders in HW, then switch to SWVP.
if (((dxcaps.DevCaps & D3DDEVCAPS_HWTRANSFORMANDLIGHT) == 0) || (dxcaps.VertexShaderVersion < D3DVS_VERSION(1,1)))
dwFlags |= D3DCREATE_SOFTWARE_VERTEXPROCESSING;
else
dwFlags |= D3DCREATE_HARDWARE_VERTEXPROCESSING;
// Try to create the D3D device we need
D3DPRESENT_PARAMETERS d3dParam;
memset(&d3dParam, 0, sizeof(D3DPRESENT_PARAMETERS));
d3dParam.Windowed = true;
// since we use an additional swap chain we don't need the default backbuffer to be very big
d3dParam.BackBufferWidth = 1;
d3dParam.BackBufferHeight = 1;
d3dParam.BackBufferCount = 1;
d3dParam.SwapEffect = D3DSWAPEFFECT_DISCARD;
d3dParam.PresentationInterval = D3DPRESENT_INTERVAL_IMMEDIATE;
// this is the window handle of the renderer's video window
d3dParam.hDeviceWindow = hWnd;
// try to create a device to use based on the type and behaviour we figured out back in the constructor
hr = d3d->CreateDevice(D3DADAPTER_DEFAULT, devtype, NULL, dwFlags, &d3dParam, &d3d_device);
if (FAILED(hr))
return;
// Turn off alpha blending
if (FAILED(hr = d3d_device->SetRenderState(D3DRS_ALPHABLENDENABLE, FALSE)))
DXTRACE_ERR(TEXT("DirectX call SetRenderState failed"), hr);
// Set a larger backbuffer size for creating our swap chains
d3dParam.BackBufferWidth = 1920;
d3dParam.BackBufferHeight = 1080;
// Create swap chains until we run out of memory
CComPtr *pSwapChains = new CComPtr[256];
for (int i = 0; i < 256; i++)
{
// Create a new swap chain to use
hr = d3d_device->CreateAdditionalSwapChain(&d3dParam, &pSwapChains);
if (FAILED(hr))
{
DXTRACE_ERR(TEXT("DirectX call CreateAdditionalSwapChain failed"), hr);
break;
}
}
// Delete our array of swap chains which will also release them
delete [] pSwapChains;
pSwapChains = NULL;
// Not necessary since we're going out of scope, but we'll release anyway
d3d_device.Release();
d3d.Release();
}

0 Kudos
7 Replies
Doraisamy_G_Intel
809 Views
Unfortunately, the driver for the platform you are using (945G) is no longer actively supported by Intel. Regretfully, it is not feasible to issue a driver patch at this time for this bug. However, this bug is being looked into by our driver engineers and will be fixed in a future release of the driver.

Meanwhile, have you tried using a newer generation of Intel graphics hardware to see if your issue goes away?
0 Kudos
Steve_Browne
Beginner
809 Views
Well I'm not incredibly concerned about getting it fixed with the 945G chipset. It would be nice, but hopefully we'll be upgrading hardware soon. The main reason that I was looking into this was that we have a customer that I believe is experiencing the same issue on a G31 chipset. I don't have access to the box to verify that it's the same issue or not, but I can confirm that this doesn't happen on a G35. However, I would imagine that some of the same code is used or shared between chipsets so there is potential for it to happen on more than just the 945G.
0 Kudos
misterfinster
Beginner
809 Views
I've discovered similar behaviour with OpenGL in regards to Intel chipsets (we have a lot of laptops with Intel video chipsets (945GM, 965GM, HD) and even some desktops with HD chipsets). Our application needs to be able to create and destroy OpenGL contexts without polluting the process space, but on Intel chipsets, OpenGL leaks handles and/or memory. There is a large (~2-3MB) memory leak on all of the Intel chipsets that I've tested when using GL_ARB_vertex_program (or GL_ARB_fragment_program) specifically noting that the call to glDeleteProgramsARB does not free the memory that was previously allocated by glGenProgramsARB/glProgramStringARB. I've attached my test source file. I built it with VS2008 and it links against GLEW 1.5.8. There are both OpenGL and DirectX tests and they both leak handles and/or memory on every Intel chipset that I've tried so far. They do not leak on ATI or nVidia drivers. I also noticed that when I updated my 945GM driver to 4926, the handle leaks went away, but the shader leak still exists. If anyone happens to look at the test code and can see something I am doing wrong, please point it out.
thanks
EDIT - actual memory allocation occurs inglProgramStringARB
0 Kudos
Steve_Browne
Beginner
809 Views
I don't know if this helps, but one of the workarounds I found at least with the G31 and G35 chipsets was to just re-use the Direct3D device if possible and never release it. This solution worked out for me because the application only uses one Direct3D device anyway. The problem was that when we weren't using it anymore we would release it and it wouldn't free up all the resources associated with it. I'm positive all of our resources are being cleaned up properly prior to releasing it as well and the same code worked fine with other non-Intel graphics cards.
The only thing to be weary of is if you're using a static autoref variable to reference the Direct3D device then you'll want to manually detach it in your dll unload otherwise it will try to release it and then free up the resources while it has a loaderlock and cause deadlocks preventing your application from shutting down properly. This was our situation so we had to account for this.
0 Kudos
Philip_T_Intel1
Employee
809 Views
If you run your app on the debug runtime and examine the debug runtime output in the Output window of Visual Studio - does the runtime provide any output that corresponds to what you see?
0 Kudos
misterfinster
Beginner
809 Views
When I step through the code I provided, (in Visual Studio 2008, for the OpenGL case), I watch the process through Process Explorer as well and notice that the memory is allocated on the call toglProgramStringARB(). I also test for any GL errors after the call with glGetError() and none are reported. When I continue to step through on the tear-down of the context, the memory is not freed on the corresponding call toglDeleteProgramsARB(). There is nothing output to the debug window in Visual Studio during this sequence. For the particular GPU (Intel 945GM) i'm debugging on, WGL_ARB_create_context is not availabe and so I can't create a context with the debug attribute enabled.
I also added some checks to the code after glDeleteProgramsARB() just now and there are no errors or Visual Studio debug outputs that I can see there either.
I have almost immediate access to 4 different Intel GPU chipsets although I'm currently only setup to debug on the 945GM at the moment, if you need more information, please let me know.
On a side note, I downloaded and compiled the MesaGL software reference 'OpenGL' ICD and ran the test code with it and the memory leaks (and handle leaks) go away.
thanks
0 Kudos
mrsosedov
Beginner
809 Views
0 Kudos
Reply