- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Is there a good way to determine what is causing an executable to have huge numbers of page faults? Or a sure fire way to eliminate them?
Dave
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I believe that Intel VTune can identify accesses in your program that lead to page faults, but my suggestion for a first approach would be to look at how much virtual memory your application is using and compare that with the amount of available physical memory on your system.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I use the process monitor to get most of the values below. In a 4 minute run it had about 6,000,000 page faults. Way to many to be normal. I have 2.5GB of memory. Less than1GB was in use when running the executable. I have a 4GB swap file. I get this running one particular solution routine within the exectuable. When I switch to another, it doesn't generate more than a few thousand page faults, or about what I figure is normal for loading. So I need to determine what is causing the page faults.
BTW, I thought I had edited my post to add more info, but I see that it isn't there. Not sure why. But in any case I set the /stack:100000000 hoping that might be where it was getting hit. But it didn't change anything. The solution routine is part of a 3rd party .lib so I don't have the source for it, but the vendor doesn't think the problem is his. Will VTune work on that if I don't have a debug version of it?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Changing the stack size has no effect on this.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Can you tell me what type of programming constructs is likely to cause paging? If it isn't the stack size then that would seem to rule out where local variables are placed on the stack. I'm at a loss to understand why operating system and compiler defaults are not letting the executable take advantage of what is available on my machine. With 2.5 GB of memory I would think somewhere along the line it should just put it all in memory and run. Since that doesn't happen I need to force it and since I don't know where to look to make that happen.
Dave
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You'll get excessive faulting if there is not enough physical memory available to satisfy the majority of access requests. Rather than trying to "force" things, you would do better to understand why the available physical memory is not being used to the fullest extent. I can't really help with that.
You CAN lock pages into physical memory, but you shouldn't have to for most applications. It could be that your swapfile is too small - even if you have lots of RAM, a too-small swapfile will restrict things. By default, Windows dynamically enlarges and shrinks the swapfile. I prefer to set it to a large fixed size and, if possible, create a swapfile on a separate drive from the boot drive.
There's a discussion on vitrual memory going on today at Slashdot - haven't read the whole thing but there may be interesting info there (and the usual load of noise.)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
It's likely the 13th party library is doing something extravagant, such as allocating an array of real *4(1000,1000,1000). That's going to take 4GB. Since few computers have that much real RAM, the poor OS will try to fake it on disk. If the code accesses the array semi-randomly, it can cause a whole lot of page faults.
Try looking at the task manager "mem usage" column for the running program. If it goeas up into the gigabytes, that's a sure sign the library code is being a bit piggish.
BTW you arent passing in huge arrays to the library rouintes are you? If you're passing in a 100 MB array, the lib routines may have to make a copy or two of the data to do their busywork.
I'd start looking around for an open-source version of that library. At least then you can FIND the places where the inefficiency lies and try to improve it.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Is your application actualy running slower on the system with high page faults (adjust your expectations by processor archetecture).
I recall seeing an article on microsoft.com relating to Page Faults being counted for certain system calls and without causing any disk paging. The following link may help:
MS uses Soft Page Faults (no I/O) under some circumstances and I think the circumstances differ depending on the version of the MS O/S..
Jim Dempsey

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page