- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have a program that runs fine on some machines and very slowly on others. Essentially the program reads aset of numerical data, interpolates another data set from the first and writes the new data set. The arrays involved can be quite large and the interpolation, which involves some fairly complex transformations, is numerically intensive. Using SysInternals' Process Explorer (from Microsoft) to monitor the program on different machines it appears that the differencebetween a 'slow' and a 'fast'machine is the parameter labeled 'I/O Other' (i.e. not a read or a write). Once the program starts number crunching there is no other parameter that changes e.g.nopage faults and no disk activity. An example I tried today required 30,028,766 of whatever this 'I/O Other' is during the run at a rate of about 66,700 per second which took 450s. The same problem on a 'slow' machine (which is arecent computer with, I believe,a quad core Xeon in it) could only manage a rate of about 700 per second which would have taken about 12 hours to complete the same problem.
Anybody have any ideas? I know this is a bit vague but I want to try and get a handle on what is happening. The program is compiled with IVF v 10.0.027 with the debug option. What is the process doing when it is number crunching and accessing memory without page faulting that requires all this 'I/O Other'?
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
OK, procexp doesn't seem to lie about page faults so that's probably not it. But something his hammering an I/O counter, and it is hammering it with a lot more overhead on some systems than on others. Or maybe the I/O is a red herring.
My next move would be to run it with procmon, procexp's evil twin, and see if you can identify those I/O operations or spot something else that might be responsible.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Is this a single process or multiple process application? IOW do you have a memory mapped file or pipe involved (or MPI, etc...)?
And what O/S is on the slow system? What O/S is on the fast one?
I have a simulation application that generates a lot of data (30GB is not unusual). The usual system box it is run on has a Q6600 (quad core) processor. When Vista Ultimate x64was on the system and when the journal file got to around 4GB the system ground to a very slow crawl. The application has a Pause button and if I were to Pause the application and wait 10 minutes then Resume the throughput would come back to "normal". My guess is the high volume of sequential writes was flushing the program portion held memory and thus forcing excessive paging. When I retrograded to Win XP Pro x64 the symptoms of excessive paging mostly went away.
Jim Dempsey
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page