- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
I have a program that works with large allocatable arrays but I manage to stay within the available memory until I try to save the array to the hard drive. Then when I try to open a file for write access I get an error insufficient virtual memory and if iostat is passed it takes value 41. I guess open file for write access needs to use some additional memory proportional to the object to be saved but I would like to understand how much additional memory it needs. And whether there is anyway round this issue? For example, is there some way to optimise the open statement for memory use at the expense of speed?
It is a x64 program on windows 10. Googling indicated the ulimit would solve this issue on Linux but I don’t know of and could find an equivalent concept in Windows.
Below is a minimal replicating example. I guess exactly how large an array is required to exhaust virtual memory will depend on the machine but, to emphasise, what I would like to understand is why the program crashes on the open statement when it hasn’t on the allocate statement for an appropriately size array.
Thanks,
Jamie
program scratch
implicit none
real(kind=8), allocatable :: policy(:,:,:,:,:)
integer :: requiredl, ios
allocate(policy(30,8,1287,11,120))
policy = 0.0
inquire (iolength=requiredl) policy
open (unit=201,form="unformatted", file='outfile', status='unknown',recl=requiredl, action='write', IOSTAT =ios)
write (201) policy
close( unit=201)
end program
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Quite probably merely a side issue, but your use of RECL= for a sequential file causes a runtime error which is hidden by IOSTAT= (and not checking for its value).
On my system your example worked, but I think I can see why it might fail if you are near the limits: an unformatted sequential file has markers for the start and stop of records. Their values have to be computed and that may not be all that obvious. It could lead to temporary arrays or I/O buffers. Not sure, as I do not know the exact mechanisms used.
You might try with a stream access file instead: add ACCESS= 'stream' and drop the RECL=.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Quite probably merely a side issue, but your use of RECL= for a sequential file causes a runtime error which is hidden by IOSTAT= (and not checking for its value).
On my system your example worked, but I think I can see why it might fail if you are near the limits: an unformatted sequential file has markers for the start and stop of records. Their values have to be computed and that may not be all that obvious. It could lead to temporary arrays or I/O buffers. Not sure, as I do not know the exact mechanisms used.
You might try with a stream access file instead: add ACCESS= 'stream' and drop the RECL=.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks you that worked. I'm suprised though how much additional memory is being used when I don't use ACCESS ='stream'. I am able to work with array three times larger at least without passing the memory limits
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I would think the buffers needed would be at least 2 x RECL to allow one record to be written whilst a second was being buffered. Your RECL is huge
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Just dropping the RECL= seems to be sufficient to solve the memory issue

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page