- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have to write and read double precision arrays (hundreds or thousands of numbers). The access to the file is "direct" (it is essential, because I use pointers to the different array locations)... and the array is written secuentialy on the file.
Which is the fastest method to achieve this goal?
Thanks.
Which is the fastest method to achieve this goal?
Thanks.
Link Copied
13 Replies
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Unfortunately my files are huge and a filemap would use too much memory. I just need to use something like
DO I=1,n !n=70000
READ(unit,...) array(I)
ENDDO
but I think this is not a fast method. Something faster (or fastest)?
DO I=1,n !n=70000
READ(unit,...) array(I)
ENDDO
but I think this is not a fast method. Something faster (or fastest)?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
An array and a direct access file are two different things. The array is in memory, and the file is on disk. Can you clarify what you're doing?
Mike
Mike
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The fastest way to read and write data is to read or write the entire array at once, not in a DO loop an element at a time. However, if the array is very large, this can cause a problem with memory usage, so sometimes reading in a section of an array:
is better - the compiler knows how to turn this style into a single transfer.
Note that unformatted I/O is MUCH faster than formatted I/O.
Steve
read (1) (array(i),i=1,1000)
is better - the compiler knows how to turn this style into a single transfer.
Note that unformatted I/O is MUCH faster than formatted I/O.
Steve
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Of course, they are different things. I am trying to write the array (that is in memory) to a file (in the hard disk), and later, read the file into another array.
This process is quite slow (using big arrays and huge files). I am looking for the fastest method.
Thanks.
This process is quite slow (using big arrays and huge files). I am looking for the fastest method.
Thanks.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks, Steve. But I have an IOSTAT error 67.
What I am doing is:
Opening the file:
OPEN(UNIT = uni ,&
FILE = fname ,&
CONVERT= 'BIG_ENDIAN' ,&
ACCESS = 'DIRECT' ,&
FORM = 'UNFORMATTED' ,&
ACTION = 'READWRITE' ,&
RECL = 1 ,&
BUFFERED ='YES' ,&
BLOCKSIZE = 16384 ,&
BUFFERCOUNT= 1 ,&
IOSTAT = Io)
and reading the file:
READ(UNIT=uni,REC=ir,IOSTAT=io) (vect(I), I=1,leng)
where leng=20000 and vect is declared...
INTEGER vect(*) !(...from a big enought array)
What is wrong?
Thanks
What I am doing is:
Opening the file:
OPEN(UNIT = uni ,&
FILE = fname ,&
CONVERT= 'BIG_ENDIAN' ,&
ACCESS = 'DIRECT' ,&
FORM = 'UNFORMATTED' ,&
ACTION = 'READWRITE' ,&
RECL = 1 ,&
BUFFERED ='YES' ,&
BLOCKSIZE = 16384 ,&
BUFFERCOUNT= 1 ,&
IOSTAT = Io)
and reading the file:
READ(UNIT=uni,REC=ir,IOSTAT=io) (vect(I), I=1,leng)
where leng=20000 and vect is declared...
INTEGER vect(*) !(...from a big enought array)
What is wrong?
Thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
RECL=1? It's not going to be too happy about that!
I think I see where you're trying to go with this, so let me suggest the following:
Open the file FORM='BINARY' and ACCESS='SEQUENTIAL' (leave off RECL). To position in the file, call FSEEK (see on-disk documentation index). Then do unformatted sequential READs of however much data you want.
Note that the CONVERT='BIG_ENDIAN' is going to slow you down a lot. But if that's what you need, you can't avoid it.
Steve
I think I see where you're trying to go with this, so let me suggest the following:
Open the file FORM='BINARY' and ACCESS='SEQUENTIAL' (leave off RECL). To position in the file, call FSEEK (see on-disk documentation index). Then do unformatted sequential READs of however much data you want.
Note that the CONVERT='BIG_ENDIAN' is going to slow you down a lot. But if that's what you need, you can't avoid it.
Steve
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hy, Steve, thanks. I get it (I guess). But I have problems writing at the end of the file. For example, on an empty file:
offset=0
Io=FSEEK(uni,offset,0) !beginning of the file
WRITE(UNIT=uni, IOSTAT=io) variable !variable = 4
It doesn't write anything.
What to do?
Thanks
offset=0
Io=FSEEK(uni,offset,0) !beginning of the file
WRITE(UNIT=uni, IOSTAT=io) variable !variable = 4
It doesn't write anything.
What to do?
Thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
An example I tried, based on your fragment, worked ok. You asked about writing to "the end of the file" but the code you showed wrote to the beginning. Is that what you want?
Steve
Steve
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Just a note regarding the size of this data, if you are talking about 70000 doubles that is plenty small enough for mapped memory. Anyhow whether you need any extra speed beyond what you can get through the language RTL depends upon your performance goals.
James
James
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You are right. But at the beginning, the end of the file is the beginning. ;o)
My problem was on the "BUFFERED Specifier". I was trying to read data before it was written. Is there any way to force the writing of the buffer?
About FileMapping, I guess that if I want to read something on the 128kb position, the file image must have at least 128kb. My file can be 500 or 1000 Mb, quite big for memory.
Thanks
Javier.
My problem was on the "BUFFERED Specifier". I was trying to read data before it was written. Is there any way to force the writing of the buffer?
About FileMapping, I guess that if I want to read something on the 128kb position, the file image must have at least 128kb. My file can be 500 or 1000 Mb, quite big for memory.
Thanks
Javier.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Ah, I didn't try BUFFERED. The whole point of BUFFERED is to allow the RTL to decide when the data should be written. You can close the file and re-open it if you wish, but you may want to try the DFPORT routine "FLUSH" to see if that works (I'm not sure if it would.)
Steve
Steve
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Again, the file sizes that you are talking about are not a problem. Think about it, this is just a different way of using the same amount of memory.
James
James

Reply
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page