Intel® Fortran Compiler
Build applications that can scale for the future with optimized code designed for Intel® Xeon® and compatible processors.
Announcements
FPGA community forums and blogs on community.intel.com are migrating to the new Altera Community and are read-only. For urgent support needs during this transition, please visit the FPGA Design Resources page or contact an Altera Authorized Distributor.

Permission to access file denied

dboggs
New Contributor I
4,409 Views

My application deals with an unformatted direct access file. A condensed outline is

OPEN (60, FILE = 'dynbmrts.ts', FORM = 'unformatted', ACCESS = 'direct', RECL = ...)
Write data to file
Read data from file
CLOSE (60)

The above is in a loop that executes 36 times.

Sometimes the program runs OK. More often, it produces the runtime error 'forrtl: severe (9): permission to access file denied, unit 60, file = path\dynbmrts.ts'. This error seems to occur randomly at any trip through the loop. There may be about 5 other files open at the same time, but they don't seem to affect the occurrence of this error. My system is 64 bit Windows 8.1. The program is type console, and it runs fine on an XP machine in command mode.

Any ideas what could be causing this behavior?

0 Kudos
14 Replies
Steven_L_Intel1
Employee
4,409 Views

We've seen that Windows sometimes takes a while to release all its handles after a file is closed. The usual recommendation is to put a delay of half a second after the close (use SLEEPQQ). Do you really need to close and reopen it? Could you use FLUSH instead? Since it's direct access, you don't have to worry about file position.

0 Kudos
dboggs
New Contributor I
4,409 Views

Thanks for the suggestions Steve. Very interesting.

Yes, the program does need to close and reopen the file on every trip through the loop, because the record length (of this direct access file) needs to respond to certain parameters that may be different.

As some more testing, I have tried running the same exe file on some other computers. The program runs reliably on a couple of computers under Windows 7, but it fails randomly on the computer running Windows 8.1. And, it doesn't seem to matter what computer the exe was built on. (They have different versions of IVF but I don't have the details handy right now.)

This makes me wonder--could it be that IVF is not yet fully "qualified" on Windows 8.1? Is there any reason to suspect this might be the case? Anybody have any similar experiences?

I will continue to explore what happens by inserting a small delay in the loop, but I hate to think this is the solution because the program runs very fast right now--approx. 0.5 sec per loop iteration--so even a small delay would cause it to run slower than what my users are willing to wait on! I will try putting in a trap for those (hopefully) rare occasions when the file access fails, and take appropriate action like simply retrying.

0 Kudos
Steven_L_Intel1
Employee
4,409 Views

Nope - has nothing to do with "qualified". This symptom has been reported for years against many versions of Windows. It isn't reliably reproducible, as you found. You could check for an error on the OPEN and then wait a half second and try again.

0 Kudos
dboggs
New Contributor I
4,409 Views

For what it's worth: The existence of this problem under Windows 8.1 is pretty reliable:it almost always occurs although at different times through my loop. Under windows 7 (and on different machines) the program runs very reliably: the problem has NEVER occurred.

0 Kudos
andrew_4619
Honored Contributor III
4,409 Views

dboggs wrote:

For what it's worth: The existence of this problem under Windows 8.1 is pretty reliable:it almost always occurs although at different times through my loop. Under windows 7 (and on different machines) the program runs very reliably: the problem has NEVER occurred.

It could just be a question of timing though: the 8.1 machine being faster or slower or some attribute of 8.1 that makes some i/o actions faster or slower. As suggested by Steve, if you test on open failure and have a short (20ms I have found to me sufficient in the past) delay before a retry does the problem go away? That would more conclusively diagnose that it is a timing issue and not some other issue that has not been considered....

 

0 Kudos
dboggs
New Contributor I
4,409 Views

Yes it COULD be a timing problem, because so far the Win 8.1 machine is definitely faster than my home Win 7 machine, and possibly faster (but only slightly) than the office Wind 7 machine I have tried. Also, I put a trap in the OPEN statement and that appears to work. Hopefully a trap in the preceding CLOSE statement will also work.

Nevertheless this seems like a very bad situation. If this problem is simply related to Windows' common failure to release file handles, it can only be expected to get worse. And so what if the solution is a simple trap in the CLOSE statement, should I as a user be expected to implement that every time I close a file, especially if I expect my program to continue to run reliably on future computers of ever-increasing speed? Makes me want to write a substitute called CLOSE_REALLY and put it in our library.

But wait...why doesn't Intel do that in their standard CLOSE statement so the rest of us don't have to deal with it?

0 Kudos
Steven_L_Intel1
Employee
4,409 Views

Because there's no issue with the CLOSE. It's the subsequent OPEN that fails.

0 Kudos
andrew_4619
Honored Contributor III
4,409 Views

It 'issue' is windows. On closing a file there is quite a bit of tidying up which also often involves slow mechanical devices such has hard drives. The OS designer had two choices, make your application wait for this to complete before allowing it to proceed, or complete this task as a separate activity and let your program continue without delay. The latter I think is a good choice. Occasionally there is a "downside" if you try to open the file before the closing processes have completed but it isn't much of a downside as all you need are some simple checks on every file open. I find file open operations have so many ways in which they regularly fail it is inconceivable to me to have any file open without some error trapping. The alternative is programs that crash.

 

 

 

0 Kudos
dboggs
New Contributor I
4,409 Views

Great discussion guys, thanks.

OK so I understand it is the file OPEN that is at risk. So I should always have a trap. The problem is, for this particular behavior--which is apparently common and likely to get worse, I need the trap to do something as simple and routine as waiting then retrying, in a loop, until it succeeds. Should I put this kind of a routine in all of my OPEN commands? It sounds like it is needed, yet not really practical. I feel like I'm in a quandary. We have more than 30 years of history with routines like this, from Microsoft 5.0 through all of its descendants, on a wide array of compilers and computers (but always DOS or Windows), and NEVER had any trouble like this, and now it is routine just because we are using a faster computer? There is something wrong with this picture.

0 Kudos
andrew_4619
Honored Contributor III
4,409 Views

There is a huge gulf between the speed of hard drives and CPUs and memory which had got bigger as time has marched on so I would expect the occurrence of such issues to increase. 

 It would be best when designing programs to not be relying on opening and closing a file repeatedly as that is the main cause of this type of  problem. There isn't an easy  no work involved solution. 

You could for example write a subroutine called "myopen" with a whole load of optional parameters so that is looks like OPEN but with a couple of extra parameters such as number of retries on failure and delay between retries. It is then a one line edit where you have problem opens.

RETRIES would be a useful extension to OPEN ..... 

0 Kudos
FortranFan
Honored Contributor III
4,409 Views

dboggs wrote:

My application deals with an unformatted direct access file. A condensed outline is

OPEN (60, FILE = 'dynbmrts.ts', FORM = 'unformatted', ACCESS = 'direct', RECL = ...)
Write data to file
Read data from file
CLOSE (60)

The above is in a loop that executes 36 times.

Sometimes the program runs OK. More often, it produces the runtime error 'forrtl: severe (9): permission to access file denied, unit 60, file = path\dynbmrts.ts'. This error seems to occur randomly at any trip through the loop. There may be about 5 other files open at the same time, but they don't seem to affect the occurrence of this error. My system is 64 bit Windows 8.1. The program is type console, and it runs fine on an XP machine in command mode.

Any ideas what could be causing this behavior?

dboggs wrote:

... I feel like I'm in a quandary. We have more than 30 years of history with routines like this, from Microsoft 5.0 through all of its descendants, on a wide array of compilers and computers (but always DOS or Windows), and NEVER had any trouble like this, and now it is routine just because we are using a faster computer? There is something wrong with this picture.

Given all the facilities in the modern Fortran language to create highly flexible and extensible data structures (derived types) with allocatable components, would you consider rearchitecting your code not to use files to write and read data inside of a loop?  Instead, you can consider working with all the data in "memory" e.g., with the use of derived types and type-bound procedures?  If required, you can then write it out the required data to a file at the end of all processing.  I'm sure you realize the code design you provided in the original post dates back to the time when memory was so precious and limited.  Why constrain yourself with such design, with the inherent limitations of file I/O operations, in this day and age?  

0 Kudos
dboggs
New Contributor I
4,409 Views

Yes, I fully realize that the architecture of this inherited program is outmoded. I did realize that it will now fit into memory, and I could restructure it to do that and solve the problem thusly.

However, I still admire the old coding practices of efficient use of memory (and speed too for that matter), because these issues never completely go away, at least on programs that are intended to be long-lived: today's solution may become problematic in another decade. Who's to say that the available memory then may again be too small, or too slow? The style of today's programmers is to not worry about such things--they take a shot-term view of everything--but I disagree with that (at least for some of our programs) and that's why I use Fortran instead of say Matlab.

It just seems to me like some of the onus of getting this to run right should be on Intel. It's hard to explain to a superior, who was against the continued use of Intel Fortran in the fist place, that "the program doesn't run reliably without a lot more time spent in re-designing it, or putting new work-arounds in our code library, because our computers are too fast." 

0 Kudos
dboggs
New Contributor I
4,409 Views

Just to add some more info to document this problem:

My error trap usually solves the "access denied" problem on the OPEN statement, but not always: I occasionally get the error "attempt to write to a read-only file." This appears to be caused by the presence of the switch /fpscomp:ioformat, and my omission of the ACTION specifier, as follows. According to IVF documentation, the default OPEN ACTION specifier is 'READWRITE' (what I need) but not with this switch. With the switch, if the attempt to open the file fails, the system tries to open the file again, first using READ, then using WRITE. Apparently, Windows occasionally (at random) decides to release the file handle exactly between these two attempts, resulting in a read-only file.

So far, it looks like the solution is to explicitly state the desired action in the open statement:

100 CONTINUE
     OPEN (10, FILE = 'DYNBMRTS.TS', FORM = 'UNFORMATTED', STATUS = 'REPLACE', &
     ACCESS = 'DIRECT, RECL = 4*nsample, ACTION = 'READWRITE', IOSTAT = ierr)
     IF (ierr /= 0) THEN             ! Windows probably hasn't released handle from a previous close command
          CALL SLEEPQQ (500) ! Wait .5 seconds,
          GO TO 100                   ! then try again
     END IF
     :

 

0 Kudos
FortranFan
Honored Contributor III
4,409 Views

dboggs wrote:

Yes, I fully realize that the architecture of this inherited program is outmoded. I did realize that it will now fit into memory, and I could restructure it to do that and solve the problem thusly.

However, I still admire the old coding practices of efficient use of memory (and speed too for that matter), because these issues never completely go away, at least on programs that are intended to be long-lived: today's solution may become problematic in another decade. Who's to say that the available memory then may again be too small, or too slow? The style of today's programmers is to not worry about such things--they take a shot-term view of everything--but I disagree with that (at least for some of our programs) and that's why I use Fortran instead of say Matlab.

It just seems to me like some of the onus of getting this to run right should be on Intel. It's hard to explain to a superior, who was against the continued use of Intel Fortran in the fist place, that "the program doesn't run reliably without a lot more time spent in re-designing it, or putting new work-arounds in our code library, because our computers are too fast." 

".. today's solution may become problematic in another decade. Who's to say that the available memory then may again be too small, or too slow? .. " - anything is possible, of course, but the sky may fall first!

Having "ported" a significant number of legacy code with supposed "memory efficient" design from 3 different industries to "modern Fortran", one of my impressions is that while many of the pieces (usually solvers, for matrix operations) worked well with minimal use of memory, the overall packages couldn't be termed particularly "efficient" for almost all of these apps involved statically allocated arrays that were dimensioned to the maximum expected problem size with the code needing to do a lot of "bookkeeping" for the actual problem size which was often 1/4th to 1/3rd of the declared sizes.  A DLL code package that I got done porting this Fall to modern Fortran with fully object-oriented design was shown to use 3 times less memory during execution compared to the "legacy" DLL which was based originally on IBM mainframe and later ported to Microsoft Powerstation followed by Digital/Compaq Visual Fortran.

 

0 Kudos
Reply