- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am developing a Fortran 90program on a Linux parallel cluster using Intel Fortran (v10). There are several large (double complex) arrays declared specifically at the start (i.e., not allocated later). Beyond a certain size (8000 x 8000)I get compiler errors like -
abc.f90:(.text+0x118) relocation truncated to fit: R_X86_64_PC32 against '.hss'
ending with "additional relocation overflows ommitted from the output'.
There is plenty of memory on each node (4 Gbytes). Can anyone explain this? I can't find anything in the docs.
Thanks!
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
OK, I fouund itworks OKif I allocate memory from within the program. Why can't the error messages give poor beginners like me more of a hint as to the problem?
I found some references to this, or similar, problems in the release notes and past forum posts. But using -shared and/or-fpic as suggested just gives a different set of errors.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
This particular error relates to having more than 2GB of static code and data on a 64-bit Mac. You can do this, but the compiler and linker have to do some extra work to make it happen. Best solution is to use allocatable rather than static arrays for big arrays.
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page