- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello!
It is known the use of modules is a more modern approach than using global subroutines. But if the project is large and includes many files, the compilation process after some changes in modules affects a large number of files. When using the global routines any changes lead to only one file compilation and linking process and it takes a little time.
Does this mean that the modules are less efficient?
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
This is a known problem, especially with older build tools such as Unix/Linux make. One common problem is that changes to the module sources that leave the module interfaces unchanged, the changes being confined to the implementation, still may cause other sources that USE the module to be recompiled. Metcalf, Reid and Cohen's book, Modern Fortran Explained, contains a few pages on how to address this.
There are a couple of utilities that help you to build makefiles that try to minimize the number of needless recompilations. If, however, you are using Visual Studio, you may have to provide a sample project with sources and a description of the problem so that the issue can be investigated.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
use of "submodules" removes that problem, Read https://software.intel.com/en-us/blogs/2015/07/07/doctor-fortran-in-we-all-live-in-a-yellow-submodule
However i thing you might have to wait for update 1 to the 16 compiler to see that benefit using Visual studio as the dependency checking isn't working correctly at the moment.

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page