- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I got rid of Portland Group and am now trying to compile MM5 with Intel 11.1. Are there any suggested setting for the configure.user file? So far only failure.
Our OS is Centos 5.3 (Rocks).
Below is what I have tried (via googled suggestions):
RUNTIME_SYSTEM = "linux"
MPP_TARGET=$(RUNTIME_SYSTEM)
### edit the following definition for your system
LINUX_MPIHOME = /opt/intel/impi/3.2.2 # change 'bin' to 'bin64' below
MFC = $(LINUX_MPIHOME)/bin64/mpif77
MCC = $(LINUX_MPIHOME)/bin64/mpicc
MLD = $(LINUX_MPIHOME)/bin64/mpif77
FCFLAGS = -I${LINUX_MPIHOME}/include -I. \\
-convert big_endian -O2 -ip -fno-alias -safe-cray-ptr \\
-mp1 -no-ftz -openmp -DDEC_ALPHA -static
LDOPTIONS = $(FCFLAGS)
LOCAL_LIBRARIES = -L$(LINUX_MPIHOME)/lib64 # for intel compilers
CPP = /lib/cpp -C -P -traditional
CFLAGS = -O -DMPI -DDEC_ALPHA -static
CPPFLAGS = -w -I${LINUX_MPIHOME}/include -DDEC_ALPHA -static
----------------------------
Thanks!
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Nick,
I'm moving this to the Intel Fortran Compilers forums, where the Fortran experts can help you. Could you also provide the error you're seeing? That would be helpful.
I would also suggest using mpiifort instead of mpif77 and mpiicc instead of mpicc. The scripts in your Makefile (mpif77 and mpicc) use the GNU compilers by default.
Regards,
~Gergana
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I'm trying to build with impi version 3.2.2.
Seperately I have successfully compiled MM5 using MPICH2 (Compiled with Intel), but MM5 hangs when I try to run. Also had the same problem with wrf hanging with MPICH2, but WRF runs perfectly fine with impi.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Sorry, I do see your mpi 2.2.2 setting.
I always source the mpivars and ifortvars scripts in my Makefile and cluster job submission, rather than trying to find all the individual path settings. What problems do you encounter?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Tim,
A sample error, right away in the build:
/opt/intel/impi/3.2.2/bin/mpiicc -c -I/opt/intel/impi/3.2.2/include -DMPI -DRSL_SYNCIO -Dlinux -DSWAPBYTES -O -DIMAX_MAKE= -DJMAX_MAKE= -DMAXDOM_MAKE=6 -DMAXPROC_MAKE=256 -DHOST_NODE=0 -DMON_LOW=1 -DALLOW_RSL_168PT=1 set_padarea.c
/opt/intel/impi/3.2.2/include/mpi.h(35): catastrophic error: #error directive: A wrong version of mpi.h file was included. Check include path.
-----------------
It looks like it may want the include64 directory instead, but when I add " -I${LINUX_MPIHOME}/include64 " to the configure.user file, it has no effect; i.e. somehow it still looks in the /include directory and not include64
I will try sourcing the *vars.sh files within the Makefile, so see if that has an effect. Thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Yes, that looks as if an mpi.h may have been found in the CentOS installation. Sourcing mpivars.sh should correct that, when using mpiicc et al.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I've been sourcing mpivars.sh and I still get the error
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
One way to check which mpi.h is coming in is to save the pre-processed source code by
ifort -E
or
ifort -keep .......
It should show a complete source path and expansion.
I've been bitten by this kind of thing. Whichever MPI came with CentOS would use conflicting coding for MPI data types, and so the Intel MPI wrapper is doing us a favor by pointing out the error. The Intel mpi wrapper would set up the correct include path, but your Makefile. for example, might over-ride and cause this problem.
If you never use the CentOS MPI, you might find and remove it by some tactic like
rpm -qa | grep -i mpi
then run (as root)
rpm -e (list of rpms to remove)
It's really not good to have any MPI components present on default paths. When you build an open source MPI yourself, you can easily avoid the problem, by configuring it to install in a specific place, like
/usr/local/openmpi1_4/
The Intel Cluster Ready clck enforces MPI path installation separation, among other things. That's definitely back to the cluster/MPC forum territory where you started.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Despite specifying "include64" in configure.user, the Make seems to want to use "include".When I manually compile the following file, it suceeds.
/opt/intel/impi/3.2.2/bin/mpiicc -c -I/opt/intel/impi/3.2.2/include64 -DMPI -DRSL_SYNCIO -Dlinux -DSWAPBYTES -O -DIMAX_MAKE= -DJMAX_MAKE= -DMAXDOM_MAKE=6 -DMAXPROC_MAKE=256 -DHOST_NODE=0 -DMON_LOW=1 -DALLOW_RSL_168PT=1 set_padarea.c
I think part of the problem is that MM5 mostlypredates 64 bit stuff, so they never thought to pass 64bit folders through the scripts/makefiles. I'll try to tunnel through the layers of scripts to get the 64bit folders passed along correctly.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I now can get a successful compile. The trick was to specify bin64, include64, and lib64 in the configure.user file, and add "include64" the the 7th line of /MPP/RSL/RSL/makefile.linux
However, now I have the same problem of mm5.mpp hanging when I try to run it (same as when I tried MPICH2 with intel compilation). Sigh.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page