Intel® MPI Library
Get help with building, analyzing, optimizing, and scaling high-performance computing (HPC) applications.
2159 Discussions

Problem launching fortran executable and python file with mpiexec on a Linux cluster

helires
Beginner
755 Views
Hello,

I have some trouble for launching fortran executable and python file with mpiexec on a Linux cluster: the executable is
elaborated with ifort linking the MSC Marc code with fortran user subroutine, the python code includes mpi4py (linking python
with MPI).
The output of the Linux cluster is the following (complete output at the end):

Fatal error in MPI_Init_thread: Other MPI error, error stack:
MPIR_Init_thread(394).....: Initialization failed
MPID_Init(118)............: channel initialization failed
MPIDI_CH3_Init(43)........:
MPID_nem_init(202)........:
MPIDI_CH3I_Seg_commit(363): PMI_KVS_Get returned -1
Killed (signal 9)

Let me explain more about the codes used:

(1) the code MSC Marc is linked with MPI shipped by MSC Software, basically a Intel MPI;

(2) the python code involves MPICH code, built by me. According to the FAQ of MPICH2, it would work with code based on Intel
MPI. I have especially taken an old version of MPICH, version 1.2 for not getting into incompatibility with Intel MPI (the
common mpif.h are compatible).

Disgression about the objective of this procedure: MSC Marc is a structural code, by writing a interface python code; it is
possible to simulate fluid structure interaction. The CFD code has also a python interface. Exchange between heterogeneous
computers is taken care by python. End of disgression.

The above procedure has worked for a HP cluster. For this machine, the MPI code is the same for MSC Marc and python code,
namely HPMPI. Some tests have been made for the installation on the new Linux cluster:

(1) the folowing instruction works:
mpiexec -np 1 small-fortran (executable) : -np 1 python small-python-code.py

(2) same for:
mpiexec -np 1 script.marc (executable of MSC Marc)

(3) same for:
mpiexec -np 1 script.marc : -np 1 script.marc

(4) same for:
mpiexec -np 1 small-fortran : -np 1 small-fortran

So what's is wrong for

mpiexec -np 1 script.marc : -np 1 fortran-executable

I don't have the slightest idea. Would somebody have any idea, I would be indebted to hear about it.

Regards,



========================Complete output

mpiexec --verbose -genv I_MPI_FALLBACK_DEVICE 0 -np 1 /wrk3/helires/CFD_ADM/CouplingScript3D_simple.marc -jid mod4_rotor_adm
-dirjid /wrk3/helires/CFD_ADM -maxnum 1000000 -nthread 1 -dirjob /wrk3/helires/CFD_ADM -ml 5000 -ci yes -cr yes : -np 1
/wrk3/helires/bin/Python-2.7_install/bin/python2.7-mpi /wrk3/helires/CFD_ADM/MasterMarc3D_simple.py
[mpiexec@bigblue]
[mpiexec@bigblue] =================================================[mpiexec@bigblue] ==========================================
=======[mpiexec@bigblue]
[mpiexec@bigblue] mpiexec options:
[mpiexec@bigblue] ----------------
[mpiexec@bigblue] Base path: /wrk3/helires/bin/mpich2-install/bin/
[mpiexec@bigblue] Proxy port: 9899
[mpiexec@bigblue] Bootstrap server: ssh
[mpiexec@bigblue] Debug level: 1
[mpiexec@bigblue] Enable X: -1
[mpiexec@bigblue] Working dir: /wrk3/helires/CFD_ADM
[mpiexec@bigblue] Host file: HYDRA_USE_LOCALHOST
[mpiexec@bigblue]
[mpiexec@bigblue] Global environment:
[mpiexec@bigblue] -------------------
[mpiexec@bigblue] REMOTEHOST=nanopus.frlab
[mpiexec@bigblue] HOSTNAME=bigblue
[mpiexec@bigblue] MSC_LICENSE_NOQUEUE=yes
[mpiexec@bigblue] HOST=bigblue
[mpiexec@bigblue] TERM=dtterm
[mpiexec@bigblue] SHELL=/bin/csh
[mpiexec@bigblue] SSH_CLIENT=125.1.5.218 62405 22
[mpiexec@bigblue] MSC_LICENCE_FILE=1700@adriatic
[mpiexec@bigblue] QTDIR=/usr/lib64/qt-3.3
[mpiexec@bigblue] QTINC=/usr/lib64/qt-3.3/include
[mpiexec@bigblue] SSH_TTY=/dev/pts/4
[mpiexec@bigblue] GROUP=DADS
[mpiexec@bigblue] USER=helires
[mpiexec@bigblue] LD_LIBRARY_PATH=/wrk3/helires/bin/mpich2-install/lib:/opt/intel/cce/11.1/073/lib/intel64:/opt/intel/fce/10.1.026/lib:/wrk3/helires/bin/marc/marc2010.2/intelmpi/linux64/lib64/:/wrk3/helires/bin/marc/marc2010.2/lib/linux64i8:/wrk3/helires/bin/marc/marc2010.2/lib_shared/linux64:/wrk3/helires/bin/Python-2.7_install/lib/python2.7:/lib64:/usr/lib64/:
[mpiexec@bigblue] LS_COLORS=no
[mpiexec@bigblue] HOSTTYPE=x86_64-linux
[mpiexec@bigblue] KDEDIR=/usr
[mpiexec@bigblue] MAIL=/var/spool/mail/helires
[mpiexec@bigblue] PATH=.:/wrk3/helires/bin/mpich2-install/bin:/wrk3/helires/bin/Python-2.7_install/bin:/wrk3/helires/bin/mpich2-install/bin:/wrk3/helires/bin/util:/usr/lib64/qt-3.3/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin
[mpiexec@bigblue] INPUTRC=/etc/inputrc
[mpiexec@bigblue] PWD=/wrk3/helires/CFD_ADM
[mpiexec@bigblue] LANG=fr_FR.UTF-8
[mpiexec@bigblue] KDE_IS_PRELINKED=1
[mpiexec@bigblue] PS1=`hostname`>>
[mpiexec@bigblue] LM_LICENSE_FILE=1700@adriatic
[mpiexec@bigblue] SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass
[mpiexec@bigblue] SHLVL=3
[mpiexec@bigblue] HOME=/wrk3/helires
[mpiexec@bigblue] OSTYPE=linux
[mpiexec@bigblue] CFLAGS=-fPIC
[mpiexec@bigblue] VENDOR=unknown
[mpiexec@bigblue] PYTHONPATH=/wrk3/helires/bin/Python-2.7_install/lib/python2.7:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/plat-linux2:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/lib-tk:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/lib-dynload:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/site-packages
[mpiexec@bigblue] MACHTYPE=x86_64
[mpiexec@bigblue] LOGNAME=helires
[mpiexec@bigblue] QTLIB=/usr/lib64/qt-3.3/lib
[mpiexec@bigblue] SSH_CONNECTION=125.1.5.218 62405 125.1.6.45 22
[mpiexec@bigblue] LESSOPEN=|/usr/bin/lesspipe.sh %s
[mpiexec@bigblue] DISPLAY=hudson:0
[mpiexec@bigblue] G_BROKEN_FILENAMES=1
[mpiexec@bigblue] OLDPWD=/wrk3/helires
[mpiexec@bigblue] _=/bin/csh
[mpiexec@bigblue]
[mpiexec@bigblue] User set environment:
[mpiexec@bigblue] ---------------------
[mpiexec@bigblue] I_MPI_FALLBACK_DEVICE=0
[mpiexec@bigblue]

[mpiexec@bigblue] Executable information:
[mpiexec@bigblue] **********************
[mpiexec@bigblue] Executable ID: 1
[mpiexec@bigblue] -----------------
[mpiexec@bigblue] Process count: 1
[mpiexec@bigblue] Executable: /wrk3/helires/CFD_ADM/CouplingScript3D_simple.marc -jid mod4_rotor_adm -dirjid /wrk3/helires/CFD_ADM -maxnum 1000000 -nthread 1 -dirjob /wrk3/helires/CFD_ADM -ml 5000 -ci yes -cr yes
[mpiexec@bigblue]
[mpiexec@bigblue] Executable ID: 2
[mpiexec@bigblue] -----------------
[mpiexec@bigblue] Process count: 1
[mpiexec@bigblue] Executable: /wrk3/helires/bin/Python-2.7_install/bin/python2.7-mpi /wrk3/helires/CFD_ADM/MasterMarc3D_simple.py
[mpiexec@bigblue]
[mpiexec@bigblue] Partition information:
[mpiexec@bigblue] *********************
[mpiexec@bigblue] Partition ID: 1
[mpiexec@bigblue] -----------------
[mpiexec@bigblue] Partition name: localhost
[mpiexec@bigblue] Process count: 1
[mpiexec@bigblue]
[mpiexec@bigblue] Partition segment list:
[mpiexec@bigblue] .......................
[mpiexec@bigblue] Start PID: 0; Process count: 1
[mpiexec@bigblue]
[mpiexec@bigblue] Partition exec list:
[mpiexec@bigblue] ....................
[mpiexec@bigblue] Exec: /wrk3/helires/CFD_ADM/CouplingScript3D_simple.marc; Process count: 1
[mpiexec@bigblue] Exec: /wrk3/helires/bin/Python-2.7_install/bin/python2.7-mpi; Process count: 1
[mpiexec@bigblue]
[mpiexec@bigblue] =================================================[mpiexec@bigblue] =================================================[mpiexec@bigblue]

[mpiexec@bigblue] Timeout set to -1 (-1 means infinite)
[mpiexec@bigblue] Got a PMI port string of bigblue:39414
[mpiexec@bigblue] Got a proxy port string of bigblue:51839
Arguments being passed to proxy 0:
--global-core-count 1 --wdir /wrk3/helires/CFD_ADM --pmi-port-str bigblue:39414 --binding HYDRA_NULL HYDRA_NULL --bindlib plpa --ckpointlib none --ckpoint-prefix HYDRA_NULL --global-inherited-env 41 'REMOTEHOST=nanopus.frlab' 'HOSTNAME=bigblue' 'MSC_LICENSE_NOQUEUE=yes' 'HOST=bigblue' 'TERM=dtterm' 'SHELL=/bin/csh' 'SSH_CLIENT=125.1.5.218 62405 22' 'MSC_LICENCE_FILE=1700@adriatic' 'QTDIR=/usr/lib64/qt-3.3' 'QTINC=/usr/lib64/qt-3.3/include' 'SSH_TTY=/dev/pts/4' 'GROUP=DADS' 'USER=helires' 'LD_LIBRARY_PATH=/wrk3/helires/bin/mpich2-install/lib:/opt/intel/cce/11.1/073/lib/intel64:/opt/intel/fce/10.1.026/lib:/wrk3/helires/bin/marc/marc2010.2/intelmpi/linux64/lib64/:/wrk3/helires/bin/marc/marc2010.2/lib/linux64i8:/wrk3/helires/bin/marc/marc2010.2/lib_shared/linux64:/wrk3/helires/bin/Python-2.7_install/lib/python2.7:/lib64:/usr/lib64/:' 'LS_COLORS=no' 'HOSTTYPE=x86_64-linux' 'KDEDIR=/usr' 'MAIL=/var/spool/mail/helires' 'PATH=.:/wrk3/helires/bin/mpich2-install/bin:/wrk3/helires/bin/Python-2.7_install/bin:/wrk3/helires/bin/mpich2-install/bin:/wrk3/helires/bin/util:/usr/lib64/qt-3.3/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin' 'INPUTRC=/etc/inputrc' 'PWD=/wrk3/helires/CFD_ADM' 'LANG=fr_FR.UTF-8' 'KDE_IS_PRELINKED=1' 'PS1=`hostname`>> ' 'LM_LICENSE_FILE=1700@adriatic' 'SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass' 'SHLVL=3' 'HOME=/wrk3/helires' 'OSTYPE=linux' 'CFLAGS=-fPIC' 'VENDOR=unknown' 'PYTHONPATH=/wrk3/helires/bin/Python-2.7_install/lib/python2.7:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/plat-linux2:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/lib-tk:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/lib-dynload:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/site-packages' 'MACHTYPE=x86_64' 'LOGNAME=helires' 'QTLIB=/usr/lib64/qt-3.3/lib' 'SSH_CONNECTION=125.1.5.218 62405 125.1.6.45 22' 'LESSOPEN=|/usr/bin/lesspipe.sh %s' 'DISPLAY=hudson:0' 'G_BROKEN_FILENAMES=1' 'OLDPWD=/wrk3/helires' '_=/bin/csh' --global-user-env 1 'I_MPI_FALLBACK_DEVICE=0' --global-system-env 0 --genv-prop 1 --segment --segment-start-pid 0 --segment-proc-count 1 --exec --exec-proc-count 1 --exec-local-env 0 --exec-env-prop 0 /wrk3/helires/CFD_ADM/CouplingScript3D_simple.marc -jid mod4_rotor_adm -dirjid /wrk3/helires/CFD_ADM -maxnum 1000000 -nthread 1 -dirjob /wrk3/helires/CFD_ADM -ml 5000 -ci yes -cr yes --exec --exec-proc-count 1 --exec-local-env 0 --exec-env-prop 0 /wrk3/helires/bin/Python-2.7_install/bin/python2.7-mpi /wrk3/helires/CFD_ADM/MasterMarc3D_simple.py

[mpiexec@bigblue] Launching process: /usr/bin/ssh -x localhost /wrk3/helires/bin/mpich2-install/bin/pmi_proxy --launch-mode 1 --proxy-port bigblue:51839 --debug --bootstrap ssh --partition-id 0
helires@localhost's password:
Marc mod4_rotor_adm begins execution

(c) COPYRIGHT 2011 MSC.Software Corporation, all rights reserved


VERSION: Marc, Version, Build, Date



Date: Fri Jul 22 15:02:20 2011

Marc execution begins
Date: Fri Jul 22 15:02:21 2011
MSC Id: 0017a4770030 (ethernet) (Linux)
Hostname: bigblue (user helires, display )
License files: 1700@adriatic
CEID: 77F66039-BC7GFC45
User: helires
Display:
LAPI Version: LAPI 8.3.1-2041 (FLEXlm 10.8.6.0)
Acquired 160 licenses for Group CAMPUS (Marc) from license server on host adriatic


general memory initially set to = 25 MByte

maximum available memory set to = 5000 MByte

general memory increasing from 25 MByte to 106 MByte

MSC Customer Entitlement ID
77F66039-BC7GFC45

wall time = 2.52

wall time = 3.60

general memory increasing from 106 MByte to 552 MByte
Appel SB UBGINC!
flag= F ;ierr= 0
Fatal error in MPI_Init_thread: Other MPI error, error stack:
MPIR_Init_thread(394).....: Initialization failed
MPID_Init(118)............: channel initialization failed
MPIDI_CH3_Init(43)........:
MPID_nem_init(202)........:
MPIDI_CH3I_Seg_commit(363): PMI_KVS_Get returned -1
Killed (signal 9)
[helires@bigblue ~/CFD_ADM]$ mpiexec --verbose -np 1 /wrk3/helires/CFD_ADM/CouplingScript3D_simple.marc -jid mod4_rotor_adm -dirjid /wrk3/helires/CFD_ADM -maxnum 1000000 -nthread 1 -dirjob /wrk3/helires/CFD_ADM -ml 5000 -ci yes -cr yes : -np 1 /wrk3/helires/bin/Python-2.7_install/bin/python2.7-mpi /wrk3/helires/CFD_ADM/MasterMarc3D_simple.py
[mpiexec@bigblue]
[mpiexec@bigblue] =================================================[mpiexec@bigblue] =================================================[mpiexec@bigblue]
[mpiexec@bigblue] mpiexec options:
[mpiexec@bigblue] ----------------
[mpiexec@bigblue] Base path: /wrk3/helires/bin/mpich2-install/bin/
[mpiexec@bigblue] Proxy port: 9899
[mpiexec@bigblue] Bootstrap server: ssh
[mpiexec@bigblue] Debug level: 1
[mpiexec@bigblue] Enable X: -1
[mpiexec@bigblue] Working dir: /wrk3/helires/CFD_ADM
[mpiexec@bigblue] Host file: HYDRA_USE_LOCALHOST
[mpiexec@bigblue]
[mpiexec@bigblue] Global environment:
[mpiexec@bigblue] -------------------
[mpiexec@bigblue] REMOTEHOST=nanopus.frlab
[mpiexec@bigblue] HOSTNAME=bigblue
[mpiexec@bigblue] MSC_LICENSE_NOQUEUE=yes
[mpiexec@bigblue] HOST=bigblue
[mpiexec@bigblue] TERM=dtterm
[mpiexec@bigblue] SHELL=/bin/csh
[mpiexec@bigblue] SSH_CLIENT=125.1.5.218 62405 22
[mpiexec@bigblue] MSC_LICENCE_FILE=1700@adriatic
[mpiexec@bigblue] QTDIR=/usr/lib64/qt-3.3
[mpiexec@bigblue] QTINC=/usr/lib64/qt-3.3/include
[mpiexec@bigblue] SSH_TTY=/dev/pts/4
[mpiexec@bigblue] GROUP=DADS
[mpiexec@bigblue] USER=helires
[mpiexec@bigblue] LD_LIBRARY_PATH=/wrk3/helires/bin/mpich2-install/lib:/opt/intel/cce/11.1/073/lib/intel64:/opt/intel/fce/10.1.026/lib:/wrk3/helires/bin/marc/marc2010.2/intelmpi/linux64/lib64/:/wrk3/helires/bin/marc/marc2010.2/lib/linux64i8:/wrk3/helires/bin/marc/marc2010.2/lib_shared/linux64:/wrk3/helires/bin/Python-2.7_install/lib/python2.7:/lib64:/usr/lib64/:
[mpiexec@bigblue] LS_COLORS=no
[mpiexec@bigblue] HOSTTYPE=x86_64-linux
[mpiexec@bigblue] KDEDIR=/usr
[mpiexec@bigblue] MAIL=/var/spool/mail/helires
[mpiexec@bigblue] PATH=.:/wrk3/helires/bin/mpich2-install/bin:/wrk3/helires/bin/Python-2.7_install/bin:/wrk3/helires/bin/mpich2-install/bin:/wrk3/helires/bin/util:/usr/lib64/qt-3.3/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin
[mpiexec@bigblue] INPUTRC=/etc/inputrc
[mpiexec@bigblue] PWD=/wrk3/helires/CFD_ADM
[mpiexec@bigblue] LANG=fr_FR.UTF-8
[mpiexec@bigblue] KDE_IS_PRELINKED=1
[mpiexec@bigblue] PS1=`hostname`>>
[mpiexec@bigblue] LM_LICENSE_FILE=1700@adriatic
[mpiexec@bigblue] SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass
[mpiexec@bigblue] SHLVL=3
[mpiexec@bigblue] HOME=/wrk3/helires
[mpiexec@bigblue] OSTYPE=linux
[mpiexec@bigblue] CFLAGS=-fPIC
[mpiexec@bigblue] VENDOR=unknown
[mpiexec@bigblue] PYTHONPATH=/wrk3/helires/bin/Python-2.7_install/lib/python2.7:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/plat-linux2:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/lib-tk:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/lib-dynload:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/site-packages
[mpiexec@bigblue] MACHTYPE=x86_64
[mpiexec@bigblue] LOGNAME=helires
[mpiexec@bigblue] QTLIB=/usr/lib64/qt-3.3/lib
[mpiexec@bigblue] SSH_CONNECTION=125.1.5.218 62405 125.1.6.45 22
[mpiexec@bigblue] LESSOPEN=|/usr/bin/lesspipe.sh %s
[mpiexec@bigblue] DISPLAY=hudson:0
[mpiexec@bigblue] G_BROKEN_FILENAMES=1
[mpiexec@bigblue] OLDPWD=/wrk3/helires
[mpiexec@bigblue] _=/bin/csh
[mpiexec@bigblue]

[mpiexec@bigblue] Executable information:
[mpiexec@bigblue] **********************
[mpiexec@bigblue] Executable ID: 1
[mpiexec@bigblue] -----------------
[mpiexec@bigblue] Process count: 1
[mpiexec@bigblue] Executable: /wrk3/helires/CFD_ADM/CouplingScript3D_simple.marc -jid mod4_rotor_adm -dirjid /wrk3/helires/CFD_ADM -maxnum 1000000 -nthread 1 -dirjob /wrk3/helires/CFD_ADM -ml 5000 -ci yes -cr yes
[mpiexec@bigblue]
[mpiexec@bigblue] Executable ID: 2
[mpiexec@bigblue] -----------------
[mpiexec@bigblue] Process count: 1
[mpiexec@bigblue] Executable: /wrk3/helires/bin/Python-2.7_install/bin/python2.7-mpi /wrk3/helires/CFD_ADM/MasterMarc3D_simple.py
[mpiexec@bigblue]
[mpiexec@bigblue] Partition information:
[mpiexec@bigblue] *********************
[mpiexec@bigblue] Partition ID: 1
[mpiexec@bigblue] -----------------
[mpiexec@bigblue] Partition name: localhost
[mpiexec@bigblue] Process count: 1
[mpiexec@bigblue]
[mpiexec@bigblue] Partition segment list:
[mpiexec@bigblue] .......................
[mpiexec@bigblue] Start PID: 0; Process count: 1
[mpiexec@bigblue]
[mpiexec@bigblue] Partition exec list:
[mpiexec@bigblue] ....................
[mpiexec@bigblue] Exec: /wrk3/helires/CFD_ADM/CouplingScript3D_simple.marc; Process count: 1
[mpiexec@bigblue] Exec: /wrk3/helires/bin/Python-2.7_install/bin/python2.7-mpi; Process count: 1
[mpiexec@bigblue]
[mpiexec@bigblue] =================================================[mpiexec@bigblue] =================================================[mpiexec@bigblue]

[mpiexec@bigblue] Timeout set to -1 (-1 means infinite)
[mpiexec@bigblue] Got a PMI port string of bigblue:33253
[mpiexec@bigblue] Got a proxy port string of bigblue:55923
Arguments being passed to proxy 0:
--global-core-count 1 --wdir /wrk3/helires/CFD_ADM --pmi-port-str bigblue:33253 --binding HYDRA_NULL HYDRA_NULL --bindlib plpa --ckpointlib none --ckpoint-prefix HYDRA_NULL --global-inherited-env 41 'REMOTEHOST=nanopus.frlab' 'HOSTNAME=bigblue' 'MSC_LICENSE_NOQUEUE=yes' 'HOST=bigblue' 'TERM=dtterm' 'SHELL=/bin/csh' 'SSH_CLIENT=125.1.5.218 62405 22' 'MSC_LICENCE_FILE=1700@adriatic' 'QTDIR=/usr/lib64/qt-3.3' 'QTINC=/usr/lib64/qt-3.3/include' 'SSH_TTY=/dev/pts/4' 'GROUP=DADS' 'USER=helires' 'LD_LIBRARY_PATH=/wrk3/helires/bin/mpich2-install/lib:/opt/intel/cce/11.1/073/lib/intel64:/opt/intel/fce/10.1.026/lib:/wrk3/helires/bin/marc/marc2010.2/intelmpi/linux64/lib64/:/wrk3/helires/bin/marc/marc2010.2/lib/linux64i8:/wrk3/helires/bin/marc/marc2010.2/lib_shared/linux64:/wrk3/helires/bin/Python-2.7_install/lib/python2.7:/lib64:/usr/lib64/:' 'LS_COLORS=no' 'HOSTTYPE=x86_64-linux' 'KDEDIR=/usr' 'MAIL=/var/spool/mail/helires' 'PATH=.:/wrk3/helires/bin/mpich2-install/bin:/wrk3/helires/bin/Python-2.7_install/bin:/wrk3/helires/bin/mpich2-install/bin:/wrk3/helires/bin/util:/usr/lib64/qt-3.3/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin' 'INPUTRC=/etc/inputrc' 'PWD=/wrk3/helires/CFD_ADM' 'LANG=fr_FR.UTF-8' 'KDE_IS_PRELINKED=1' 'PS1=`hostname`>> ' 'LM_LICENSE_FILE=1700@adriatic' 'SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass' 'SHLVL=3' 'HOME=/wrk3/helires' 'OSTYPE=linux' 'CFLAGS=-fPIC' 'VENDOR=unknown' 'PYTHONPATH=/wrk3/helires/bin/Python-2.7_install/lib/python2.7:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/plat-linux2:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/lib-tk:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/lib-dynload:/wrk3/helires/bin/Python-2.7_install/lib/python2.7/site-packages' 'MACHTYPE=x86_64' 'LOGNAME=helires' 'QTLIB=/usr/lib64/qt-3.3/lib' 'SSH_CONNECTION=125.1.5.218 62405 125.1.6.45 22' 'LESSOPEN=|/usr/bin/lesspipe.sh %s' 'DISPLAY=hudson:0' 'G_BROKEN_FILENAMES=1' 'OLDPWD=/wrk3/helires' '_=/bin/csh' --global-user-env 0 --global-system-env 0 --genv-prop 1 --segment --segment-start-pid 0 --segment-proc-count 1 --exec --exec-proc-count 1 --exec-local-env 0 --exec-env-prop 0 /wrk3/helires/CFD_ADM/CouplingScript3D_simple.marc -jid mod4_rotor_adm -dirjid /wrk3/helires/CFD_ADM -maxnum 1000000 -nthread 1 -dirjob /wrk3/helires/CFD_ADM -ml 5000 -ci yes -cr yes --exec --exec-proc-count 1 --exec-local-env 0 --exec-env-prop 0 /wrk3/helires/bin/Python-2.7_install/bin/python2.7-mpi /wrk3/helires/CFD_ADM/MasterMarc3D_simple.py

[mpiexec@bigblue] Launching process: /usr/bin/ssh -x localhost /wrk3/helires/bin/mpich2-install/bin/pmi_proxy --launch-mode 1 --proxy-port bigblue:55923 --debug --bootstrap ssh --partition-id 0
helires@localhost's password:
Marc mod4_rotor_adm begins execution

(c) COPYRIGHT 2011 MSC.Software Corporation, all rights reserved


VERSION: Marc, Version, Build, Date



Date: Fri Jul 22 15:02:39 2011

Marc execution begins
Date: Fri Jul 22 15:02:39 2011
MSC Id: 0017a4770030 (ethernet) (Linux)
Hostname: bigblue (user helires, display )
License files: 1700@adriatic
CEID: 77F66039-BC7GFC45
User: helires
Display:
LAPI Version: LAPI 8.3.1-2041 (FLEXlm 10.8.6.0)
Acquired 160 licenses for Group CAMPUS (Marc) from license server on host adriatic


general memory initially set to = 25 MByte

maximum available memory set to = 5000 MByte

general memory increasing from 25 MByte to 106 MByte

MSC Customer Entitlement ID
77F66039-BC7GFC45

wall time = 2.60

wall time = 3.67

general memory increasing from 106 MByte to 552 MByte
Appel SB UBGINC!
flag= F ;ierr= 0
Fatal error in MPI_Init_thread: Other MPI error, error stack:
MPIR_Init_thread(394).....: Initialization failed
MPID_Init(118)............: channel initialization failed
MPIDI_CH3_Init(43)........:
MPID_nem_init(202)........:
MPIDI_CH3I_Seg_commit(363): PMI_KVS_Get returned -1
Killed (signal 9)

0 Kudos
2 Replies
Dmitry_K_Intel2
Employee
755 Views
Hello,
Does Marc work with the default (Intel MPI) library?

>According to the FAQ of MPICH2, it would work with code based on Intel MPI.
Yeah, it may work and may not.
The error you see goes from Hydra (process manager) communication. To support Intel MPI library this part of the process manager was modified and it's very likely that this part of the library is incompatible with MPICH2 version of the mpiexec.
I don't even know what we can help you in resolving this issue.

Regards!
Dmitry
0 Kudos
helires
Beginner
755 Views
Hello Dmitry,

Thanks for your reply.

>Does Marc work with the default (Intel MPI) library?

Yes, MSC.Marc calls intelMPI.

>>According to the FAQ of MPICH2, it would work with code based on Intel MPI.
>Yeah, it may work and may not.
>The error you see goes from Hydra (process manager) communication. To support Intel MPI library this part >of the process manager was modified and it's very likely that this part of the library is incompatible with >MPICH2 version of the mpiexec.
>I don't even know what we can help you in resolving this issue

Obviously, it doesn' t work.

Regards,
0 Kudos
Reply