############################################################ Begin Prologue: Tue May 22 16:19:23 EDT 2012 PBS Job ID: 645 PBS User Name: userfoo Fair Share Group: generalGrp PBS Account Name: PBS Job Name: trivial PBS Queue: rhel6 Resource Request: cput=01:20:00,mem=16gb,neednodes=2:ppn=8,nodes=2:ppn=8,pmem=1gb,walltime=00:10:00 Nodes Assigned: bc07bl01,bc07bl02 Creating /scratch/645 on bc07bl01 Creating /scratch/645 on bc07bl02 End Prologue: Tue May 22 16:19:23 EDT 2012 ############################################################ Warning: no access to tty (Bad file descriptor). Thus no job control in this shell. tput: No value for $TERM and no -T specified ================================================================================================== mpiexec options: ---------------- Base path: /rhel6/opt/intel/ics2012/impi/4.0.3.008/intel64/bin/ Bootstrap server: ssh Debug level: 1 Enable X: -1 Global environment: ------------------- I_MPI_PERHOST=allcores MKLROOT=/rhel6/opt/intel/ics2012/mkl MKL_MOD=/rhel6/opt/intel/ics2012/mkl/include/intel64/lp64 MANPATH=/rhel6/opt/intel/ics2012/inspector_xe/man:/rhel6/opt/intel/ics2012/vtune_amplifier_xe/man:/rhel6/opt/intel/ics2012/itac/8.0.3.007/man:/rhel6/opt/intel/ics2012/impi/4.0.3.008/man:/home/userfoo/man:/usr/share/man/overrides:/usr/share/man:/usr/local/share/man AR=xiar VT_MPI=impi4 HOSTNAME=bc07bl01 intel_mpirun=/rhel6/opt/intel/ics2012/impi/4.0.3.008/intel64/bin/mpirun PBS_VERSION=TORQUE-2.5.7 I_MPI_F77=ifort IPPROOT=/rhel6/opt/intel/ics2012/ipp INTEL_LICENSE_FILE=/rhel6/opt/intel/ics2012/licenses:/opt/intel/licenses:/home/userfoo/intel/licenses HOST=bc07bl01 SHELL=/bin/tcsh MAUI_PREFIX=/opt/maui PBS_JOBNAME=trivial LIBRARY_PATH=/rhel6/opt/intel/ics2012/ipp/lib/intel64:/rhel6/opt/intel/ics2012/composerxe/lib/intel64:/rhel6/opt/intel/ics2012/mkl/lib/intel64:/rhel6/opt/intel/ics2012/tbb/lib/intel64//cc4.1.0_libc2.4_kernel2.6.16.21 intel_mpiexec=/rhel6/opt/intel/ics2012/impi/4.0.3.008/intel64/bin/mpiexec NCARG_FONTCAPS=/usr/lib64/ncarg/fontcaps FPATH=/rhel6/opt/intel/ics2012/mkl/include:/rhel6/opt/intel/ics2012/mkl/include/intel64/lp64 PBS_ENVIRONMENT=PBS_BATCH QTDIR=/usr/lib64/qt-3.3 QTINC=/usr/lib64/qt-3.3/include PBS_O_WORKDIR=/home/tests/intelmpi/TrivialMPI GROUP=generalGrp PBS_TASKNUM=1 USER=userfoo LD_LIBRARY_PATH=/rhel6/opt/intel/ics2012/composerxe/lib/intel64:/rhel6/opt/intel/ics2012/mkl/lib/intel64:/rhel6/opt/intel/ics2012/itac/8.0.3.007/itac/slib_impi4:/rhel6/opt/intel/ics2012/impi/4.0.3.008/intel64/lib:/rhel6/opt/intel/ics2012/tbb/lib/intel64//cc4.1.0_libc2.4_kernel2.6.16.21:/rhel6/opt/intel/ics2012/composerxe/debugger/lib/intel64:/rhel6/opt/intel/ics2012/composer_xe_2011_sp1.10.319/mpirt/lib/intel64 LS_COLORS=(null) PBS_O_HOME=/home/userfoo VTUNEROOT=/rhel6/opt/intel/ics2012/vtune_amplifier_xe CPATH=/rhel6/opt/intel/ics2012/tbb/include HOSTTYPE=x86_64-linux PBS_GPUFILE=/var/lib/torque/aux//645 PBS_MOMPORT=15003 NCARG_GRAPHCAPS=/usr/lib64/ncarg/graphcaps CPP=icc -E PBS_O_QUEUE=rhel6 NLSPATH=/rhel6/opt/intel/ics2012/composerxe/debugger/intel64/locale/%l_%t/%N VT_ADD_LIBS=-ldwarf -lelf -lvtunwind -lnsl -lm -ldl -lpthread INSPXEROOT=/rhel6/opt/intel/ics2012/inspector_xe MAIL=/var/spool/mail/userfoo PBS_O_LOGNAME=userfoo PATH=/rhel6/opt/mpiexec/0.84-intel-2012/bin:/rhel6/opt/intel/ics2012/bin:/rhel6/opt/intel/ics2012/inspector_xe/bin64:/rhel6/opt/intel/ics2012/vtune_amplifier_xe/bin64:/rhel6/opt/intel/ics2012/itac/8.0.3.007/bin:/rhel6/opt/intel/ics2012/impi/4.0.3.008/intel64/bin:/home/userfoo/bin:/usr/lib64/qt-3.3/bin:/bin:/usr/bin:/usr/local/bin:/usr/lpp/mmfs/bin:/opt/maui/bin:/rhel6/opt/intel/ics2012/composer_xe_2011_sp1.10.319/mpirt/bin/intel64 PBS_O_LANG=en_US.UTF-8 PBS_JOBCOOKIE=C4EB75184B5D828AEB30F2C195CF619F TBBROOT=/rhel6/opt/intel/ics2012/tbb NLS_PATH=/rhel6/opt/intel/ics2012/composerxe/lib/intel64/locale/%l_%t/%N:/rhel6/opt/intel/ics2012/ipp/lib/intel64/locale/%l_%t/%N F90=ifort PWD=/home/tests/intelmpi/TrivialMPI _LMFILES_=/usr/share/Modules/modulefiles/use.own:/home/userfoo/privatemodules/compilers/intel-2012-lp64-userfoo NCARG_ROOT=/usr EDITOR=vim F95=ifort KDE_IS_PRELINKED=1 PBS_NODENUM=0 LANG=C NCARG_DATABASE=/usr/lib64/ncarg/database MODULEPATH=/usr/share/Modules/modulefiles:/etc/modulefiles:/home/userfoo/privatemodules GPFSDIR=/usr/lpp/mmfs VT_LIB_DIR=/rhel6/opt/intel/ics2012/itac/8.0.3.007/itac/lib_impi4 LOADEDMODULES=use.own:compilers/intel-2012-lp64-userfoo KDEDIRS=/usr PBS_NUM_NODES=2 I_MPI_F90=ifort F77=ifort PBS_O_SHELL=/bin/tcsh VT_ROOT=/rhel6/opt/intel/ics2012/itac/8.0.3.007 I_MPI_CC=icc PBS_SERVER=rhel6pbs LM_LICENSE_FILE=7496@flexlm PBS_JOBID=645 MKL_LP64_ILP64=lp64 CXX=icpc NCARG_LIB=/usr/lib64/ncarg ENVIRONMENT=BATCH NCARG_NCARG=/usr/share/ncarg SHLVL=2 HOME=/home/userfoo I_MPI_CXX=icpc OSTYPE=linux PBS_O_HOST=rhel6head2 VT_SLIB_DIR=/rhel6/opt/intel/ics2012/itac/8.0.3.007/itac/slib_impi4 FC=ifort I_MPI_FC=ifort VENDOR=unknown PBS_VNODENUM=0 MACHTYPE=x86_64 LOGNAME=userfoo VISUAL=vim QTLIB=/usr/lib64/qt-3.3/lib CLASSPATH=/rhel6/opt/intel/ics2012/itac/8.0.3.007/itac/lib_impi4 PBS_QUEUE=rhel6 MODULESHOME=/usr/share/Modules LESSOPEN=|/usr/bin/lesspipe.sh %s PBS_O_MAIL=/var/spool/mail/userfoo CC=icc PBS_NP=16 PBS_NUM_PPN=8 INCLUDE=/rhel6/opt/intel/ics2012/ipp/include G_BROKEN_FILENAMES=1 PBS_NODEFILE=/var/lib/torque/aux//645 I_MPI_ROOT=/rhel6/opt/intel/ics2012/impi/4.0.3.008 IDBROOT=/rhel6/opt/intel/ics2012/composerxe/debugger PBS_O_PATH=/rhel6/opt/mpiexec/0.84-intel-2012/bin:/rhel6/opt/intel/ics2012/bin:/rhel6/opt/intel/ics2012/inspector_xe/bin64:/rhel6/opt/intel/ics2012/vtune_amplifier_xe/bin64:/rhel6/opt/intel/ics2012/itac/8.0.3.007/bin:/rhel6/opt/intel/ics2012/impi/4.0.3.008/intel64/bin:/rhel6/opt/intel/ics2012/composer_xe_2011_sp1.6.233/bin/intel64:/rhel6/opt/intel/ics2012/vtune_amplifier_xe_2011/bin64:/home/userfoo/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/bin:/usr/bin:/usr/lpp/mmfs/bin:/opt/maui/bin:/rhel6/opt/intel/ics2012/composer_xe_2011_sp1.6.233/mpirt/bin/intel64:/rhel6/opt/intel/ics2012/composer_xe_2011_sp1.10.319/mpirt/bin/intel64 _=/rhel6/opt/intel/ics2012/impi/4.0.3.008/intel64/bin/mpiexec.hydra OLDPWD=/home/userfoo Proxy information: ********************* Proxy ID: 1 ----------------- Proxy name: bc07bl01 Process count: 8 Start PID: 0 Proxy exec list: .................... Exec: ./hello_mpi; Process count: 8 Proxy ID: 2 ----------------- Proxy name: bc07bl02 Process count: 8 Start PID: 8 Proxy exec list: .................... Exec: ./hello_mpi; Process count: 8 ================================================================================================== [mpiexec@bc07bl01] Timeout set to -1 (-1 means infinite) [mpiexec@bc07bl01] Got a control port string of bc07bl01:38382 Proxy launch args: /rhel6/opt/intel/ics2012/impi/4.0.3.008/intel64/bin/pmi_proxy --control-port bc07bl01:38382 --debug --pmi-connect lazy-cache --pmi-aggregate -s 0 --bootstrap ssh --demux poll --pgid 0 --enable-stdin 1 --proxy-id [mpiexec@bc07bl01] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1 Arguments being passed to proxy 0: --version 1.3 --interface-env-name MPICH_INTERFACE_HOSTNAME --hostname bc07bl01 --global-core-count 16 --global-process-count 16 --auto-cleanup 1 --pmi-rank -1 --pmi-kvsname kvs_11431_0 --pmi-process-mapping (vector,(0,2,8)) --bindlib ipl --ckpoint-num -1 --global-inherited-env 108 'I_MPI_PERHOST=allcores' 'MKLROOT=/rhel6/opt/intel/ics2012/mkl' 'MKL_MOD=/rhel6/opt/intel/ics2012/mkl/include/intel64/lp64' 'MANPATH=/rhel6/opt/intel/ics2012/inspector_xe/man:/rhel6/opt/intel/ics2012/vtune_amplifier_xe/man:/rhel6/opt/intel/ics2012/itac/8.0.3.007/man:/rhel6/opt/intel/ics2012/impi/4.0.3.008/man:/home/userfoo/man:/usr/share/man/overrides:/usr/share/man:/usr/local/share/man' 'AR=xiar' 'VT_MPI=impi4' 'HOSTNAME=bc07bl01' 'intel_mpirun=/rhel6/opt/intel/ics2012/impi/4.0.3.008/intel64/bin/mpirun' 'PBS_VERSION=TORQUE-2.5.7' 'I_MPI_F77=ifort' 'IPPROOT=/rhel6/opt/intel/ics2012/ipp' 'INTEL_LICENSE_FILE=/rhel6/opt/intel/ics2012/licenses:/opt/intel/licenses:/home/userfoo/intel/licenses' 'HOST=bc07bl01' 'SHELL=/bin/tcsh' 'MAUI_PREFIX=/opt/maui' 'PBS_JOBNAME=trivial' 'LIBRARY_PATH=/rhel6/opt/intel/ics2012/ipp/lib/intel64:/rhel6/opt/intel/ics2012/composerxe/lib/intel64:/rhel6/opt/intel/ics2012/mkl/lib/intel64:/rhel6/opt/intel/ics2012/tbb/lib/intel64//cc4.1.0_libc2.4_kernel2.6.16.21' 'intel_mpiexec=/rhel6/opt/intel/ics2012/impi/4.0.3.008/intel64/bin/mpiexec' 'NCARG_FONTCAPS=/usr/lib64/ncarg/fontcaps' 'FPATH=/rhel6/opt/intel/ics2012/mkl/include:/rhel6/opt/intel/ics2012/mkl/include/intel64/lp64' 'PBS_ENVIRONMENT=PBS_BATCH' 'QTDIR=/usr/lib64/qt-3.3' 'QTINC=/usr/lib64/qt-3.3/include' 'PBS_O_WORKDIR=/home/tests/intelmpi/TrivialMPI' 'GROUP=generalGrp' 'PBS_TASKNUM=1' 'USER=userfoo' 'LD_LIBRARY_PATH=/rhel6/opt/intel/ics2012/composerxe/lib/intel64:/rhel6/opt/intel/ics2012/mkl/lib/intel64:/rhel6/opt/intel/ics2012/itac/8.0.3.007/itac/slib_impi4:/rhel6/opt/intel/ics2012/impi/4.0.3.008/intel64/lib:/rhel6/opt/intel/ics2012/tbb/lib/intel64//cc4.1.0_libc2.4_kernel2.6.16.21:/rhel6/opt/intel/ics2012/composerxe/debugger/lib/intel64:/rhel6/opt/intel/ics2012/composer_xe_2011_sp1.10.319/mpirt/lib/intel64' 'LS_COLORS=' 'PBS_O_HOME=/home/userfoo' 'VTUNEROOT=/rhel6/opt/intel/ics2012/vtune_amplifier_xe' 'CPATH=/rhel6/opt/intel/ics2012/tbb/include' 'HOSTTYPE=x86_64-linux' 'PBS_GPUFILE=/var/lib/torque/aux//645.rhel6pbsgpu' 'PBS_MOMPORT=15003' 'NCARG_GRAPHCAPS=/usr/lib64/ncarg/graphcaps' 'CPP=icc -E' 'PBS_O_QUEUE=rhel6' 'NLSPATH=/rhel6/opt/intel/ics2012/composerxe/debugger/intel64/locale/%l_%t/%N' 'VT_ADD_LIBS=-ldwarf -lelf -lvtunwind -lnsl -lm -ldl -lpthread' 'INSPXEROOT=/rhel6/opt/intel/ics2012/inspector_xe' 'MAIL=/var/spool/mail/userfoo' 'PBS_O_LOGNAME=userfoo' 'PATH=/rhel6/opt/mpiexec/0.84-intel-2012/bin:/rhel6/opt/intel/ics2012/bin:/rhel6/opt/intel/ics2012/inspector_xe/bin64:/rhel6/opt/intel/ics2012/vtune_amplifier_xe/bin64:/rhel6/opt/intel/ics2012/itac/8.0.3.007/bin:/rhel6/opt/intel/ics2012/impi/4.0.3.008/intel64/bin:/home/userfoo/bin:/usr/lib64/qt-3.3/bin:/bin:/usr/bin:/usr/local/bin:/usr/lpp/mmfs/bin:/opt/maui/bin:/rhel6/opt/intel/ics2012/composer_xe_2011_sp1.10.319/mpirt/bin/intel64' 'PBS_O_LANG=en_US.UTF-8' 'PBS_JOBCOOKIE=C4EB75184B5D828AEB30F2C195CF619F' 'TBBROOT=/rhel6/opt/intel/ics2012/tbb' 'NLS_PATH=/rhel6/opt/intel/ics2012/composerxe/lib/intel64/locale/%l_%t/%N:/rhel6/opt/intel/ics2012/ipp/lib/intel64/locale/%l_%t/%N' 'F90=ifort' 'PWD=/home/tests/intelmpi/TrivialMPI' '_LMFILES_=/usr/share/Modules/modulefiles/use.own:/home/userfoo/privatemodules/compilers/intel-2012-lp64-userfoo' 'NCARG_ROOT=/usr' 'EDITOR=vim' 'F95=ifort' 'KDE_IS_PRELINKED=1' 'PBS_NODENUM=0' 'LANG=C' 'NCARG_DATABASE=/usr/lib64/ncarg/database' 'MODULEPATH=/usr/share/Modules/modulefiles:/etc/modulefiles:/home/userfoo/privatemodules' 'GPFSDIR=/usr/lpp/mmfs' 'VT_LIB_DIR=/rhel6/opt/intel/ics2012/itac/8.0.3.007/itac/lib_impi4' 'LOADEDMODULES=use.own:compilers/intel-2012-lp64-userfoo' 'KDEDIRS=/usr' 'PBS_NUM_NODES=2' 'I_MPI_F90=ifort' 'F77=ifort' 'PBS_O_SHELL=/bin/tcsh' 'VT_ROOT=/rhel6/opt/intel/ics2012/itac/8.0.3.007' 'I_MPI_CC=icc' 'PBS_SERVER=rhel6pbs' 'LM_LICENSE_FILE=7496@flexlm' 'PBS_JOBID=645.rhel6pbs' 'MKL_LP64_ILP64=lp64' 'CXX=icpc' 'NCARG_LIB=/usr/lib64/ncarg' 'ENVIRONMENT=BATCH' 'NCARG_NCARG=/usr/share/ncarg' 'SHLVL=2' 'HOME=/home/userfoo' 'I_MPI_CXX=icpc' 'OSTYPE=linux' 'PBS_O_HOST=rhel6head2' 'VT_SLIB_DIR=/rhel6/opt/intel/ics2012/itac/8.0.3.007/itac/slib_impi4' 'FC=ifort' 'I_MPI_FC=ifort' 'VENDOR=unknown' 'PBS_VNODENUM=0' 'MACHTYPE=x86_64' 'LOGNAME=userfoo' 'VISUAL=vim' 'QTLIB=/usr/lib64/qt-3.3/lib' 'CLASSPATH=/rhel6/opt/intel/ics2012/itac/8.0.3.007/itac/lib_impi4' 'PBS_QUEUE=rhel6' 'MODULESHOME=/usr/share/Modules' 'LESSOPEN=|/usr/bin/lesspipe.sh %s' 'PBS_O_MAIL=/var/spool/mail/userfoo' 'CC=icc' 'PBS_NP=16' 'PBS_NUM_PPN=8' 'INCLUDE=/rhel6/opt/intel/ics2012/ipp/include' 'G_BROKEN_FILENAMES=1' 'PBS_NODEFILE=/var/lib/torque/aux//645.rhel6pbs' 'I_MPI_ROOT=/rhel6/opt/intel/ics2012/impi/4.0.3.008' 'IDBROOT=/rhel6/opt/intel/ics2012/composerxe/debugger' 'PBS_O_PATH=/rhel6/opt/mpiexec/0.84-intel-2012/bin:/rhel6/opt/intel/ics2012/bin:/rhel6/opt/intel/ics2012/inspector_xe/bin64:/rhel6/opt/intel/ics2012/vtune_amplifier_xe/bin64:/rhel6/opt/intel/ics2012/itac/8.0.3.007/bin:/rhel6/opt/intel/ics2012/impi/4.0.3.008/intel64/bin:/rhel6/opt/intel/ics2012/composer_xe_2011_sp1.6.233/bin/intel64:/rhel6/opt/intel/ics2012/vtune_amplifier_xe_2011/bin64:/home/userfoo/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/bin:/usr/bin:/usr/lpp/mmfs/bin:/opt/maui/bin:/rhel6/opt/intel/ics2012/composer_xe_2011_sp1.6.233/mpirt/bin/intel64:/rhel6/opt/intel/ics2012/composer_xe_2011_sp1.10.319/mpirt/bin/intel64' '_=/rhel6/opt/intel/ics2012/impi/4.0.3.008/intel64/bin/mpiexec.hydra' 'OLDPWD=/home/userfoo' --global-user-env 0 --global-system-env 0 --start-pid 0 --proxy-core-count 8 --exec --exec-appnum 0 --exec-proc-count 8 --exec-local-env 0 --exec-wdir /home/tests/intelmpi/TrivialMPI --exec-args 1 ./hello_mpi [mpiexec@bc07bl01] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1 Arguments being passed to proxy 1: --version 1.3 --interface-env-name MPICH_INTERFACE_HOSTNAME --hostname bc07bl02 --global-core-count 16 --global-process-count 16 --auto-cleanup 1 --pmi-rank -1 --pmi-kvsname kvs_11431_0 --pmi-process-mapping (vector,(0,2,8)) --bindlib ipl --ckpoint-num -1 --global-inherited-env 108 'I_MPI_PERHOST=allcores' 'MKLROOT=/rhel6/opt/intel/ics2012/mkl' 'MKL_MOD=/rhel6/opt/intel/ics2012/mkl/include/intel64/lp64' 'MANPATH=/rhel6/opt/intel/ics2012/inspector_xe/man:/rhel6/opt/intel/ics2012/vtune_amplifier_xe/man:/rhel6/opt/intel/ics2012/itac/8.0.3.007/man:/rhel6/opt/intel/ics2012/impi/4.0.3.008/man:/home/userfoo/man:/usr/share/man/overrides:/usr/share/man:/usr/local/share/man' 'AR=xiar' 'VT_MPI=impi4' 'HOSTNAME=bc07bl01' 'intel_mpirun=/rhel6/opt/intel/ics2012/impi/4.0.3.008/intel64/bin/mpirun' 'PBS_VERSION=TORQUE-2.5.7' 'I_MPI_F77=ifort' 'IPPROOT=/rhel6/opt/intel/ics2012/ipp' 'INTEL_LICENSE_FILE=/rhel6/opt/intel/ics2012/licenses:/opt/intel/licenses:/home/userfoo/intel/licenses' 'HOST=bc07bl01' 'SHELL=/bin/tcsh' 'MAUI_PREFIX=/opt/maui' 'PBS_JOBNAME=trivial' 'LIBRARY_PATH=/rhel6/opt/intel/ics2012/ipp/lib/intel64:/rhel6/opt/intel/ics2012/composerxe/lib/intel64:/rhel6/opt/intel/ics2012/mkl/lib/intel64:/rhel6/opt/intel/ics2012/tbb/lib/intel64//cc4.1.0_libc2.4_kernel2.6.16.21' 'intel_mpiexec=/rhel6/opt/intel/ics2012/impi/4.0.3.008/intel64/bin/mpiexec' 'NCARG_FONTCAPS=/usr/lib64/ncarg/fontcaps' 'FPATH=/rhel6/opt/intel/ics2012/mkl/include:/rhel6/opt/intel/ics2012/mkl/include/intel64/lp64' 'PBS_ENVIRONMENT=PBS_BATCH' 'QTDIR=/usr/lib64/qt-3.3' 'QTINC=/usr/lib64/qt-3.3/include' 'PBS_O_WORKDIR=/home/tests/intelmpi/TrivialMPI' 'GROUP=generalGrp' 'PBS_TASKNUM=1' 'USER=userfoo' 'LD_LIBRARY_PATH=/rhel6/opt/intel/ics2012/composerxe/lib/intel64:/rhel6/opt/intel/ics2012/mkl/lib/intel64:/rhel6/opt/intel/ics2012/itac/8.0.3.007/itac/slib_impi4:/rhel6/opt/intel/ics2012/impi/4.0.3.008/intel64/lib:/rhel6/opt/intel/ics2012/tbb/lib/intel64//cc4.1.0_libc2.4_kernel2.6.16.21:/rhel6/opt/intel/ics2012/composerxe/debugger/lib/intel64:/rhel6/opt/intel/ics2012/composer_xe_2011_sp1.10.319/mpirt/lib/intel64' 'LS_COLORS=' 'PBS_O_HOME=/home/userfoo' 'VTUNEROOT=/rhel6/opt/intel/ics2012/vtune_amplifier_xe' 'CPATH=/rhel6/opt/intel/ics2012/tbb/include' 'HOSTTYPE=x86_64-linux' 'PBS_GPUFILE=/var/lib/torque/aux//645.rhel6pbsgpu' 'PBS_MOMPORT=15003' 'NCARG_GRAPHCAPS=/usr/lib64/ncarg/graphcaps' 'CPP=icc -E' 'PBS_O_QUEUE=rhel6' 'NLSPATH=/rhel6/opt/intel/ics2012/composerxe/debugger/intel64/locale/%l_%t/%N' 'VT_ADD_LIBS=-ldwarf -lelf -lvtunwind -lnsl -lm -ldl -lpthread' 'INSPXEROOT=/rhel6/opt/intel/ics2012/inspector_xe' 'MAIL=/var/spool/mail/userfoo' 'PBS_O_LOGNAME=userfoo' 'PATH=/rhel6/opt/mpiexec/0.84-intel-2012/bin:/rhel6/opt/intel/ics2012/bin:/rhel6/opt/intel/ics2012/inspector_xe/bin64:/rhel6/opt/intel/ics2012/vtune_amplifier_xe/bin64:/rhel6/opt/intel/ics2012/itac/8.0.3.007/bin:/rhel6/opt/intel/ics2012/impi/4.0.3.008/intel64/bin:/home/userfoo/bin:/usr/lib64/qt-3.3/bin:/bin:/usr/bin:/usr/local/bin:/usr/lpp/mmfs/bin:/opt/maui/bin:/rhel6/opt/intel/ics2012/composer_xe_2011_sp1.10.319/mpirt/bin/intel64' 'PBS_O_LANG=en_US.UTF-8' 'PBS_JOBCOOKIE=C4EB75184B5D828AEB30F2C195CF619F' 'TBBROOT=/rhel6/opt/intel/ics2012/tbb' 'NLS_PATH=/rhel6/opt/intel/ics2012/composerxe/lib/intel64/locale/%l_%t/%N:/rhel6/opt/intel/ics2012/ipp/lib/intel64/locale/%l_%t/%N' 'F90=ifort' 'PWD=/home/tests/intelmpi/TrivialMPI' '_LMFILES_=/usr/share/Modules/modulefiles/use.own:/home/userfoo/privatemodules/compilers/intel-2012-lp64-userfoo' 'NCARG_ROOT=/usr' 'EDITOR=vim' 'F95=ifort' 'KDE_IS_PRELINKED=1' 'PBS_NODENUM=0' 'LANG=C' 'NCARG_DATABASE=/usr/lib64/ncarg/database' 'MODULEPATH=/usr/share/Modules/modulefiles:/etc/modulefiles:/home/userfoo/privatemodules' 'GPFSDIR=/usr/lpp/mmfs' 'VT_LIB_DIR=/rhel6/opt/intel/ics2012/itac/8.0.3.007/itac/lib_impi4' 'LOADEDMODULES=use.own:compilers/intel-2012-lp64-userfoo' 'KDEDIRS=/usr' 'PBS_NUM_NODES=2' 'I_MPI_F90=ifort' 'F77=ifort' 'PBS_O_SHELL=/bin/tcsh' 'VT_ROOT=/rhel6/opt/intel/ics2012/itac/8.0.3.007' 'I_MPI_CC=icc' 'PBS_SERVER=rhel6pbs' 'LM_LICENSE_FILE=7496@flexlm' 'PBS_JOBID=645.rhel6pbs' 'MKL_LP64_ILP64=lp64' 'CXX=icpc' 'NCARG_LIB=/usr/lib64/ncarg' 'ENVIRONMENT=BATCH' 'NCARG_NCARG=/usr/share/ncarg' 'SHLVL=2' 'HOME=/home/userfoo' 'I_MPI_CXX=icpc' 'OSTYPE=linux' 'PBS_O_HOST=rhel6head2' 'VT_SLIB_DIR=/rhel6/opt/intel/ics2012/itac/8.0.3.007/itac/slib_impi4' 'FC=ifort' 'I_MPI_FC=ifort' 'VENDOR=unknown' 'PBS_VNODENUM=0' 'MACHTYPE=x86_64' 'LOGNAME=userfoo' 'VISUAL=vim' 'QTLIB=/usr/lib64/qt-3.3/lib' 'CLASSPATH=/rhel6/opt/intel/ics2012/itac/8.0.3.007/itac/lib_impi4' 'PBS_QUEUE=rhel6' 'MODULESHOME=/usr/share/Modules' 'LESSOPEN=|/usr/bin/lesspipe.sh %s' 'PBS_O_MAIL=/var/spool/mail/userfoo' 'CC=icc' 'PBS_NP=16' 'PBS_NUM_PPN=8' 'INCLUDE=/rhel6/opt/intel/ics2012/ipp/include' 'G_BROKEN_FILENAMES=1' 'PBS_NODEFILE=/var/lib/torque/aux//645.rhel6pbs' 'I_MPI_ROOT=/rhel6/opt/intel/ics2012/impi/4.0.3.008' 'IDBROOT=/rhel6/opt/intel/ics2012/composerxe/debugger' 'PBS_O_PATH=/rhel6/opt/mpiexec/0.84-intel-2012/bin:/rhel6/opt/intel/ics2012/bin:/rhel6/opt/intel/ics2012/inspector_xe/bin64:/rhel6/opt/intel/ics2012/vtune_amplifier_xe/bin64:/rhel6/opt/intel/ics2012/itac/8.0.3.007/bin:/rhel6/opt/intel/ics2012/impi/4.0.3.008/intel64/bin:/rhel6/opt/intel/ics2012/composer_xe_2011_sp1.6.233/bin/intel64:/rhel6/opt/intel/ics2012/vtune_amplifier_xe_2011/bin64:/home/userfoo/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/bin:/usr/bin:/usr/lpp/mmfs/bin:/opt/maui/bin:/rhel6/opt/intel/ics2012/composer_xe_2011_sp1.6.233/mpirt/bin/intel64:/rhel6/opt/intel/ics2012/composer_xe_2011_sp1.10.319/mpirt/bin/intel64' '_=/rhel6/opt/intel/ics2012/impi/4.0.3.008/intel64/bin/mpiexec.hydra' 'OLDPWD=/home/userfoo' --global-user-env 0 --global-system-env 0 --start-pid 8 --proxy-core-count 8 --exec --exec-appnum 0 --exec-proc-count 8 --exec-local-env 0 --exec-wdir /home/tests/intelmpi/TrivialMPI --exec-args 1 ./hello_mpi [mpiexec@bc07bl01] Launch arguments: /rhel6/opt/intel/ics2012/impi/4.0.3.008/intel64/bin/pmi_proxy --control-port bc07bl01:38382 --debug --pmi-connect lazy-cache --pmi-aggregate -s 0 --bootstrap ssh --demux poll --pgid 0 --enable-stdin 1 --proxy-id 0 [mpiexec@bc07bl01] Launch arguments: /usr/bin/ssh -x -q bc07bl02 /rhel6/opt/intel/ics2012/impi/4.0.3.008/intel64/bin/pmi_proxy --control-port bc07bl01:38382 --debug --pmi-connect lazy-cache --pmi-aggregate -s 0 --bootstrap ssh --demux poll --pgid 0 --enable-stdin 1 --proxy-id 1 [mpiexec@bc07bl01] STDIN will be redirected to 1 fd(s): 9 [proxy:0:0@bc07bl01] Start PMI_proxy 0 [proxy:0:0@bc07bl01] STDIN will be redirected to 1 fd(s): 12 [proxy:0:0@bc07bl01] got pmi command (from 17): init pmi_version=1 pmi_subversion=1 [proxy:0:0@bc07bl01] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@bc07bl01] got pmi command (from 17): get_maxes [proxy:0:0@bc07bl01] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@bc07bl01] got pmi command (from 17): barrier_in [proxy:0:1@bc07bl02] Start PMI_proxy 1 [proxy:0:0@bc07bl01] got pmi command (from 23): init pmi_version=1 pmi_subversion=1 [proxy:0:0@bc07bl01] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@bc07bl01] got pmi command (from 23): get_maxes [proxy:0:0@bc07bl01] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@bc07bl01] got pmi command (from 23): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 29): init pmi_version=1 pmi_subversion=1 [proxy:0:0@bc07bl01] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@bc07bl01] got pmi command (from 29): get_maxes [proxy:0:0@bc07bl01] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@bc07bl01] got pmi command (from 29): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 6): init pmi_version=1 pmi_subversion=1 [proxy:0:0@bc07bl01] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@bc07bl01] got pmi command (from 6): get_maxes [proxy:0:0@bc07bl01] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@bc07bl01] got pmi command (from 6): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 20): init pmi_version=1 pmi_subversion=1 [proxy:0:0@bc07bl01] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@bc07bl01] got pmi command (from 20): get_maxes [proxy:0:0@bc07bl01] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@bc07bl01] got pmi command (from 20): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 10): init pmi_version=1 pmi_subversion=1 [proxy:0:0@bc07bl01] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@bc07bl01] got pmi command (from 26): init pmi_version=1 pmi_subversion=1 [proxy:0:0@bc07bl01] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@bc07bl01] got pmi command (from 9): init pmi_version=1 pmi_subversion=1 [proxy:0:0@bc07bl01] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@bc07bl01] got pmi command (from 10): get_maxes [proxy:0:0@bc07bl01] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@bc07bl01] got pmi command (from 26): get_maxes [proxy:0:0@bc07bl01] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@bc07bl01] got pmi command (from 9): get_maxes [proxy:0:0@bc07bl01] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@bc07bl01] got pmi command (from 26): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 9): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 10): barrier_in [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=barrier_in [proxy:0:0@bc07bl01] forwarding command (cmd=barrier_in) upstream [proxy:0:1@bc07bl02] got pmi command (from 4): init pmi_version=1 pmi_subversion=1 [proxy:0:1@bc07bl02] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@bc07bl02] got pmi command (from 4): get_maxes [proxy:0:1@bc07bl02] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@bc07bl02] got pmi command (from 4): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 19): init pmi_version=1 pmi_subversion=1 [proxy:0:1@bc07bl02] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@bc07bl02] got pmi command (from 19): get_maxes [proxy:0:1@bc07bl02] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@bc07bl02] got pmi command (from 19): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 5): init pmi_version=1 pmi_subversion=1 [proxy:0:1@bc07bl02] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@bc07bl02] got pmi command (from 5): get_maxes [proxy:0:1@bc07bl02] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@bc07bl02] got pmi command (from 5): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 22): init pmi_version=1 pmi_subversion=1 [proxy:0:1@bc07bl02] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@bc07bl02] got pmi command (from 22): get_maxes [proxy:0:1@bc07bl02] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@bc07bl02] got pmi command (from 22): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 16): init pmi_version=1 pmi_subversion=1 [proxy:0:1@bc07bl02] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@bc07bl02] got pmi command (from 16): get_maxes [proxy:0:1@bc07bl02] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@bc07bl02] got pmi command (from 16): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 7): init pmi_version=1 pmi_subversion=1 [proxy:0:1@bc07bl02] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@bc07bl02] got pmi command (from 7): get_maxes [proxy:0:1@bc07bl02] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@bc07bl02] got pmi command (from 10): init pmi_version=1 pmi_subversion=1 [proxy:0:1@bc07bl02] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@bc07bl02] got pmi command (from 7): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 10): get_maxes [proxy:0:1@bc07bl02] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@bc07bl02] got pmi command (from 10): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 13): init pmi_version=1 pmi_subversion=1 [proxy:0:1@bc07bl02] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@bc07bl02] got pmi command (from 13): get_maxes [proxy:0:1@bc07bl02] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@bc07bl01] PMI response to fd 6 pid 13: cmd=barrier_out [mpiexec@bc07bl01] PMI response to fd 0 pid 13: cmd=barrier_out [proxy:0:1@bc07bl02] got pmi command (from 13): barrier_in [proxy:0:1@bc07bl02] forwarding command (cmd=barrier_in) upstream [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] got pmi command (from 4): get_ranks2hosts [proxy:0:1@bc07bl02] PMI response: put_ranks2hosts 64 2 8 bc07bl01 0,1,2,3,4,5,6,7, 8 bc07bl02 8,9,10,11,12,13,14,15, [proxy:0:0@bc07bl01] got pmi command (from 17): get_ranks2hosts [proxy:0:0@bc07bl01] PMI response: put_ranks2hosts 64 2 8 bc07bl01 0,1,2,3,4,5,6,7, 8 bc07bl02 8,9,10,11,12,13,14,15, [proxy:0:0@bc07bl01] got pmi command (from 20): get_ranks2hosts [proxy:0:0@bc07bl01] PMI response: put_ranks2hosts 64 2 8 bc07bl01 0,1,2,3,4,5,6,7, 8 bc07bl02 8,9,10,11,12,13,14,15, [proxy:0:0@bc07bl01] got pmi command (from 6): get_ranks2hosts [proxy:0:0@bc07bl01] PMI response: put_ranks2hosts 64 2 8 bc07bl01 0,1,2,3,4,5,6,7, 8 bc07bl02 8,9,10,11,12,13,14,15, [proxy:0:0@bc07bl01] got pmi command (from 9): get_ranks2hosts [proxy:0:0@bc07bl01] PMI response: put_ranks2hosts 64 2 8 bc07bl01 0,1,2,3,4,5,6,7, 8 bc07bl02 8,9,10,11,12,13,14,15, [proxy:0:0@bc07bl01] got pmi command (from 10): get_ranks2hosts [proxy:0:0@bc07bl01] PMI response: put_ranks2hosts 64 2 8 bc07bl01 0,1,2,3,4,5,6,7, 8 bc07bl02 8,9,10,11,12,13,14,15, [proxy:0:0@bc07bl01] got pmi command (from 29): get_ranks2hosts [proxy:0:0@bc07bl01] PMI response: put_ranks2hosts 64 2 8 bc07bl01 0,1,2,3,4,5,6,7, 8 bc07bl02 8,9,10,11,12,13,14,15, [proxy:0:0@bc07bl01] got pmi command (from 23): get_ranks2hosts [proxy:0:0@bc07bl01] PMI response: put_ranks2hosts 64 2 8 bc07bl01 0,1,2,3,4,5,6,7, 8 bc07bl02 8,9,10,11,12,13,14,15, [proxy:0:0@bc07bl01] got pmi command (from 26): get_ranks2hosts [proxy:0:0@bc07bl01] PMI response: put_ranks2hosts 64 2 8 bc07bl01 0,1,2,3,4,5,6,7, 8 bc07bl02 8,9,10,11,12,13,14,15, [proxy:0:0@bc07bl01] got pmi command (from 6): get_appnum [proxy:0:0@bc07bl01] PMI response: cmd=appnum appnum=0 [proxy:0:0@bc07bl01] got pmi command (from 9): get_appnum [proxy:0:0@bc07bl01] PMI response: cmd=appnum appnum=0 [proxy:0:0@bc07bl01] got pmi command (from 17): get_appnum [proxy:0:0@bc07bl01] PMI response: cmd=appnum appnum=0 [proxy:0:0@bc07bl01] got pmi command (from 20): get_appnum [proxy:0:0@bc07bl01] PMI response: cmd=appnum appnum=0 [proxy:0:0@bc07bl01] got pmi command (from 29): get_appnum [proxy:0:0@bc07bl01] PMI response: cmd=appnum appnum=0 [proxy:0:1@bc07bl02] got pmi command (from 4): get_appnum [proxy:0:1@bc07bl02] PMI response: cmd=appnum appnum=0 [proxy:0:1@bc07bl02] got pmi command (from 5): get_ranks2hosts [proxy:0:1@bc07bl02] PMI response: put_ranks2hosts 64 2 8 bc07bl01 0,1,2,3,4,5,6,7, 8 bc07bl02 8,9,10,11,12,13,14,15, [proxy:0:1@bc07bl02] got pmi command (from 7): get_ranks2hosts [proxy:0:1@bc07bl02] PMI response: put_ranks2hosts 64 2 8 bc07bl01 0,1,2,3,4,5,6,7, 8 bc07bl02 8,9,10,11,12,13,14,15, [proxy:0:1@bc07bl02] got pmi command (from 10): get_ranks2hosts [proxy:0:1@bc07bl02] PMI response: put_ranks2hosts 64 2 8 bc07bl01 0,1,2,3,4,5,6,7, 8 bc07bl02 8,9,10,11,12,13,14,15, [proxy:0:1@bc07bl02] got pmi command (from 13): get_ranks2hosts [proxy:0:1@bc07bl02] PMI response: put_ranks2hosts 64 2 8 bc07bl01 0,1,2,3,4,5,6,7, 8 bc07bl02 8,9,10,11,12,13,14,15, [proxy:0:1@bc07bl02] got pmi command (from 16): get_ranks2hosts [proxy:0:1@bc07bl02] PMI response: put_ranks2hosts 64 2 8 bc07bl01 0,1,2,3,4,5,6,7, 8 bc07bl02 8,9,10,11,12,13,14,15, [proxy:0:1@bc07bl02] got pmi command (from 19): get_ranks2hosts [proxy:0:1@bc07bl02] PMI response: put_ranks2hosts 64 2 8 bc07bl01 0,1,2,3,4,5,6,7, 8 bc07bl02 8,9,10,11,12,13,14,15, [proxy:0:1@bc07bl02] got pmi command (from 22): get_ranks2hosts [proxy:0:1@bc07bl02] PMI response: put_ranks2hosts 64 2 8 bc07bl01 0,1,2,3,4,5,6,7, 8 bc07bl02 8,9,10,11,12,13,14,15, [proxy:0:1@bc07bl02] got pmi command (from 5): get_appnum [proxy:0:1@bc07bl02] PMI response: cmd=appnum appnum=0 [proxy:0:1@bc07bl02] got pmi command (from 7): get_appnum [proxy:0:1@bc07bl02] PMI response: cmd=appnum appnum=0 [proxy:0:1@bc07bl02] got pmi command (from 16): get_appnum [proxy:0:1@bc07bl02] PMI response: cmd=appnum appnum=0 [proxy:0:1@bc07bl02] got pmi command (from 5): get_my_kvsname [proxy:0:1@bc07bl02] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:1@bc07bl02] got pmi command (from 7): get_my_kvsname [proxy:0:1@bc07bl02] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:1@bc07bl02] got pmi command (from 19): get_appnum [proxy:0:1@bc07bl02] PMI response: cmd=appnum appnum=0 [proxy:0:1@bc07bl02] got pmi command (from 16): get_my_kvsname [proxy:0:1@bc07bl02] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:0@bc07bl01] got pmi command (from 6): get_my_kvsname [proxy:0:0@bc07bl01] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:1@bc07bl02] got pmi command (from 10): get_appnum [proxy:0:1@bc07bl02] PMI response: cmd=appnum appnum=0 [proxy:0:0@bc07bl01] got pmi command (from 9): get_my_kvsname [proxy:0:0@bc07bl01] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:0@bc07bl01] got pmi command (from 17): get_my_kvsname [proxy:0:0@bc07bl01] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:0@bc07bl01] got pmi command (from 20): get_my_kvsname [proxy:0:0@bc07bl01] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:0@bc07bl01] got pmi command (from 23): get_appnum [proxy:0:1@bc07bl02] got pmi command (from 19): get_my_kvsname [proxy:0:1@bc07bl02] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:1@bc07bl02] got pmi command (from 5): get_my_kvsname [proxy:0:1@bc07bl02] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:0@bc07bl01] PMI response: cmd=appnum appnum=0 [proxy:0:0@bc07bl01] got pmi command (from 26): get_appnum [proxy:0:0@bc07bl01] PMI response: cmd=appnum appnum=0 [proxy:0:0@bc07bl01] got pmi command (from 29): get_my_kvsname [proxy:0:0@bc07bl01] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:1@bc07bl02] got pmi command (from 13): get_appnum [proxy:0:1@bc07bl02] PMI response: cmd=appnum appnum=0 [proxy:0:1@bc07bl02] got pmi command (from 22): get_appnum [proxy:0:1@bc07bl02] PMI response: cmd=appnum appnum=0 [proxy:0:1@bc07bl02] got pmi command (from 10): get_my_kvsname [proxy:0:1@bc07bl02] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:0@bc07bl01] got pmi command (from 17): get_my_kvsname [proxy:0:0@bc07bl01] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:0@bc07bl01] got pmi command (from 23): get_my_kvsname [proxy:0:0@bc07bl01] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:0@bc07bl01] got pmi command (from 6): get_my_kvsname [proxy:0:0@bc07bl01] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:0@bc07bl01] got pmi command (from 10): get_appnum [proxy:0:0@bc07bl01] PMI response: cmd=appnum appnum=0 [proxy:0:0@bc07bl01] got pmi command (from 10): get_my_kvsname [proxy:0:0@bc07bl01] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:1@bc07bl02] got pmi command (from 4): get_my_kvsname [proxy:0:1@bc07bl02] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:1@bc07bl02] got pmi command (from 22): get_my_kvsname [proxy:0:1@bc07bl02] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:0@bc07bl01] got pmi command (from 26): get_my_kvsname [proxy:0:0@bc07bl01] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:1@bc07bl02] got pmi command (from 13): get_my_kvsname [proxy:0:1@bc07bl02] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:1@bc07bl02] got pmi command (from 16): get_my_kvsname [proxy:0:1@bc07bl02] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:1@bc07bl02] got pmi command (from 19): get_my_kvsname [proxy:0:1@bc07bl02] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:1@bc07bl02] got pmi command (from 13): get_my_kvsname [proxy:0:1@bc07bl02] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:1@bc07bl02] got pmi command (from 10): get_my_kvsname [proxy:0:1@bc07bl02] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:1@bc07bl02] got pmi command (from 7): get_my_kvsname [proxy:0:1@bc07bl02] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:0@bc07bl01] got pmi command (from 20): get_my_kvsname [proxy:0:0@bc07bl01] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:0@bc07bl01] got pmi command (from 29): get_my_kvsname [proxy:0:0@bc07bl01] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:1@bc07bl02] got pmi command (from 7): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 4): get_my_kvsname [proxy:0:1@bc07bl02] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:1@bc07bl02] got pmi command (from 10): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 22): get_my_kvsname [proxy:0:1@bc07bl02] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:1@bc07bl02] got pmi command (from 19): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 16): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 9): get_my_kvsname [proxy:0:0@bc07bl01] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:0@bc07bl01] got pmi command (from 23): get_my_kvsname [proxy:0:0@bc07bl01] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:1@bc07bl02] got pmi command (from 5): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 13): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 22): barrier_in [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=put kvsname=kvs_11431_0 key=sharedFilename[8] value=/dev/shm/Intel_MPI_YxNnxQ [mpiexec@bc07bl01] PMI response to fd 0 pid 4: cmd=put_result rc=0 msg=success [proxy:0:1@bc07bl02] got pmi command (from 4): put kvsname=kvs_11431_0 key=sharedFilename[8] value=/dev/shm/Intel_MPI_YxNnxQ [proxy:0:1@bc07bl02] forwarding command (cmd=put kvsname=kvs_11431_0 key=sharedFilename[8] value=/dev/shm/Intel_MPI_YxNnxQ) upstream [proxy:0:0@bc07bl01] got pmi command (from 26): get_my_kvsname [proxy:0:0@bc07bl01] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:1@bc07bl02] we don't understand the response put_result; forwarding downstream [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=barrier_in [proxy:0:1@bc07bl02] got pmi command (from 4): barrier_in [proxy:0:1@bc07bl02] forwarding command (cmd=barrier_in) upstream [proxy:0:0@bc07bl01] got pmi command (from 17): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 20): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 29): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 10): get_my_kvsname [proxy:0:0@bc07bl01] PMI response: cmd=my_kvsname kvsname=kvs_11431_0 [proxy:0:0@bc07bl01] got pmi command (from 9): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 6): put kvsname=kvs_11431_0 key=sharedFilename[0] value=/dev/shm/Intel_MPI_oQALK5 [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=put kvsname=kvs_11431_0 key=sharedFilename[0] value=/dev/shm/Intel_MPI_oQALK5 [mpiexec@bc07bl01] PMI response to fd 6 pid 6: cmd=put_result rc=0 msg=success [proxy:0:0@bc07bl01] forwarding command (cmd=put kvsname=kvs_11431_0 key=sharedFilename[0] value=/dev/shm/Intel_MPI_oQALK5) upstream [proxy:0:0@bc07bl01] we don't understand the response put_result; forwarding downstream [proxy:0:0@bc07bl01] got pmi command (from 10): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 23): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 26): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 6): barrier_in [proxy:0:0@bc07bl01] forwarding command (cmd=barrier_in) upstream [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@bc07bl01] PMI response to fd 6 pid 6: cmd=barrier_out [mpiexec@bc07bl01] PMI response to fd 0 pid 6: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] got pmi command (from 9): get kvsname=kvs_11431_0 key=sharedFilename[0] [proxy:0:0@bc07bl01] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_oQALK5 [proxy:0:0@bc07bl01] got pmi command (from 10): get kvsname=kvs_11431_0 key=sharedFilename[0] [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_oQALK5 [proxy:0:0@bc07bl01] got pmi command (from 17): get kvsname=kvs_11431_0 key=sharedFilename[0] [proxy:0:0@bc07bl01] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_oQALK5 [proxy:0:0@bc07bl01] got pmi command (from 20): get kvsname=kvs_11431_0 key=sharedFilename[0] [proxy:0:0@bc07bl01] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_oQALK5 [proxy:0:0@bc07bl01] got pmi command (from 23): get kvsname=kvs_11431_0 key=sharedFilename[0] [proxy:0:0@bc07bl01] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_oQALK5 [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] got pmi command (from 5): get kvsname=kvs_11431_0 key=sharedFilename[8] [proxy:0:0@bc07bl01] got pmi command (from 26): get kvsname=kvs_11431_0 key=sharedFilename[0] [proxy:0:0@bc07bl01] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_oQALK5 [proxy:0:0@bc07bl01] got pmi command (from 29): get [proxy:0:1@bc07bl02] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_YxNnxQ kvsname=kvs_11431_0 key=sharedFilename[0] [proxy:0:0@bc07bl01] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_oQALK5 [proxy:0:1@bc07bl02] got pmi command (from 7): get kvsname=kvs_11431_0 key=sharedFilename[8] [proxy:0:1@bc07bl02] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_YxNnxQ [proxy:0:1@bc07bl02] got pmi command (from 10): get kvsname=kvs_11431_0 key=sharedFilename[8] [proxy:0:1@bc07bl02] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_YxNnxQ [proxy:0:1@bc07bl02] got pmi command (from 13): get kvsname=kvs_11431_0 key=sharedFilename[8] [proxy:0:1@bc07bl02] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_YxNnxQ [proxy:0:1@bc07bl02] got pmi command (from 16): get kvsname=kvs_11431_0 key=sharedFilename[8] [proxy:0:1@bc07bl02] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_YxNnxQ [proxy:0:1@bc07bl02] got pmi command (from 19): get kvsname=kvs_11431_0 key=sharedFilename[8] [proxy:0:1@bc07bl02] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_YxNnxQ [proxy:0:1@bc07bl02] got pmi command (from 22): get kvsname=kvs_11431_0 key=sharedFilename[8] [proxy:0:1@bc07bl02] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_YxNnxQ [proxy:0:0@bc07bl01] got pmi command (from 9): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 5): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 6): put kvsname=kvs_11431_0 key=DAPL_PROVIDER value= [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=put kvsname=kvs_11431_0 key=DAPL_PROVIDER value= [mpiexec@bc07bl01] PMI response to fd 6 pid 6: cmd=put_result rc=0 msg=success [proxy:0:0@bc07bl01] forwarding command (cmd=put kvsname=kvs_11431_0 key=DAPL_PROVIDER value=) upstream [proxy:0:0@bc07bl01] we don't understand the response put_result; forwarding downstream [proxy:0:0@bc07bl01] got pmi command (from 6): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 10): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 23): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 20): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 29): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 17): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 26): barrier_in [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=barrier_in [proxy:0:0@bc07bl01] forwarding command (cmd=barrier_in) upstream [proxy:0:1@bc07bl02] got pmi command (from 16): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 22): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 10): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 7): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 19): barrier_in [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@bc07bl01] PMI response to fd 6 pid 13: cmd=barrier_out [mpiexec@bc07bl01] PMI response to fd 0 pid 13: cmd=barrier_out [proxy:0:1@bc07bl02] got pmi command (from 4): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 13): barrier_in [proxy:0:1@bc07bl02] forwarding command (cmd=barrier_in) upstream [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] got pmi command (from 17): get kvsname=kvs_11431_0 key=DAPL_PROVIDER [proxy:0:0@bc07bl01] PMI response: cmd=get_result rc=0 msg=success value= [proxy:0:0@bc07bl01] got pmi command (from 9): get kvsname=kvs_11431_0 key=DAPL_PROVIDER [proxy:0:0@bc07bl01] PMI response: cmd=get_result rc=0 msg=success value= [proxy:0:0@bc07bl01] got pmi command (from 23): get [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11431_0 key=DAPL_PROVIDER [mpiexec@bc07bl01] PMI response to fd 0 pid 5: cmd=get_result rc=0 msg=success value= key=DAPL_PROVIDER [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11431_0 key=DAPL_PROVIDER [mpiexec@bc07bl01] PMI response to fd 0 pid 7: cmd=get_result rc=0 msg=success value= key=DAPL_PROVIDER [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] got pmi command (from 5): get kvsname=kvs_11431_0 key=DAPL_PROVIDER [proxy:0:1@bc07bl02] forwarding command (cmd=get kvsname=kvs_11431_0 key=DAPL_PROVIDER) upstream [proxy:0:1@bc07bl02] got pmi command (from 7): get kvsname=kvs_11431_0 key=DAPL_PROVIDER [proxy:0:1@bc07bl02] forwarding command (cmd=get kvsname=kvs_11431_0 key=DAPL_PROVIDER) upstream [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11431_0 key=DAPL_PROVIDER [mpiexec@bc07bl01] PMI response to fd 0 pid 10: cmd=get_result rc=0 msg=success value= key=DAPL_PROVIDER [proxy:0:1@bc07bl02] got pmi command (from 10): get kvsname=kvs_11431_0 key=DAPL_PROVIDER [proxy:0:1@bc07bl02] forwarding command (cmd=get kvsname=kvs_11431_0 key=DAPL_PROVIDER) upstream [proxy:0:1@bc07bl02] got pmi command (from 13): get kvsname=kvs_11431_0 key=DAPL_PROVIDER [proxy:0:1@bc07bl02] forwarding command (cmd=get kvsname=kvs_11431_0 key=DAPL_PROVIDER) upstream [proxy:0:1@bc07bl02] got pmi command (from 16): get kvsname=kvs_11431_0 [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11431_0 key=DAPL_PROVIDER [mpiexec@bc07bl01] PMI response to fd 0 pid 13: cmd=get_result rc=0 msg=success value= key=DAPL_PROVIDER kvsname=kvs_11431_0 key=DAPL_PROVIDER key=DAPL_PROVIDER [proxy:0:1@bc07bl02] forwarding command (cmd=get kvsname=kvs_11431_0 key=DAPL_PROVIDER) upstream [proxy:0:1@bc07bl02] we don't understand the response get_result; forwarding downstream [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11431_0 key=DAPL_PROVIDER [mpiexec@bc07bl01] PMI response to fd 0 pid 16: cmd=get_result rc=0 msg=success value= key=DAPL_PROVIDER [proxy:0:0@bc07bl01] PMI response: cmd=get_result rc=0 msg=success value= [proxy:0:1@bc07bl02] got pmi command (from 19): get kvsname=kvs_11431_0 key=DAPL_PROVIDER [proxy:0:1@bc07bl02] PMI response: cmd=get_result rc=0 msg=success value= [proxy:0:1@bc07bl02] got pmi command (from 22): get kvsname=kvs_11431_0 key=DAPL_PROVIDER [proxy:0:1@bc07bl02] PMI response: cmd=get_result rc=0 msg=success value= [proxy:0:1@bc07bl02] we don't understand the response get_result; forwarding downstream [proxy:0:1@bc07bl02] got pmi command (from 5): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 19): barrier_in [proxy:0:1@bc07bl02] we don't understand the response get_result; forwarding downstream [proxy:0:1@bc07bl02] got pmi command (from 7): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 22): barrier_in [proxy:0:1@bc07bl02] we don't understand the response get_result; forwarding downstream [proxy:0:1@bc07bl02] got pmi command (from 10): barrier_in [proxy:0:1@bc07bl02] we don't understand the response get_result; forwarding downstream [proxy:0:0@bc07bl01] got pmi command (from 6): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 9): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 13): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 10): get kvsname=kvs_11431_0 key=DAPL_PROVIDER [proxy:0:0@bc07bl01] PMI response: cmd=get_result rc=0 msg=success value= [proxy:0:0@bc07bl01] got pmi command (from 17): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 16): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 4): get kvsname=kvs_11431_0 key=DAPL_PROVIDER [proxy:0:1@bc07bl02] PMI response: cmd=get_result rc=0 msg=success value= [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=barrier_in [proxy:0:0@bc07bl01] got pmi command (from 26): get kvsname=kvs_11431_0 key=DAPL_PROVIDER [proxy:0:0@bc07bl01] PMI response: cmd=get_result rc=0 msg=success value= [proxy:0:0@bc07bl01] got pmi command (from 29): get kvsname=kvs_11431_0 key=DAPL_PROVIDER [proxy:0:0@bc07bl01] PMI response: cmd=get_result rc=0 msg=success value= [proxy:0:0@bc07bl01] got pmi command (from 10): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 23): barrier_in [proxy:0:1@bc07bl02] got pmi command (from 4): barrier_in [proxy:0:1@bc07bl02] forwarding command (cmd=barrier_in) upstream [proxy:0:0@bc07bl01] got pmi command (from 26): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 29): barrier_in [proxy:0:0@bc07bl01] got pmi command (from 20): get kvsname=kvs_11431_0 key=DAPL_PROVIDER [proxy:0:0@bc07bl01] PMI response: cmd=get_result rc=0 msg=success value= [proxy:0:0@bc07bl01] got pmi command (from 20): barrier_in [proxy:0:0@bc07bl01] forwarding command (cmd=barrier_in) upstream [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@bc07bl01] PMI response to fd 6 pid 20: cmd=barrier_out [mpiexec@bc07bl01] PMI response to fd 0 pid 20: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] PMI response: cmd=barrier_out [proxy:0:0@bc07bl01] got pmi command (from 6): get kvsname=kvs_11431_0 key=DAPL_MISMATCH [proxy:0:0@bc07bl01] forwarding command (cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH) upstream [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH [mpiexec@bc07bl01] PMI response to fd 6 pid 6: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [proxy:0:0@bc07bl01] got pmi command (from 17): get kvsname=kvs_11431_0 key=DAPL_MISMATCH [proxy:0:0@bc07bl01] forwarding command (cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH) upstream [proxy:0:0@bc07bl01] got pmi command (from 20): get [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH [mpiexec@bc07bl01] PMI response to fd 6 pid 17: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown kvsname=kvs_11431_0 key=DAPL_MISMATCH [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH [mpiexec@bc07bl01] PMI response to fd 0 pid 4: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [proxy:0:0@bc07bl01] forwarding command (cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH) upstream [proxy:0:0@bc07bl01] got pmi command (from 23): get kvsname=kvs_11431_0 key=DAPL_MISMATCH [proxy:0:0@bc07bl01] forwarding command (cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH) upstream [proxy:0:0@bc07bl01] we don't understand the response get_result; forwarding downstream [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH [mpiexec@bc07bl01] PMI response to fd 6 pid 20: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH [mpiexec@bc07bl01] PMI response to fd 0 pid 5: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [proxy:0:0@bc07bl01] got pmi command (from 9): get kvsname=kvs_11431_0 key=DAPL_MISMATCH [proxy:0:0@bc07bl01] forwarding command (cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH) upstream [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH [mpiexec@bc07bl01] PMI response to fd 6 pid 23: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH [mpiexec@bc07bl01] PMI response to fd 0 pid 7: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [proxy:0:0@bc07bl01] got pmi command (from 10): get kvsname=kvs_11431_0 key=DAPL_MISMATCH [proxy:0:0@bc07bl01] forwarding command (cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH) upstream [proxy:0:0@bc07bl01] got pmi command (from 26): get kvsname=kvs_11431_0 key=DAPL_MISMATCH [proxy:0:0@bc07bl01] forwarding command (cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH) upstream [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH [mpiexec@bc07bl01] PMI response to fd 6 pid 9: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH [mpiexec@bc07bl01] PMI response to fd 0 pid 10: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [proxy:0:0@bc07bl01] got pmi command (from 29): get kvsname=kvs_11431_0 key=DAPL_MISMATCH [proxy:0:0@bc07bl01] forwarding command (cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH) upstream [proxy:0:0@bc07bl01] we don't understand the response get_result; forwarding downstream [proxy:0:0@bc07bl01] we don't understand the response get_result; forwarding downstream [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH [mpiexec@bc07bl01] PMI response to fd 6 pid 10: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH [mpiexec@bc07bl01] PMI response to fd 0 pid 13: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [proxy:0:0@bc07bl01] we don't understand the response get_result; forwarding downstream [proxy:0:0@bc07bl01] we don't understand the response get_result; forwarding downstream [proxy:0:0@bc07bl01] we don't understand the response get_result; forwarding downstream [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH [mpiexec@bc07bl01] PMI response to fd 6 pid 26: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH [mpiexec@bc07bl01] PMI response to fd 0 pid 16: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [proxy:0:0@bc07bl01] we don't understand the response get_result; forwarding downstream [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH [mpiexec@bc07bl01] PMI response to fd 6 pid 29: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH [mpiexec@bc07bl01] PMI response to fd 0 pid 19: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [proxy:0:0@bc07bl01] we don't understand the response get_result; forwarding downstream [mpiexec@bc07bl01] [pgid: 0] got PMI command: cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH [mpiexec@bc07bl01] PMI response to fd 0 pid 22: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] PMI response: cmd=barrier_out [proxy:0:1@bc07bl02] got pmi command (from 4): get kvsname=kvs_11431_0 key=DAPL_MISMATCH [proxy:0:1@bc07bl02] forwarding command (cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH) upstream [proxy:0:1@bc07bl02] got pmi command (from 5): get kvsname=kvs_11431_0 key=DAPL_MISMATCH [proxy:0:1@bc07bl02] forwarding command (cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH) upstream [proxy:0:1@bc07bl02] got pmi command (from 7): get kvsname=kvs_11431_0 key=DAPL_MISMATCH [proxy:0:1@bc07bl02] forwarding command (cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH) upstream [proxy:0:1@bc07bl02] got pmi command (from 10): get kvsname=kvs_11431_0 key=DAPL_MISMATCH [proxy:0:1@bc07bl02] forwarding command (cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH) upstream [proxy:0:1@bc07bl02] got pmi command (from 13): get kvsname=kvs_11431_0 key=DAPL_MISMATCH [proxy:0:1@bc07bl02] forwarding command (cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH) upstream [proxy:0:1@bc07bl02] got pmi command (from 16): get kvsname=kvs_11431_0 key=DAPL_MISMATCH [proxy:0:1@bc07bl02] forwarding command (cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH) upstream [proxy:0:1@bc07bl02] got pmi command (from 19): get kvsname=kvs_11431_0 key=DAPL_MISMATCH [proxy:0:1@bc07bl02] forwarding command (cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH) upstream [proxy:0:1@bc07bl02] got pmi command (from 22): get kvsname=kvs_11431_0 key=DAPL_MISMATCH [proxy:0:1@bc07bl02] forwarding command (cmd=get kvsname=kvs_11431_0 key=DAPL_MISMATCH) upstream [proxy:0:1@bc07bl02] we don't understand the response get_result; forwarding downstream [proxy:0:1@bc07bl02] we don't understand the response get_result; forwarding downstream [proxy:0:1@bc07bl02] we don't understand the response get_result; forwarding downstream [proxy:0:1@bc07bl02] we don't understand the response get_result; forwarding downstream [proxy:0:1@bc07bl02] we don't understand the response get_result; forwarding downstream [proxy:0:1@bc07bl02] we don't understand the response get_result; forwarding downstream [proxy:0:1@bc07bl02] we don't understand the response get_result; forwarding downstream [proxy:0:1@bc07bl02] we don't understand the response get_result; forwarding downstream [proxy:0:1@bc07bl02] got crush from 7, 0 [proxy:0:1@bc07bl02] got crush from 16, 0 [proxy:0:0@bc07bl01] got crush from 10, 0 [proxy:0:1@bc07bl02] got crush from 19, 0 [proxy:0:1@bc07bl02] HYDT_dmxu_poll_wait_for_event (./tools/demux/demux_poll.c:70): assert (!(pollfds[i].revents & ~POLLIN & ~POLLOUT & ~POLLHUP)) failed [proxy:0:1@bc07bl02] main (./pm/pmiserv/pmip.c:387): demux engine error waiting for event [proxy:0:0@bc07bl01] got crush from 26, 0 [proxy:0:0@bc07bl01] got crush from 6, 0 [proxy:0:0@bc07bl01] got crush from 17, 0 [proxy:0:0@bc07bl01] got crush from 29, 0 [proxy:0:0@bc07bl01] got crush from 20, 0 [proxy:0:0@bc07bl01] got crush from 9, 0 [mpiexec@bc07bl01] HYDT_bscu_wait_for_completion (./tools/bootstrap/utils/bscu_wait.c:101): one of the processes terminated badly; aborting [mpiexec@bc07bl01] HYDT_bsci_wait_for_completion (./tools/bootstrap/src/bsci_wait.c:18): bootstrap device returned error waiting for completion [mpiexec@bc07bl01] HYD_pmci_wait_for_completion (./pm/pmiserv/pmiserv_pmci.c:521): bootstrap server returned error waiting for completion [mpiexec@bc07bl01] main (./ui/mpich/mpiexec.c:548): process manager error waiting for completion ############################################################ Begin Epilogue: Tue May 22 16:19:26 EDT 2012 PBS Job ID: 645.rhel6pbs PBS User Name: userfoo Fair Share Group: generalGrp PBS Account Name: PBS Job Name: trivial PBS Queue: rhel6 Resource Request: cput=01:20:00,mem=16gb,neednodes=2:ppn=8,nodes=2:ppn=8,pmem=1gb,walltime=00:10:00 Resource Consumed: cput=00:00:00,mem=772kb,vmem=13616kb,walltime=00:00:01 Nodes Assigned: bc07bl01,bc07bl02 Session ID: 11203 Job exit code: 645.rhel6pbs0 Cleaning up /scratch/645.rhel6pbs on bc07bl01 Cleaning up /scratch/645.rhel6pbs on bc07bl02 End Prologue: Tue May 22 16:19:26 EDT 2012 ############################################################