host: compute-0-0 host: compute-0-1 ================================================================================================== mpiexec options: ---------------- Base path: /opt/intel/impi/4.1.0.024/intel64/bin/ Launcher: ssh Debug level: 1 Enable X: -1 Global environment: ------------------- I_MPI_PERHOST=allcores LD_LIBRARY_PATH=/opt/intel/itac/8.0.3.007/itac/slib_impi4:/opt/intel/impi/4.1.0.024/intel64/lib:/opt/intel/impi/4.1.0.024/intel64/lib:/opt/intel/composer_xe_2013.0.079/compiler/lib/intel64:/opt/intel/mic/coi/host-linux-release/lib:/opt/intel/mic/myo/lib:/opt/intel/composer_xe_2013.0.079/mpirt/lib/intel64:/opt/intel/composer_xe_2013.0.079/ipp/../compiler/lib/intel64:/opt/intel/composer_xe_2013.0.079/ipp/lib/intel64:/opt/intel/composer_xe_2013.0.079/compiler/lib/intel64:/opt/intel/composer_xe_2013.0.079/mkl/lib/intel64:/opt/intel/composer_xe_2013.0.079/tbb/lib/intel64:/opt/gridengine/lib/lx26-amd64:/opt/intel/impi/4.1.0.024/intel64/lib:/opt/intel/impi/4.1.0.024/intel64/lib:/opt/intel/impi/4.1.0.024/mic/lib SET_HOST_TYPE= -x MKLROOT=/opt/intel/composer_xe_2013.0.079/mkl MANPATH=/opt/intel/itac/8.0.3.007/man:/opt/intel/impi/4.1.0.024/man:/opt/intel/composer_xe_2013.0.079/man/en_US:/opt/intel/composer_xe_2013.0.079/man/en_US:/opt/intel/impi/4.1.0.024/man:/usr/kerberos/man:/usr/java/latest/man:/usr/local/share/man:/usr/share/man/en:/usr/share/man:/opt/ganglia/man:/opt/rocks/man:/opt/condor/man:/opt/tripwire/man:/opt/openmpi/share/man:/opt/sun-ct/man:/opt/gridengine/man::/opt/intel/vtune_amplifier_xe_2013/man PDSHROOT=/opt/pdsh SELINUX_INIT=YES CONSOLE=/dev/console VT_MPI=impi4 HOSTNAME=compute-0-1.local SGE_INFOTEXT_MAX_COLUMN=5000 INTEL_LICENSE_FILE=/opt/intel/licenses:/opt/intel/composer_xe_2013.0.079/licenses:/opt/intel/licenses:/home/ivasan/intel/licenses IPPROOT=/opt/intel/composer_xe_2013.0.079/ipp SGE_TASK_STEPSIZE=undefined TERM=vt100 SHELL=/bin/bash ECLIPSE_HOME=/opt/eclipse HISTSIZE=1000 NHOSTS=2 CONDOR_IDS=407.407 SSH_CLIENT=157.88.111.46 1365 22 TMPDIR=/tmp/509.1.all.q SGE_O_WORKDIR=/home/ivasan/test_impi/IMB LIBRARY_PATH=/opt/intel/composer_xe_2013.0.079/compiler/lib/intel64:/opt/intel/composer_xe_2013.0.079/ipp/../compiler/lib/intel64:/opt/intel/composer_xe_2013.0.079/ipp/lib/intel64:/opt/intel/composer_xe_2013.0.079/compiler/lib/intel64:/opt/intel/composer_xe_2013.0.079/mkl/lib/intel64:/opt/intel/composer_xe_2013.0.079/tbb/lib/intel64 SGE_O_HOME=/home/ivasan SGE_CELL=default SGE_ARCH=lx26-amd64 MPICH_PROCESS_GROUP=no MIC_LD_LIBRARY_PATH=/opt/intel/composer_xe_2013.0.079/compiler/lib/mic:/opt/intel/mic/coi/device-linux-release/lib:/opt/intel/mic/myo/lib:/opt/intel/composer_xe_2013.0.079/compiler/lib/mic:/opt/intel/composer_xe_2013.0.079/mkl/lib/mic:/opt/intel/composer_xe_2013.0.079/tbb/lib/mic ROCKSROOT=/opt/rocks/share/devel SSH_TTY=/dev/pts/2 RESTARTED=0 ANT_HOME=/opt/rocks ARC=lx26-amd64 USER=ivasan LS_COLORS=no=00:fi=00:di=01;34:ln=01;36:pi=40;33:so=01;35:bd=40;33;01:cd=40;33;01:or=01;05;37;41:mi=01;05;37;41:ex=01;32:*.cmd=01;32:*.exe=01;32:*.com=01;32:*.btm=01;32:*.bat=01;32:*.sh=01;32:*.csh=01;32:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01;31:*.lzh=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.gz=01;31:*.bz2=01;31:*.bz=01;31:*.tz=01;31:*.rpm=01;31:*.cpio=01;31:*.jpg=01;35:*.gif=01;35:*.bmp=01;35:*.xbm=01;35:*.xpm=01;35:*.png=01;35:*.tif=01;35: INIT_VERSION=sysvinit-2.86 SGE_TASK_LAST=undefined ROCKS_ROOT=/opt/rocks QUEUE=all.q CPATH=/opt/intel/composer_xe_2013.0.079/mkl/include:/opt/intel/composer_xe_2013.0.079/tbb/include SGE_TASK_ID=undefined NLSPATH=/opt/intel/composer_xe_2013.0.079/compiler/lib/intel64/locale/%l_%t/%N:/opt/intel/composer_xe_2013.0.079/ipp/lib/intel64/locale/%l_%t/%N:/opt/intel/composer_xe_2013.0.079/mkl/lib/intel64/locale/%l_%t/%N:/opt/intel/composer_xe_2013.0.079/debugger/intel64/locale/%l_%t/%N PATH=/tmp/509.1.all.q:/opt/intel/vtune_amplifier_xe_2013/bin64:/opt/intel/itac/8.0.3.007/bin:/opt/intel/impi/4.1.0.024/intel64/bin:/opt/intel/impi/4.1.0.024/intel64/bin:/opt/intel/composer_xe_2013.0.079/bin/intel64:/opt/intel/composer_xe_2013.0.079/mpirt/bin/intel64:/opt/intel/composer_xe_2013.0.079/bin/intel64:/opt/intel/composer_xe_2013.0.079/bin/intel64_mic:/opt/intel/composer_xe_2013.0.079/debugger/gui/intel64:/opt/intel/impi/4.1.0.024/intel64/bin:/usr/kerberos/bin:/usr/java/latest/bin:/usr/local/bin:/bin:/usr/bin:/opt/eclipse:/opt/ganglia/bin:/opt/ganglia/sbin:/opt/maven/bin:/opt/pdsh/bin:/opt/rocks/bin:/opt/rocks/sbin:/opt/condor/bin:/opt/condor/sbin:/opt/gridengine/bin/lx26-amd64:/usr/sbin:/home/ivasan/bin:/home/ivasan/binvasp VT_ADD_LIBS=-ldwarf -lelf -lvtunwind -lnsl -lm -ldl -lpthread MAVEN_HOME=/opt/maven MAIL=/var/spool/mail/ivasan SGE_BINARY_PATH=/opt/gridengine/bin/lx26-amd64 RUNLEVEL=3 TBBROOT=/opt/intel/composer_xe_2013.0.079/tbb CONDOR_CONFIG=/opt/condor/etc/condor_config SGE_STDERR_PATH=/home/ivasan/test_impi/IMB/test.impi.IMB.o509 PWD=/home/ivasan/test_impi/IMB INPUTRC=/etc/inputrc JAVA_HOME=/usr/java/latest SGE_EXECD_PORT=537 SGE_ACCOUNT=sge SGE_STDOUT_PATH=/home/ivasan/test_impi/IMB/test.impi.IMB.o509 LANG=en_US.iso885915 SGE_QMASTER_PORT=536 JOB_NAME=test.impi.IMB JOB_SCRIPT=/opt/gridengine/default/spool/compute-0-1/job_scripts/509 SGE_ROOT=/opt/gridengine SGE_NOMSG=1 VT_LIB_DIR=/opt/intel/itac/8.0.3.007/itac/lib_impi4 CONDOR_ROOT=/opt/condor PREVLEVEL=N VT_ROOT=/opt/intel/itac/8.0.3.007 REQNAME=test.impi.IMB VTUNE_AMPLIFIER_XE_2013_DIR=/opt/intel/vtune_amplifier_xe_2013 SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass ENVIRONMENT=BATCH SGE_JOB_SPOOL_DIR=/opt/gridengine/default/spool/compute-0-1/active_jobs/509.1 PE_HOSTFILE=/opt/gridengine/default/spool/compute-0-1/active_jobs/509.1/pe_hostfile HOME=/home/ivasan SHLVL=3 NQUEUES=2 SGE_CWD_PATH=/home/ivasan/test_impi/IMB SGE_O_LOGNAME=ivasan ROLLSROOT=/opt/rocks/share/devel/src/roll VT_SLIB_DIR=/opt/intel/itac/8.0.3.007/itac/slib_impi4 SGE_O_MAIL=/var/spool/mail/ivasan LOGNAME=ivasan JOB_ID=509 TMP=/tmp/509.1.all.q CVS_RSH=ssh CLASSPATH=/opt/intel/itac/8.0.3.007/itac/lib_impi4 SSH_CONNECTION=157.88.111.46 1365 157.88.111.174 22 PE=impi I_MPI_HYDRA_BOOTSTRAP=sge SGE_TASK_FIRST=undefined LESSOPEN=|/usr/bin/lesspipe.sh %s SGE_O_PATH=/opt/intel/vtune_amplifier_xe_2013/bin64:/opt/intel/itac/8.0.3.007/bin:/opt/intel/impi/4.1.0.024/intel64/bin:/opt/intel/impi/4.1.0.024/intel64/bin:/opt/intel/composer_xe_2013.0.079/bin/intel64:/opt/intel/composer_xe_2013.0.079/mpirt/bin/intel64:/opt/intel/composer_xe_2013.0.079/bin/intel64:/opt/intel/composer_xe_2013.0.079/bin/intel64_mic:/opt/intel/composer_xe_2013.0.079/debugger/gui/intel64:/opt/intel/impi/4.1.0.024/intel64/bin:/usr/kerberos/bin:/usr/java/latest/bin:/usr/local/bin:/bin:/usr/bin:/opt/eclipse:/opt/ganglia/bin:/opt/ganglia/sbin:/opt/maven/bin:/opt/pdsh/bin:/opt/rocks/bin:/opt/rocks/sbin:/opt/condor/bin:/opt/condor/sbin:/opt/gridengine/bin/lx26-amd64:/usr/sbin:/home/ivasan/bin:/home/ivasan/binvasp SGE_CLUSTER_NAME=gamma SGE_O_SHELL=/bin/bash SGE_O_HOST=gamma REQUEST=test.impi.IMB INCLUDE=/opt/intel/composer_xe_2013.0.079/mkl/include NSLOTS=24 G_BROKEN_FILENAMES=1 SGE_STDIN_PATH=/dev/null I_MPI_ROOT=/opt/intel/impi/4.1.0.024 _=/opt/intel/impi/4.1.0.024/intel64/bin/mpiexec.hydra Hydra internal environment: --------------------------- MPICH_ENABLE_CKPOINT=1 GFORTRAN_UNBUFFERED_PRECONNECTED=y User set environment: --------------------- I_MPI_DEBUG=5 Proxy information: ********************* [1] proxy: compute-0-0 (12 cores) Exec list: IMB-MPI1 (12 processes); [2] proxy: compute-0-1 (12 cores) Exec list: IMB-MPI1 (12 processes); ================================================================================================== [mpiexec@compute-0-1.local] Timeout set to -1 (-1 means infinite) [mpiexec@compute-0-1.local] Got a control port string of compute-0-1:49467 Proxy launch args: /opt/intel/impi/4.1.0.024/intel64/bin/pmi_proxy --control-port compute-0-1:49467 --debug --pmi-connect lazy-cache --pmi-aggregate -s 0 --rmk sge --launcher ssh --demux poll --pgid 0 --enable-stdin 1 --retries 10 --proxy-id [mpiexec@compute-0-1.local] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1 Arguments being passed to proxy 0: --version 1.4.1p1 --iface-ip-env-name MPICH_INTERFACE_HOSTNAME --hostname compute-0-0 --global-core-map 0,12,12 --filler-process-map 0,12,12 --global-process-count 24 --auto-cleanup 1 --pmi-rank -1 --pmi-kvsname kvs_30666_0 --pmi-process-mapping (vector,(0,2,12)) --topolib ipl --ckpointlib blcr --ckpoint-prefix /tmp --ckpoint-preserve -1 --ckpoint off --ckpoint-num -1 --global-inherited-env 103 'I_MPI_PERHOST=allcores' 'LD_LIBRARY_PATH=/opt/intel/itac/8.0.3.007/itac/slib_impi4:/opt/intel/impi/4.1.0.024/intel64/lib:/opt/intel/impi/4.1.0.024/intel64/lib:/opt/intel/composer_xe_2013.0.079/compiler/lib/intel64:/opt/intel/mic/coi/host-linux-release/lib:/opt/intel/mic/myo/lib:/opt/intel/composer_xe_2013.0.079/mpirt/lib/intel64:/opt/intel/composer_xe_2013.0.079/ipp/../compiler/lib/intel64:/opt/intel/composer_xe_2013.0.079/ipp/lib/intel64:/opt/intel/composer_xe_2013.0.079/compiler/lib/intel64:/opt/intel/composer_xe_2013.0.079/mkl/lib/intel64:/opt/intel/composer_xe_2013.0.079/tbb/lib/intel64:/opt/gridengine/lib/lx26-amd64:/opt/intel/impi/4.1.0.024/intel64/lib:/opt/intel/impi/4.1.0.024/intel64/lib:/opt/intel/impi/4.1.0.024/mic/lib' 'SET_HOST_TYPE= -x ' 'MKLROOT=/opt/intel/composer_xe_2013.0.079/mkl' 'MANPATH=/opt/intel/itac/8.0.3.007/man:/opt/intel/impi/4.1.0.024/man:/opt/intel/composer_xe_2013.0.079/man/en_US:/opt/intel/composer_xe_2013.0.079/man/en_US:/opt/intel/impi/4.1.0.024/man:/usr/kerberos/man:/usr/java/latest/man:/usr/local/share/man:/usr/share/man/en:/usr/share/man:/opt/ganglia/man:/opt/rocks/man:/opt/condor/man:/opt/tripwire/man:/opt/openmpi/share/man:/opt/sun-ct/man:/opt/gridengine/man::/opt/intel/vtune_amplifier_xe_2013/man' 'PDSHROOT=/opt/pdsh' 'SELINUX_INIT=YES' 'CONSOLE=/dev/console' 'VT_MPI=impi4' 'HOSTNAME=compute-0-1.local' 'SGE_INFOTEXT_MAX_COLUMN=5000' 'INTEL_LICENSE_FILE=/opt/intel/licenses:/opt/intel/composer_xe_2013.0.079/licenses:/opt/intel/licenses:/home/ivasan/intel/licenses' 'IPPROOT=/opt/intel/composer_xe_2013.0.079/ipp' 'SGE_TASK_STEPSIZE=undefined' 'TERM=vt100' 'SHELL=/bin/bash' 'ECLIPSE_HOME=/opt/eclipse' 'HISTSIZE=1000' 'NHOSTS=2' 'CONDOR_IDS=407.407' 'SSH_CLIENT=157.88.111.46 1365 22' 'TMPDIR=/tmp/509.1.all.q' 'SGE_O_WORKDIR=/home/ivasan/test_impi/IMB' 'LIBRARY_PATH=/opt/intel/composer_xe_2013.0.079/compiler/lib/intel64:/opt/intel/composer_xe_2013.0.079/ipp/../compiler/lib/intel64:/opt/intel/composer_xe_2013.0.079/ipp/lib/intel64:/opt/intel/composer_xe_2013.0.079/compiler/lib/intel64:/opt/intel/composer_xe_2013.0.079/mkl/lib/intel64:/opt/intel/composer_xe_2013.0.079/tbb/lib/intel64' 'SGE_O_HOME=/home/ivasan' 'SGE_CELL=default' 'SGE_ARCH=lx26-amd64' 'MPICH_PROCESS_GROUP=no' 'MIC_LD_LIBRARY_PATH=/opt/intel/composer_xe_2013.0.079/compiler/lib/mic:/opt/intel/mic/coi/device-linux-release/lib:/opt/intel/mic/myo/lib:/opt/intel/composer_xe_2013.0.079/compiler/lib/mic:/opt/intel/composer_xe_2013.0.079/mkl/lib/mic:/opt/intel/composer_xe_2013.0.079/tbb/lib/mic' 'ROCKSROOT=/opt/rocks/share/devel' 'SSH_TTY=/dev/pts/2' 'RESTARTED=0' 'ANT_HOME=/opt/rocks' 'ARC=lx26-amd64' 'USER=ivasan' 'LS_COLORS=no=00:fi=00:di=01;34:ln=01;36:pi=40;33:so=01;35:bd=40;33;01:cd=40;33;01:or=01;05;37;41:mi=01;05;37;41:ex=01;32:*.cmd=01;32:*.exe=01;32:*.com=01;32:*.btm=01;32:*.bat=01;32:*.sh=01;32:*.csh=01;32:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01;31:*.lzh=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.gz=01;31:*.bz2=01;31:*.bz=01;31:*.tz=01;31:*.rpm=01;31:*.cpio=01;31:*.jpg=01;35:*.gif=01;35:*.bmp=01;35:*.xbm=01;35:*.xpm=01;35:*.png=01;35:*.tif=01;35:' 'INIT_VERSION=sysvinit-2.86' 'SGE_TASK_LAST=undefined' 'ROCKS_ROOT=/opt/rocks' 'QUEUE=all.q' 'CPATH=/opt/intel/composer_xe_2013.0.079/mkl/include:/opt/intel/composer_xe_2013.0.079/tbb/include' 'SGE_TASK_ID=undefined' 'NLSPATH=/opt/intel/composer_xe_2013.0.079/compiler/lib/intel64/locale/%l_%t/%N:/opt/intel/composer_xe_2013.0.079/ipp/lib/intel64/locale/%l_%t/%N:/opt/intel/composer_xe_2013.0.079/mkl/lib/intel64/locale/%l_%t/%N:/opt/intel/composer_xe_2013.0.079/debugger/intel64/locale/%l_%t/%N' 'PATH=/tmp/509.1.all.q:/opt/intel/vtune_amplifier_xe_2013/bin64:/opt/intel/itac/8.0.3.007/bin:/opt/intel/impi/4.1.0.024/intel64/bin:/opt/intel/impi/4.1.0.024/intel64/bin:/opt/intel/composer_xe_2013.0.079/bin/intel64:/opt/intel/composer_xe_2013.0.079/mpirt/bin/intel64:/opt/intel/composer_xe_2013.0.079/bin/intel64:/opt/intel/composer_xe_2013.0.079/bin/intel64_mic:/opt/intel/composer_xe_2013.0.079/debugger/gui/intel64:/opt/intel/impi/4.1.0.024/intel64/bin:/usr/kerberos/bin:/usr/java/latest/bin:/usr/local/bin:/bin:/usr/bin:/opt/eclipse:/opt/ganglia/bin:/opt/ganglia/sbin:/opt/maven/bin:/opt/pdsh/bin:/opt/rocks/bin:/opt/rocks/sbin:/opt/condor/bin:/opt/condor/sbin:/opt/gridengine/bin/lx26-amd64:/usr/sbin:/home/ivasan/bin:/home/ivasan/binvasp' 'VT_ADD_LIBS=-ldwarf -lelf -lvtunwind -lnsl -lm -ldl -lpthread' 'MAVEN_HOME=/opt/maven' 'MAIL=/var/spool/mail/ivasan' 'SGE_BINARY_PATH=/opt/gridengine/bin/lx26-amd64' 'RUNLEVEL=3' 'TBBROOT=/opt/intel/composer_xe_2013.0.079/tbb' 'CONDOR_CONFIG=/opt/condor/etc/condor_config' 'SGE_STDERR_PATH=/home/ivasan/test_impi/IMB/test.impi.IMB.o509' 'PWD=/home/ivasan/test_impi/IMB' 'INPUTRC=/etc/inputrc' 'JAVA_HOME=/usr/java/latest' 'SGE_EXECD_PORT=537' 'SGE_ACCOUNT=sge' 'SGE_STDOUT_PATH=/home/ivasan/test_impi/IMB/test.impi.IMB.o509' 'LANG=en_US.iso885915' 'SGE_QMASTER_PORT=536' 'JOB_NAME=test.impi.IMB' 'JOB_SCRIPT=/opt/gridengine/default/spool/compute-0-1/job_scripts/509' 'SGE_ROOT=/opt/gridengine' 'SGE_NOMSG=1' 'VT_LIB_DIR=/opt/intel/itac/8.0.3.007/itac/lib_impi4' 'CONDOR_ROOT=/opt/condor' 'PREVLEVEL=N' 'VT_ROOT=/opt/intel/itac/8.0.3.007' 'REQNAME=test.impi.IMB' 'VTUNE_AMPLIFIER_XE_2013_DIR=/opt/intel/vtune_amplifier_xe_2013' 'SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass' 'ENVIRONMENT=BATCH' 'SGE_JOB_SPOOL_DIR=/opt/gridengine/default/spool/compute-0-1/active_jobs/509.1' 'PE_HOSTFILE=/opt/gridengine/default/spool/compute-0-1/active_jobs/509.1/pe_hostfile' 'HOME=/home/ivasan' 'SHLVL=3' 'NQUEUES=2' 'SGE_CWD_PATH=/home/ivasan/test_impi/IMB' 'SGE_O_LOGNAME=ivasan' 'ROLLSROOT=/opt/rocks/share/devel/src/roll' 'VT_SLIB_DIR=/opt/intel/itac/8.0.3.007/itac/slib_impi4' 'SGE_O_MAIL=/var/spool/mail/ivasan' 'LOGNAME=ivasan' 'JOB_ID=509' 'TMP=/tmp/509.1.all.q' 'CVS_RSH=ssh' 'CLASSPATH=/opt/intel/itac/8.0.3.007/itac/lib_impi4' 'SSH_CONNECTION=157.88.111.46 1365 157.88.111.174 22' 'PE=impi' 'I_MPI_HYDRA_BOOTSTRAP=sge' 'SGE_TASK_FIRST=undefined' 'LESSOPEN=|/usr/bin/lesspipe.sh %s' 'SGE_O_PATH=/opt/intel/vtune_amplifier_xe_2013/bin64:/opt/intel/itac/8.0.3.007/bin:/opt/intel/impi/4.1.0.024/intel64/bin:/opt/intel/impi/4.1.0.024/intel64/bin:/opt/intel/composer_xe_2013.0.079/bin/intel64:/opt/intel/composer_xe_2013.0.079/mpirt/bin/intel64:/opt/intel/composer_xe_2013.0.079/bin/intel64:/opt/intel/composer_xe_2013.0.079/bin/intel64_mic:/opt/intel/composer_xe_2013.0.079/debugger/gui/intel64:/opt/intel/impi/4.1.0.024/intel64/bin:/usr/kerberos/bin:/usr/java/latest/bin:/usr/local/bin:/bin:/usr/bin:/opt/eclipse:/opt/ganglia/bin:/opt/ganglia/sbin:/opt/maven/bin:/opt/pdsh/bin:/opt/rocks/bin:/opt/rocks/sbin:/opt/condor/bin:/opt/condor/sbin:/opt/gridengine/bin/lx26-amd64:/usr/sbin:/home/ivasan/bin:/home/ivasan/binvasp' 'SGE_CLUSTER_NAME=gamma' 'SGE_O_SHELL=/bin/bash' 'SGE_O_HOST=gamma' 'REQUEST=test.impi.IMB' 'INCLUDE=/opt/intel/composer_xe_2013.0.079/mkl/include' 'NSLOTS=24' 'G_BROKEN_FILENAMES=1' 'SGE_STDIN_PATH=/dev/null' 'I_MPI_ROOT=/opt/intel/impi/4.1.0.024' '_=/opt/intel/impi/4.1.0.024/intel64/bin/mpiexec.hydra' --global-user-env 1 'I_MPI_DEBUG=5' --global-system-env 2 'MPICH_ENABLE_CKPOINT=1' 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 12 --exec --exec-appnum 0 --exec-proc-count 12 --exec-local-env 0 --exec-wdir /home/ivasan/test_impi/IMB --exec-args 1 IMB-MPI1 [mpiexec@compute-0-1.local] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1 Arguments being passed to proxy 1: --version 1.4.1p1 --iface-ip-env-name MPICH_INTERFACE_HOSTNAME --hostname compute-0-1 --global-core-map 12,12,0 --filler-process-map 12,12,0 --global-process-count 24 --auto-cleanup 1 --pmi-rank -1 --pmi-kvsname kvs_30666_0 --pmi-process-mapping (vector,(0,2,12)) --topolib ipl --ckpointlib blcr --ckpoint-prefix /tmp --ckpoint-preserve -1 --ckpoint off --ckpoint-num -1 --global-inherited-env 103 'I_MPI_PERHOST=allcores' 'LD_LIBRARY_PATH=/opt/intel/itac/8.0.3.007/itac/slib_impi4:/opt/intel/impi/4.1.0.024/intel64/lib:/opt/intel/impi/4.1.0.024/intel64/lib:/opt/intel/composer_xe_2013.0.079/compiler/lib/intel64:/opt/intel/mic/coi/host-linux-release/lib:/opt/intel/mic/myo/lib:/opt/intel/composer_xe_2013.0.079/mpirt/lib/intel64:/opt/intel/composer_xe_2013.0.079/ipp/../compiler/lib/intel64:/opt/intel/composer_xe_2013.0.079/ipp/lib/intel64:/opt/intel/composer_xe_2013.0.079/compiler/lib/intel64:/opt/intel/composer_xe_2013.0.079/mkl/lib/intel64:/opt/intel/composer_xe_2013.0.079/tbb/lib/intel64:/opt/gridengine/lib/lx26-amd64:/opt/intel/impi/4.1.0.024/intel64/lib:/opt/intel/impi/4.1.0.024/intel64/lib:/opt/intel/impi/4.1.0.024/mic/lib' 'SET_HOST_TYPE= -x ' 'MKLROOT=/opt/intel/composer_xe_2013.0.079/mkl' 'MANPATH=/opt/intel/itac/8.0.3.007/man:/opt/intel/impi/4.1.0.024/man:/opt/intel/composer_xe_2013.0.079/man/en_US:/opt/intel/composer_xe_2013.0.079/man/en_US:/opt/intel/impi/4.1.0.024/man:/usr/kerberos/man:/usr/java/latest/man:/usr/local/share/man:/usr/share/man/en:/usr/share/man:/opt/ganglia/man:/opt/rocks/man:/opt/condor/man:/opt/tripwire/man:/opt/openmpi/share/man:/opt/sun-ct/man:/opt/gridengine/man::/opt/intel/vtune_amplifier_xe_2013/man' 'PDSHROOT=/opt/pdsh' 'SELINUX_INIT=YES' 'CONSOLE=/dev/console' 'VT_MPI=impi4' 'HOSTNAME=compute-0-1.local' 'SGE_INFOTEXT_MAX_COLUMN=5000' 'INTEL_LICENSE_FILE=/opt/intel/licenses:/opt/intel/composer_xe_2013.0.079/licenses:/opt/intel/licenses:/home/ivasan/intel/licenses' 'IPPROOT=/opt/intel/composer_xe_2013.0.079/ipp' 'SGE_TASK_STEPSIZE=undefined' 'TERM=vt100' 'SHELL=/bin/bash' 'ECLIPSE_HOME=/opt/eclipse' 'HISTSIZE=1000' 'NHOSTS=2' 'CONDOR_IDS=407.407' 'SSH_CLIENT=157.88.111.46 1365 22' 'TMPDIR=/tmp/509.1.all.q' 'SGE_O_WORKDIR=/home/ivasan/test_impi/IMB' 'LIBRARY_PATH=/opt/intel/composer_xe_2013.0.079/compiler/lib/intel64:/opt/intel/composer_xe_2013.0.079/ipp/../compiler/lib/intel64:/opt/intel/composer_xe_2013.0.079/ipp/lib/intel64:/opt/intel/composer_xe_2013.0.079/compiler/lib/intel64:/opt/intel/composer_xe_2013.0.079/mkl/lib/intel64:/opt/intel/composer_xe_2013.0.079/tbb/lib/intel64' 'SGE_O_HOME=/home/ivasan' 'SGE_CELL=default' 'SGE_ARCH=lx26-amd64' 'MPICH_PROCESS_GROUP=no' 'MIC_LD_LIBRARY_PATH=/opt/intel/composer_xe_2013.0.079/compiler/lib/mic:/opt/intel/mic/coi/device-linux-release/lib:/opt/intel/mic/myo/lib:/opt/intel/composer_xe_2013.0.079/compiler/lib/mic:/opt/intel/composer_xe_2013.0.079/mkl/lib/mic:/opt/intel/composer_xe_2013.0.079/tbb/lib/mic' 'ROCKSROOT=/opt/rocks/share/devel' 'SSH_TTY=/dev/pts/2' 'RESTARTED=0' 'ANT_HOME=/opt/rocks' 'ARC=lx26-amd64' 'USER=ivasan' 'LS_COLORS=no=00:fi=00:di=01;34:ln=01;36:pi=40;33:so=01;35:bd=40;33;01:cd=40;33;01:or=01;05;37;41:mi=01;05;37;41:ex=01;32:*.cmd=01;32:*.exe=01;32:*.com=01;32:*.btm=01;32:*.bat=01;32:*.sh=01;32:*.csh=01;32:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01;31:*.lzh=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.gz=01;31:*.bz2=01;31:*.bz=01;31:*.tz=01;31:*.rpm=01;31:*.cpio=01;31:*.jpg=01;35:*.gif=01;35:*.bmp=01;35:*.xbm=01;35:*.xpm=01;35:*.png=01;35:*.tif=01;35:' 'INIT_VERSION=sysvinit-2.86' 'SGE_TASK_LAST=undefined' 'ROCKS_ROOT=/opt/rocks' 'QUEUE=all.q' 'CPATH=/opt/intel/composer_xe_2013.0.079/mkl/include:/opt/intel/composer_xe_2013.0.079/tbb/include' 'SGE_TASK_ID=undefined' 'NLSPATH=/opt/intel/composer_xe_2013.0.079/compiler/lib/intel64/locale/%l_%t/%N:/opt/intel/composer_xe_2013.0.079/ipp/lib/intel64/locale/%l_%t/%N:/opt/intel/composer_xe_2013.0.079/mkl/lib/intel64/locale/%l_%t/%N:/opt/intel/composer_xe_2013.0.079/debugger/intel64/locale/%l_%t/%N' 'PATH=/tmp/509.1.all.q:/opt/intel/vtune_amplifier_xe_2013/bin64:/opt/intel/itac/8.0.3.007/bin:/opt/intel/impi/4.1.0.024/intel64/bin:/opt/intel/impi/4.1.0.024/intel64/bin:/opt/intel/composer_xe_2013.0.079/bin/intel64:/opt/intel/composer_xe_2013.0.079/mpirt/bin/intel64:/opt/intel/composer_xe_2013.0.079/bin/intel64:/opt/intel/composer_xe_2013.0.079/bin/intel64_mic:/opt/intel/composer_xe_2013.0.079/debugger/gui/intel64:/opt/intel/impi/4.1.0.024/intel64/bin:/usr/kerberos/bin:/usr/java/latest/bin:/usr/local/bin:/bin:/usr/bin:/opt/eclipse:/opt/ganglia/bin:/opt/ganglia/sbin:/opt/maven/bin:/opt/pdsh/bin:/opt/rocks/bin:/opt/rocks/sbin:/opt/condor/bin:/opt/condor/sbin:/opt/gridengine/bin/lx26-amd64:/usr/sbin:/home/ivasan/bin:/home/ivasan/binvasp' 'VT_ADD_LIBS=-ldwarf -lelf -lvtunwind -lnsl -lm -ldl -lpthread' 'MAVEN_HOME=/opt/maven' 'MAIL=/var/spool/mail/ivasan' 'SGE_BINARY_PATH=/opt/gridengine/bin/lx26-amd64' 'RUNLEVEL=3' 'TBBROOT=/opt/intel/composer_xe_2013.0.079/tbb' 'CONDOR_CONFIG=/opt/condor/etc/condor_config' 'SGE_STDERR_PATH=/home/ivasan/test_impi/IMB/test.impi.IMB.o509' 'PWD=/home/ivasan/test_impi/IMB' 'INPUTRC=/etc/inputrc' 'JAVA_HOME=/usr/java/latest' 'SGE_EXECD_PORT=537' 'SGE_ACCOUNT=sge' 'SGE_STDOUT_PATH=/home/ivasan/test_impi/IMB/test.impi.IMB.o509' 'LANG=en_US.iso885915' 'SGE_QMASTER_PORT=536' 'JOB_NAME=test.impi.IMB' 'JOB_SCRIPT=/opt/gridengine/default/spool/compute-0-1/job_scripts/509' 'SGE_ROOT=/opt/gridengine' 'SGE_NOMSG=1' 'VT_LIB_DIR=/opt/intel/itac/8.0.3.007/itac/lib_impi4' 'CONDOR_ROOT=/opt/condor' 'PREVLEVEL=N' 'VT_ROOT=/opt/intel/itac/8.0.3.007' 'REQNAME=test.impi.IMB' 'VTUNE_AMPLIFIER_XE_2013_DIR=/opt/intel/vtune_amplifier_xe_2013' 'SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass' 'ENVIRONMENT=BATCH' 'SGE_JOB_SPOOL_DIR=/opt/gridengine/default/spool/compute-0-1/active_jobs/509.1' 'PE_HOSTFILE=/opt/gridengine/default/spool/compute-0-1/active_jobs/509.1/pe_hostfile' 'HOME=/home/ivasan' 'SHLVL=3' 'NQUEUES=2' 'SGE_CWD_PATH=/home/ivasan/test_impi/IMB' 'SGE_O_LOGNAME=ivasan' 'ROLLSROOT=/opt/rocks/share/devel/src/roll' 'VT_SLIB_DIR=/opt/intel/itac/8.0.3.007/itac/slib_impi4' 'SGE_O_MAIL=/var/spool/mail/ivasan' 'LOGNAME=ivasan' 'JOB_ID=509' 'TMP=/tmp/509.1.all.q' 'CVS_RSH=ssh' 'CLASSPATH=/opt/intel/itac/8.0.3.007/itac/lib_impi4' 'SSH_CONNECTION=157.88.111.46 1365 157.88.111.174 22' 'PE=impi' 'I_MPI_HYDRA_BOOTSTRAP=sge' 'SGE_TASK_FIRST=undefined' 'LESSOPEN=|/usr/bin/lesspipe.sh %s' 'SGE_O_PATH=/opt/intel/vtune_amplifier_xe_2013/bin64:/opt/intel/itac/8.0.3.007/bin:/opt/intel/impi/4.1.0.024/intel64/bin:/opt/intel/impi/4.1.0.024/intel64/bin:/opt/intel/composer_xe_2013.0.079/bin/intel64:/opt/intel/composer_xe_2013.0.079/mpirt/bin/intel64:/opt/intel/composer_xe_2013.0.079/bin/intel64:/opt/intel/composer_xe_2013.0.079/bin/intel64_mic:/opt/intel/composer_xe_2013.0.079/debugger/gui/intel64:/opt/intel/impi/4.1.0.024/intel64/bin:/usr/kerberos/bin:/usr/java/latest/bin:/usr/local/bin:/bin:/usr/bin:/opt/eclipse:/opt/ganglia/bin:/opt/ganglia/sbin:/opt/maven/bin:/opt/pdsh/bin:/opt/rocks/bin:/opt/rocks/sbin:/opt/condor/bin:/opt/condor/sbin:/opt/gridengine/bin/lx26-amd64:/usr/sbin:/home/ivasan/bin:/home/ivasan/binvasp' 'SGE_CLUSTER_NAME=gamma' 'SGE_O_SHELL=/bin/bash' 'SGE_O_HOST=gamma' 'REQUEST=test.impi.IMB' 'INCLUDE=/opt/intel/composer_xe_2013.0.079/mkl/include' 'NSLOTS=24' 'G_BROKEN_FILENAMES=1' 'SGE_STDIN_PATH=/dev/null' 'I_MPI_ROOT=/opt/intel/impi/4.1.0.024' '_=/opt/intel/impi/4.1.0.024/intel64/bin/mpiexec.hydra' --global-user-env 1 'I_MPI_DEBUG=5' --global-system-env 2 'MPICH_ENABLE_CKPOINT=1' 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 12 --exec --exec-appnum 0 --exec-proc-count 12 --exec-local-env 0 --exec-wdir /home/ivasan/test_impi/IMB --exec-args 1 IMB-MPI1 [mpiexec@compute-0-1.local] Launch arguments: /usr/bin/ssh -x -q compute-0-0 /opt/intel/impi/4.1.0.024/intel64/bin/pmi_proxy --control-port compute-0-1:49467 --debug --pmi-connect lazy-cache --pmi-aggregate -s 0 --rmk sge --launcher ssh --demux poll --pgid 0 --enable-stdin 1 --retries 10 --proxy-id 0 [mpiexec@compute-0-1.local] Launch arguments: /opt/intel/impi/4.1.0.024/intel64/bin/pmi_proxy --control-port compute-0-1:49467 --debug --pmi-connect lazy-cache --pmi-aggregate -s 0 --rmk sge --launcher ssh --demux poll --pgid 0 --enable-stdin 1 --retries 10 --proxy-id 1 [mpiexec@compute-0-1.local] STDIN will be redirected to 1 fd(s): 7 [proxy:0:1@compute-0-1.local] Start PMI_proxy 1 [proxy:0:1@compute-0-1.local] got pmi command (from 7): init pmi_version=1 pmi_subversion=1 [proxy:0:1@compute-0-1.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@compute-0-1.local] got pmi command (from 11): init pmi_version=1 pmi_subversion=1 [proxy:0:1@compute-0-1.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@compute-0-1.local] got pmi command (from 15): init pmi_version=1 pmi_subversion=1 [proxy:0:1@compute-0-1.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@compute-0-1.local] got pmi command (from 18): init pmi_version=1 pmi_subversion=1 [proxy:0:1@compute-0-1.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@compute-0-1.local] got pmi command (from 21): init pmi_version=1 pmi_subversion=1 [proxy:0:1@compute-0-1.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@compute-0-1.local] got pmi command (from 24): init pmi_version=1 pmi_subversion=1 [proxy:0:1@compute-0-1.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@compute-0-1.local] got pmi command (from 27): init pmi_version=1 pmi_subversion=1 [proxy:0:1@compute-0-1.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@compute-0-1.local] got pmi command (from 18): get_maxes [proxy:0:1@compute-0-1.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@compute-0-1.local] got pmi command (from 21): get_maxes [proxy:0:1@compute-0-1.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@compute-0-1.local] got pmi command (from 24): get_maxes [proxy:0:1@compute-0-1.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@compute-0-1.local] got pmi command (from 18): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 21): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 27): get_maxes [proxy:0:1@compute-0-1.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@compute-0-1.local] got pmi command (from 30): init pmi_version=1 pmi_subversion=1 [proxy:0:1@compute-0-1.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@compute-0-1.local] got pmi command (from 11): get_maxes [proxy:0:1@compute-0-1.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@compute-0-1.local] got pmi command (from 24): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 27): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 11): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 30): get_maxes [proxy:0:1@compute-0-1.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@compute-0-1.local] got pmi command (from 30): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 7): get_maxes [proxy:0:1@compute-0-1.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@compute-0-1.local] got pmi command (from 15): get_maxes [proxy:0:1@compute-0-1.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@compute-0-1.local] got pmi command (from 7): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 15): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 33): init pmi_version=1 pmi_subversion=1 [proxy:0:1@compute-0-1.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@compute-0-1.local] got pmi command (from 33): get_maxes [proxy:0:1@compute-0-1.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@compute-0-1.local] got pmi command (from 33): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 36): init pmi_version=1 pmi_subversion=1 [proxy:0:1@compute-0-1.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@compute-0-1.local] got pmi command (from 36): get_maxes [proxy:0:1@compute-0-1.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@compute-0-1.local] got pmi command (from 36): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 39): init pmi_version=1 pmi_subversion=1 [proxy:0:1@compute-0-1.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@compute-0-1.local] got pmi command (from 39): get_maxes [proxy:0:1@compute-0-1.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@compute-0-1.local] got pmi command (from 39): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 42): init pmi_version=1 pmi_subversion=1 [proxy:0:1@compute-0-1.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@compute-0-1.local] got pmi command (from 42): get_maxes [proxy:0:1@compute-0-1.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@compute-0-1.local] got pmi command (from 42): barrier_in [proxy:0:1@compute-0-1.local] forwarding command (cmd=barrier_in) upstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=barrier_in [proxy:0:0@compute-0-0.local] Start PMI_proxy 0 [proxy:0:0@compute-0-0.local] STDIN will be redirected to 1 fd(s): 9 [proxy:0:0@compute-0-0.local] got pmi command (from 6): init pmi_version=1 pmi_subversion=1 [proxy:0:0@compute-0-0.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@compute-0-0.local] got pmi command (from 7): init pmi_version=1 pmi_subversion=1 [proxy:0:0@compute-0-0.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@compute-0-0.local] got pmi command (from 8): init pmi_version=1 pmi_subversion=1 [proxy:0:0@compute-0-0.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@compute-0-0.local] got pmi command (from 13): init pmi_version=1 pmi_subversion=1 [proxy:0:0@compute-0-0.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@compute-0-0.local] got pmi command (from 16): init pmi_version=1 pmi_subversion=1 [proxy:0:0@compute-0-0.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@compute-0-0.local] got pmi command (from 19): init pmi_version=1 pmi_subversion=1 [proxy:0:0@compute-0-0.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@compute-0-0.local] got pmi command (from 6): get_maxes [proxy:0:0@compute-0-0.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@compute-0-0.local] got pmi command (from 7): get_maxes [proxy:0:0@compute-0-0.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@compute-0-0.local] got pmi command (from 8): get_maxes [proxy:0:0@compute-0-0.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@compute-0-0.local] got pmi command (from 13): get_maxes [proxy:0:0@compute-0-0.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@compute-0-0.local] got pmi command (from 16): get_maxes [proxy:0:0@compute-0-0.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@compute-0-0.local] got pmi command (from 19): get_maxes [proxy:0:0@compute-0-0.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@compute-0-0.local] got pmi command (from 25): init pmi_version=1 pmi_subversion=1 [proxy:0:0@compute-0-0.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@compute-0-0.local] got pmi command (from 6): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 7): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 8): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 13): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 16): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 19): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 25): get_maxes [proxy:0:0@compute-0-0.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@compute-0-0.local] got pmi command (from 25): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 28): init pmi_version=1 pmi_subversion=1 [proxy:0:0@compute-0-0.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@compute-0-0.local] got pmi command (from 28): get_maxes [proxy:0:0@compute-0-0.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@compute-0-0.local] got pmi command (from 28): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 31): init pmi_version=1 pmi_subversion=1 [proxy:0:0@compute-0-0.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@compute-0-0.local] got pmi command (from 31): get_maxes [proxy:0:0@compute-0-0.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@compute-0-0.local] got pmi command (from 31): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 22): init pmi_version=1 pmi_subversion=1 [proxy:0:0@compute-0-0.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@compute-0-1.local] PMI response to fd 0 pid 34: cmd=barrier_out [mpiexec@compute-0-1.local] PMI response to fd 6 pid 34: cmd=barrier_out [proxy:0:0@compute-0-0.local] got pmi command (from 22): get_maxes [proxy:0:0@compute-0-0.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@compute-0-0.local] got pmi command (from 22): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 37): init pmi_version=1 pmi_subversion=1 [proxy:0:0@compute-0-0.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@compute-0-0.local] got pmi command (from 34): init pmi_version=1 pmi_subversion=1 [proxy:0:0@compute-0-0.local] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@compute-0-0.local] got pmi command (from 37): get_maxes [proxy:0:0@compute-0-0.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@compute-0-0.local] got pmi command (from 34): get_maxes [proxy:0:0@compute-0-0.local] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@compute-0-0.local] got pmi command (from 37): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 34): barrier_in [proxy:0:0@compute-0-0.local] forwarding command (cmd=barrier_in) upstream [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] got pmi command (from 7): get_ranks2hosts [proxy:0:1@compute-0-1.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:1@compute-0-1.local] got pmi command (from 11): get_ranks2hosts [proxy:0:1@compute-0-1.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:1@compute-0-1.local] got pmi command (from 15): get_ranks2hosts [proxy:0:1@compute-0-1.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:1@compute-0-1.local] got pmi command (from 18): get_ranks2hosts [proxy:0:1@compute-0-1.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:1@compute-0-1.local] got pmi command (from 21): get_ranks2hosts [proxy:0:1@compute-0-1.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:1@compute-0-1.local] got pmi command (from 24): get_ranks2hosts [proxy:0:1@compute-0-1.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:1@compute-0-1.local] got pmi command (from 27): get_ranks2hosts [proxy:0:1@compute-0-1.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:1@compute-0-1.local] got pmi command (from 30): get_ranks2hosts [proxy:0:1@compute-0-1.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:1@compute-0-1.local] got pmi command (from 33): get_ranks2hosts [proxy:0:1@compute-0-1.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:1@compute-0-1.local] got pmi command (from 7): get_appnum [proxy:0:1@compute-0-1.local] PMI response: cmd=appnum appnum=0 [proxy:0:1@compute-0-1.local] got pmi command (from 11): get_appnum [proxy:0:1@compute-0-1.local] PMI response: cmd=appnum appnum=0 [proxy:0:1@compute-0-1.local] got pmi command (from 15): get_appnum [proxy:0:1@compute-0-1.local] PMI response: cmd=appnum appnum=0 [proxy:0:1@compute-0-1.local] got pmi command (from 18): get_appnum [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] got pmi command (from 6): get_ranks2hosts [proxy:0:0@compute-0-0.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:1@compute-0-1.local] PMI response: cmd=appnum appnum=0 [proxy:0:1@compute-0-1.local] got pmi command (from 21): get_appnum [proxy:0:1@compute-0-1.local] PMI response: cmd=appnum appnum=0 [proxy:0:1@compute-0-1.local] got pmi command (from 24): get_appnum [proxy:0:1@compute-0-1.local] PMI response: cmd=appnum appnum=0 [proxy:0:1@compute-0-1.local] got pmi command (from 27): get_appnum [proxy:0:1@compute-0-1.local] PMI response: cmd=appnum appnum=0 [proxy:0:0@compute-0-0.local] got pmi command (from 7): get_ranks2hosts [proxy:0:0@compute-0-0.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:0@compute-0-0.local] got pmi command (from 8): get_ranks2hosts [proxy:0:0@compute-0-0.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:0@compute-0-0.local] got pmi command (from 13): get_ranks2hosts [proxy:0:0@compute-0-0.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:0@compute-0-0.local] got pmi command (from 16): get_ranks2hosts [proxy:0:0@compute-0-0.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:0@compute-0-0.local] got pmi command (from 19): get_ranks2hosts [proxy:0:0@compute-0-0.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:0@compute-0-0.local] got pmi command (from 22): get_ranks2hosts [proxy:0:0@compute-0-0.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:1@compute-0-1.local] got pmi command (from 30): get_appnum [proxy:0:1@compute-0-1.local] PMI response: cmd=appnum appnum=0 [proxy:0:1@compute-0-1.local] got pmi command (from 36): get_ranks2hosts [proxy:0:1@compute-0-1.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:1@compute-0-1.local] got pmi command (from 39): get_ranks2hosts [proxy:0:1@compute-0-1.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:1@compute-0-1.local] got pmi command (from 42): get_ranks2hosts [proxy:0:1@compute-0-1.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:1@compute-0-1.local] got pmi command (from 7): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 11): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 15): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 18): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 21): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 24): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 27): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 30): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 33): get_appnum [proxy:0:1@compute-0-1.local] PMI response: cmd=appnum appnum=0 [proxy:0:1@compute-0-1.local] got pmi command (from 36): get_appnum [proxy:0:1@compute-0-1.local] PMI response: cmd=appnum appnum=0 [proxy:0:1@compute-0-1.local] got pmi command (from 39): get_appnum [proxy:0:1@compute-0-1.local] PMI response: cmd=appnum appnum=0 [proxy:0:1@compute-0-1.local] got pmi command (from 7): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 11): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] got pmi command (from 25): get_ranks2hosts [proxy:0:0@compute-0-0.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:0@compute-0-0.local] got pmi command (from 28): get_ranks2hosts [proxy:0:0@compute-0-0.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:0@compute-0-0.local] got pmi command (from 6): get_appnum [proxy:0:0@compute-0-0.local] PMI response: cmd=appnum appnum=0 [proxy:0:0@compute-0-0.local] got pmi command (from 7): get_appnum [proxy:0:0@compute-0-0.local] PMI response: cmd=appnum appnum=0 [proxy:0:0@compute-0-0.local] got pmi command (from 8): get_appnum [proxy:0:0@compute-0-0.local] PMI response: cmd=appnum appnum=0 [proxy:0:0@compute-0-0.local] got pmi command (from 13): get_appnum [proxy:0:0@compute-0-0.local] PMI response: cmd=appnum appnum=0 [proxy:0:0@compute-0-0.local] got pmi command (from 16): get_appnum [proxy:0:1@compute-0-1.local] got pmi command (from 15): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 18): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 21): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 24): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 27): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 30): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 33): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 36): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 42): get_appnum [proxy:0:1@compute-0-1.local] PMI response: cmd=appnum appnum=0 [proxy:0:1@compute-0-1.local] got pmi command (from 7): put kvsname=kvs_30666_0 key=sharedFilename[12] value=/dev/shm/Intel_MPI_ig7vy7 [proxy:0:1@compute-0-1.local] forwarding command (cmd=put kvsname=kvs_30666_0 key=sharedFilename[12] value=/dev/shm/Intel_MPI_ig7vy7) upstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=put kvsname=kvs_30666_0 key=sharedFilename[12] value=/dev/shm/Intel_MPI_ig7vy7 [mpiexec@compute-0-1.local] PMI response to fd 6 pid 7: cmd=put_result rc=0 msg=success [proxy:0:0@compute-0-0.local] PMI response: cmd=appnum appnum=0 [proxy:0:0@compute-0-0.local] got pmi command (from 19): get_appnum [proxy:0:0@compute-0-0.local] PMI response: cmd=appnum appnum=0 [proxy:0:0@compute-0-0.local] got pmi command (from 22): get_appnum [proxy:0:0@compute-0-0.local] PMI response: cmd=appnum appnum=0 [proxy:0:0@compute-0-0.local] got pmi command (from 25): get_appnum [proxy:0:0@compute-0-0.local] PMI response: cmd=appnum appnum=0 [proxy:0:0@compute-0-0.local] got pmi command (from 28): get_appnum [proxy:0:0@compute-0-0.local] PMI response: cmd=appnum appnum=0 [proxy:0:0@compute-0-0.local] got pmi command (from 31): get_ranks2hosts [proxy:0:0@compute-0-0.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:0@compute-0-0.local] got pmi command (from 34): get_ranks2hosts [proxy:0:0@compute-0-0.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:0@compute-0-0.local] got pmi command (from 37): get_ranks2hosts [proxy:0:1@compute-0-1.local] got pmi command (from 11): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 15): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 21): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 24): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 27): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 33): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 39): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] we don't understand the response put_result; forwarding downstream [proxy:0:1@compute-0-1.local] got pmi command (from 18): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 30): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 36): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 42): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 7): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 33): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 39): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 42): get_my_kvsname [proxy:0:1@compute-0-1.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 36): barrier_in [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=put kvsname=kvs_30666_0 key=sharedFilename[0] value=/dev/shm/Intel_MPI_yrXxJf [mpiexec@compute-0-1.local] PMI response to fd 0 pid 6: cmd=put_result rc=0 msg=success [proxy:0:0@compute-0-0.local] PMI response: put_ranks2hosts 96 2 11 compute-0-0 0,1,2,3,4,5,6,7,8,9,10,11, 11 compute-0-1 12,13,14,15,16,17,18,19,20,21,22,23, [proxy:0:0@compute-0-0.local] got pmi command (from 6): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] got pmi command (from 7): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] got pmi command (from 8): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] got pmi command (from 13): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] got pmi command (from 16): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] got pmi command (from 19): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] got pmi command (from 22): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] got pmi command (from 25): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] got pmi command (from 28): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] got pmi command (from 31): get_appnum [proxy:0:0@compute-0-0.local] PMI response: cmd=appnum appnum=0 [proxy:0:0@compute-0-0.local] got pmi command (from 6): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] got pmi command (from 7): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] got pmi command (from 8): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] got pmi command (from 13): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:1@compute-0-1.local] got pmi command (from 39): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 42): barrier_in [proxy:0:1@compute-0-1.local] forwarding command (cmd=barrier_in) upstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 16): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] got pmi command (from 19): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] got pmi command (from 22): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] got pmi command (from 25): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] got pmi command (from 28): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@compute-0-1.local] PMI response to fd 0 pid 37: cmd=barrier_out [mpiexec@compute-0-1.local] PMI response to fd 6 pid 37: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] got pmi command (from 34): get_appnum [proxy:0:0@compute-0-0.local] PMI response: cmd=appnum appnum=0 [proxy:0:0@compute-0-0.local] got pmi command (from 37): get_appnum [proxy:0:0@compute-0-0.local] PMI response: cmd=appnum appnum=0 [proxy:0:0@compute-0-0.local] got pmi command (from 6): put kvsname=kvs_30666_0 key=sharedFilename[0] value=/dev/shm/Intel_MPI_yrXxJf [proxy:0:0@compute-0-0.local] forwarding command (cmd=put kvsname=kvs_30666_0 key=sharedFilename[0] value=/dev/shm/Intel_MPI_yrXxJf) upstream [proxy:0:0@compute-0-0.local] got pmi command (from 7): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 8): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 13): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 16): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 22): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 25): barrier_in [proxy:0:0@compute-0-0.local] [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] got pmi command (from 11): get kvsname=kvs_30666_0 key=sharedFilename[12] [proxy:0:1@compute-0-1.local] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_ig7vy7 got pmi command (from 28): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 31): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] got pmi command (from 34): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] got pmi command (from 19): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 31): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] got pmi command (from 37): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] we don't understand the response put_result; forwarding downstream [proxy:0:0@compute-0-0.local] got pmi command (from 34): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] got pmi command (from 31): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 37): get_my_kvsname [proxy:0:0@compute-0-0.local] PMI response: cmd=my_kvsname kvsname=kvs_30666_0 [proxy:0:0@compute-0-0.local] got pmi command (from 6): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 34): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 37): barrier_in [proxy:0:0@compute-0-0.local] forwarding command (cmd=barrier_in) upstream [proxy:0:1@compute-0-1.local] got pmi command (from 15): get kvsname=kvs_30666_0 key=sharedFilename[12] [proxy:0:1@compute-0-1.local] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_ig7vy7 [proxy:0:1@compute-0-1.local] got pmi command (from 18): get kvsname=kvs_30666_0 key=sharedFilename[12] [proxy:0:1@compute-0-1.local] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_ig7vy7 [proxy:0:1@compute-0-1.local] got pmi command (from 21): get kvsname=kvs_30666_0 key=sharedFilename[12] [proxy:0:1@compute-0-1.local] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_ig7vy7 [proxy:0:1@compute-0-1.local] got pmi command (from 24): get kvsname=kvs_30666_0 key=sharedFilename[12] [proxy:0:1@compute-0-1.local] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_ig7vy7 [proxy:0:1@compute-0-1.local] got pmi command (from 27): get kvsname=kvs_30666_0 key=sharedFilename[12] [proxy:0:1@compute-0-1.local] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_ig7vy7 [proxy:0:1@compute-0-1.local] got pmi command (from 30): get kvsname=kvs_30666_0 key=sharedFilename[12] [proxy:0:1@compute-0-1.local] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_ig7vy7 [proxy:0:1@compute-0-1.local] got pmi command (from 33): get kvsname=kvs_30666_0 key=sharedFilename[12] [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] got pmi command (from 7): get kvsname=kvs_30666_0 key=sharedFilename[0] [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_yrXxJf [proxy:0:0@compute-0-0.local] got pmi command (from 8): get kvsname=kvs_30666_0 key=sharedFilename[0] [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_yrXxJf [proxy:0:1@compute-0-1.local] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_ig7vy7 [proxy:0:1@compute-0-1.local] got pmi command (from 39): get kvsname=kvs_30666_0 key=sharedFilename[12] [proxy:0:1@compute-0-1.local] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_ig7vy7 [proxy:0:1@compute-0-1.local] got pmi command (from 36): get kvsname=kvs_30666_0 key=sharedFilename[12] [proxy:0:1@compute-0-1.local] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_ig7vy7 [proxy:0:0@compute-0-0.local] got pmi command (from 13): get kvsname=kvs_30666_0 key=sharedFilename[0] [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_yrXxJf [proxy:0:0@compute-0-0.local] got pmi command (from 16): get kvsname=kvs_30666_0 key=sharedFilename[0] [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_yrXxJf [proxy:0:0@compute-0-0.local] got pmi command (from 19): get kvsname=kvs_30666_0 key=sharedFilename[0] [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_yrXxJf [proxy:0:0@compute-0-0.local] got pmi command (from 22): get kvsname=kvs_30666_0 key=sharedFilename[0] [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_yrXxJf [proxy:0:0@compute-0-0.local] got pmi command (from 25): get kvsname=kvs_30666_0 key=sharedFilename[0] [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_yrXxJf [proxy:0:0@compute-0-0.local] got pmi command (from 28): get kvsname=kvs_30666_0 key=sharedFilename[0] [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_yrXxJf [proxy:0:1@compute-0-1.local] got pmi command (from 42): get kvsname=kvs_30666_0 key=sharedFilename[12] [proxy:0:1@compute-0-1.local] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_ig7vy7 [18] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [22] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [17] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [0] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [12] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [3] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [4] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [15] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [6] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [10] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [1] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [23] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [20] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [8] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [9] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [19] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [2] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [16] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [7] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [14] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [11] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [13] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [21] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [5] DAPL startup(): trying to open default DAPL provider from dat registry: ofa-v2-mlx4_0-1 [9] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [18] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [4] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [proxy:0:1@compute-0-1.local] got pmi command (from 27): barrier_in [3] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [6] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [1] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [12] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [proxy:0:1@compute-0-1.local] got pmi command (from 7): barrier_in [22] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [2] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [proxy:0:1@compute-0-1.local] got pmi command (from 39): barrier_in [17] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [5] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [8] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [proxy:0:1@compute-0-1.local] got pmi command (from 24): barrier_in [15] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [proxy:0:1@compute-0-1.local] got pmi command (from 18): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 31): get kvsname=kvs_30666_0 key=sharedFilename[0] [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_yrXxJf [proxy:0:0@compute-0-0.local] got pmi command (from 34): get kvsname=kvs_30666_0 key=sharedFilename[0] [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_yrXxJf [proxy:0:0@compute-0-0.local] got pmi command (from 37): get kvsname=kvs_30666_0 key=sharedFilename[0] [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_yrXxJf [proxy:0:0@compute-0-0.local] got pmi command (from 31): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 16): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 13): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 22): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 7): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 8): barrier_in [0] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=put kvsname=kvs_30666_0 key=DAPL_PROVIDER value=ofa-v2-mlx4_0-1(v2.0) [mpiexec@compute-0-1.local] PMI response to fd 0 pid 6: cmd=put_result rc=0 msg=success [14] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [20] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [13] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [7] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [proxy:0:1@compute-0-1.local] got pmi command (from 11): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 15): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 33): barrier_in [10] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [11] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=barrier_in [16] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [proxy:0:0@compute-0-0.local] got pmi command (from 19): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 28): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 6): put kvsname=kvs_30666_0 key=DAPL_PROVIDER value=ofa-v2-mlx4_0-1(v2.0) [proxy:0:0@compute-0-0.local] forwarding command (cmd=put kvsname=kvs_30666_0 key=DAPL_PROVIDER value=ofa-v2-mlx4_0-1(v2.0)) upstream [proxy:0:0@compute-0-0.local] we don't understand the response put_result; forwarding downstream [proxy:0:0@compute-0-0.local] got pmi command (from 6): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 25): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 34): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 21): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 37): barrier_in [proxy:0:0@compute-0-0.local] forwarding command (cmd=barrier_in) upstream [21] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [19] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [proxy:0:1@compute-0-1.local] got pmi command (from 36): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 30): barrier_in [23] MPI startup(): DAPL provider ofa-v2-mlx4_0-1 [proxy:0:1@compute-0-1.local] got pmi command (from 42): barrier_in [proxy:0:1@compute-0-1.local] forwarding command (cmd=barrier_in) upstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@compute-0-1.local] PMI response to fd 0 pid 42: cmd=barrier_out [mpiexec@compute-0-1.local] PMI response to fd 6 pid 42: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] got pmi command (from 7): get kvsname=kvs_30666_0 key=DAPL_PROVIDER [proxy:0:1@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_PROVIDER) upstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_PROVIDER [mpiexec@compute-0-1.local] PMI response to fd 6 pid 7: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) key=DAPL_PROVIDER [proxy:0:1@compute-0-1.local] got pmi command (from 11): get kvsname=kvs_30666_0 key=DAPL_PROVIDER [proxy:0:1@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_PROVIDER) upstream [proxy:0:1@compute-0-1.local] got pmi command (from 15): get kvsname=kvs_30666_0 key=DAPL_PROVIDER [proxy:0:1@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_PROVIDER) upstream [proxy:0:1@compute-0-1.local] got pmi command (from 18): get kvsname=kvs_30666_0 key=DAPL_PROVIDER [proxy:0:1@compute-0-1.local] [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_PROVIDER [mpiexec@compute-0-1.local] PMI response to fd 6 pid 11: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) key=DAPL_PROVIDER forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_PROVIDER) upstream [proxy:0:1@compute-0-1.local] got pmi command (from 21): get kvsname=kvs_30666_0 key=DAPL_PROVIDER [proxy:0:1@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_PROVIDER) upstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_PROVIDER [mpiexec@compute-0-1.local] PMI response to fd 6 pid 15: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) key=DAPL_PROVIDER [proxy:0:1@compute-0-1.local] got pmi command (from 24): get kvsname=kvs_30666_0 key=DAPL_PROVIDER [proxy:0:1@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_PROVIDER) upstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_PROVIDER [mpiexec@compute-0-1.local] PMI response to fd 6 pid 18: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) key=DAPL_PROVIDER [proxy:0:1@compute-0-1.local] got pmi command (from 27): get kvsname=kvs_30666_0 key=DAPL_PROVIDER [proxy:0:1@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_PROVIDER) upstream [proxy:0:1@compute-0-1.local] got pmi command (from 30): get [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_PROVIDER [mpiexec@compute-0-1.local] PMI response to fd 6 pid 21: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) key=DAPL_PROVIDER kvsname=kvs_30666_0 key=DAPL_PROVIDER [proxy:0:1@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_PROVIDER) upstream [proxy:0:1@compute-0-1.local] got pmi command (from 33): get kvsname=kvs_30666_0 key=DAPL_PROVIDER [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_PROVIDER [mpiexec@compute-0-1.local] PMI response to fd 6 pid 24: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) key=DAPL_PROVIDER [proxy:0:1@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_PROVIDER) upstream [proxy:0:1@compute-0-1.local] got pmi command (from 36): get kvsname=kvs_30666_0 key=DAPL_PROVIDER [proxy:0:1@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_PROVIDER) upstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_PROVIDER [mpiexec@compute-0-1.local] PMI response to fd 6 pid 27: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) key=DAPL_PROVIDER [proxy:0:1@compute-0-1.local] got pmi command (from 39): get kvsname=kvs_30666_0 key=DAPL_PROVIDER [proxy:0:1@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_PROVIDER) upstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_PROVIDER [mpiexec@compute-0-1.local] PMI response to fd 6 pid 30: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) key=DAPL_PROVIDER [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_PROVIDER [mpiexec@compute-0-1.local] PMI response to fd 6 pid 33: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) key=DAPL_PROVIDER [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] got pmi command (from 6): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 7): get kvsname=kvs_30666_0 key=DAPL_PROVIDER [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) [proxy:0:0@compute-0-0.local] [proxy:0:1@compute-0-1.local] got pmi command (from 42): get kvsname=kvs_30666_0 key=DAPL_PROVIDER [proxy:0:1@compute-0-1.local] PMI response: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_PROVIDER [mpiexec@compute-0-1.local] PMI response to fd 6 pid 36: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) key=DAPL_PROVIDER got pmi command (from 8): get kvsname=kvs_30666_0 key=DAPL_PROVIDER [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) [proxy:0:0@compute-0-0.local] got pmi command (from 13): get kvsname=kvs_30666_0 key=DAPL_PROVIDER [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) [proxy:0:0@compute-0-0.local] got pmi command (from 16): get kvsname=kvs_30666_0 key=DAPL_PROVIDER [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [proxy:0:1@compute-0-1.local] got pmi command (from 7): barrier_in [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_PROVIDER [mpiexec@compute-0-1.local] PMI response to fd 6 pid 39: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) key=DAPL_PROVIDER [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [proxy:0:1@compute-0-1.local] got pmi command (from 11): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 42): barrier_in [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [proxy:0:1@compute-0-1.local] got pmi command (from 15): barrier_in [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [proxy:0:1@compute-0-1.local] got pmi command (from 18): barrier_in [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 21): barrier_in [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [proxy:0:1@compute-0-1.local] got pmi command (from 24): barrier_in [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [proxy:0:1@compute-0-1.local] got pmi command (from 27): barrier_in [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [proxy:0:1@compute-0-1.local] got pmi command (from 30): barrier_in [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) [proxy:0:0@compute-0-0.local] got pmi command (from 19): get kvsname=kvs_30666_0 key=DAPL_PROVIDER [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) [proxy:0:0@compute-0-0.local] got pmi command (from 22): get kvsname=kvs_30666_0 key=DAPL_PROVIDER [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) [proxy:0:0@compute-0-0.local] got pmi command (from 25): get kvsname=kvs_30666_0 key=DAPL_PROVIDER [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) [proxy:0:0@compute-0-0.local] got pmi command (from 28): get kvsname=kvs_30666_0 key=DAPL_PROVIDER [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) [proxy:0:0@compute-0-0.local] got pmi command (from 31): get kvsname=kvs_30666_0 key=DAPL_PROVIDER [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) [proxy:0:1@compute-0-1.local] got pmi command (from 33): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 36): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 39): barrier_in [proxy:0:1@compute-0-1.local] forwarding command (cmd=barrier_in) upstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@compute-0-1.local] PMI response to fd 0 pid 39: cmd=barrier_out [mpiexec@compute-0-1.local] PMI response to fd 6 pid 39: cmd=barrier_out [proxy:0:0@compute-0-0.local] got pmi command (from 34): get kvsname=kvs_30666_0 key=DAPL_PROVIDER [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) [proxy:0:0@compute-0-0.local] got pmi command (from 7): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 8): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 13): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 16): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 19): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 22): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 25): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 28): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 31): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 37): get kvsname=kvs_30666_0 key=DAPL_PROVIDER [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=ofa-v2-mlx4_0-1(v2.0) [proxy:0:0@compute-0-0.local] got pmi command (from 34): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 37): barrier_in [proxy:0:0@compute-0-0.local] forwarding command (cmd=barrier_in) upstream [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] got pmi command (from 7): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:1@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 6 pid 7: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [proxy:0:1@compute-0-1.local] got pmi command (from 11): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:1@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:1@compute-0-1.local] got pmi command (from 15): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:1@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:1@compute-0-1.local] got pmi command (from 18): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:1@compute-0-1.local] [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 6 pid 11: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 0 pid 6: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:1@compute-0-1.local] got pmi command (from 21): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:1@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:1@compute-0-1.local] got pmi command (from 24): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:1@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:1@compute-0-1.local] got pmi command (from 27): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:1@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:1@compute-0-1.local] got pmi command (from 30): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:1@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:1@compute-0-1.local] got pmi command (from 33): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 6 pid 15: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 0 pid 7: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [proxy:0:1@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:1@compute-0-1.local] got pmi command (from 36): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:1@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:1@compute-0-1.local] got pmi command (from 39): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:1@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 6 pid 18: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 0 pid 8: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] got pmi command (from 6): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:0@compute-0-0.local] got pmi command (from 7): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:1@compute-0-1.local] got pmi command (from 42): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:1@compute-0-1.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 6 pid 21: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 0 pid 13: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 6 pid 24: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 0 pid 16: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 6 pid 27: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 0 pid 19: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [proxy:0:0@compute-0-0.local] got pmi command (from 8): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:0@compute-0-0.local] got pmi command (from 13): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:0@compute-0-0.local] got pmi command (from 16): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:0@compute-0-0.local] got pmi command (from 19): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:0@compute-0-0.local] got pmi command (from 22): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 6 pid 30: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 0 pid 22: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 6 pid 33: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 0 pid 25: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 6 pid 36: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 0 pid 28: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 6 pid 39: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 0 pid 31: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:0@compute-0-0.local] got pmi command (from 25): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:0@compute-0-0.local] got pmi command (from 28): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:0@compute-0-0.local] got pmi command (from 31): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] got pmi command (from 34): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 6 pid 42: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 0 pid 34: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [12] MPI startup(): shm and dapl data transfer modes [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH [mpiexec@compute-0-1.local] PMI response to fd 0 pid 37: cmd=get_result rc=-1 msg=key_DAPL_MISMATCH_not_found value=unknown [proxy:0:1@compute-0-1.local] we don't understand the response get_result; forwarding downstream [13] MPI startup(): shm and dapl data transfer modes [0] MPI startup(): shm and dapl data transfer modes [proxy:0:0@compute-0-0.local] got pmi command (from 37): get kvsname=kvs_30666_0 key=DAPL_MISMATCH [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=DAPL_MISMATCH) upstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [14] MPI startup(): shm and dapl data transfer modes [1] MPI startup(): shm and dapl data transfer modes [15] MPI startup(): shm and dapl data transfer modes [2] MPI startup(): shm and dapl data transfer modes [16] MPI startup(): shm and dapl data transfer modes [3] MPI startup(): shm and dapl data transfer modes [17] MPI startup(): shm and dapl data transfer modes [4] MPI startup(): shm and dapl data transfer modes [18] MPI startup(): shm and dapl data transfer modes [5] MPI startup(): shm and dapl data transfer modes [19] MPI startup(): shm and dapl data transfer modes [6] MPI startup(): shm and dapl data transfer modes [20] MPI startup(): shm and dapl data transfer modes [7] MPI startup(): shm and dapl data transfer modes [proxy:0:1@compute-0-1.local] got pmi command (from 7): put kvsname=kvs_30666_0 key=P12-businesscard-0 [21] MPI startup(): shm and dapl data transfer modes [8] MPI startup(): shm and dapl data transfer modes [22] MPI startup(): shm and dapl data transfer modes [9] MPI startup(): shm and dapl data transfer modes [23] MPI startup(): shm and dapl data transfer modes [10] MPI startup(): shm and dapl data transfer modes [11] MPI startup(): shm and dapl data transfer modes value=rdma_port0#30669$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:1@compute-0-1.local] got pmi command (from 11): put kvsname=kvs_30666_0 key=P13-businesscard-0 value=rdma_port0#30670$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:1@compute-0-1.local] got pmi command (from 18): put kvsname=kvs_30666_0 key=P15-businesscard-0 value=rdma_port0#30672$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:1@compute-0-1.local] got pmi command (from 21): put kvsname=kvs_30666_0 key=P16-businesscard-0 value=rdma_port0#30673$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:1@compute-0-1.local] got pmi command (from 27): put kvsname=kvs_30666_0 key=P18-businesscard-0 value=rdma_port0#30675$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:1@compute-0-1.local] got pmi command (from 30): put kvsname=kvs_30666_0 key=P19-businesscard-0 value=rdma_port0#30676$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:1@compute-0-1.local] got pmi command (from 33): put kvsname=kvs_30666_0 key=P20-businesscard-0 value=rdma_port0#30677$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:1@compute-0-1.local] got pmi command (from 39): put kvsname=kvs_30666_0 key=P22-businesscard-0 value=rdma_port0#30679$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:1@compute-0-1.local] got pmi command (from 42): put kvsname=kvs_30666_0 key=P23-businesscard-0 value=rdma_port0#30680$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:1@compute-0-1.local] got pmi command (from 15): put kvsname=kvs_30666_0 key=P14-businesscard-0 value=rdma_port0#30671$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:1@compute-0-1.local] got pmi command (from 24): put kvsname=kvs_30666_0 key=P17-businesscard-0 value=rdma_port0#30674$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:1@compute-0-1.local] got pmi command (from 36): put kvsname=kvs_30666_0 key=P21-businesscard-0 value=rdma_port0#30678$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P12-businesscard-0 value=rdma_port0#30669$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P13-businesscard-0 value=rdma_port0#30670$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P15-businesscard-0 value=rdma_port0#30672$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P16-businesscard-0 value=rdma_port0#30673$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P18-businesscard-0 value=rdma_port0#30675$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P19-businesscard-0 value=rdma_port0#30676$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P20-businesscard-0 value=rdma_port0#30677$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P22-businesscard-0 value=rdma_port0#30679$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P23-businesscard-0 value=rdma_port0#30680$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P14-businesscard-0 value=rdma_port0#30671$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P17-businesscard-0 value=rdma_port0#30674$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P21-businesscard-0 value=rdma_port0#30678$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] got pmi command (from 8): put kvsname=kvs_30666_0 key=P2-businesscard-0 value=rdma_port0#24273$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 13): put kvsname=kvs_30666_0 key=P3-businesscard-0 value=rdma_port0#24274$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 16): put kvsname=kvs_30666_0 key=P4-businesscard-0 value=rdma_port0#24275$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 28): put kvsname=kvs_30666_0 key=P8-businesscard-0 value=rdma_port0#24279$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 34): put kvsname=kvs_30666_0 key=P10-businesscard-0 value=rdma_port0#24281$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 6): put kvsname=kvs_30666_0 key=P0-businesscard-0 value=rdma_port0#24271$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 7): put kvsname=kvs_30666_0 key=P1-businesscard-0 value=rdma_port0#24272$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 19): put kvsname=kvs_30666_0 key=P5-businesscard-0 value=rdma_port0#24276$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 22): put kvsname=kvs_30666_0 key=P6-businesscard-0 value=rdma_port0#24277$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 25): put kvsname=kvs_30666_0 key=P7-businesscard-0 value=rdma_port0#24278$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P2-businesscard-0 value=rdma_port0#24273$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P3-businesscard-0 value=rdma_port0#24274$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P4-businesscard-0 value=rdma_port0#24275$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P8-businesscard-0 value=rdma_port0#24279$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P10-businesscard-0 value=rdma_port0#24281$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P0-businesscard-0 value=rdma_port0#24271$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P1-businesscard-0 value=rdma_port0#24272$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P5-businesscard-0 value=rdma_port0#24276$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P6-businesscard-0 value=rdma_port0#24277$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P7-businesscard-0 value=rdma_port0#24278$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P9-businesscard-0 value=rdma_port0#24280$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [mpiexec@compute-0-1.local] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_30666_0 key=P11-businesscard-0 value=rdma_port0#24282$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] reply: cmd=put_result rc=0 msg=success [proxy:0:1@compute-0-1.local] got pmi command (from 7): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 11): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 18): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 21): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 27): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 30): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 33): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 42): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 15): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 24): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 36): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 31): put kvsname=kvs_30666_0 key=P9-businesscard-0 value=rdma_port0#24280$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 37): put kvsname=kvs_30666_0 key=P11-businesscard-0 value=rdma_port0#24282$rdma_host0#2:0:0:10:1:255:254:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:1@compute-0-1.local] got pmi command (from 39): barrier_in [proxy:0:1@compute-0-1.local] forwarding command (cmd=barrier_in) upstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@compute-0-1.local] PMI response to fd 0 pid 28: cmd=barrier_out [mpiexec@compute-0-1.local] PMI response to fd 6 pid 28: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=P12-businesscard-0 [mpiexec@compute-0-1.local] PMI response to fd 0 pid 13: cmd=get_result rc=0 msg=success value=rdma_port0#30669$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ key=P12-businesscard-0 [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=P12-businesscard-0 [mpiexec@compute-0-1.local] PMI response to fd 0 pid 16: cmd=get_result rc=0 msg=success value=rdma_port0#30669$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ key=P12-businesscard-0 [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=P13-businesscard-0 [mpiexec@compute-0-1.local] PMI response to fd 0 pid 13: cmd=get_result rc=0 msg=success value=rdma_port0#30670$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ key=P13-businesscard-0 [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=P13-businesscard-0 [mpiexec@compute-0-1.local] PMI response to fd 0 pid 16: cmd=get_result rc=0 msg=success value=rdma_port0#30670$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ key=P13-businesscard-0 [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=P14-businesscard-0 [mpiexec@compute-0-1.local] PMI response to fd 0 pid 13: cmd=get_result rc=0 msg=success value=rdma_port0#30671$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ key=P14-businesscard-0 [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=P15-businesscard-0 [mpiexec@compute-0-1.local] PMI response to fd 0 pid 13: cmd=get_result rc=0 msg=success value=rdma_port0#30672$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ key=P15-businesscard-0 [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=P16-businesscard-0 [mpiexec@compute-0-1.local] PMI response to fd 0 pid 13: cmd=get_result rc=0 msg=success value=rdma_port0#30673$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ key=P16-businesscard-0 [proxy:0:0@compute-0-0.local] got pmi command (from 6): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 7): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 8): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 13): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 16): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 19): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 34): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 22): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 25): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 31): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 37): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 28): barrier_in [proxy:0:0@compute-0-0.local] forwarding command (cmd=barrier_in) upstream [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=P17-businesscard-0 [mpiexec@compute-0-1.local] PMI response to fd 0 pid 13: cmd=get_result rc=0 msg=success value=rdma_port0#30674$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ key=P17-businesscard-0 [proxy:0:0@compute-0-0.local] got pmi command (from 13): get kvsname=kvs_30666_0 key=P12-businesscard-0 [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=P12-businesscard-0) upstream [proxy:0:0@compute-0-0.local] got pmi command (from 16): get kvsname=kvs_30666_0 key=P12-businesscard-0 [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=P12-businesscard-0) upstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] got pmi command (from 13): get kvsname=kvs_30666_0 key=P13-businesscard-0 [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=P13-businesscard-0) upstream [proxy:0:0@compute-0-0.local] got pmi command (from 16): get kvsname=kvs_30666_0 key=P13-businesscard-0 [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=P13-businesscard-0) upstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] got pmi command (from 13): get kvsname=kvs_30666_0 key=P14-businesscard-0 [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=P14-businesscard-0) upstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] got pmi command (from 16): get kvsname=kvs_30666_0 key=P14-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30671$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 13): get kvsname=kvs_30666_0 key=P15-businesscard-0 [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=P15-businesscard-0) upstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] got pmi command (from 16): get kvsname=kvs_30666_0 key=P15-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30672$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 13): get kvsname=kvs_30666_0 key=P16-businesscard-0 [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=P16-businesscard-0) upstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] got pmi command (from 8): get kvsname=kvs_30666_0 key=P12-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30669$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 19): get kvsname=kvs_30666_0 key=P12-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30669$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 7): get kvsname=kvs_30666_0 key=P12-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30669$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 16): get kvsname=kvs_30666_0 key=P16-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30673$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 13): get kvsname=kvs_30666_0 key=P17-businesscard-0 [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=P17-businesscard-0) upstream [proxy:0:0@compute-0-0.local] got pmi command (from 7): get kvsname=kvs_30666_0 key=P13-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30670$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=P18-businesscard-0 [mpiexec@compute-0-1.local] PMI response to fd 0 pid 13: cmd=get_result rc=0 msg=success value=rdma_port0#30675$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ key=P18-businesscard-0 [proxy:0:0@compute-0-0.local] got pmi command (from 16): get kvsname=kvs_30666_0 key=P17-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30674$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 19): get kvsname=kvs_30666_0 key=P13-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30670$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 6): get kvsname=kvs_30666_0 key=P12-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30669$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 7): get kvsname=kvs_30666_0 key=P14-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30671$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 13): get kvsname=kvs_30666_0 key=P18-businesscard-0 [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=P18-businesscard-0) upstream [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=P19-businesscard-0 [mpiexec@compute-0-1.local] PMI response to fd 0 pid 13: cmd=get_result rc=0 msg=success value=rdma_port0#30676$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ key=P19-businesscard-0 [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=P20-businesscard-0 [mpiexec@compute-0-1.local] PMI response to fd 0 pid 16: cmd=get_result rc=0 msg=success value=rdma_port0#30677$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ key=P20-businesscard-0 [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=P20-businesscard-0 [mpiexec@compute-0-1.local] PMI response to fd 0 pid 13: cmd=get_result rc=0 msg=success value=rdma_port0#30677$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ key=P20-businesscard-0 [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] got pmi command (from 16): get kvsname=kvs_30666_0 key=P18-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30675$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 34): get kvsname=kvs_30666_0 key=P12-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30669$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 22): get kvsname=kvs_30666_0 key=P12-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30669$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 7): get kvsname=kvs_30666_0 key=P15-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30672$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 19): get kvsname=kvs_30666_0 key=P14-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30671$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 6): get kvsname=kvs_30666_0 key=P13-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30670$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 13): get kvsname=kvs_30666_0 key=P19-businesscard-0 [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=P19-businesscard-0) upstream [proxy:0:0@compute-0-0.local] got pmi command (from 37): get kvsname=kvs_30666_0 key=P12-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30669$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] got pmi command (from 16): get kvsname=kvs_30666_0 key=P19-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30676$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 8): get kvsname=kvs_30666_0 key=P13-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30670$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 34): get kvsname=kvs_30666_0 key=P13-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30670$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 7): get kvsname=kvs_30666_0 key=P16-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30673$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 22): get kvsname=kvs_30666_0 key=P13-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30670$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 19): get kvsname=kvs_30666_0 key=P15-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30672$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 8): get kvsname=kvs_30666_0 key=P14-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30671$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 6): get kvsname=kvs_30666_0 key=P14-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30671$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 37): get kvsname=kvs_30666_0 key=P13-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30670$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 16): get kvsname=kvs_30666_0 key=P20-businesscard-0 [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=P20-businesscard-0) upstream [proxy:0:0@compute-0-0.local] got pmi command (from 13): get kvsname=kvs_30666_0 key=P20-businesscard-0 [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=P20-businesscard-0) upstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] got pmi command (from 19): get kvsname=kvs_30666_0 key=P16-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30673$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 7): get kvsname=kvs_30666_0 key=P17-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30674$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 8): get kvsname=kvs_30666_0 key=P15-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30672$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 34): get kvsname=kvs_30666_0 key=P14-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30671$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 22): get kvsname=kvs_30666_0 key=P14-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30671$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 6): get kvsname=kvs_30666_0 key=P15-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30672$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 8): get kvsname=kvs_30666_0 key=P16-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30673$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=P21-businesscard-0 [mpiexec@compute-0-1.local] PMI response to fd 0 pid 13: cmd=get_result rc=0 msg=success value=rdma_port0#30678$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ key=P21-businesscard-0 [proxy:0:0@compute-0-0.local] got pmi command (from 19): get kvsname=kvs_30666_0 key=P17-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30674$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 31): get kvsname=kvs_30666_0 key=P12-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30669$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 37): get kvsname=kvs_30666_0 key=P14-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30671$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 34): get kvsname=kvs_30666_0 key=P15-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30672$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 13): get kvsname=kvs_30666_0 key=P21-businesscard-0 [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=P21-businesscard-0) upstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] got pmi command (from 16): get kvsname=kvs_30666_0 key=P21-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30678$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 25): get kvsname=kvs_30666_0 key=P12-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30669$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 28): get kvsname=kvs_30666_0 key=P12-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30669$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 7): get kvsname=kvs_30666_0 key=P18-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30675$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 19): get kvsname=kvs_30666_0 key=P18-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30675$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 22): get kvsname=kvs_30666_0 key=P15-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30672$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 28): get kvsname=kvs_30666_0 key=P13-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30670$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 31): get kvsname=kvs_30666_0 key=P13-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30670$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 37): get kvsname=kvs_30666_0 key=P15-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30672$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=P22-businesscard-0 [mpiexec@compute-0-1.local] PMI response to fd 0 pid 16: cmd=get_result rc=0 msg=success value=rdma_port0#30679$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ key=P22-businesscard-0 [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=P22-businesscard-0 [mpiexec@compute-0-1.local] PMI response to fd 0 pid 13: cmd=get_result rc=0 msg=success value=rdma_port0#30679$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ key=P22-businesscard-0 [proxy:0:0@compute-0-0.local] got pmi command (from 34): get kvsname=kvs_30666_0 key=P16-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30673$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 16): get kvsname=kvs_30666_0 key=P22-businesscard-0 [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=P22-businesscard-0) upstream [proxy:0:0@compute-0-0.local] got pmi command (from 13): get kvsname=kvs_30666_0 key=P22-businesscard-0 [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=P22-businesscard-0) upstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] got pmi command (from 25): get kvsname=kvs_30666_0 key=P13-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30670$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 6): get kvsname=kvs_30666_0 key=P16-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30673$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 8): get kvsname=kvs_30666_0 key=P17-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30674$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 19): get kvsname=kvs_30666_0 key=P19-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30676$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 7): get kvsname=kvs_30666_0 key=P19-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30676$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 28): get kvsname=kvs_30666_0 key=P14-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30671$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=get kvsname=kvs_30666_0 key=P23-businesscard-0 [mpiexec@compute-0-1.local] PMI response to fd 0 pid 13: cmd=get_result rc=0 msg=success value=rdma_port0#30680$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ key=P23-businesscard-0 [proxy:0:0@compute-0-0.local] got pmi command (from 37): get kvsname=kvs_30666_0 key=P16-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30673$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 31): get kvsname=kvs_30666_0 key=P14-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30671$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 34): get kvsname=kvs_30666_0 key=P17-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30674$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 19): get kvsname=kvs_30666_0 key=P20-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30677$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 13): get kvsname=kvs_30666_0 key=P23-businesscard-0 [proxy:0:0@compute-0-0.local] forwarding command (cmd=get kvsname=kvs_30666_0 key=P23-businesscard-0) upstream [proxy:0:0@compute-0-0.local] we don't understand the response get_result; forwarding downstream [proxy:0:0@compute-0-0.local] got pmi command (from 22): get kvsname=kvs_30666_0 key=P16-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30673$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 25): get kvsname=kvs_30666_0 key=P14-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30671$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 7): get kvsname=kvs_30666_0 key=P20-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30677$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 16): get kvsname=kvs_30666_0 key=P23-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30680$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 28): get kvsname=kvs_30666_0 key=P15-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30672$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 37): get kvsname=kvs_30666_0 key=P17-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30674$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 6): get kvsname=kvs_30666_0 key=P17-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30674$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 31): get kvsname=kvs_30666_0 key=P15-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30672$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 8): get kvsname=kvs_30666_0 key=P18-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30675$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 34): get kvsname=kvs_30666_0 key=P18-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30675$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 25): get kvsname=kvs_30666_0 key=P15-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30672$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 19): get kvsname=kvs_30666_0 key=P21-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30678$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 6): get kvsname=kvs_30666_0 key=P18-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30675$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 37): get kvsname=kvs_30666_0 key=P18-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30675$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 25): get kvsname=kvs_30666_0 key=P16-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30673$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 28): get kvsname=kvs_30666_0 key=P16-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30673$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 22): get kvsname=kvs_30666_0 key=P17-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30674$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 8): get kvsname=kvs_30666_0 key=P19-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30676$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 7): get kvsname=kvs_30666_0 key=P21-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30678$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 31): get kvsname=kvs_30666_0 key=P16-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30673$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 25): get kvsname=kvs_30666_0 key=P17-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30674$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 34): get kvsname=kvs_30666_0 key=P19-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30676$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 37): get kvsname=kvs_30666_0 key=P19-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30676$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 19): get kvsname=kvs_30666_0 key=P22-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30679$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 6): get kvsname=kvs_30666_0 key=P19-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30676$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 28): get kvsname=kvs_30666_0 key=P17-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30674$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 22): get kvsname=kvs_30666_0 key=P18-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30675$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 25): get kvsname=kvs_30666_0 key=P18-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30675$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 31): get kvsname=kvs_30666_0 key=P17-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30674$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 8): get kvsname=kvs_30666_0 key=P20-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30677$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 6): get kvsname=kvs_30666_0 key=P20-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30677$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 7): get kvsname=kvs_30666_0 key=P22-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30679$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 34): get kvsname=kvs_30666_0 key=P20-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30677$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 19): get kvsname=kvs_30666_0 key=P23-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30680$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 22): get kvsname=kvs_30666_0 key=P19-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30676$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 25): get kvsname=kvs_30666_0 key=P19-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30676$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 37): get kvsname=kvs_30666_0 key=P20-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30677$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 28): get kvsname=kvs_30666_0 key=P18-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30675$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 31): get kvsname=kvs_30666_0 key=P18-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30675$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 7): get kvsname=kvs_30666_0 key=P23-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30680$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 6): get kvsname=kvs_30666_0 key=P21-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30678$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 25): get kvsname=kvs_30666_0 key=P20-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30677$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 8): get kvsname=kvs_30666_0 key=P21-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30678$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 34): get kvsname=kvs_30666_0 key=P21-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30678$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 28): get kvsname=kvs_30666_0 key=P19-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30676$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 22): get kvsname=kvs_30666_0 key=P20-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30677$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 28): get kvsname=kvs_30666_0 key=P20-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30677$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 22): get kvsname=kvs_30666_0 key=P21-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30678$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 6): get kvsname=kvs_30666_0 key=P22-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30679$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 37): get kvsname=kvs_30666_0 key=P21-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30678$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 31): get kvsname=kvs_30666_0 key=P19-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30676$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 25): get kvsname=kvs_30666_0 key=P21-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30678$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 34): get kvsname=kvs_30666_0 key=P22-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30679$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 28): get kvsname=kvs_30666_0 key=P21-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30678$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 8): get kvsname=kvs_30666_0 key=P22-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30679$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 28): get kvsname=kvs_30666_0 key=P22-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30679$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 25): get kvsname=kvs_30666_0 key=P22-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30679$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 22): get kvsname=kvs_30666_0 key=P22-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30679$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 37): get kvsname=kvs_30666_0 key=P22-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30679$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 31): get kvsname=kvs_30666_0 key=P20-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30677$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 6): get kvsname=kvs_30666_0 key=P23-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30680$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 34): get kvsname=kvs_30666_0 key=P23-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30680$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 28): get kvsname=kvs_30666_0 key=P23-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30680$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 37): get kvsname=kvs_30666_0 key=P23-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30680$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 25): get kvsname=kvs_30666_0 key=P23-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30680$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 8): get kvsname=kvs_30666_0 key=P23-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30680$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 22): get kvsname=kvs_30666_0 key=P23-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30680$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 31): get kvsname=kvs_30666_0 key=P21-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30678$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [proxy:0:0@compute-0-0.local] got pmi command (from 31): get kvsname=kvs_30666_0 key=P22-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30679$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ [0] MPI startup(): Rank Pid Node name Pin cpu [0] MPI startup(): 0 24271 compute-0-0.local 0 [0] MPI startup(): 1 24272 compute-0-0.local 2 [0] MPI startup(): 2 24273 compute-0-0.local 4 [0] MPI startup(): 3 24274 compute-0-0.local 6 [0] MPI startup(): 4 24275 compute-0-0.local 8 [0] MPI startup(): 5 24276 compute-0-0.local 10 [0] MPI startup(): 6 24277 compute-0-0.local 1 [0] MPI startup(): 7 24278 compute-0-0.local 3 [0] MPI startup(): 8 24279 compute-0-0.local 5 [0] MPI startup(): 9 24280 compute-0-0.local 7 [0] MPI startup(): 10 24281 compute-0-0.local 9 [0] MPI startup(): 11 24282 compute-0-0.local 11 [0] MPI startup(): 12 30669 compute-0-1.local 0 [0] MPI startup(): 13 30670 compute-0-1.local 2 [0] MPI startup(): 14 30671 compute-0-1.local 4 [0] MPI startup(): 15 30672 compute-0-1.local 6 [0] MPI startup(): 16 30673 compute-0-1.local 8 [0] MPI startup(): 17 30674 compute-0-1.local 10 [0] MPI startup(): 18 30675 compute-0-1.local 1 [0] MPI startup(): 19 30676 compute-0-1.local 3 [0] MPI startup(): 20 30677 compute-0-1.local 5 [0] MPI startup(): 21 30678 compute-0-1.local 7 [0] MPI startup(): 22 30679 compute-0-1.local 9 [0] MPI startup(): 23 30680 compute-0-1.local 11 [0] MPI startup(): I_MPI_DEBUG=5 [0] MPI startup(): I_MPI_PIN_MAPPING=12:0 0,1 2,2 4,3 6,4 8,5 10,6 1,7 3,8 5,9 7,10 9,11 11 #--------------------------------------------------- # Intel (R) MPI Benchmark Suite V3.2.4, MPI-1 part #--------------------------------------------------- # Date : Mon Oct 22 18:20:59 2012 # Machine : x86_64 # System : Linux # Release : 2.6.18-238.19.1.el5 # Version : #1 SMP Fri Jul 15 07:31:24 EDT 2011 # MPI Version : 2.2 # MPI Thread Environment: # New default behavior from Version 3.2 on: # the number of iterations per message size is cut down # dynamically when a certain run time (per message size sample) # is expected to be exceeded. Time limit is defined by variable # "SECS_PER_SAMPLE" (=> IMB_settings.h) # or through the flag => -time # Calling sequence was: # IMB-MPI1 # Minimum message length in bytes: 0 # Maximum message length in bytes: 4194304 # # MPI_Datatype : MPI_BYTE # MPI_Datatype for reductions : MPI_FLOAT # MPI_Op : MPI_SUM # # # List of Benchmarks to run: # PingPong # PingPing # Sendrecv # Exchange # Allreduce # Reduce # Reduce_scatter # Allgather # Allgatherv # Gather # Gatherv # Scatter # Scatterv # Alltoall # Alltoallv # Bcast # Barrier #--------------------------------------------------- # Benchmarking PingPong # #processes = 2 # ( 22 additional processes waiting in MPI_Barrier) #--------------------------------------------------- #bytes #repetitions t[usec] Mbytes/sec 0 1000 0.29 0.00 1 1000 0.34 2.84 2 1000 0.34 5.54 4 1000 0.35 10.95 8 1000 0.36 21.31 16 1000 0.33 45.68 32 1000 0.38 79.48 64 1000 0.38 159.15 128 1000 0.40 307.88 256 1000 0.42 581.98 512 1000 0.47 1043.30 1024 1000 0.55 1790.21 2048 1000 0.72 2697.40 4096 1000 1.07 3662.87 [proxy:0:0@compute-0-0.local] got pmi command (from 31): get kvsname=kvs_30666_0 key=P23-businesscard-0 [proxy:0:0@compute-0-0.local] PMI response: cmd=get_result rc=0 msg=success value=rdma_port0#30680$rdma_host0#2:0:0:10:1:255:253:0:0:0:0:0:0:0:0$fabrics_list#shm_and_dapl$ 8192 1000 1.81 4311.30 16384 1000 3.31 4716.18 32768 1000 5.77 5420.57 65536 640 9.42 6633.01 131072 320 17.72 7055.92 262144 160 36.66 6819.04 524288 80 68.15 7336.87 1048576 40 292.97 3413.26 2097152 20 581.37 3440.14 4194304 10 1163.05 3439.22 #--------------------------------------------------- # Benchmarking PingPing # #processes = 2 # ( 22 additional processes waiting in MPI_Barrier) #--------------------------------------------------- #bytes #repetitions t[usec] Mbytes/sec 0 1000 0.49 0.00 1 1000 0.50 1.90 2 1000 0.50 3.78 4 1000 0.49 7.85 8 1000 0.49 15.53 16 1000 0.49 31.33 32 1000 0.50 61.16 64 1000 0.50 123.08 128 1000 0.57 215.31 256 1000 0.57 425.96 512 1000 0.63 770.21 1024 1000 0.68 1442.76 2048 1000 0.85 2308.26 4096 1000 1.22 3199.38 8192 1000 2.02 3869.63 16384 1000 3.86 4044.93 32768 1000 6.85 4563.47 65536 640 18.68 3345.01 131072 320 36.89 3388.10 262144 160 73.41 3405.71 524288 80 136.85 3653.65 1048576 40 584.27 1711.52 2097152 20 1162.76 1720.05 4194304 10 2325.08 1720.37 #----------------------------------------------------------------------------- # Benchmarking Sendrecv # #processes = 2 # ( 22 additional processes waiting in MPI_Barrier) #----------------------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] Mbytes/sec 0 1000 0.38 0.38 0.38 0.00 1 1000 0.51 0.51 0.51 3.75 2 1000 0.54 0.54 0.54 7.13 4 1000 0.53 0.53 0.53 14.29 8 1000 0.53 0.53 0.53 28.79 16 1000 0.53 0.53 0.53 57.58 32 1000 0.53 0.53 0.53 115.16 64 1000 0.51 0.51 0.51 237.92 128 1000 0.55 0.55 0.55 445.60 256 1000 0.58 0.58 0.58 844.54 512 1000 0.61 0.61 0.61 1611.96 1024 1000 0.68 0.68 0.68 2876.40 2048 1000 0.86 0.86 0.86 4563.79 4096 1000 1.24 1.24 1.24 6320.99 8192 1000 1.98 1.98 1.98 7875.03 16384 1000 3.65 3.65 3.65 8571.28 32768 1000 6.79 6.79 6.79 9207.40 65536 640 18.41 18.41 18.41 6789.37 131072 320 35.53 35.54 35.53 7034.77 262144 160 72.50 72.53 72.51 6894.13 524288 80 135.74 135.81 135.77 7363.11 1048576 40 580.78 582.37 581.57 3434.22 2097152 20 1158.31 1161.50 1159.91 3443.81 4194304 10 2316.00 2322.41 2319.20 3444.70 #----------------------------------------------------------------------------- # Benchmarking Sendrecv # #processes = 4 # ( 20 additional processes waiting in MPI_Barrier) #----------------------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] Mbytes/sec 0 1000 0.44 0.44 0.44 0.00 1 1000 0.50 0.50 0.50 3.85 2 1000 0.51 0.51 0.51 7.51 4 1000 0.50 0.50 0.50 15.32 8 1000 0.51 0.51 0.51 29.70 16 1000 0.50 0.50 0.50 60.66 32 1000 0.51 0.51 0.51 119.46 64 1000 0.51 0.51 0.51 239.81 128 1000 0.54 0.54 0.54 452.10 256 1000 0.57 0.57 0.57 860.87 512 1000 0.60 0.60 0.60 1619.61 1024 1000 0.70 0.70 0.70 2805.48 2048 1000 0.88 0.88 0.88 4453.38 4096 1000 1.30 1.30 1.30 5991.59 8192 1000 2.50 2.50 2.50 6247.47 16384 1000 4.21 4.21 4.21 7419.45 32768 1000 7.97 7.97 7.97 7842.75 65536 640 20.89 20.90 20.90 5979.90 131072 320 53.94 53.96 53.95 4633.37 262144 160 107.49 107.61 107.56 4646.27 524288 80 215.02 215.19 215.11 4647.11 1048576 40 753.53 756.07 754.84 2645.25 2097152 20 1498.34 1503.35 1500.82 2660.73 4194304 10 3016.81 3028.11 3022.53 2641.91 #----------------------------------------------------------------------------- # Benchmarking Sendrecv # #processes = 8 # ( 16 additional processes waiting in MPI_Barrier) #----------------------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] Mbytes/sec 0 1000 0.66 0.66 0.66 0.00 1 1000 0.73 0.73 0.73 2.60 2 1000 0.74 0.74 0.74 5.18 4 1000 0.73 0.73 0.73 10.46 8 1000 0.74 0.74 0.74 20.73 16 1000 0.73 0.73 0.73 41.91 32 1000 0.74 0.74 0.74 82.79 64 1000 0.72 0.73 0.73 167.92 128 1000 0.73 0.73 0.73 333.55 256 1000 0.74 0.74 0.74 659.79 512 1000 0.85 0.85 0.85 1142.22 1024 1000 1.01 1.01 1.01 1927.98 2048 1000 1.27 1.27 1.27 3073.92 4096 1000 1.97 1.98 1.97 3951.76 8192 1000 3.57 3.57 3.57 4372.86 16384 1000 6.53 6.54 6.53 4780.51 32768 1000 13.58 13.60 13.59 4593.94 65536 640 34.52 34.60 34.56 3613.05 131072 320 91.37 91.60 91.53 2729.27 262144 160 183.89 184.79 184.36 2705.72 524288 80 364.66 368.59 367.28 2713.07 1048576 40 879.38 886.63 883.71 2255.74 2097152 20 1888.11 1901.21 1896.45 2103.92 4194304 10 4670.91 5120.30 4907.67 1562.41 #----------------------------------------------------------------------------- # Benchmarking Sendrecv # #processes = 16 # ( 8 additional processes waiting in MPI_Barrier) #----------------------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] Mbytes/sec 0 1000 0.83 0.85 0.84 0.00 1 1000 0.73 0.73 0.73 2.61 2 1000 0.72 0.72 0.72 5.28 4 1000 0.71 0.71 0.71 10.76 8 1000 0.72 0.72 0.72 21.14 16 1000 0.84 0.85 0.84 36.03 32 1000 0.73 0.74 0.74 82.69 64 1000 0.75 0.76 0.76 161.01 128 1000 0.79 0.80 0.79 306.68 256 1000 1.12 1.12 1.12 435.19 512 1000 1.16 1.17 1.16 837.46 1024 1000 1.31 1.32 1.31 1482.98 2048 1000 1.66 1.67 1.66 2344.59 4096 1000 2.47 2.48 2.48 3148.95 8192 1000 4.19 4.20 4.20 3718.35 16384 1000 8.71 8.73 8.72 3579.64 32768 1000 21.62 21.78 21.72 2869.76 65536 640 38.13 38.34 38.25 3260.24 131072 320 81.29 82.27 81.87 3038.83 262144 160 165.02 168.21 166.83 2972.42 524288 80 329.73 336.71 333.55 2969.89 1048576 40 840.37 889.97 873.50 2247.25 2097152 20 1757.00 1988.95 1925.10 2011.11 4194304 10 3487.52 4731.51 4207.63 1690.79 #----------------------------------------------------------------------------- # Benchmarking Sendrecv # #processes = 24 #----------------------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] Mbytes/sec 0 1000 0.77 0.79 0.78 0.00 1 1000 0.71 0.72 0.71 2.66 2 1000 0.83 0.83 0.83 4.57 4 1000 0.72 0.73 0.72 10.52 8 1000 0.72 0.73 0.73 20.87 16 1000 0.72 0.73 0.72 42.09 32 1000 0.74 0.75 0.74 81.92 64 1000 0.75 0.75 0.75 162.13 128 1000 0.79 0.80 0.80 304.76 256 1000 1.11 1.11 1.11 438.26 512 1000 1.14 1.15 1.14 851.38 1024 1000 1.26 1.26 1.26 1545.37 2048 1000 1.59 1.60 1.60 2437.01 4096 1000 2.37 2.38 2.37 3284.02 8192 1000 4.26 4.27 4.27 3656.53 16384 1000 8.81 8.83 8.82 3537.51 32768 1000 21.28 21.58 21.44 2896.46 65536 640 38.31 38.54 38.42 3243.32 131072 320 82.59 83.80 83.14 2983.41 262144 160 166.24 168.46 167.59 2968.13 524288 80 332.34 351.02 340.45 2848.81 1048576 40 874.58 907.05 891.41 2204.96 2097152 20 1871.05 2042.00 1981.23 1958.87 4194304 10 4132.58 4693.79 4393.07 1704.38 #----------------------------------------------------------------------------- # Benchmarking Exchange # #processes = 2 # ( 22 additional processes waiting in MPI_Barrier) #----------------------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] Mbytes/sec 0 1000 0.95 0.95 0.95 0.00 1 1000 1.12 1.12 1.12 3.41 2 1000 1.08 1.08 1.08 7.04 4 1000 1.11 1.11 1.11 13.72 8 1000 1.08 1.08 1.08 28.15 16 1000 1.12 1.12 1.12 54.26 32 1000 1.07 1.07 1.07 114.08 64 1000 1.17 1.17 1.17 208.85 128 1000 1.14 1.14 1.14 428.99 256 1000 1.24 1.24 1.24 788.15 512 1000 1.29 1.29 1.29 1511.72 1024 1000 1.44 1.44 1.44 2714.83 2048 1000 1.75 1.75 1.75 4451.57 4096 1000 2.49 2.49 2.49 6267.79 8192 1000 3.78 3.78 3.78 8258.06 16384 1000 6.83 6.84 6.84 9142.86 32768 1000 12.92 12.92 12.92 9671.78 65536 640 40.14 40.14 40.14 6227.62 131072 320 73.69 73.70 73.69 6784.22 262144 160 146.39 146.42 146.41 6829.72 524288 80 272.20 272.29 272.24 7345.22 1048576 40 1167.15 1168.80 1167.97 3422.32 2097152 20 2327.19 2330.35 2328.77 3432.96 4194304 10 4677.41 4683.71 4680.56 3416.10 #----------------------------------------------------------------------------- # Benchmarking Exchange # #processes = 4 # ( 20 additional processes waiting in MPI_Barrier) #----------------------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] Mbytes/sec 0 1000 1.08 1.09 1.08 0.00 1 1000 1.02 1.02 1.02 3.73 2 1000 1.07 1.07 1.07 7.14 4 1000 1.03 1.03 1.03 14.85 8 1000 1.01 1.01 1.01 30.07 16 1000 1.05 1.05 1.05 58.08 32 1000 1.05 1.05 1.05 115.81 64 1000 1.00 1.00 1.00 243.64 128 1000 1.07 1.07 1.07 456.84 256 1000 1.11 1.11 1.11 877.46 512 1000 1.24 1.24 1.24 1572.66 1024 1000 1.43 1.43 1.43 2737.51 2048 1000 1.89 1.89 1.89 4122.80 4096 1000 2.75 2.75 2.75 5683.46 8192 1000 4.25 4.25 4.25 7356.57 16384 1000 8.08 8.08 8.08 7734.23 32768 1000 15.69 15.70 15.69 7963.79 65536 640 47.50 47.52 47.51 5260.88 131072 320 112.81 112.83 112.82 4431.50 262144 160 218.46 218.54 218.50 4575.75 524288 80 431.52 431.70 431.63 4632.86 1048576 40 1515.42 1518.37 1516.97 2634.40 2097152 20 3079.35 3084.96 3082.27 2593.23 4194304 10 6193.18 6205.99 6199.94 2578.16 #----------------------------------------------------------------------------- # Benchmarking Exchange # #processes = 8 # ( 16 additional processes waiting in MPI_Barrier) #----------------------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] Mbytes/sec 0 1000 1.45 1.45 1.45 0.00 1 1000 1.38 1.38 1.38 2.75 2 1000 1.38 1.39 1.38 5.50 4 1000 1.39 1.39 1.39 10.99 8 1000 1.40 1.40 1.40 21.78 16 1000 1.37 1.38 1.38 44.32 32 1000 1.38 1.39 1.39 87.96 64 1000 1.39 1.39 1.39 175.40 128 1000 1.58 1.59 1.59 307.69 256 1000 1.57 1.58 1.58 618.45 512 1000 1.86 1.86 1.86 1048.37 1024 1000 2.05 2.05 2.05 1906.45 2048 1000 2.58 2.59 2.59 3018.70 4096 1000 4.15 4.16 4.15 3758.66 8192 1000 7.03 7.04 7.04 4437.70 16384 1000 13.50 13.52 13.51 4624.17 32768 1000 27.70 27.72 27.71 4509.73 65536 640 69.21 69.31 69.27 3606.84 131072 320 173.03 173.51 173.32 2881.69 262144 160 347.26 348.63 348.16 2868.35 524288 80 767.69 772.96 771.39 2587.45 1048576 40 1998.85 2016.88 2010.10 1983.26 2097152 20 4399.60 4455.60 4430.20 1795.49 4194304 10 9051.99 9400.39 9249.42 1702.06 #----------------------------------------------------------------------------- # Benchmarking Exchange # #processes = 16 # ( 8 additional processes waiting in MPI_Barrier) #----------------------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] Mbytes/sec 0 1000 2.13 2.14 2.14 0.00 1 1000 2.18 2.20 2.19 1.74 2 1000 2.17 2.19 2.18 3.49 4 1000 2.15 2.16 2.16 7.06 8 1000 2.20 2.21 2.20 13.82 16 1000 2.23 2.25 2.24 27.18 32 1000 2.28 2.29 2.29 53.21 64 1000 2.33 2.34 2.33 104.33 128 1000 2.54 2.56 2.55 191.10 256 1000 3.97 3.99 3.98 244.51 512 1000 4.26 4.28 4.27 455.92 1024 1000 5.02 5.05 5.03 773.96 2048 1000 6.33 6.36 6.34 1228.19 4096 1000 7.92 7.96 7.94 1963.21 8192 1000 10.99 11.03 11.01 2831.91 16384 1000 18.55 18.64 18.60 3353.55 32768 1000 36.71 36.88 36.80 3389.28 65536 640 75.19 75.48 75.35 3312.28 131072 320 169.40 170.35 169.96 2935.13 262144 160 331.24 334.66 333.63 2988.09 524288 80 710.19 729.36 720.85 2742.11 1048576 40 1910.42 2013.43 1974.05 1986.66 2097152 20 4017.35 4507.20 4328.83 1774.94 4194304 10 7220.60 9323.81 8529.52 1716.04 #----------------------------------------------------------------------------- # Benchmarking Exchange # #processes = 24 #----------------------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] Mbytes/sec 0 1000 2.05 2.06 2.06 0.00 1 1000 2.16 2.17 2.16 1.76 2 1000 2.17 2.18 2.17 3.50 4 1000 2.12 2.13 2.13 7.16 8 1000 2.20 2.21 2.20 13.83 16 1000 2.23 2.25 2.24 27.15 32 1000 2.29 2.31 2.30 52.94 64 1000 2.33 2.34 2.33 104.23 128 1000 2.54 2.55 2.54 191.26 256 1000 3.92 3.94 3.93 247.66 512 1000 4.22 4.25 4.23 459.45 1024 1000 4.87 4.90 4.89 797.00 2048 1000 6.39 6.43 6.41 1214.62 4096 1000 8.01 8.07 8.04 1936.47 8192 1000 11.57 11.62 11.59 2689.82 16384 1000 19.66 19.73 19.69 3167.44 32768 1000 37.97 38.17 38.06 3275.00 65536 640 79.43 79.75 79.60 3134.73 131072 320 166.82 167.83 167.34 2979.17 262144 160 335.05 340.73 338.06 2934.91 524288 80 710.76 726.85 719.10 2751.60 1048576 40 1930.65 1998.98 1975.87 2001.02 2097152 20 4230.25 4601.20 4449.88 1738.68 4194304 10 8172.11 9370.09 8954.45 1707.56 #---------------------------------------------------------------- # Benchmarking Allreduce # #processes = 2 # ( 22 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.07 0.07 0.07 4 1000 0.78 0.78 0.78 8 1000 0.79 0.79 0.79 16 1000 0.77 0.77 0.77 32 1000 0.75 0.75 0.75 64 1000 0.76 0.76 0.76 128 1000 0.85 0.85 0.85 256 1000 0.90 0.91 0.90 512 1000 1.00 1.00 1.00 1024 1000 1.08 1.08 1.08 2048 1000 1.29 1.29 1.29 4096 1000 1.86 1.86 1.86 8192 1000 3.65 3.65 3.65 16384 1000 6.20 6.20 6.20 32768 1000 11.02 11.02 11.02 65536 640 21.06 21.06 21.06 131072 320 57.00 57.00 57.00 262144 160 121.93 121.94 121.93 524288 80 244.69 244.76 244.72 1048576 40 480.93 481.05 480.99 2097152 20 1665.79 1668.20 1667.00 4194304 10 3419.28 3426.98 3423.13 #---------------------------------------------------------------- # Benchmarking Allreduce # #processes = 4 # ( 20 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.07 0.07 0.07 4 1000 1.39 1.39 1.39 8 1000 1.24 1.24 1.24 16 1000 1.23 1.23 1.23 32 1000 1.23 1.23 1.23 64 1000 1.22 1.22 1.22 128 1000 1.35 1.35 1.35 256 1000 1.47 1.47 1.47 512 1000 1.62 1.62 1.62 1024 1000 1.81 1.81 1.81 2048 1000 2.36 2.36 2.36 4096 1000 3.51 3.51 3.51 8192 1000 5.62 5.62 5.62 16384 1000 9.54 9.54 9.54 32768 1000 17.45 17.45 17.45 65536 640 34.30 34.30 34.30 131072 320 119.66 119.70 119.68 262144 160 248.79 248.91 248.84 524288 80 490.00 490.33 490.18 1048576 40 1082.35 1082.69 1082.55 2097152 20 4074.85 4079.50 4077.20 4194304 10 9717.01 9750.51 9734.81 #---------------------------------------------------------------- # Benchmarking Allreduce # #processes = 8 # ( 16 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.07 0.07 0.07 4 1000 2.60 2.60 2.60 8 1000 2.33 2.34 2.33 16 1000 2.31 2.31 2.31 32 1000 2.31 2.31 2.31 64 1000 2.31 2.31 2.31 128 1000 2.68 2.68 2.68 256 1000 2.78 2.78 2.78 512 1000 2.99 2.99 2.99 1024 1000 3.61 3.61 3.61 2048 1000 4.98 4.98 4.98 4096 1000 7.68 7.68 7.68 8192 1000 10.88 10.88 10.88 16384 1000 16.89 16.89 16.89 32768 1000 30.12 30.12 30.12 65536 640 58.93 58.94 58.93 131072 320 181.08 181.24 181.19 262144 160 375.28 375.55 375.45 524288 80 801.96 803.21 802.93 1048576 40 2541.15 2547.25 2545.45 2097152 20 7344.89 7368.54 7360.76 4194304 10 16814.71 16981.51 16949.17 #---------------------------------------------------------------- # Benchmarking Allreduce # #processes = 16 # ( 8 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.07 0.07 0.07 4 1000 4.93 4.93 4.93 8 1000 4.99 4.99 4.99 16 1000 5.00 5.00 5.00 32 1000 5.13 5.14 5.14 64 1000 5.28 5.28 5.28 128 1000 5.92 5.93 5.93 256 1000 7.90 7.90 7.90 512 1000 8.78 8.79 8.78 1024 1000 10.82 10.82 10.82 2048 1000 16.29 16.30 16.29 4096 1000 20.39 20.39 20.39 8192 1000 24.90 24.91 24.91 16384 1000 35.66 35.67 35.66 32768 1000 56.44 56.45 56.44 65536 640 102.26 102.28 102.27 131072 320 309.54 309.77 309.66 262144 160 585.68 586.26 586.01 524288 80 1017.84 1020.21 1019.60 1048576 40 3772.57 3786.80 3784.24 2097152 20 7941.81 7992.05 7976.11 4194304 10 15729.40 15973.40 15897.86 #---------------------------------------------------------------- # Benchmarking Allreduce # #processes = 24 #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.07 0.07 0.07 4 1000 6.21 6.23 6.22 8 1000 6.36 6.37 6.36 16 1000 6.42 6.42 6.42 32 1000 6.56 6.57 6.57 64 1000 6.77 6.78 6.78 128 1000 7.49 7.49 7.49 256 1000 10.52 10.52 10.52 512 1000 11.46 11.46 11.46 1024 1000 13.85 13.86 13.86 2048 1000 24.70 24.71 24.71 4096 1000 31.16 31.17 31.16 8192 1000 38.98 38.99 38.98 16384 1000 53.59 53.61 53.60 32768 1000 82.82 82.85 82.83 65536 640 148.99 149.06 149.03 131072 320 311.10 311.31 311.23 262144 160 582.84 583.41 583.25 524288 80 1114.58 1117.55 1116.76 1048576 40 3899.65 3915.90 3912.97 2097152 20 7977.95 8018.15 8010.24 4194304 10 15813.61 16001.30 15967.71 #---------------------------------------------------------------- # Benchmarking Reduce # #processes = 2 # ( 22 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.08 0.08 0.08 4 1000 0.51 0.51 0.51 8 1000 0.52 0.52 0.52 16 1000 0.53 0.53 0.53 32 1000 0.58 0.58 0.58 64 1000 0.61 0.61 0.61 128 1000 0.62 0.62 0.62 256 1000 0.66 0.66 0.66 512 1000 0.73 0.73 0.73 1024 1000 0.83 0.83 0.83 2048 1000 1.07 1.07 1.07 4096 1000 1.51 1.51 1.51 8192 1000 2.49 2.49 2.49 16384 1000 5.10 5.11 5.11 32768 1000 8.18 8.19 8.19 65536 640 14.14 14.15 14.15 131072 320 28.57 28.62 28.59 262144 160 55.70 55.84 55.77 524288 80 109.71 110.30 110.00 1048576 40 217.40 219.57 218.48 2097152 20 506.45 516.61 511.53 4194304 10 1502.61 1558.71 1530.66 #---------------------------------------------------------------- # Benchmarking Reduce # #processes = 4 # ( 20 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.07 0.07 0.07 4 1000 0.96 0.97 0.97 8 1000 0.97 0.97 0.97 16 1000 0.99 0.99 0.99 32 1000 1.12 1.12 1.12 64 1000 1.10 1.10 1.10 128 1000 1.21 1.21 1.21 256 1000 1.27 1.27 1.27 512 1000 1.39 1.39 1.39 1024 1000 1.62 1.62 1.62 2048 1000 2.13 2.13 2.13 4096 1000 3.15 3.15 3.15 8192 1000 5.25 5.26 5.26 16384 1000 9.57 9.58 9.57 32768 1000 13.92 13.94 13.93 65536 640 24.19 24.23 24.22 131072 320 51.46 51.58 51.54 262144 160 101.96 102.30 102.20 524288 80 207.19 208.41 208.08 1048576 40 558.02 564.93 563.10 2097152 20 1706.95 1745.70 1735.80 4194304 10 3521.70 3663.21 3627.57 #---------------------------------------------------------------- # Benchmarking Reduce # #processes = 8 # ( 16 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.07 0.07 0.07 4 1000 1.30 1.30 1.30 8 1000 1.29 1.29 1.29 16 1000 1.31 1.31 1.31 32 1000 1.42 1.42 1.42 64 1000 1.43 1.44 1.43 128 1000 1.61 1.61 1.61 256 1000 1.68 1.68 1.68 512 1000 1.88 1.88 1.88 1024 1000 2.13 2.13 2.13 2048 1000 2.74 2.75 2.74 4096 1000 4.04 4.06 4.05 8192 1000 6.79 6.81 6.81 16384 1000 12.53 12.56 12.54 32768 1000 20.04 20.08 20.06 65536 640 35.37 35.45 35.42 131072 320 76.85 77.03 76.97 262144 160 166.52 166.94 166.81 524288 80 348.38 349.88 349.52 1048576 40 1216.52 1232.12 1229.47 2097152 20 3146.21 3236.40 3219.17 4194304 10 6220.79 6511.71 6464.65 #---------------------------------------------------------------- # Benchmarking Reduce # #processes = 16 # ( 8 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.07 0.07 0.07 4 1000 1.50 1.51 1.50 8 1000 1.49 1.50 1.49 16 1000 1.52 1.53 1.53 32 1000 1.65 1.66 1.66 64 1000 1.66 1.67 1.67 128 1000 1.80 1.81 1.81 256 1000 2.28 2.30 2.29 512 1000 2.41 2.43 2.42 1024 1000 2.76 2.78 2.78 2048 1000 3.45 3.48 3.47 4096 1000 4.90 4.94 4.92 8192 1000 8.05 8.11 8.08 16384 1000 14.52 14.61 14.57 32768 1000 23.26 23.38 23.33 65536 640 41.38 41.56 41.48 131072 320 91.05 91.42 91.27 262144 160 196.07 196.96 196.63 524288 80 402.13 404.96 404.19 1048576 40 1404.93 1451.13 1436.54 2097152 20 3741.15 3850.90 3831.81 4194304 10 6454.59 7741.19 7475.25 #---------------------------------------------------------------- # Benchmarking Reduce # #processes = 24 #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.07 0.07 0.07 4 1000 1.42 1.43 1.42 8 1000 1.41 1.42 1.42 16 1000 1.51 1.52 1.52 32 1000 1.59 1.60 1.60 64 1000 1.59 1.60 1.60 128 1000 1.72 1.73 1.72 256 1000 2.20 2.22 2.21 512 1000 2.33 2.35 2.34 1024 1000 2.65 2.67 2.66 2048 1000 3.37 3.40 3.39 4096 1000 4.82 4.88 4.85 8192 1000 8.14 8.23 8.19 16384 1000 15.03 15.19 15.12 32768 1000 24.79 24.98 24.91 65536 640 43.04 43.42 43.31 131072 320 93.92 94.78 94.48 262144 160 197.01 199.88 199.26 524288 80 409.06 415.26 413.84 1048576 40 1415.10 1443.40 1437.67 2097152 20 3593.05 3920.01 3880.33 4194304 10 7432.10 7916.50 7836.36 #---------------------------------------------------------------- # Benchmarking Reduce_scatter # #processes = 2 # ( 22 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.15 0.15 0.15 4 1000 0.85 0.99 0.92 8 1000 1.04 1.04 1.04 16 1000 1.07 1.07 1.07 32 1000 1.05 1.05 1.05 64 1000 1.06 1.06 1.06 128 1000 1.20 1.20 1.20 256 1000 1.27 1.28 1.27 512 1000 1.34 1.34 1.34 1024 1000 1.43 1.43 1.43 2048 1000 1.66 1.66 1.66 4096 1000 1.96 1.96 1.96 8192 1000 2.98 2.98 2.98 16384 1000 5.08 5.08 5.08 32768 1000 8.77 8.77 8.77 65536 640 23.56 23.57 23.57 131072 320 46.22 46.22 46.22 262144 160 88.95 88.96 88.95 524288 80 138.99 139.04 139.01 1048576 40 334.90 335.03 334.96 2097152 20 847.15 850.70 848.93 4194304 10 2274.11 2279.90 2277.00 #---------------------------------------------------------------- # Benchmarking Reduce_scatter # #processes = 4 # ( 20 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.15 0.16 0.16 4 1000 0.77 1.05 0.90 8 1000 1.28 1.46 1.37 16 1000 1.70 1.70 1.70 32 1000 1.58 1.58 1.58 64 1000 1.57 1.57 1.57 128 1000 1.74 1.74 1.74 256 1000 1.78 1.78 1.78 512 1000 1.92 1.92 1.92 1024 1000 2.04 2.04 2.04 2048 1000 2.34 2.34 2.34 4096 1000 2.73 2.73 2.73 8192 1000 3.91 3.91 3.91 16384 1000 6.63 6.63 6.63 32768 1000 12.21 12.21 12.21 65536 640 32.79 32.80 32.79 131072 320 66.89 66.92 66.91 262144 160 138.09 138.22 138.17 524288 80 264.67 264.81 264.75 1048576 40 626.95 628.85 628.07 2097152 20 2234.29 2243.49 2239.97 4194304 10 4673.60 4722.79 4703.04 #---------------------------------------------------------------- # Benchmarking Reduce_scatter # #processes = 8 # ( 16 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.16 0.22 0.17 4 1000 0.77 1.25 1.08 8 1000 1.86 2.57 2.19 16 1000 1.97 2.97 2.48 32 1000 2.71 2.71 2.71 64 1000 2.72 2.72 2.72 128 1000 2.82 2.82 2.82 256 1000 2.97 2.98 2.98 512 1000 3.28 3.28 3.28 1024 1000 3.37 3.37 3.37 2048 1000 3.93 3.93 3.93 4096 1000 4.90 4.90 4.90 8192 1000 6.97 6.97 6.97 16384 1000 11.49 11.49 11.49 32768 1000 20.52 20.52 20.52 65536 640 59.68 59.69 59.68 131072 320 106.78 106.82 106.80 262144 160 207.98 208.19 208.09 524288 80 382.66 382.94 382.81 1048576 40 1719.80 1723.83 1722.05 2097152 20 4206.35 4226.35 4218.17 4194304 10 8294.32 8378.51 8341.53 #---------------------------------------------------------------- # Benchmarking Reduce_scatter # #processes = 16 # ( 8 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.18 0.22 0.19 4 1000 0.74 1.25 1.15 8 1000 3.47 6.27 5.03 16 1000 3.07 3.96 3.49 32 1000 3.00 4.88 4.42 64 1000 5.61 5.62 5.61 128 1000 6.27 6.27 6.27 256 1000 6.23 6.24 6.23 512 1000 7.36 7.37 7.37 1024 1000 8.80 8.81 8.80 2048 1000 10.04 10.04 10.04 4096 1000 12.75 12.76 12.75 8192 1000 18.13 18.13 18.13 16384 1000 29.93 29.94 29.93 32768 1000 77.05 77.06 77.05 65536 640 113.30 113.33 113.32 131072 320 171.33 171.42 171.37 262144 160 285.70 286.06 285.86 524288 80 856.34 857.38 856.71 1048576 40 2044.27 2052.97 2047.71 2097152 20 4925.05 4954.35 4937.86 4194304 10 9671.09 9762.91 9709.19 #---------------------------------------------------------------- # Benchmarking Reduce_scatter # #processes = 24 #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.22 0.25 0.22 4 1000 0.85 1.40 1.26 8 1000 3.93 6.37 5.37 16 1000 2.67 4.23 3.51 32 1000 3.12 4.37 4.03 64 1000 5.25 6.03 5.94 128 1000 7.29 7.30 7.29 256 1000 8.35 8.36 8.36 512 1000 10.26 10.26 10.26 1024 1000 11.52 11.53 11.53 2048 1000 13.59 13.59 13.59 4096 1000 17.53 17.53 17.53 8192 1000 26.81 26.82 26.82 16384 1000 42.95 42.95 42.95 32768 1000 98.44 98.46 98.45 65536 640 157.83 157.87 157.85 131072 320 226.02 226.14 226.07 262144 160 348.68 349.10 348.87 524288 80 878.21 879.30 878.55 1048576 40 2421.80 2442.97 2427.07 2097152 20 5182.10 5232.80 5199.84 4194304 10 9923.39 10058.78 9978.86 #---------------------------------------------------------------- # Benchmarking Allgather # #processes = 2 # ( 22 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.06 0.06 0.06 1 1000 0.76 0.76 0.76 2 1000 0.76 0.76 0.76 4 1000 0.76 0.76 0.76 8 1000 0.75 0.75 0.75 16 1000 0.76 0.76 0.76 32 1000 0.76 0.76 0.76 64 1000 0.83 0.83 0.83 128 1000 0.86 0.86 0.86 256 1000 0.81 0.81 0.81 512 1000 0.96 0.96 0.96 1024 1000 1.06 1.06 1.06 2048 1000 1.01 1.01 1.01 4096 1000 1.43 1.43 1.43 8192 1000 2.59 2.59 2.59 16384 1000 4.64 4.64 4.64 32768 1000 8.93 8.93 8.93 65536 640 24.86 24.87 24.87 131072 320 50.73 50.75 50.74 262144 160 111.69 111.71 111.70 524288 80 206.33 206.39 206.36 1048576 40 705.58 707.15 706.36 2097152 20 1412.65 1415.80 1414.23 4194304 10 3361.51 3367.40 3364.46 #---------------------------------------------------------------- # Benchmarking Allgather # #processes = 4 # ( 20 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.06 0.07 0.06 1 1000 1.18 1.19 1.19 2 1000 1.29 1.29 1.29 4 1000 1.24 1.24 1.24 8 1000 1.22 1.22 1.22 16 1000 1.18 1.18 1.18 32 1000 1.29 1.29 1.29 64 1000 1.32 1.32 1.32 128 1000 1.34 1.34 1.34 256 1000 1.46 1.46 1.46 512 1000 1.65 1.65 1.65 1024 1000 1.98 1.98 1.98 2048 1000 2.64 2.64 2.64 4096 1000 4.07 4.07 4.07 8192 1000 7.16 7.16 7.16 16384 1000 13.63 13.63 13.63 32768 1000 36.80 36.81 36.80 65536 640 99.55 99.56 99.56 131072 320 201.04 201.06 201.05 262144 160 380.07 380.21 380.14 524288 80 840.92 841.32 841.13 1048576 40 2867.75 2870.45 2869.16 2097152 20 6166.85 6172.20 6169.59 4194304 10 12623.72 12633.20 12628.47 #---------------------------------------------------------------- # Benchmarking Allgather # #processes = 8 # ( 16 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.06 0.06 0.06 1 1000 2.51 2.51 2.51 2 1000 2.55 2.55 2.55 4 1000 2.70 2.70 2.70 8 1000 2.49 2.49 2.49 16 1000 3.08 3.08 3.08 32 1000 2.70 2.70 2.70 64 1000 2.81 2.81 2.81 128 1000 2.97 2.97 2.97 256 1000 3.31 3.31 3.31 512 1000 4.15 4.15 4.15 1024 1000 6.03 6.04 6.03 2048 1000 8.86 8.86 8.86 4096 1000 14.03 14.03 14.03 8192 1000 24.41 24.41 24.41 16384 1000 49.60 49.61 49.60 32768 1000 164.06 164.10 164.08 65536 640 342.65 342.73 342.69 131072 320 740.06 740.36 740.27 262144 160 1837.58 1839.16 1838.44 524288 80 4438.87 4444.59 4442.74 1048576 40 9037.63 9047.28 9043.57 2097152 20 18316.40 18381.11 18363.57 4194304 10 37692.90 38048.89 37936.62 #---------------------------------------------------------------- # Benchmarking Allgather # #processes = 16 # ( 8 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.06 0.07 0.07 1 1000 4.64 4.64 4.64 2 1000 4.69 4.69 4.69 4 1000 6.79 6.80 6.79 8 1000 7.71 7.71 7.71 16 1000 5.18 5.19 5.19 32 1000 7.43 7.43 7.43 64 1000 8.80 8.81 8.80 128 1000 11.09 11.09 11.09 256 1000 15.96 15.96 15.96 512 1000 16.84 16.84 16.84 1024 1000 18.96 18.97 18.96 2048 1000 24.42 24.42 24.42 4096 1000 35.84 35.85 35.84 8192 1000 64.35 64.37 64.36 16384 1000 159.97 160.01 160.00 32768 1000 369.96 370.09 370.03 65536 640 751.63 751.94 751.79 131072 320 1700.02 1701.25 1700.71 262144 160 4193.10 4198.79 4196.37 524288 80 8799.40 8824.36 8814.37 1048576 40 17779.67 17876.17 17838.73 2097152 20 35526.55 35903.20 35757.28 4194304 10 71247.01 72761.42 72179.58 #---------------------------------------------------------------- # Benchmarking Allgather # #processes = 24 #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.06 0.07 0.06 1 1000 7.43 7.43 7.43 2 1000 7.41 7.41 7.41 4 1000 10.78 10.79 10.78 8 1000 12.10 12.11 12.11 16 1000 9.61 9.61 9.61 32 1000 12.12 12.13 12.12 64 1000 13.68 13.68 13.68 128 1000 16.57 16.58 16.58 256 1000 24.32 24.33 24.32 512 1000 25.24 25.25 25.25 1024 1000 28.06 28.06 28.06 2048 1000 36.02 36.03 36.02 4096 1000 54.57 54.58 54.58 8192 1000 109.09 109.12 109.11 16384 1000 259.22 259.31 259.27 32768 1000 567.21 567.47 567.35 65536 640 1168.46 1168.98 1168.72 131072 320 2984.71 2985.52 2985.18 262144 160 6279.37 6282.99 6281.21 524288 80 12895.59 12909.62 12904.38 1048576 40 26761.90 26845.48 26802.94 2097152 20 53564.14 53785.55 53689.54 4194304 10 108390.31 109396.79 108972.31 #---------------------------------------------------------------- # Benchmarking Allgatherv # #processes = 2 # ( 22 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.07 0.07 0.07 1 1000 0.81 0.81 0.81 2 1000 0.78 0.78 0.78 4 1000 0.78 0.78 0.78 8 1000 0.77 0.77 0.77 16 1000 0.78 0.78 0.78 32 1000 0.80 0.80 0.80 64 1000 0.85 0.85 0.85 128 1000 0.65 0.65 0.65 256 1000 0.67 0.67 0.67 512 1000 0.75 0.75 0.75 1024 1000 0.83 0.83 0.83 2048 1000 1.04 1.04 1.04 4096 1000 1.52 1.52 1.52 8192 1000 2.72 2.72 2.72 16384 1000 4.65 4.65 4.65 32768 1000 8.99 8.99 8.99 65536 640 25.18 25.18 25.18 131072 320 50.63 50.64 50.64 262144 160 111.76 111.78 111.77 524288 80 207.56 207.65 207.61 1048576 40 704.47 706.07 705.27 2097152 20 1424.10 1427.30 1425.70 4194304 10 3386.59 3393.29 3389.94 #---------------------------------------------------------------- # Benchmarking Allgatherv # #processes = 4 # ( 20 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.08 0.08 0.08 1 1000 1.38 1.38 1.38 2 1000 1.41 1.41 1.41 4 1000 1.48 1.48 1.48 8 1000 1.36 1.36 1.36 16 1000 1.39 1.39 1.39 32 1000 1.46 1.46 1.46 64 1000 1.48 1.48 1.48 128 1000 1.58 1.58 1.58 256 1000 1.65 1.65 1.65 512 1000 1.84 1.84 1.84 1024 1000 2.09 2.09 2.09 2048 1000 2.71 2.71 2.71 4096 1000 4.17 4.17 4.17 8192 1000 7.32 7.32 7.32 16384 1000 13.76 13.76 13.76 32768 1000 28.77 28.77 28.77 65536 640 99.55 99.55 99.55 131072 320 201.16 201.17 201.17 262144 160 379.61 379.74 379.68 524288 80 854.34 854.74 854.55 1048576 40 2863.80 2866.48 2865.18 2097152 20 6178.06 6183.40 6180.78 4194304 10 12632.20 12642.12 12637.19 #---------------------------------------------------------------- # Benchmarking Allgatherv # #processes = 8 # ( 16 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.09 0.10 0.10 1 1000 2.70 2.70 2.70 2 1000 2.90 2.91 2.90 4 1000 2.90 2.90 2.90 8 1000 2.95 2.95 2.95 16 1000 2.96 2.96 2.96 32 1000 3.07 3.07 3.07 64 1000 3.20 3.20 3.20 128 1000 5.22 5.22 5.22 256 1000 5.37 5.37 5.37 512 1000 6.25 6.26 6.25 1024 1000 6.89 6.89 6.89 2048 1000 9.09 9.09 9.09 4096 1000 14.21 14.21 14.21 8192 1000 24.67 24.67 24.67 16384 1000 50.84 50.85 50.84 32768 1000 136.01 136.04 136.03 65536 640 342.09 342.16 342.14 131072 320 704.06 704.34 704.25 262144 160 1795.79 1797.38 1796.66 524288 80 4509.52 4515.49 4513.45 1048576 40 9026.85 9037.17 9033.06 2097152 20 18240.30 18303.80 18286.56 4194304 10 37155.29 37365.20 37307.55 #---------------------------------------------------------------- # Benchmarking Allgatherv # #processes = 16 # ( 8 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.12 0.12 0.12 1 1000 5.74 5.74 5.74 2 1000 6.15 6.16 6.16 4 1000 6.21 6.21 6.21 8 1000 6.39 6.40 6.40 16 1000 24.92 24.92 24.92 32 1000 8.11 8.12 8.11 64 1000 9.53 9.54 9.54 128 1000 11.49 11.49 11.49 256 1000 16.22 16.22 16.22 512 1000 17.17 17.17 17.17 1024 1000 19.25 19.25 19.25 2048 1000 24.70 24.71 24.70 4096 1000 36.08 36.09 36.08 8192 1000 65.37 65.38 65.37 16384 1000 160.27 160.31 160.29 32768 1000 400.87 401.03 400.95 65536 640 727.00 727.31 727.15 131072 320 1694.90 1695.94 1695.50 262144 160 4173.21 4177.66 4176.18 524288 80 8614.27 8637.39 8628.45 1048576 40 17764.55 17848.00 17817.51 2097152 20 35529.99 35900.34 35758.65 4194304 10 71096.21 72718.60 72050.05 #---------------------------------------------------------------- # Benchmarking Allgatherv # #processes = 24 #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.14 0.15 0.14 1 1000 8.77 8.78 8.78 2 1000 8.64 8.64 8.64 4 1000 9.00 9.00 9.00 8 1000 9.30 9.30 9.30 16 1000 10.03 10.03 10.03 32 1000 13.17 13.17 13.17 64 1000 14.68 14.69 14.69 128 1000 17.24 17.24 17.24 256 1000 25.31 25.32 25.32 512 1000 26.17 26.17 26.17 1024 1000 29.10 29.10 29.10 2048 1000 36.74 36.75 36.75 4096 1000 55.53 55.54 55.53 8192 1000 111.44 111.46 111.45 16384 1000 258.51 258.58 258.56 32768 1000 569.73 569.97 569.86 65536 640 1192.52 1193.03 1192.77 131072 320 2958.93 2959.76 2959.40 262144 160 6274.03 6278.38 6276.35 524288 80 12981.31 13021.38 13005.19 1048576 40 26623.00 26704.23 26663.82 2097152 20 53795.10 54059.79 53939.01 4194304 10 108473.42 109467.20 109038.23 #---------------------------------------------------------------- # Benchmarking Gather # #processes = 2 # ( 22 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.07 0.07 0.07 1 1000 0.51 0.51 0.51 2 1000 0.50 0.50 0.50 4 1000 0.51 0.51 0.51 8 1000 0.52 0.52 0.52 16 1000 0.54 0.54 0.54 32 1000 0.57 0.57 0.57 64 1000 0.58 0.58 0.58 128 1000 0.58 0.58 0.58 256 1000 0.62 0.62 0.62 512 1000 0.69 0.69 0.69 1024 1000 0.76 0.76 0.76 2048 1000 0.95 0.95 0.95 4096 1000 1.31 1.31 1.31 8192 1000 2.01 2.01 2.01 16384 1000 3.42 3.42 3.42 32768 1000 5.98 5.98 5.98 65536 640 11.77 11.77 11.77 131072 320 25.99 26.00 25.99 262144 160 56.13 56.16 56.14 524288 80 108.18 108.25 108.21 1048576 40 348.27 349.87 349.07 2097152 20 743.76 746.95 745.36 4194304 10 1749.49 1756.50 1753.00 #---------------------------------------------------------------- # Benchmarking Gather # #processes = 4 # ( 20 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.07 0.07 0.07 1 1000 0.49 0.49 0.49 2 1000 0.49 0.50 0.50 4 1000 0.49 0.49 0.49 8 1000 0.50 0.50 0.50 16 1000 0.51 0.51 0.51 32 1000 0.52 0.53 0.53 64 1000 0.53 0.53 0.53 128 1000 0.59 0.59 0.59 256 1000 0.61 0.61 0.61 512 1000 0.65 0.65 0.65 1024 1000 0.74 0.74 0.74 2048 1000 0.93 0.93 0.93 4096 1000 1.42 1.43 1.42 8192 1000 2.59 2.60 2.59 16384 1000 3.75 3.76 3.75 32768 1000 8.09 8.11 8.10 65536 640 24.45 24.50 24.48 131072 320 47.09 47.24 47.18 262144 160 94.42 94.93 94.73 524288 80 221.30 223.47 222.62 1048576 40 622.43 639.68 632.75 2097152 20 1319.85 1383.65 1358.74 4194304 10 2874.30 3118.90 3025.17 #---------------------------------------------------------------- # Benchmarking Gather # #processes = 8 # ( 16 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.06 0.07 0.07 1 1000 0.64 0.64 0.64 2 1000 0.63 0.63 0.63 4 1000 0.64 0.64 0.64 8 1000 0.64 0.64 0.64 16 1000 0.65 0.65 0.65 32 1000 0.70 0.71 0.70 64 1000 0.70 0.70 0.70 128 1000 0.76 0.77 0.77 256 1000 0.84 0.84 0.84 512 1000 0.92 0.92 0.92 1024 1000 1.10 1.11 1.10 2048 1000 1.45 1.46 1.45 4096 1000 2.43 2.44 2.44 8192 1000 4.49 4.51 4.50 16384 1000 9.31 9.35 9.33 32768 1000 21.90 21.98 21.95 65536 640 39.88 40.08 40.00 131072 320 88.45 89.20 88.89 262144 160 301.64 305.38 303.93 524288 80 624.66 640.82 634.49 1048576 40 950.83 1015.50 988.55 2097152 20 2441.05 2674.85 2578.95 4194304 10 5353.21 6261.80 5873.66 #---------------------------------------------------------------- # Benchmarking Gather # #processes = 16 # ( 8 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.06 0.07 0.06 1 1000 2.96 2.98 2.97 2 1000 3.01 3.03 3.03 4 1000 3.03 3.05 3.04 8 1000 3.27 3.29 3.28 16 1000 3.46 3.49 3.48 32 1000 3.65 3.68 3.67 64 1000 3.85 3.88 3.87 128 1000 4.16 4.19 4.18 256 1000 4.64 4.67 4.66 512 1000 5.68 5.73 5.71 1024 1000 3.62 3.66 3.64 2048 1000 5.79 5.86 5.83 4096 1000 9.61 9.72 9.67 8192 1000 17.17 17.35 17.27 16384 1000 31.74 32.02 31.89 32768 1000 80.63 81.08 80.93 65536 640 137.34 138.33 138.01 131072 320 294.64 303.34 301.08 262144 160 645.97 674.93 665.59 524288 80 1459.43 1528.14 1517.65 1048576 40 2441.70 2691.53 2581.51 2097152 20 4833.84 5673.69 5338.92 4194304 10 7139.71 10444.09 9080.67 #---------------------------------------------------------------- # Benchmarking Gather # #processes = 24 #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.06 0.07 0.06 1 1000 3.40 3.43 3.42 2 1000 3.49 3.53 3.52 4 1000 3.56 3.60 3.59 8 1000 3.79 3.83 3.82 16 1000 4.06 4.11 4.10 32 1000 4.27 4.32 4.31 64 1000 4.45 4.50 4.49 128 1000 4.77 4.83 4.81 256 1000 5.49 5.55 5.54 512 1000 7.11 7.20 7.18 1024 1000 13.15 13.35 13.26 2048 1000 16.22 16.50 16.39 4096 1000 21.99 22.32 22.19 8192 1000 36.30 36.85 36.64 16384 1000 63.29 64.21 63.86 32768 1000 128.05 128.60 128.38 65536 640 234.69 236.93 236.24 131072 320 550.43 561.37 558.37 262144 160 997.82 1062.08 1044.15 524288 80 1978.58 2134.55 2082.94 1048576 40 3857.51 4336.60 4170.06 2097152 20 6203.40 8955.74 8081.92 4194304 10 6404.40 17363.10 13952.92 #---------------------------------------------------------------- # Benchmarking Gatherv # #processes = 2 # ( 22 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.19 0.20 0.19 1 1000 0.56 0.56 0.56 2 1000 0.56 0.56 0.56 4 1000 0.57 0.57 0.57 8 1000 0.58 0.58 0.58 16 1000 0.57 0.57 0.57 32 1000 0.59 0.59 0.59 64 1000 0.59 0.59 0.59 128 1000 0.61 0.61 0.61 256 1000 0.65 0.65 0.65 512 1000 0.70 0.70 0.70 1024 1000 0.78 0.78 0.78 2048 1000 0.97 0.97 0.97 4096 1000 1.31 1.31 1.31 8192 1000 2.07 2.08 2.08 16384 1000 3.47 3.48 3.47 32768 1000 6.02 6.03 6.03 65536 640 11.80 11.81 11.80 131072 320 26.00 26.02 26.01 262144 160 55.64 55.67 55.66 524288 80 108.89 108.96 108.93 1048576 40 348.22 349.83 349.02 2097152 20 745.14 748.35 746.74 4194304 10 1749.01 1755.31 1752.16 #---------------------------------------------------------------- # Benchmarking Gatherv # #processes = 4 # ( 20 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.15 0.15 0.15 1 1000 0.59 0.59 0.59 2 1000 0.58 0.58 0.58 4 1000 0.58 0.58 0.58 8 1000 0.59 0.59 0.59 16 1000 0.59 0.59 0.59 32 1000 0.61 0.61 0.61 64 1000 0.61 0.61 0.61 128 1000 0.64 0.65 0.65 256 1000 0.67 0.67 0.67 512 1000 0.73 0.73 0.73 1024 1000 0.80 0.81 0.80 2048 1000 1.01 1.01 1.01 4096 1000 1.47 1.48 1.48 8192 1000 2.70 2.71 2.70 16384 1000 3.84 3.84 3.84 32768 1000 8.28 8.30 8.29 65536 640 24.49 24.54 24.52 131072 320 47.04 47.18 47.12 262144 160 94.67 95.21 95.00 524288 80 221.39 223.56 222.71 1048576 40 624.90 642.05 635.15 2097152 20 1321.45 1384.89 1360.12 4194304 10 2922.80 3177.69 3081.42 #---------------------------------------------------------------- # Benchmarking Gatherv # #processes = 8 # ( 16 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.13 0.14 0.14 1 1000 0.74 0.74 0.74 2 1000 0.74 0.75 0.74 4 1000 0.73 0.73 0.73 8 1000 0.73 0.74 0.73 16 1000 0.75 0.75 0.75 32 1000 0.79 0.79 0.79 64 1000 0.78 0.79 0.79 128 1000 0.83 0.84 0.84 256 1000 0.87 0.88 0.88 512 1000 0.95 0.95 0.95 1024 1000 1.12 1.12 1.12 2048 1000 1.49 1.49 1.49 4096 1000 2.45 2.46 2.45 8192 1000 4.54 4.56 4.55 16384 1000 9.35 9.38 9.36 32768 1000 22.06 22.14 22.10 65536 640 39.98 40.19 40.10 131072 320 88.76 89.48 89.18 262144 160 304.91 308.65 307.21 524288 80 629.51 645.81 639.43 1048576 40 980.12 1044.85 1017.79 2097152 20 2301.94 2554.20 2454.09 4194304 10 4985.59 5955.20 5558.15 #---------------------------------------------------------------- # Benchmarking Gatherv # #processes = 16 # ( 8 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.13 0.14 0.14 1 1000 2.92 2.95 2.94 2 1000 2.92 2.96 2.94 4 1000 2.93 2.96 2.94 8 1000 2.97 3.01 2.99 16 1000 2.93 2.97 2.95 32 1000 3.10 3.14 3.12 64 1000 3.08 3.13 3.11 128 1000 3.57 3.62 3.59 256 1000 4.20 4.26 4.23 512 1000 4.52 4.57 4.55 1024 1000 5.38 5.45 5.42 2048 1000 7.43 7.52 7.48 4096 1000 11.40 11.53 11.47 8192 1000 19.69 19.91 19.81 16384 1000 36.80 37.18 37.01 32768 1000 182.44 183.68 183.15 65536 640 277.45 282.98 281.02 131072 320 444.85 462.63 454.09 262144 160 857.58 929.07 892.21 524288 80 1400.89 1480.66 1461.65 1048576 40 2338.90 2650.25 2514.69 2097152 20 4534.10 5485.45 5065.11 4194304 10 6945.01 10144.02 8800.70 #---------------------------------------------------------------- # Benchmarking Gatherv # #processes = 24 #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.13 0.13 0.13 1 1000 11.01 11.20 11.12 2 1000 11.02 11.20 11.12 4 1000 11.03 11.22 11.14 8 1000 10.46 10.64 10.56 16 1000 11.53 11.71 11.64 32 1000 10.69 10.90 10.81 64 1000 10.63 10.81 10.73 128 1000 10.48 10.66 10.58 256 1000 11.56 11.78 11.68 512 1000 11.95 12.17 12.07 1024 1000 12.32 12.56 12.45 2048 1000 16.82 17.15 17.00 4096 1000 23.68 24.14 23.93 8192 1000 38.71 39.46 39.13 16384 1000 70.69 72.04 71.45 32768 1000 409.99 415.01 412.81 65536 640 614.90 625.45 620.98 131072 320 948.43 1015.67 1004.02 262144 160 1646.64 1799.35 1739.53 524288 80 2333.61 2561.31 2486.69 1048576 40 4328.88 4863.08 4657.07 2097152 20 6470.86 9682.75 8597.10 4194304 10 6930.21 18903.59 14729.66 #---------------------------------------------------------------- # Benchmarking Scatter # #processes = 2 # ( 22 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.07 0.07 0.07 1 1000 0.48 0.49 0.49 2 1000 0.49 0.49 0.49 4 1000 0.48 0.48 0.48 8 1000 0.48 0.48 0.48 16 1000 0.50 0.50 0.50 32 1000 0.53 0.53 0.53 64 1000 0.55 0.55 0.55 128 1000 0.56 0.56 0.56 256 1000 0.61 0.61 0.61 512 1000 0.66 0.66 0.66 1024 1000 0.75 0.75 0.75 2048 1000 0.96 0.96 0.96 4096 1000 1.41 1.41 1.41 8192 1000 2.47 2.47 2.47 16384 1000 4.63 4.63 4.63 32768 1000 8.23 8.23 8.23 65536 640 15.31 15.32 15.32 131072 320 28.65 28.66 28.65 262144 160 59.19 59.21 59.20 524288 80 112.73 112.80 112.76 1048576 40 380.05 381.23 380.64 2097152 20 831.45 834.14 832.80 4194304 10 2100.30 2106.40 2103.35 #---------------------------------------------------------------- # Benchmarking Scatter # #processes = 4 # ( 20 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.07 0.07 0.07 1 1000 0.72 0.72 0.72 2 1000 0.71 0.71 0.71 4 1000 0.72 0.72 0.72 8 1000 0.71 0.71 0.71 16 1000 0.74 0.74 0.74 32 1000 0.81 0.81 0.81 64 1000 0.83 0.84 0.84 128 1000 0.85 0.85 0.85 256 1000 0.95 0.95 0.95 512 1000 1.02 1.02 1.02 1024 1000 1.23 1.23 1.23 2048 1000 1.75 1.75 1.75 4096 1000 1.60 1.61 1.60 8192 1000 2.85 2.86 2.86 16384 1000 4.85 4.85 4.85 32768 1000 8.87 8.87 8.87 65536 640 25.32 25.35 25.34 131072 320 54.19 54.29 54.25 262144 160 102.22 102.61 102.46 524288 80 225.66 227.40 226.74 1048576 40 798.62 809.77 805.38 2097152 20 1696.21 1749.25 1728.75 4194304 10 3241.30 3463.89 3379.32 #---------------------------------------------------------------- # Benchmarking Scatter # #processes = 8 # ( 16 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.07 0.07 0.07 1 1000 1.06 1.06 1.06 2 1000 1.08 1.08 1.08 4 1000 1.08 1.08 1.08 8 1000 1.17 1.17 1.17 16 1000 1.18 1.18 1.18 32 1000 1.22 1.22 1.22 64 1000 1.28 1.29 1.28 128 1000 1.37 1.37 1.37 256 1000 1.56 1.56 1.56 512 1000 1.88 1.88 1.88 1024 1000 2.58 2.58 2.58 2048 1000 4.26 4.26 4.26 4096 1000 2.74 2.75 2.75 8192 1000 5.04 5.05 5.05 16384 1000 10.14 10.16 10.16 32768 1000 20.61 20.65 20.64 65536 640 44.92 44.98 44.96 131072 320 99.53 99.76 99.68 262144 160 283.41 284.36 284.02 524288 80 598.79 605.16 602.69 1048576 40 1428.55 1469.35 1451.96 2097152 20 2653.00 2866.51 2780.99 4194304 10 4987.50 5904.41 5525.93 #---------------------------------------------------------------- # Benchmarking Scatter # #processes = 16 # ( 8 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.06 0.07 0.07 1 1000 1.78 1.79 1.78 2 1000 1.75 1.76 1.76 4 1000 1.78 1.78 1.78 8 1000 1.86 1.86 1.86 16 1000 1.91 1.91 1.91 32 1000 2.02 2.03 2.02 64 1000 2.19 2.20 2.20 128 1000 2.55 2.56 2.56 256 1000 3.03 3.04 3.04 512 1000 3.85 3.86 3.86 1024 1000 5.75 5.76 5.76 2048 1000 9.65 9.67 9.66 4096 1000 10.04 10.06 10.05 8192 1000 17.24 17.28 17.26 16384 1000 31.75 31.81 31.79 32768 1000 85.84 86.16 86.00 65536 640 153.81 155.17 154.85 131072 320 305.43 309.15 308.39 262144 160 646.46 650.87 648.94 524288 80 1257.90 1326.69 1317.23 1048576 40 2624.33 2777.13 2712.36 2097152 20 4986.00 5730.81 5446.34 4194304 10 6095.29 9577.89 8317.25 #---------------------------------------------------------------- # Benchmarking Scatter # #processes = 24 #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.07 0.07 0.07 1 1000 2.16 2.16 2.16 2 1000 2.13 2.14 2.14 4 1000 2.26 2.27 2.27 8 1000 2.34 2.34 2.34 16 1000 2.53 2.54 2.54 32 1000 2.66 2.66 2.66 64 1000 2.78 2.79 2.78 128 1000 3.20 3.20 3.20 256 1000 3.86 3.87 3.87 512 1000 5.27 5.28 5.27 1024 1000 8.00 8.02 8.01 2048 1000 13.92 13.94 13.93 4096 1000 21.37 21.47 21.43 8192 1000 33.89 34.03 33.97 16384 1000 58.91 59.14 59.05 32768 1000 256.07 256.94 256.60 65536 640 305.34 307.33 306.56 131072 320 540.24 549.51 546.60 262144 160 1001.16 1020.53 1014.20 524288 80 2022.70 2188.65 2135.73 1048576 40 4080.00 4467.15 4335.84 2097152 20 6610.45 9110.64 8300.14 4194304 10 6034.09 18255.90 14702.80 #---------------------------------------------------------------- # Benchmarking Scatterv # #processes = 2 # ( 22 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.19 0.19 0.19 1 1000 0.58 0.58 0.58 2 1000 0.59 0.59 0.59 4 1000 0.60 0.60 0.60 8 1000 0.60 0.60 0.60 16 1000 0.61 0.61 0.61 32 1000 0.60 0.60 0.60 64 1000 0.60 0.60 0.60 128 1000 0.62 0.62 0.62 256 1000 0.65 0.65 0.65 512 1000 0.73 0.73 0.73 1024 1000 0.81 0.81 0.81 2048 1000 0.98 0.98 0.98 4096 1000 1.37 1.37 1.37 8192 1000 2.31 2.31 2.31 16384 1000 4.09 4.10 4.09 32768 1000 7.36 7.36 7.36 65536 640 15.43 15.44 15.43 131072 320 28.55 28.56 28.56 262144 160 59.05 59.07 59.06 524288 80 113.03 113.09 113.06 1048576 40 378.90 380.07 379.49 2097152 20 842.81 845.35 844.08 4194304 10 2097.20 2103.21 2100.21 #---------------------------------------------------------------- # Benchmarking Scatterv # #processes = 4 # ( 20 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.14 0.15 0.15 1 1000 0.76 0.76 0.76 2 1000 0.75 0.75 0.75 4 1000 0.76 0.76 0.76 8 1000 0.76 0.76 0.76 16 1000 0.76 0.76 0.76 32 1000 0.76 0.76 0.76 64 1000 0.74 0.75 0.74 128 1000 0.80 0.80 0.80 256 1000 0.83 0.83 0.83 512 1000 0.92 0.92 0.92 1024 1000 1.03 1.03 1.03 2048 1000 1.31 1.31 1.31 4096 1000 2.00 2.00 2.00 8192 1000 3.38 3.38 3.38 16384 1000 5.84 5.85 5.85 32768 1000 10.80 10.80 10.80 65536 640 27.74 27.76 27.75 131072 320 54.95 55.05 55.01 262144 160 102.48 102.87 102.72 524288 80 227.03 228.74 228.08 1048576 40 802.43 813.60 809.19 2097152 20 1685.15 1738.45 1717.72 4194304 10 3248.60 3472.90 3387.47 #---------------------------------------------------------------- # Benchmarking Scatterv # #processes = 8 # ( 16 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.13 0.14 0.13 1 1000 1.47 1.47 1.47 2 1000 1.49 1.49 1.49 4 1000 1.45 1.46 1.45 8 1000 1.45 1.46 1.46 16 1000 1.46 1.46 1.46 32 1000 1.46 1.47 1.47 64 1000 1.45 1.46 1.45 128 1000 1.59 1.59 1.59 256 1000 1.64 1.64 1.64 512 1000 1.83 1.83 1.83 1024 1000 2.12 2.13 2.13 2048 1000 2.71 2.71 2.71 4096 1000 4.02 4.03 4.03 8192 1000 6.71 6.72 6.71 16384 1000 12.99 13.01 13.00 32768 1000 26.07 26.09 26.09 65536 640 45.13 45.19 45.17 131072 320 95.65 95.88 95.80 262144 160 282.49 283.40 283.07 524288 80 607.36 613.66 611.18 1048576 40 1441.77 1483.60 1465.96 2097152 20 2669.95 2884.76 2798.91 4194304 10 4998.21 5909.30 5532.41 #---------------------------------------------------------------- # Benchmarking Scatterv # #processes = 16 # ( 8 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.11 0.14 0.11 1 1000 3.56 3.57 3.57 2 1000 3.57 3.57 3.57 4 1000 3.53 3.54 3.53 8 1000 3.45 3.45 3.45 16 1000 3.33 3.33 3.33 32 1000 3.41 3.41 3.41 64 1000 3.42 3.43 3.43 128 1000 3.74 3.75 3.74 256 1000 4.22 4.23 4.23 512 1000 4.77 4.78 4.78 1024 1000 5.48 5.49 5.48 2048 1000 7.55 7.57 7.55 4096 1000 12.33 12.36 12.34 8192 1000 20.85 20.89 20.87 16384 1000 37.70 37.76 37.73 32768 1000 84.87 85.20 85.03 65536 640 156.53 157.92 157.60 131072 320 301.44 304.54 303.82 262144 160 641.33 646.51 644.20 524288 80 1234.50 1302.62 1292.91 1048576 40 2622.75 2775.00 2711.28 2097152 20 4980.00 5731.95 5445.08 4194304 10 6093.10 9566.50 8306.00 #---------------------------------------------------------------- # Benchmarking Scatterv # #processes = 24 #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.11 0.11 0.11 1 1000 10.52 10.56 10.54 2 1000 10.49 10.50 10.50 4 1000 10.52 10.54 10.53 8 1000 10.56 10.58 10.56 16 1000 11.56 11.57 11.56 32 1000 10.74 10.76 10.75 64 1000 10.43 10.45 10.44 128 1000 10.69 10.72 10.71 256 1000 11.91 11.94 11.92 512 1000 13.04 13.07 13.06 1024 1000 13.44 13.46 13.45 2048 1000 17.10 17.12 17.11 4096 1000 23.99 24.02 24.00 8192 1000 39.70 39.77 39.73 16384 1000 72.14 72.27 72.20 32768 1000 250.43 252.09 251.14 65536 640 306.03 307.80 307.05 131072 320 541.53 551.00 548.03 262144 160 1001.92 1021.25 1015.08 524288 80 2011.78 2177.79 2124.52 1048576 40 4045.22 4499.98 4350.65 2097152 20 6613.80 9116.10 8303.95 4194304 10 6030.80 18276.38 14665.27 #---------------------------------------------------------------- # Benchmarking Alltoall # #processes = 2 # ( 22 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.06 0.06 0.06 1 1000 1.21 1.21 1.21 2 1000 1.18 1.18 1.18 4 1000 1.20 1.20 1.20 8 1000 1.19 1.19 1.19 16 1000 1.17 1.17 1.17 32 1000 1.20 1.20 1.20 64 1000 1.19 1.19 1.19 128 1000 1.15 1.15 1.15 256 1000 1.24 1.24 1.24 512 1000 1.35 1.35 1.35 1024 1000 3.03 3.03 3.03 2048 1000 1.57 1.57 1.57 4096 1000 1.52 1.52 1.52 8192 1000 3.35 3.35 3.35 16384 1000 4.91 4.91 4.91 32768 1000 9.70 9.70 9.70 65536 640 28.64 28.64 28.64 131072 320 52.94 52.96 52.95 262144 160 105.36 105.38 105.37 524288 80 199.73 199.79 199.76 1048576 40 698.33 699.93 699.13 2097152 20 1639.50 1642.55 1641.02 4194304 10 3995.01 4001.12 3998.06 #---------------------------------------------------------------- # Benchmarking Alltoall # #processes = 4 # ( 20 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.06 0.06 0.06 1 1000 2.07 2.07 2.07 2 1000 2.07 2.07 2.07 4 1000 2.00 2.00 2.00 8 1000 2.05 2.05 2.05 16 1000 2.09 2.09 2.09 32 1000 2.00 2.00 2.00 64 1000 2.01 2.01 2.01 128 1000 2.11 2.11 2.11 256 1000 2.19 2.19 2.19 512 1000 2.36 2.36 2.36 1024 1000 6.50 6.50 6.50 2048 1000 3.41 3.42 3.42 4096 1000 4.28 4.28 4.28 8192 1000 7.74 7.74 7.74 16384 1000 14.71 14.71 14.71 32768 1000 38.32 38.33 38.33 65536 640 103.59 103.61 103.60 131072 320 196.45 196.53 196.50 262144 160 402.66 402.91 402.77 524288 80 1478.48 1479.26 1479.03 1048576 40 3052.82 3055.43 3054.12 2097152 20 6274.75 6281.95 6278.59 4194304 10 12699.51 12708.09 12703.75 #---------------------------------------------------------------- # Benchmarking Alltoall # #processes = 8 # ( 16 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.06 0.06 0.06 1 1000 5.86 5.87 5.86 2 1000 5.96 5.96 5.96 4 1000 5.98 5.98 5.98 8 1000 5.89 5.89 5.89 16 1000 5.84 5.84 5.84 32 1000 5.85 5.85 5.85 64 1000 5.89 5.89 5.89 128 1000 6.47 6.47 6.47 256 1000 6.88 6.89 6.89 512 1000 7.67 7.67 7.67 1024 1000 16.66 16.66 16.66 2048 1000 11.81 11.81 11.81 4096 1000 20.75 20.75 20.75 8192 1000 32.35 32.36 32.35 16384 1000 85.77 85.77 85.77 32768 1000 180.59 180.60 180.59 65536 640 358.74 358.80 358.78 131072 320 1212.72 1213.08 1212.99 262144 160 3388.01 3391.17 3389.88 524288 80 6728.37 6734.03 6732.49 1048576 40 10541.90 10569.28 10564.85 2097152 20 20693.76 20757.90 20735.97 4194304 10 40874.79 41150.40 41074.33 #---------------------------------------------------------------- # Benchmarking Alltoall # #processes = 16 # ( 8 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.06 0.07 0.06 1 1000 13.62 13.62 13.62 2 1000 13.61 13.61 13.61 4 1000 13.75 13.75 13.75 8 1000 13.94 13.94 13.94 16 1000 14.51 14.51 14.51 32 1000 17.22 17.23 17.22 64 1000 18.02 18.03 18.02 128 1000 19.80 19.80 19.80 256 1000 23.64 23.65 23.64 512 1000 30.93 30.94 30.94 1024 1000 113.74 113.78 113.76 2048 1000 79.49 79.51 79.50 4096 1000 126.87 126.90 126.88 8192 1000 212.70 212.75 212.72 16384 1000 368.50 368.57 368.54 32768 1000 985.19 985.39 985.29 65536 640 1918.82 1919.14 1918.99 131072 320 3889.70 3890.25 3889.94 262144 160 7837.26 7841.18 7839.13 524288 80 15084.49 15096.49 15089.91 1048576 40 26202.33 26225.07 26214.45 2097152 20 52012.21 52081.80 52044.25 4194304 10 103580.50 103787.02 103702.73 #---------------------------------------------------------------- # Benchmarking Alltoall # #processes = 24 #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.06 0.06 0.06 1 1000 18.10 18.10 18.10 2 1000 18.29 18.29 18.29 4 1000 18.42 18.42 18.42 8 1000 18.89 18.89 18.89 16 1000 20.17 20.17 20.17 32 1000 23.94 23.94 23.94 64 1000 25.33 25.34 25.33 128 1000 28.79 28.80 28.79 256 1000 40.88 40.89 40.88 512 1000 64.16 64.19 64.18 1024 1000 109.20 109.25 109.23 2048 1000 300.23 300.30 300.27 4096 1000 374.07 374.17 374.13 8192 1000 594.70 594.82 594.76 16384 1000 1069.68 1069.85 1069.77 32768 1000 2721.12 2721.27 2721.20 65536 640 5456.17 5456.48 5456.35 131072 320 9123.14 9123.75 9123.47 262144 160 17243.67 17247.48 17245.66 524288 80 30838.92 30852.01 30846.95 1048576 40 58251.50 58279.28 58262.90 2097152 20 115973.51 116078.15 116047.46 4194304 10 232326.39 233073.52 232758.07 #---------------------------------------------------------------- # Benchmarking Alltoallv # #processes = 2 # ( 22 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.26 0.28 0.27 1 1000 1.13 1.13 1.13 2 1000 1.17 1.17 1.17 4 1000 1.15 1.15 1.15 8 1000 1.15 1.15 1.15 16 1000 1.15 1.15 1.15 32 1000 1.18 1.18 1.18 64 1000 1.18 1.18 1.18 128 1000 1.15 1.15 1.15 256 1000 1.20 1.20 1.20 512 1000 1.25 1.25 1.25 1024 1000 1.35 1.35 1.35 2048 1000 1.62 1.62 1.62 4096 1000 2.15 2.15 2.15 8192 1000 3.38 3.38 3.38 16384 1000 5.55 5.55 5.55 32768 1000 10.49 10.49 10.49 65536 640 29.07 29.07 29.07 131072 320 53.32 53.33 53.33 262144 160 105.51 105.54 105.52 524288 80 203.01 203.08 203.05 1048576 40 699.03 700.60 699.81 2097152 20 1634.20 1637.70 1635.95 4194304 10 3995.01 4001.90 3998.46 #---------------------------------------------------------------- # Benchmarking Alltoallv # #processes = 4 # ( 20 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.30 0.31 0.31 1 1000 2.22 2.22 2.22 2 1000 2.15 2.16 2.15 4 1000 2.15 2.15 2.15 8 1000 2.14 2.14 2.14 16 1000 2.12 2.12 2.12 32 1000 2.12 2.12 2.12 64 1000 2.11 2.11 2.11 128 1000 2.18 2.18 2.18 256 1000 2.35 2.35 2.35 512 1000 2.50 2.50 2.50 1024 1000 2.80 2.80 2.80 2048 1000 3.46 3.46 3.46 4096 1000 4.74 4.74 4.74 8192 1000 7.84 7.84 7.84 16384 1000 15.50 15.50 15.50 32768 1000 40.18 40.20 40.19 65536 640 103.71 103.73 103.72 131072 320 196.47 196.54 196.51 262144 160 403.31 403.78 403.56 524288 80 1473.09 1474.35 1473.76 1048576 40 3046.70 3049.45 3048.11 2097152 20 6265.64 6272.65 6269.37 4194304 10 12696.70 12705.59 12701.09 #---------------------------------------------------------------- # Benchmarking Alltoallv # #processes = 8 # ( 16 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.37 0.47 0.40 1 1000 6.03 6.03 6.03 2 1000 6.05 6.05 6.05 4 1000 5.96 5.97 5.97 8 1000 6.04 6.04 6.04 16 1000 5.98 5.98 5.98 32 1000 5.98 5.98 5.98 64 1000 5.99 5.99 5.99 128 1000 6.51 6.52 6.51 256 1000 14.33 14.33 14.33 512 1000 7.79 7.80 7.79 1024 1000 9.11 9.11 9.11 2048 1000 11.91 11.91 11.91 4096 1000 18.92 18.92 18.92 8192 1000 32.49 32.50 32.50 16384 1000 83.20 83.21 83.21 32768 1000 183.32 183.37 183.35 65536 640 359.73 359.78 359.75 131072 320 1211.81 1212.31 1212.13 262144 160 3391.06 3395.31 3394.28 524288 80 6637.94 6651.80 6648.10 1048576 40 10556.25 10577.57 10570.40 2097152 20 20717.55 20780.60 20763.06 4194304 10 40825.01 41127.71 40976.69 #---------------------------------------------------------------- # Benchmarking Alltoallv # #processes = 16 # ( 8 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.50 0.61 0.53 1 1000 15.75 15.76 15.76 2 1000 15.62 15.63 15.62 4 1000 15.50 15.50 15.50 8 1000 15.94 15.94 15.94 16 1000 16.11 16.12 16.11 32 1000 16.77 16.78 16.78 64 1000 17.48 17.48 17.48 128 1000 19.71 19.72 19.71 256 1000 34.19 34.20 34.20 512 1000 38.35 38.36 38.36 1024 1000 48.36 48.37 48.36 2048 1000 79.72 79.74 79.73 4096 1000 126.79 126.82 126.80 8192 1000 211.50 211.55 211.52 16384 1000 367.93 368.01 367.96 32768 1000 996.61 996.75 996.70 65536 640 1840.40 1840.81 1840.67 131072 320 3807.13 3808.53 3807.81 262144 160 8074.01 8080.38 8077.69 524288 80 15598.12 15620.77 15614.72 1048576 40 27058.92 27113.68 27087.95 2097152 20 52679.79 52836.44 52802.47 4194304 10 104415.92 104942.70 104675.66 #---------------------------------------------------------------- # Benchmarking Alltoallv # #processes = 24 #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.65 0.74 0.67 1 1000 293.39 293.44 293.43 2 1000 288.98 289.07 289.05 4 1000 286.20 286.27 286.25 8 1000 287.02 287.07 287.06 16 1000 289.71 289.75 289.73 32 1000 331.31 331.39 331.36 64 1000 292.07 292.14 292.12 128 1000 291.39 291.43 291.42 256 1000 286.44 286.50 286.48 512 1000 270.12 270.19 270.16 1024 1000 254.18 254.23 254.21 2048 1000 300.85 300.91 300.88 4096 1000 373.58 373.66 373.62 8192 1000 594.37 594.52 594.45 16384 1000 1068.46 1068.74 1068.63 32768 1000 3580.25 3580.46 3580.39 65536 640 6129.04 6129.85 6129.41 131072 320 9652.28 9654.76 9653.57 262144 160 17818.87 17829.06 17824.97 524288 80 32538.07 32553.89 32548.30 1048576 40 61332.12 61385.62 61363.46 2097152 20 118996.00 119199.14 119122.90 4194304 10 232128.00 232712.20 232498.88 #---------------------------------------------------------------- # Benchmarking Bcast # #processes = 2 # ( 22 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.05 0.06 0.06 1 1000 0.38 0.38 0.38 2 1000 0.40 0.40 0.40 4 1000 0.40 0.40 0.40 8 1000 0.40 0.40 0.40 16 1000 0.40 0.40 0.40 32 1000 0.44 0.44 0.44 64 1000 0.46 0.46 0.46 128 1000 0.48 0.48 0.48 256 1000 0.50 0.50 0.50 512 1000 0.56 0.56 0.56 1024 1000 0.65 0.65 0.65 2048 1000 0.83 0.84 0.83 4096 1000 1.19 1.19 1.19 8192 1000 1.94 1.94 1.94 16384 1000 3.47 3.47 3.47 32768 1000 6.13 6.13 6.13 65536 640 10.04 10.05 10.04 131072 320 18.33 18.34 18.33 262144 160 37.16 37.18 37.17 524288 80 68.81 68.89 68.85 1048576 40 291.17 292.78 291.98 2097152 20 579.00 582.90 580.95 4194304 10 1155.50 1162.00 1158.75 #---------------------------------------------------------------- # Benchmarking Bcast # #processes = 4 # ( 20 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.05 0.05 0.05 1 1000 0.61 0.61 0.61 2 1000 0.62 0.62 0.62 4 1000 0.61 0.62 0.62 8 1000 0.63 0.63 0.63 16 1000 0.65 0.65 0.65 32 1000 0.69 0.69 0.69 64 1000 0.70 0.70 0.70 128 1000 0.76 0.76 0.76 256 1000 0.82 0.82 0.82 512 1000 0.90 0.90 0.90 1024 1000 1.02 1.03 1.03 2048 1000 1.34 1.35 1.35 4096 1000 2.00 2.00 2.00 8192 1000 3.17 3.18 3.17 16384 1000 5.49 5.50 5.50 32768 1000 9.96 9.98 9.97 65536 640 21.82 21.83 21.83 131072 320 46.93 46.96 46.94 262144 160 92.13 92.21 92.18 524288 80 176.58 176.83 176.71 1048576 40 697.95 701.62 699.97 2097152 20 1385.75 1392.29 1389.29 4194304 10 2732.68 2745.70 2739.64 #---------------------------------------------------------------- # Benchmarking Bcast # #processes = 8 # ( 16 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.05 0.05 0.05 1 1000 1.09 1.09 1.09 2 1000 1.09 1.09 1.09 4 1000 1.09 1.09 1.09 8 1000 1.10 1.10 1.10 16 1000 1.11 1.11 1.11 32 1000 1.22 1.23 1.22 64 1000 1.24 1.24 1.24 128 1000 1.40 1.41 1.40 256 1000 1.42 1.42 1.42 512 1000 1.59 1.59 1.59 1024 1000 1.84 1.84 1.84 2048 1000 2.28 2.29 2.29 4096 1000 3.37 3.38 3.38 8192 1000 5.43 5.45 5.44 16384 1000 10.31 10.34 10.33 32768 1000 16.47 16.51 16.49 65536 640 29.66 29.74 29.72 131072 320 58.67 58.89 58.84 262144 160 136.46 137.48 137.29 524288 80 309.12 314.99 314.09 1048576 40 683.35 708.48 705.03 2097152 20 2121.54 2164.79 2154.88 4194304 10 4588.60 4743.91 4718.64 #---------------------------------------------------------------- # Benchmarking Bcast # #processes = 16 # ( 8 additional processes waiting in MPI_Barrier) #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.05 0.05 0.05 1 1000 1.28 1.29 1.29 2 1000 1.27 1.28 1.27 4 1000 1.27 1.28 1.28 8 1000 1.28 1.29 1.28 16 1000 1.28 1.29 1.29 32 1000 1.37 1.38 1.37 64 1000 1.38 1.39 1.39 128 1000 1.61 1.62 1.62 256 1000 1.97 1.99 1.98 512 1000 2.07 2.08 2.08 1024 1000 2.37 2.39 2.38 2048 1000 2.99 3.01 3.00 4096 1000 4.25 4.29 4.27 8192 1000 6.92 6.98 6.95 16384 1000 12.55 12.62 12.60 32768 1000 20.16 20.25 20.21 65536 640 36.27 36.45 36.39 131072 320 70.90 71.37 71.22 262144 160 166.30 167.46 167.17 524288 80 361.59 363.95 363.35 1048576 40 830.90 839.05 837.31 2097152 20 2515.75 2572.85 2562.82 4194304 10 4931.21 5855.89 5633.50 #---------------------------------------------------------------- # Benchmarking Bcast # #processes = 24 #---------------------------------------------------------------- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec] 0 1000 0.05 0.06 0.05 1 1000 1.20 1.22 1.21 2 1000 1.20 1.21 1.20 4 1000 1.20 1.22 1.21 8 1000 1.21 1.22 1.22 16 1000 1.22 1.23 1.23 32 1000 1.29 1.31 1.30 64 1000 1.31 1.32 1.31 128 1000 1.52 1.54 1.53 256 1000 1.88 1.90 1.89 512 1000 1.96 1.98 1.97 1024 1000 2.26 2.29 2.27 2048 1000 2.87 2.90 2.89 4096 1000 4.19 4.23 4.21 8192 1000 7.15 7.23 7.19 16384 1000 13.16 13.28 13.23 32768 1000 22.67 22.82 22.77 65536 640 38.16 38.47 38.36 131072 320 75.05 75.51 75.32 262144 160 169.04 171.03 170.55 524288 80 362.73 367.06 365.88 1048576 40 838.55 847.42 845.51 2097152 20 2483.84 2562.25 2538.52 4194304 10 5485.99 6022.41 5951.07 #--------------------------------------------------- # Benchmarking Barrier # #processes = 2 # ( 22 additional processes waiting in MPI_Barrier) #--------------------------------------------------- #repetitions t_min[usec] t_max[usec] t_avg[usec] 1000 0.55 0.55 0.55 #--------------------------------------------------- # Benchmarking Barrier # #processes = 4 # ( 20 additional processes waiting in MPI_Barrier) #--------------------------------------------------- #repetitions t_min[usec] t_max[usec] t_avg[usec] 1000 0.94 0.94 0.94 #--------------------------------------------------- # Benchmarking Barrier # #processes = 8 # ( 16 additional processes waiting in MPI_Barrier) #--------------------------------------------------- #repetitions t_min[usec] t_max[usec] t_avg[usec] 1000 1.73 1.73 1.73 #--------------------------------------------------- # Benchmarking Barrier # #processes = 16 # ( 8 additional processes waiting in MPI_Barrier) #--------------------------------------------------- #repetitions t_min[usec] t_max[usec] t_avg[usec] 1000 4.55 4.55 4.55 #--------------------------------------------------- # Benchmarking Barrier # #processes = 24 #--------------------------------------------------- #repetitions t_min[usec] t_max[usec] t_avg[usec] 1000 5.57 5.57 5.57 # All processes entering MPI_Finalize [proxy:0:1@compute-0-1.local] got pmi command (from 42): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 21): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 24): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 27): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 34): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 36): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 33): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 37): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 6): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 7): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 31): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 16): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 25): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 28): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 22): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 11): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 8): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 18): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 30): barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 39): barrier_in [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=barrier_in [proxy:0:1@compute-0-1.local] got pmi command (from 15): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 19): barrier_in [proxy:0:0@compute-0-0.local] got pmi command (from 13): barrier_in [proxy:0:0@compute-0-0.local] forwarding command (cmd=barrier_in) upstream [proxy:0:1@compute-0-1.local] got pmi command (from 7): barrier_in [mpiexec@compute-0-1.local] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@compute-0-1.local] PMI response to fd 0 pid 7: cmd=barrier_out [mpiexec@compute-0-1.local] PMI response to fd 6 pid 7: cmd=barrier_out [proxy:0:1@compute-0-1.local] forwarding command (cmd=barrier_in) upstream [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:0@compute-0-0.local] PMI response: cmd=barrier_out [proxy:0:1@compute-0-1.local] got pmi command (from 7): finalize [proxy:0:1@compute-0-1.local] PMI response: cmd=finalize_ack [proxy:0:0@compute-0-0.local] got pmi command (from 22): finalize [proxy:0:0@compute-0-0.local] PMI response: cmd=finalize_ack [proxy:0:1@compute-0-1.local] got pmi command (from 11): finalize [proxy:0:1@compute-0-1.local] PMI response: cmd=finalize_ack [proxy:0:1@compute-0-1.local] got pmi command (from 15): finalize [proxy:0:1@compute-0-1.local] PMI response: cmd=finalize_ack [proxy:0:0@compute-0-0.local] got pmi command (from 7): finalize [proxy:0:0@compute-0-0.local] PMI response: cmd=finalize_ack [proxy:0:0@compute-0-0.local] got pmi command (from 8): finalize [proxy:0:0@compute-0-0.local] PMI response: cmd=finalize_ack [proxy:0:1@compute-0-1.local] got pmi command (from 18): finalize [proxy:0:1@compute-0-1.local] PMI response: cmd=finalize_ack [proxy:0:1@compute-0-1.local] got pmi command (from 24): finalize [proxy:0:1@compute-0-1.local] PMI response: cmd=finalize_ack [proxy:0:1@compute-0-1.local] got pmi command (from 27): finalize [proxy:0:1@compute-0-1.local] PMI response: cmd=finalize_ack [proxy:0:1@compute-0-1.local] got pmi command (from 42): finalize [proxy:0:1@compute-0-1.local] PMI response: cmd=finalize_ack [proxy:0:0@compute-0-0.local] got pmi command (from 16): finalize [proxy:0:0@compute-0-0.local] PMI response: cmd=finalize_ack [proxy:0:0@compute-0-0.local] got pmi command (from 6): finalize [proxy:0:0@compute-0-0.local] PMI response: cmd=finalize_ack [proxy:0:0@compute-0-0.local] got pmi command (from 25): finalize [proxy:0:0@compute-0-0.local] PMI response: cmd=finalize_ack [proxy:0:1@compute-0-1.local] got pmi command (from 21): finalize [proxy:0:1@compute-0-1.local] PMI response: cmd=finalize_ack [proxy:0:1@compute-0-1.local] got pmi command (from 30): finalize [proxy:0:1@compute-0-1.local] PMI response: cmd=finalize_ack [proxy:0:0@compute-0-0.local] got pmi command (from 31): finalize [proxy:0:0@compute-0-0.local] PMI response: cmd=finalize_ack [proxy:0:0@compute-0-0.local] got pmi command (from 37): finalize [proxy:0:0@compute-0-0.local] PMI response: cmd=finalize_ack [proxy:0:0@compute-0-0.local] got pmi command (from 28): finalize [proxy:0:0@compute-0-0.local] PMI response: cmd=finalize_ack [proxy:0:0@compute-0-0.local] got pmi command (from 34): finalize [proxy:0:0@compute-0-0.local] PMI response: cmd=finalize_ack [proxy:0:0@compute-0-0.local] got pmi command (from 13): finalize [proxy:0:0@compute-0-0.local] PMI response: cmd=finalize_ack [proxy:0:1@compute-0-1.local] got pmi command (from 36): finalize [proxy:0:1@compute-0-1.local] PMI response: cmd=finalize_ack [proxy:0:1@compute-0-1.local] got pmi command (from 33): finalize [proxy:0:1@compute-0-1.local] PMI response: cmd=finalize_ack [proxy:0:1@compute-0-1.local] got pmi command (from 39): finalize [proxy:0:1@compute-0-1.local] PMI response: cmd=finalize_ack [proxy:0:0@compute-0-0.local] got pmi command (from 19): finalize [proxy:0:0@compute-0-0.local] PMI response: cmd=finalize_ack