host: ip-10-252-65-194.us-west-2.compute.internal host: ip-10-252-67-17.us-west-2.compute.internal ================================================================================================== mpiexec options: ---------------- Base path: /shared/home/derek/MD/Linux-x86_64/IntelMPI4/bin/ Launcher: sge Debug level: 1 Enable X: -1 Global environment: ------------------- I_MPI_PERHOST=allcores MANPATH=I_MPI_SUBSTITUTE_INSTALLDIR/man:/sched/sge/sge-2011.11/man:/sched/sge/sge-2011.11/man:/sched/sge/sge-2011.11/man:/sched/sge/sge-2011.11/man:/usr/man:$/opt/ganglia/current/share/man:/usr/java/default/man:$/opt/ganglia/current/share/man:/usr/java/default/man SGE_INFOTEXT_MAX_COLUMN=5000 HISTSIZE=1000 CYCLECLOUD_BOOTSTRAP=/opt/cycle/jetpack/system/bootstrap LC_ALL=C LD_LIBRARY_PATH=I_MPI_SUBSTITUTE_INSTALLDIR/intel64/lib:/shared/home/derek/MD/Linux-x86_64/IntelMKL/lib:/shared/home/derek/MD/Linux-x86_64/IntelMPI4/lib:/shared/home/derek/MD/Linux-x86_64/IntelMKL/lib:/shared/home/derek/MD/Linux-x86_64/IntelMPI4/lib:/sched/sge/sge-2011.11/lib/linux-x64:/sched/sge/sge-2011.11/lib/linux-x64 MAIL=/var/spool/mail/rahul PWD=/shared/home/rahul/DFT-MD/single_atom/Si JAVA_HOME=/usr/java/default SGE_EXECD_PORT=538 SGE_RSH_COMMAND=/usr/bin/ssh -X SGE_ACCOUNT=sge SGE_NOMSG=1 REQNAME=struc1 GRIDLOCALEDIR=/sched/sge/sge-2011.11/locale HISTCONTROL=ignoredups SHLVL=9 SGE_CWD_PATH=/shared/home/rahul/DFT-MD/single_atom/Si I_MPI_HYDRA_BOOTSTRAP=sge LESSOPEN=||/usr/bin/lesspipe.sh %s CYCLECLOUD_HOME=/opt/cycle/jetpack SGE_CLUSTER_NAME=grid1 G_BROKEN_FILENAMES=1 I_MPI_ROOT=I_MPI_SUBSTITUTE_INSTALLDIR _=/shared/home/derek/MD/Linux-x86_64/IntelMPI4/bin/mpiexec.hydra Hydra internal environment: --------------------------- MPICH_ENABLE_CKPOINT=1 GFORTRAN_UNBUFFERED_PRECONNECTED=y User set environment: --------------------- I_MPI_DEBUG=5 Proxy information: ********************* [1] proxy: ip-10-252-65-194.us-west-2.compute.internal (8 cores) Exec list: /shared/home/derek/programs/vasp/vasp5/vasp_parallel (8 processes); [2] proxy: ip-10-252-67-17.us-west-2.compute.internal (8 cores) Exec list: /shared/home/derek/programs/vasp/vasp5/vasp_parallel (8 processes); ================================================================================================== [mpiexec@ip-10-252-65-194] Timeout set to -1 (-1 means infinite) [mpiexec@ip-10-252-65-194] Got a control port string of ip-10-252-65-194.us-west-2.compute.internal:41140 Proxy launch args: /shared/home/derek/MD/Linux-x86_64/IntelMPI4/bin/pmi_proxy --control-port ip-10-252-65-194.us-west-2.compute.internal:41140 --debug --pmi-connect lazy-cache --pmi-aggregate --preload libVTmc.so -s 0 --rmk sge --launcher sge --demux poll --pgid 0 --enable-stdin 1 --retries 10 --control-code 1053606612 --proxy-id [mpiexec@ip-10-252-65-194] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1 Arguments being passed to proxy 0: --version 1.4.1p1 --iface-ip-env-name MPICH_INTERFACE_HOSTNAME --hostname ip-10-252-65-194.us-west-2.compute.internal --global-core-map 0,8,8 --filler-process-map 0,8,8 --global-process-count 16 --auto-cleanup 1 --pmi-rank -1 --pmi-kvsname kvs_77005_0 --pmi-process-mapping (vector,(0,2,8)) --topolib ipl --ckpointlib blcr --ckpoint-prefix /tmp --ckpoint-preserve 1 --ckpoint off --ckpoint-num -1 --global-inherited-env 26 'I_MPI_PERHOST=allcores' 'MANPATH=I_MPI_SUBSTITUTE_INSTALLDIR/man:/sched/sge/sge-2011.11/man:/sched/sge/sge-2011.11/man:/sched/sge/sge-2011.11/man:/sched/sge/sge-2011.11/man:/usr/man:$/opt/ganglia/current/share/man:/usr/java/default/man:$/opt/ganglia/current/share/man:/usr/java/default/man' 'SGE_INFOTEXT_MAX_COLUMN=5000' 'HISTSIZE=1000' 'CYCLECLOUD_BOOTSTRAP=/opt/cycle/jetpack/system/bootstrap' 'LC_ALL=C' 'LD_LIBRARY_PATH=I_MPI_SUBSTITUTE_INSTALLDIR/intel64/lib:/shared/home/derek/MD/Linux-x86_64/IntelMKL/lib:/shared/home/derek/MD/Linux-x86_64/IntelMPI4/lib:/shared/home/derek/MD/Linux-x86_64/IntelMKL/lib:/shared/home/derek/MD/Linux-x86_64/IntelMPI4/lib:/sched/sge/sge-2011.11/lib/linux-x64:/sched/sge/sge-2011.11/lib/linux-x64' 'MAIL=/var/spool/mail/rahul' 'PWD=/shared/home/rahul/DFT-MD/single_atom/Si' 'JAVA_HOME=/usr/java/default' 'SGE_EXECD_PORT=538' 'SGE_RSH_COMMAND=/usr/bin/ssh -X' 'SGE_ACCOUNT=sge' 'SGE_NOMSG=1' 'REQNAME=struc1' 'GRIDLOCALEDIR=/sched/sge/sge-2011.11/locale' 'HISTCONTROL=ignoredups' 'SHLVL=9' 'SGE_CWD_PATH=/shared/home/rahul/DFT-MD/single_atom/Si' 'I_MPI_HYDRA_BOOTSTRAP=sge' 'LESSOPEN=||/usr/bin/lesspipe.sh %s' 'CYCLECLOUD_HOME=/opt/cycle/jetpack' 'SGE_CLUSTER_NAME=grid1' 'G_BROKEN_FILENAMES=1' 'I_MPI_ROOT=I_MPI_SUBSTITUTE_INSTALLDIR' '_=/shared/home/derek/MD/Linux-x86_64/IntelMPI4/bin/mpiexec.hydra' --global-user-env 1 'I_MPI_DEBUG=5' --global-system-env 2 'MPICH_ENABLE_CKPOINT=1' 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 8 --exec --exec-appnum 0 --exec-proc-count 8 --exec-local-env 0 --exec-wdir /shared/home/rahul/DFT-MD/single_atom/Si --exec-args 1 /shared/home/derek/programs/vasp/vasp5/vasp_parallel [mpiexec@ip-10-252-65-194] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1 Arguments being passed to proxy 1: --version 1.4.1p1 --iface-ip-env-name MPICH_INTERFACE_HOSTNAME --hostname ip-10-252-67-17.us-west-2.compute.internal --global-core-map 8,8,0 --filler-process-map 8,8,0 --global-process-count 16 --auto-cleanup 1 --pmi-rank -1 --pmi-kvsname kvs_77005_0 --pmi-process-mapping (vector,(0,2,8)) --topolib ipl --ckpointlib blcr --ckpoint-prefix /tmp --ckpoint-preserve 1 --ckpoint off --ckpoint-num -1 --global-inherited-env 26 'I_MPI_PERHOST=allcores' 'MANPATH=I_MPI_SUBSTITUTE_INSTALLDIR/man:/sched/sge/sge-2011.11/man:/sched/sge/sge-2011.11/man:/sched/sge/sge-2011.11/man:/sched/sge/sge-2011.11/man:/usr/man:$/opt/ganglia/current/share/man:/usr/java/default/man:$/opt/ganglia/current/share/man:/usr/java/default/man' 'SGE_INFOTEXT_MAX_COLUMN=5000' 'HISTSIZE=1000' 'CYCLECLOUD_BOOTSTRAP=/opt/cycle/jetpack/system/bootstrap' 'LC_ALL=C' 'LD_LIBRARY_PATH=I_MPI_SUBSTITUTE_INSTALLDIR/intel64/lib:/shared/home/derek/MD/Linux-x86_64/IntelMKL/lib:/shared/home/derek/MD/Linux-x86_64/IntelMPI4/lib:/shared/home/derek/MD/Linux-x86_64/IntelMKL/lib:/shared/home/derek/MD/Linux-x86_64/IntelMPI4/lib:/sched/sge/sge-2011.11/lib/linux-x64:/sched/sge/sge-2011.11/lib/linux-x64' 'MAIL=/var/spool/mail/rahul' 'PWD=/shared/home/rahul/DFT-MD/single_atom/Si' 'JAVA_HOME=/usr/java/default' 'SGE_EXECD_PORT=538' 'SGE_RSH_COMMAND=/usr/bin/ssh -X' 'SGE_ACCOUNT=sge' 'SGE_NOMSG=1' 'REQNAME=struc1' 'GRIDLOCALEDIR=/sched/sge/sge-2011.11/locale' 'HISTCONTROL=ignoredups' 'SHLVL=9' 'SGE_CWD_PATH=/shared/home/rahul/DFT-MD/single_atom/Si' 'I_MPI_HYDRA_BOOTSTRAP=sge' 'LESSOPEN=||/usr/bin/lesspipe.sh %s' 'CYCLECLOUD_HOME=/opt/cycle/jetpack' 'SGE_CLUSTER_NAME=grid1' 'G_BROKEN_FILENAMES=1' 'I_MPI_ROOT=I_MPI_SUBSTITUTE_INSTALLDIR' '_=/shared/home/derek/MD/Linux-x86_64/IntelMPI4/bin/mpiexec.hydra' --global-user-env 1 'I_MPI_DEBUG=5' --global-system-env 2 'MPICH_ENABLE_CKPOINT=1' 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 8 --exec --exec-appnum 0 --exec-proc-count 8 --exec-local-env 0 --exec-wdir /shared/home/rahul/DFT-MD/single_atom/Si --exec-args 1 /shared/home/derek/programs/vasp/vasp5/vasp_parallel [mpiexec@ip-10-252-65-194] Launch arguments: /sched/sge/sge-2011.11/bin/linux-x64/qrsh -inherit -V ip-10-252-65-194.us-west-2.compute.internal /shared/home/derek/MD/Linux-x86_64/IntelMPI4/bin/pmi_proxy --control-port ip-10-252-65-194.us-west-2.compute.internal:41140 --debug --pmi-connect lazy-cache --pmi-aggregate --preload libVTmc.so -s 0 --rmk sge --launcher sge --demux poll --pgid 0 --enable-stdin 1 --retries 10 --control-code 1053606612 --proxy-id 0 [mpiexec@ip-10-252-65-194] Launch arguments: /sched/sge/sge-2011.11/bin/linux-x64/qrsh -inherit -V ip-10-252-67-17.us-west-2.compute.internal /shared/home/derek/MD/Linux-x86_64/IntelMPI4/bin/pmi_proxy --control-port ip-10-252-65-194.us-west-2.compute.internal:41140 --debug --pmi-connect lazy-cache --pmi-aggregate --preload libVTmc.so -s 0 --rmk sge --launcher sge --demux poll --pgid 0 --enable-stdin 1 --retries 10 --control-code 1053606612 --proxy-id 1 [mpiexec@ip-10-252-65-194] STDIN will be redirected to 1 fd(s): 7 [proxy:0:0@ip-10-252-65-194] Start PMI_proxy 0 [proxy:0:1@ip-10-252-67-17] Start PMI_proxy 1 [proxy:0:0@ip-10-252-65-194] STDIN will be redirected to 1 fd(s): 9 [proxy:0:0@ip-10-252-65-194] got pmi command (from 6): init pmi_version=1 pmi_subversion=1 [proxy:0:0@ip-10-252-65-194] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 7): init pmi_version=1 pmi_subversion=1 [proxy:0:0@ip-10-252-65-194] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 8): init pmi_version=1 pmi_subversion=1 [proxy:0:0@ip-10-252-65-194] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 13): init pmi_version=1 pmi_subversion=1 [proxy:0:0@ip-10-252-65-194] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 16): init pmi_version=1 pmi_subversion=1 [proxy:0:0@ip-10-252-65-194] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 19): init pmi_version=1 pmi_subversion=1 [proxy:0:0@ip-10-252-65-194] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 22): init pmi_version=1 pmi_subversion=1 [proxy:0:0@ip-10-252-65-194] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 25): init pmi_version=1 pmi_subversion=1 [proxy:0:0@ip-10-252-65-194] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 6): get_maxes [proxy:0:0@ip-10-252-65-194] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@ip-10-252-65-194] got pmi command (from 7): get_maxes [proxy:0:0@ip-10-252-65-194] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@ip-10-252-65-194] got pmi command (from 8): get_maxes [proxy:0:0@ip-10-252-65-194] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@ip-10-252-65-194] got pmi command (from 13): get_maxes [proxy:0:0@ip-10-252-65-194] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@ip-10-252-65-194] got pmi command (from 16): get_maxes [proxy:0:0@ip-10-252-65-194] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@ip-10-252-65-194] got pmi command (from 19): get_maxes [proxy:0:0@ip-10-252-65-194] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@ip-10-252-65-194] got pmi command (from 6): barrier_in [proxy:0:0@ip-10-252-65-194] got pmi command (from 7): barrier_in [proxy:0:0@ip-10-252-65-194] got pmi command (from 8): barrier_in [proxy:0:0@ip-10-252-65-194] got pmi command (from 13): barrier_in [proxy:0:0@ip-10-252-65-194] got pmi command (from 25): get_maxes [proxy:0:0@ip-10-252-65-194] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@ip-10-252-65-194] got pmi command (from 16): barrier_in [proxy:0:0@ip-10-252-65-194] got pmi command (from 19): barrier_in [proxy:0:0@ip-10-252-65-194] got pmi command (from 25): barrier_in [proxy:0:0@ip-10-252-65-194] got pmi command (from 22): get_maxes [proxy:0:0@ip-10-252-65-194] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [mpiexec@ip-10-252-65-194] [pgid: 0] got PMI command: cmd=barrier_in [proxy:0:0@ip-10-252-65-194] got pmi command (from 22): barrier_in [proxy:0:0@ip-10-252-65-194] forwarding command (cmd=barrier_in) upstream [proxy:0:1@ip-10-252-67-17] got pmi command (from 6): init pmi_version=1 pmi_subversion=1 [proxy:0:1@ip-10-252-67-17] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 7): init pmi_version=1 pmi_subversion=1 [proxy:0:1@ip-10-252-67-17] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 9): init pmi_version=1 pmi_subversion=1 [proxy:0:1@ip-10-252-67-17] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 12): init pmi_version=1 pmi_subversion=1 [proxy:0:1@ip-10-252-67-17] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 15): init pmi_version=1 pmi_subversion=1 [proxy:0:1@ip-10-252-67-17] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 18): init pmi_version=1 pmi_subversion=1 [proxy:0:1@ip-10-252-67-17] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 21): init pmi_version=1 pmi_subversion=1 [proxy:0:1@ip-10-252-67-17] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 7): get_maxes [proxy:0:1@ip-10-252-67-17] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@ip-10-252-67-17] got pmi command (from 9): get_maxes [proxy:0:1@ip-10-252-67-17] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@ip-10-252-67-17] got pmi command (from 12): get_maxes [proxy:0:1@ip-10-252-67-17] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@ip-10-252-67-17] got pmi command (from 15): get_maxes [proxy:0:1@ip-10-252-67-17] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@ip-10-252-67-17] got pmi command (from 18): get_maxes [proxy:0:1@ip-10-252-67-17] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@ip-10-252-67-17] got pmi command (from 24): init pmi_version=1 pmi_subversion=1 [proxy:0:1@ip-10-252-67-17] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 7): barrier_in [proxy:0:1@ip-10-252-67-17] got pmi command (from 9): barrier_in [proxy:0:1@ip-10-252-67-17] got pmi command (from 12): barrier_in [proxy:0:1@ip-10-252-67-17] got pmi command (from 15): barrier_in [proxy:0:1@ip-10-252-67-17] got pmi command (from 18): barrier_in [proxy:0:1@ip-10-252-67-17] got pmi command (from 21): get_maxes [proxy:0:1@ip-10-252-67-17] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@ip-10-252-67-17] got pmi command (from 24): get_maxes [proxy:0:1@ip-10-252-67-17] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:1@ip-10-252-67-17] got pmi command (from 21): barrier_in [proxy:0:1@ip-10-252-67-17] got pmi command (from 24): barrier_in [proxy:0:1@ip-10-252-67-17] got pmi command (from 6): get_maxes [proxy:0:1@ip-10-252-67-17] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [mpiexec@ip-10-252-65-194] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@ip-10-252-65-194] PMI response to fd 0 pid 6: cmd=barrier_out [mpiexec@ip-10-252-65-194] PMI response to fd 6 pid 6: cmd=barrier_out [proxy:0:1@ip-10-252-67-17] got pmi command (from 6): barrier_in [proxy:0:1@ip-10-252-67-17] forwarding command (cmd=barrier_in) upstream [proxy:0:0@ip-10-252-65-194] PMI response: cmd=barrier_out [proxy:0:0@ip-10-252-65-194] PMI response: cmd=barrier_out [proxy:0:0@ip-10-252-65-194] PMI response: cmd=barrier_out [proxy:0:0@ip-10-252-65-194] PMI response: cmd=barrier_out [proxy:0:0@ip-10-252-65-194] PMI response: cmd=barrier_out [proxy:0:0@ip-10-252-65-194] PMI response: cmd=barrier_out [proxy:0:0@ip-10-252-65-194] PMI response: cmd=barrier_out [proxy:0:0@ip-10-252-65-194] PMI response: cmd=barrier_out [proxy:0:0@ip-10-252-65-194] got pmi command (from 6): get_ranks2hosts [proxy:0:0@ip-10-252-65-194] PMI response: put_ranks2hosts 135 2 43 ip-10-252-65-194.us-west-2.compute.internal 0,1,2,3,4,5,6,7, 42 ip-10-252-67-17.us-west-2.compute.internal 8,9,10,11,12,13,14,15, [proxy:0:0@ip-10-252-65-194] got pmi command (from 7): get_ranks2hosts [proxy:0:0@ip-10-252-65-194] PMI response: put_ranks2hosts 135 2 43 ip-10-252-65-194.us-west-2.compute.internal 0,1,2,3,4,5,6,7, 42 ip-10-252-67-17.us-west-2.compute.internal 8,9,10,11,12,13,14,15, [proxy:0:0@ip-10-252-65-194] got pmi command (from 8): get_ranks2hosts [proxy:0:0@ip-10-252-65-194] PMI response: put_ranks2hosts 135 2 43 ip-10-252-65-194.us-west-2.compute.internal 0,1,2,3,4,5,6,7, 42 ip-10-252-67-17.us-west-2.compute.internal 8,9,10,11,12,13,14,15, [proxy:0:0@ip-10-252-65-194] got pmi command (from 13): get_ranks2hosts [proxy:0:0@ip-10-252-65-194] PMI response: put_ranks2hosts 135 2 43 ip-10-252-65-194.us-west-2.compute.internal 0,1,2,3,4,5,6,7, 42 ip-10-252-67-17.us-west-2.compute.internal 8,9,10,11,12,13,14,15, [proxy:0:0@ip-10-252-65-194] got pmi command (from 6): get_appnum [proxy:0:0@ip-10-252-65-194] PMI response: cmd=appnum appnum=0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 7): get_appnum [proxy:0:0@ip-10-252-65-194] PMI response: cmd=appnum appnum=0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 8): get_appnum [proxy:0:0@ip-10-252-65-194] PMI response: cmd=appnum appnum=0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 16): get_ranks2hosts [proxy:0:0@ip-10-252-65-194] PMI response: put_ranks2hosts 135 2 43 ip-10-252-65-194.us-west-2.compute.internal 0,1,2,3,4,5,6,7, 42 ip-10-252-67-17.us-west-2.compute.internal 8,9,10,11,12,13,14,15, [proxy:0:0@ip-10-252-65-194] got pmi command (from 19): get_ranks2hosts [proxy:0:0@ip-10-252-65-194] PMI response: put_ranks2hosts 135 2 43 ip-10-252-65-194.us-west-2.compute.internal 0,1,2,3,4,5,6,7, 42 ip-10-252-67-17.us-west-2.compute.internal 8,9,10,11,12,13,14,15, [proxy:0:0@ip-10-252-65-194] got pmi command (from 22): get_ranks2hosts [proxy:0:0@ip-10-252-65-194] PMI response: put_ranks2hosts 135 2 43 ip-10-252-65-194.us-west-2.compute.internal 0,1,2,3,4,5,6,7, 42 ip-10-252-67-17.us-west-2.compute.internal 8,9,10,11,12,13,14,15, [proxy:0:0@ip-10-252-65-194] got pmi command (from 25): get_ranks2hosts [proxy:0:0@ip-10-252-65-194] PMI response: put_ranks2hosts 135 2 43 ip-10-252-65-194.us-west-2.compute.internal 0,1,2,3,4,5,6,7, 42 ip-10-252-67-17.us-west-2.compute.internal 8,9,10,11,12,13,14,15, [proxy:0:1@ip-10-252-67-17] PMI response: cmd=barrier_out [proxy:0:1@ip-10-252-67-17] PMI response: cmd=barrier_out [proxy:0:1@ip-10-252-67-17] PMI response: cmd=barrier_out [proxy:0:1@ip-10-252-67-17] PMI response: cmd=barrier_out [proxy:0:1@ip-10-252-67-17] PMI response: cmd=barrier_out [proxy:0:0@ip-10-252-65-194] got pmi command (from 6): get_my_kvsname [proxy:0:0@ip-10-252-65-194] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 7): get_my_kvsname [proxy:0:0@ip-10-252-65-194] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 8): get_my_kvsname [proxy:0:0@ip-10-252-65-194] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 13): get_appnum [proxy:0:0@ip-10-252-65-194] PMI response: cmd=appnum appnum=0 [proxy:0:1@ip-10-252-67-17] PMI response: cmd=barrier_out [proxy:0:1@ip-10-252-67-17] PMI response: cmd=barrier_out [proxy:0:1@ip-10-252-67-17] PMI response: cmd=barrier_out [proxy:0:1@ip-10-252-67-17] got pmi command (from 6): get_ranks2hosts [proxy:0:1@ip-10-252-67-17] PMI response: put_ranks2hosts 135 2 43 ip-10-252-65-194.us-west-2.compute.internal 0,1,2,3,4,5,6,7, 42 ip-10-252-67-17.us-west-2.compute.internal 8,9,10,11,12,13,14,15, [proxy:0:1@ip-10-252-67-17] got pmi command (from 7): get_ranks2hosts [proxy:0:1@ip-10-252-67-17] PMI response: put_ranks2hosts 135 2 43 ip-10-252-65-194.us-west-2.compute.internal 0,1,2,3,4,5,6,7, 42 ip-10-252-67-17.us-west-2.compute.internal 8,9,10,11,12,13,14,15, [proxy:0:1@ip-10-252-67-17] got pmi command (from 9): get_ranks2hosts [proxy:0:1@ip-10-252-67-17] PMI response: put_ranks2hosts 135 2 43 ip-10-252-65-194.us-west-2.compute.internal 0,1,2,3,4,5,6,7, 42 ip-10-252-67-17.us-west-2.compute.internal 8,9,10,11,12,13,14,15, [proxy:0:1@ip-10-252-67-17] got pmi command (from 12): get_ranks2hosts [proxy:0:1@ip-10-252-67-17] PMI response: put_ranks2hosts 135 2 43 ip-10-252-65-194.us-west-2.compute.internal 0,1,2,3,4,5,6,7, 42 ip-10-252-67-17.us-west-2.compute.internal 8,9,10,11,12,13,14,15, [proxy:0:0@ip-10-252-65-194] got pmi command (from 16): get_appnum [proxy:0:0@ip-10-252-65-194] PMI response: cmd=appnum appnum=0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 19): get_appnum [proxy:0:0@ip-10-252-65-194] PMI response: cmd=appnum appnum=0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 6): get_my_kvsname [proxy:0:0@ip-10-252-65-194] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 7): get_my_kvsname [proxy:0:0@ip-10-252-65-194] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 18): get_ranks2hosts [proxy:0:1@ip-10-252-67-17] PMI response: put_ranks2hosts 135 2 43 ip-10-252-65-194.us-west-2.compute.internal 0,1,2,3,4,5,6,7, 42 ip-10-252-67-17.us-west-2.compute.internal 8,9,10,11,12,13,14,15, [proxy:0:1@ip-10-252-67-17] got pmi command (from 6): get_appnum [proxy:0:1@ip-10-252-67-17] PMI response: cmd=appnum appnum=0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 7): get_appnum [proxy:0:1@ip-10-252-67-17] PMI response: cmd=appnum appnum=0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 12): get_appnum [proxy:0:0@ip-10-252-65-194] got pmi command (from 8): get_my_kvsname [proxy:0:0@ip-10-252-65-194] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 13): get_my_kvsname [proxy:0:0@ip-10-252-65-194] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 16): get_my_kvsname [proxy:0:0@ip-10-252-65-194] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 25): get_appnum [proxy:0:0@ip-10-252-65-194] PMI response: cmd=appnum appnum=0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 13): get_my_kvsname [proxy:0:0@ip-10-252-65-194] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 19): get_my_kvsname [proxy:0:0@ip-10-252-65-194] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:1@ip-10-252-67-17] PMI response: cmd=appnum appnum=0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 15): get_ranks2hosts [proxy:0:1@ip-10-252-67-17] PMI response: put_ranks2hosts 135 2 43 ip-10-252-65-194.us-west-2.compute.internal 0,1,2,3,4,5,6,7, 42 ip-10-252-67-17.us-west-2.compute.internal 8,9,10,11,12,13,14,15, [proxy:0:1@ip-10-252-67-17] got pmi command (from 21): get_ranks2hosts [proxy:0:1@ip-10-252-67-17] PMI response: put_ranks2hosts 135 2 43 ip-10-252-65-194.us-west-2.compute.internal 0,1,2,3,4,5,6,7, 42 ip-10-252-67-17.us-west-2.compute.internal 8,9,10,11,12,13,14,15, [proxy:0:1@ip-10-252-67-17] got pmi command (from 24): get_ranks2hosts [proxy:0:1@ip-10-252-67-17] PMI response: put_ranks2hosts 135 2 43 ip-10-252-65-194.us-west-2.compute.internal 0,1,2,3,4,5,6,7, 42 ip-10-252-67-17.us-west-2.compute.internal 8,9,10,11,12,13,14,15, [proxy:0:1@ip-10-252-67-17] got pmi command (from 6): get_my_kvsname [proxy:0:1@ip-10-252-67-17] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 7): get_my_kvsname [proxy:0:1@ip-10-252-67-17] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 9): get_appnum [proxy:0:1@ip-10-252-67-17] PMI response: cmd=appnum appnum=0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 16): get_my_kvsname [proxy:0:0@ip-10-252-65-194] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 25): get_my_kvsname [proxy:0:0@ip-10-252-65-194] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 12): get_my_kvsname [proxy:0:1@ip-10-252-67-17] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 18): get_appnum [proxy:0:1@ip-10-252-67-17] PMI response: cmd=appnum appnum=0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 6): get_my_kvsname [proxy:0:1@ip-10-252-67-17] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 7): get_my_kvsname [proxy:0:1@ip-10-252-67-17] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 9): get_my_kvsname [proxy:0:1@ip-10-252-67-17] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [mpiexec@ip-10-252-65-194] [pgid: 0] got PMI command: cmd=put kvsname=kvs_77005_0 key=sharedFilename[0] value=/dev/shm/Intel_MPI_IbCwMG [mpiexec@ip-10-252-65-194] PMI response to fd 0 pid 6: cmd=put_result rc=0 msg=success [proxy:0:0@ip-10-252-65-194] got pmi command (from 7): barrier_in [proxy:0:0@ip-10-252-65-194] got pmi command (from 19): get_my_kvsname [proxy:0:0@ip-10-252-65-194] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 6): put kvsname=kvs_77005_0 key=sharedFilename[0] value=/dev/shm/Intel_MPI_IbCwMG [proxy:0:0@ip-10-252-65-194] forwarding command (cmd=put kvsname=kvs_77005_0 key=sharedFilename[0] value=/dev/shm/Intel_MPI_IbCwMG) upstream [proxy:0:0@ip-10-252-65-194] got pmi command (from 8): barrier_in [proxy:0:0@ip-10-252-65-194] got pmi command (from 13): barrier_in [proxy:0:0@ip-10-252-65-194] got pmi command (from 25): get_my_kvsname [proxy:0:0@ip-10-252-65-194] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 15): get_appnum [proxy:0:1@ip-10-252-67-17] PMI response: cmd=appnum appnum=0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 21): get_appnum [proxy:0:1@ip-10-252-67-17] PMI response: cmd=appnum appnum=0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 24): get_appnum [proxy:0:1@ip-10-252-67-17] PMI response: cmd=appnum appnum=0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 9): get_my_kvsname [proxy:0:1@ip-10-252-67-17] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 12): get_my_kvsname [proxy:0:1@ip-10-252-67-17] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 15): get_my_kvsname [proxy:0:1@ip-10-252-67-17] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 18): get_my_kvsname [proxy:0:0@ip-10-252-65-194] we don't understand the response put_result; forwarding downstream [proxy:0:0@ip-10-252-65-194] got pmi command (from 16): barrier_in [proxy:0:0@ip-10-252-65-194] got pmi command (from 19): barrier_in [proxy:0:0@ip-10-252-65-194] got pmi command (from 6): barrier_in [proxy:0:0@ip-10-252-65-194] got pmi command (from 25): barrier_in [mpiexec@ip-10-252-65-194] [pgid: 0] got PMI command: cmd=put kvsname=kvs_77005_0 key=sharedFilename[8] value=/dev/shm/Intel_MPI_lnni7c [mpiexec@ip-10-252-65-194] PMI response to fd 6 pid 6: cmd=put_result rc=0 msg=success [proxy:0:1@ip-10-252-67-17] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 21): get_my_kvsname [proxy:0:1@ip-10-252-67-17] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 6): put kvsname=kvs_77005_0 key=sharedFilename[8] value=/dev/shm/Intel_MPI_lnni7c [proxy:0:1@ip-10-252-67-17] forwarding command (cmd=put kvsname=kvs_77005_0 key=sharedFilename[8] value=/dev/shm/Intel_MPI_lnni7c) upstream [proxy:0:0@ip-10-252-65-194] got pmi command (from 22): get_appnum [proxy:0:0@ip-10-252-65-194] PMI response: cmd=appnum appnum=0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 7): barrier_in [proxy:0:1@ip-10-252-67-17] got pmi command (from 15): get_my_kvsname [proxy:0:1@ip-10-252-67-17] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 24): get_my_kvsname [proxy:0:1@ip-10-252-67-17] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 9): barrier_in [proxy:0:1@ip-10-252-67-17] got pmi command (from 12): barrier_in [proxy:0:1@ip-10-252-67-17] got pmi command (from 18): get_my_kvsname [proxy:0:1@ip-10-252-67-17] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 21): get_my_kvsname [proxy:0:1@ip-10-252-67-17] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:0@ip-10-252-65-194] got pmi command (from 22): get_my_kvsname [proxy:0:0@ip-10-252-65-194] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:1@ip-10-252-67-17] got pmi command (from 24): get_my_kvsname [proxy:0:1@ip-10-252-67-17] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 [proxy:0:1@ip-10-252-67-17] we don't understand the response put_result; forwarding downstream [proxy:0:1@ip-10-252-67-17] got pmi command (from 15): barrier_in [proxy:0:1@ip-10-252-67-17] got pmi command (from 18): barrier_in [proxy:0:1@ip-10-252-67-17] got pmi command (from 21): barrier_in [proxy:0:1@ip-10-252-67-17] [proxy:0:0@ip-10-252-65-194] got pmi command (from 22): get_my_kvsname [proxy:0:0@ip-10-252-65-194] PMI response: cmd=my_kvsname kvsname=kvs_77005_0 got pmi command (from 6): barrier_in [proxy:0:1@ip-10-252-67-17] got pmi command (from 24): barrier_in [proxy:0:1@ip-10-252-67-17] forwarding command (cmd=barrier_in) upstream [mpiexec@ip-10-252-65-194] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@ip-10-252-65-194] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@ip-10-252-65-194] PMI response to fd 0 pid 22: cmd=barrier_out [mpiexec@ip-10-252-65-194] PMI response to fd 6 pid 22: cmd=barrier_out [proxy:0:0@ip-10-252-65-194] got pmi command (from 22): barrier_in [proxy:0:0@ip-10-252-65-194] forwarding command (cmd=barrier_in) upstream [proxy:0:0@ip-10-252-65-194] PMI response: cmd=barrier_out [proxy:0:0@ip-10-252-65-194] PMI response: cmd=barrier_out [proxy:0:0@ip-10-252-65-194] PMI response: cmd=barrier_out [proxy:0:0@ip-10-252-65-194] PMI response: cmd=barrier_out [proxy:0:0@ip-10-252-65-194] PMI response: cmd=barrier_out [proxy:0:0@ip-10-252-65-194] PMI response: cmd=barrier_out [proxy:0:0@ip-10-252-65-194] PMI response: cmd=barrier_out [proxy:0:0@ip-10-252-65-194] PMI response: cmd=barrier_out [proxy:0:0@ip-10-252-65-194] got pmi command (from 7): get kvsname=kvs_77005_0 key=sharedFilename[0] [proxy:0:0@ip-10-252-65-194] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_IbCwMG [proxy:0:0@ip-10-252-65-194] got pmi command (from 8): get kvsname=kvs_77005_0 key=sharedFilename[0] [proxy:0:0@ip-10-252-65-194] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_IbCwMG [proxy:0:0@ip-10-252-65-194] got pmi command (from 13): get kvsname=kvs_77005_0 key=sharedFilename[0] [proxy:0:0@ip-10-252-65-194] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_IbCwMG [proxy:0:0@ip-10-252-65-194] got pmi command (from 16): get kvsname=kvs_77005_0 key=sharedFilename[0] [proxy:0:0@ip-10-252-65-194] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_IbCwMG [proxy:0:0@ip-10-252-65-194] got pmi command (from 19): get kvsname=kvs_77005_0 key=sharedFilename[0] [proxy:0:0@ip-10-252-65-194] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_IbCwMG [proxy:0:0@ip-10-252-65-194] got pmi command (from 22): get kvsname=kvs_77005_0 key=sharedFilename[0] [proxy:0:0@ip-10-252-65-194] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_IbCwMG [proxy:0:0@ip-10-252-65-194] got pmi command (from 25): get kvsname=kvs_77005_0 key=sharedFilename[0] [proxy:0:0@ip-10-252-65-194] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_IbCwMG [proxy:0:1@ip-10-252-67-17] PMI response: cmd=barrier_out [proxy:0:1@ip-10-252-67-17] PMI response: cmd=barrier_out [proxy:0:1@ip-10-252-67-17] PMI response: cmd=barrier_out [proxy:0:1@ip-10-252-67-17] PMI response: cmd=barrier_out [proxy:0:1@ip-10-252-67-17] PMI response: cmd=barrier_out [proxy:0:1@ip-10-252-67-17] PMI response: cmd=barrier_out [proxy:0:1@ip-10-252-67-17] PMI response: cmd=barrier_out [proxy:0:1@ip-10-252-67-17] PMI response: cmd=barrier_out [proxy:0:1@ip-10-252-67-17] got pmi command (from 7): get kvsname=kvs_77005_0 key=sharedFilename[8] [proxy:0:1@ip-10-252-67-17] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_lnni7c [proxy:0:1@ip-10-252-67-17] got pmi command (from 9): get kvsname=kvs_77005_0 key=sharedFilename[8] [proxy:0:1@ip-10-252-67-17] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_lnni7c [proxy:0:1@ip-10-252-67-17] got pmi command (from 12): get kvsname=kvs_77005_0 key=sharedFilename[8] [proxy:0:1@ip-10-252-67-17] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_lnni7c [proxy:0:1@ip-10-252-67-17] got pmi command (from 18): get kvsname=kvs_77005_0 key=sharedFilename[8] [proxy:0:1@ip-10-252-67-17] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_lnni7c [proxy:0:1@ip-10-252-67-17] got pmi command (from 15): get kvsname=kvs_77005_0 key=sharedFilename[8] [proxy:0:1@ip-10-252-67-17] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_lnni7c [proxy:0:1@ip-10-252-67-17] got pmi command (from 21): get kvsname=kvs_77005_0 key=sharedFilename[8] [proxy:0:1@ip-10-252-67-17] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_lnni7c [proxy:0:1@ip-10-252-67-17] got pmi command (from 24): get kvsname=kvs_77005_0 key=sharedFilename[8] [proxy:0:1@ip-10-252-67-17] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_lnni7c [4] MPI startup(): RLIMIT_MEMLOCK too small [5] MPI startup(): RLIMIT_MEMLOCK too small [0] MPI startup(): RLIMIT_MEMLOCK too small [1] MPI startup(): RLIMIT_MEMLOCK too small [3] MPI startup(): RLIMIT_MEMLOCK too small [6] MPI startup(): RLIMIT_MEMLOCK too small [7] MPI startup(): RLIMIT_MEMLOCK too small [2] MPI startup(): RLIMIT_MEMLOCK too small [10] MPI startup(): RLIMIT_MEMLOCK too small [14] MPI startup(): RLIMIT_MEMLOCK too small [15] MPI startup(): RLIMIT_MEMLOCK too small [8] MPI startup(): RLIMIT_MEMLOCK too small [9] MPI startup(): RLIMIT_MEMLOCK too small [12] MPI startup(): RLIMIT_MEMLOCK too small [13] MPI startup(): RLIMIT_MEMLOCK too small [11] MPI startup(): RLIMIT_MEMLOCK too small