Software Archive
Read-only legacy content
17061 Discussions

hybrid application on the Xeon Phi

Miah__Wadud
Beginner
751 Views

I would like to run a hybrid application (CP2K) on the Xeon Phi. The application is MPI + OpenMP and I set up the environment in the following manner: $ export OMP_NUM_THREADS=15 $ export I_MPI_PIN_PROCESSOR_LIST=$(seq -s "," 1 $OMP_NUM_THREADS 240) $ echo $I_MPI_PIN_PROCESSOR_LIST 1,16,31,46,61,76,91,106,121,136,151,166,181,196,211,226 $ mpirun -n $(expr 240 / $OMP_NUM_THREADS) However, the application is running awfully slowly. When I run the "top" command only shows the 16 MPI processes and not any of the threads and says the Phi system is 6.2% user busy (16 / 240 * 100). It seems like the threads are not running. Any help will be greatly appreciated. Thanks in advance

0 Kudos
6 Replies
James_T_Intel
Moderator
751 Views

That does seem odd.  Try running with -verbose and attaching the output.  It is possible something in the environment isn't getting passed correctly, and -verbose will show the environment mpirun sees.

0 Kudos
Miah__Wadud
Beginner
751 Views
Hi James, It is producing a lot of output: [eew918@dn150-mic0 benchmark]$ mpirun -verbose -n $(expr 240 / $OMP_NUM_THREADS) cp2k.psmp.epcc H2O-64.inp host: dn150-mic0 ================================================================================================== mpiexec options: ---------------- Base path: /home/eew918/impi/bin/ Launcher: ssh Debug level: 1 Enable X: -1 Global environment: ------------------- USER=eew918 MAIL=/var/mail/eew918 SSH_CLIENT=172.31.1.254 43738 22 LD_LIBRARY_PATH=/home/eew918/impi/lib HOME=/home/eew918 OLDPWD=/home/eew918/cp2k SSH_TTY=/dev/pts/0 LOGNAME=eew918 TERM=xterm I_MPI_PIN_PROCESSOR_LIST=1,16,31,46,61,76,91,106,121,136,151,166,181,196,211,226 PATH=/usr/bin:/usr/sbin:/bin:/sbin:/home/eew918/impi/bin:/home/eew918/cp2k/exe SHELL=/bin/sh I_MPI_DEBUG=4 PWD=/home/eew918/cp2k/tests/QS/benchmark SSH_CONNECTION=172.31.1.254 43738 172.31.1.1 22 OMP_NUM_THREADS=15 Hydra internal environment: --------------------------- MPICH_ENABLE_CKPOINT=1 GFORTRAN_UNBUFFERED_PRECONNECTED=y Proxy information: ********************* [1] proxy: dn150-mic0 (60 cores) Exec list: cp2k.psmp.epcc (16 processes); ================================================================================================== [mpiexec@dn150-mic0] Timeout set to -1 (-1 means infinite) [mpiexec@dn150-mic0] Got a control port string of dn150-mic0:47637 Proxy launch args: /home/eew918/impi/bin/pmi_proxy --control-port dn150-mic0:47637 --debug --pmi-connect lazy-cache --pmi-aggregate -s 0 --rmk slurm --launcher ssh --demux poll --pgid 0 --enable-stdin 1 --retries 10 --control-code 1616450218 --proxy-id [mpiexec@dn150-mic0] PMI FD: (null); PMI PORT: (null); PMI ID/RANK: -1 Arguments being passed to proxy 0: --version 1.4.1p1 --iface-ip-env-name MPICH_INTERFACE_HOSTNAME --hostname dn150-mic0 --global-core-map 0,60,0 --filler-process-map 0,60,0 --global-process-count 16 --auto-cleanup 1 --pmi-rank -1 --pmi-kvsname kvs_9989_0 --pmi-process-mapping (vector,(0,1,60)) --binding omp=15;map=1,16,31,46,61,76,91,106,121,136,151,166,181,196,211,226 --topolib ipl --ckpointlib blcr --ckpoint-prefix /tmp --ckpoint-preserve 1 --ckpoint off --ckpoint-num -1 --global-inherited-env 16 'USER=eew918' 'MAIL=/var/mail/eew918' 'SSH_CLIENT=172.31.1.254 43738 22' 'LD_LIBRARY_PATH=/home/eew918/impi/lib' 'HOME=/home/eew918' 'OLDPWD=/home/eew918/cp2k' 'SSH_TTY=/dev/pts/0' 'LOGNAME=eew918' 'TERM=xterm' 'I_MPI_PIN_PROCESSOR_LIST=1,16,31,46,61,76,91,106,121,136,151,166,181,196,211,226' 'PATH=/usr/bin:/usr/sbin:/bin:/sbin:/home/eew918/impi/bin:/home/eew918/cp2k/exe' 'SHELL=/bin/sh' 'I_MPI_DEBUG=4' 'PWD=/home/eew918/cp2k/tests/QS/benchmark' 'SSH_CONNECTION=172.31.1.254 43738 172.31.1.1 22' 'OMP_NUM_THREADS=15' --global-user-env 0 --global-system-env 2 'MPICH_ENABLE_CKPOINT=1' 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 60 --exec --exec-appnum 0 --exec-proc-count 16 --exec-local-env 0 --exec-wdir /home/eew918/cp2k/tests/QS/benchmark --exec-args 2 cp2k.psmp.epcc H2O-64.inp [mpiexec@dn150-mic0] Launch arguments: /home/eew918/impi/bin/pmi_proxy --control-port dn150-mic0:47637 --debug --pmi-connect lazy-cache --pmi-aggregate -s 0 --rmk slurm --launcher ssh --demux poll --pgid 0 --enable-stdin 1 --retries 10 --control-code 1616450218 --proxy-id 0 [mpiexec@dn150-mic0] STDIN will be redirected to 1 fd(s): 9 [proxy:0:0@dn150-mic0] Start PMI_proxy 0 [proxy:0:0@dn150-mic0] STDIN will be redirected to 1 fd(s): 15 [proxy:0:0@dn150-mic0] got pmi command (from 10): init pmi_version=1 pmi_subversion=1 [proxy:0:0@dn150-mic0] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@dn150-mic0] got pmi command (from 12): init pmi_version=1 pmi_subversion=1 [proxy:0:0@dn150-mic0] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@dn150-mic0] got pmi command (from 14): init pmi_version=1 pmi_subversion=1 [proxy:0:0@dn150-mic0] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@dn150-mic0] got pmi command (from 10): get_maxes [proxy:0:0@dn150-mic0] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@dn150-mic0] got pmi command (from 12): get_maxes [proxy:0:0@dn150-mic0] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@dn150-mic0] got pmi command (from 19): init pmi_version=1 pmi_subversion=1 [proxy:0:0@dn150-mic0] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@dn150-mic0] got pmi command (from 10): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 14): get_maxes [proxy:0:0@dn150-mic0] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@dn150-mic0] got pmi command (from 12): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 19): get_maxes [proxy:0:0@dn150-mic0] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@dn150-mic0] got pmi command (from 22): init pmi_version=1 pmi_subversion=1 [proxy:0:0@dn150-mic0] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@dn150-mic0] got pmi command (from 14): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 19): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 22): get_maxes [proxy:0:0@dn150-mic0] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@dn150-mic0] got pmi command (from 22): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 25): init pmi_version=1 pmi_subversion=1 [proxy:0:0@dn150-mic0] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@dn150-mic0] got pmi command (from 25): get_maxes [proxy:0:0@dn150-mic0] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@dn150-mic0] got pmi command (from 25): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 31): init pmi_version=1 pmi_subversion=1 [proxy:0:0@dn150-mic0] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@dn150-mic0] got pmi command (from 31): get_maxes [proxy:0:0@dn150-mic0] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@dn150-mic0] got pmi command (from 31): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 34): init pmi_version=1 pmi_subversion=1 [proxy:0:0@dn150-mic0] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@dn150-mic0] got pmi command (from 28): init pmi_version=1 pmi_subversion=1 [proxy:0:0@dn150-mic0] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@dn150-mic0] got pmi command (from 34): get_maxes [proxy:0:0@dn150-mic0] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@dn150-mic0] got pmi command (from 28): get_maxes [proxy:0:0@dn150-mic0] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@dn150-mic0] got pmi command (from 34): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 28): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 37): init pmi_version=1 pmi_subversion=1 [proxy:0:0@dn150-mic0] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@dn150-mic0] got pmi command (from 40): init pmi_version=1 pmi_subversion=1 [proxy:0:0@dn150-mic0] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@dn150-mic0] got pmi command (from 37): get_maxes [proxy:0:0@dn150-mic0] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@dn150-mic0] got pmi command (from 40): get_maxes [proxy:0:0@dn150-mic0] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@dn150-mic0] got pmi command (from 43): init pmi_version=1 pmi_subversion=1 [proxy:0:0@dn150-mic0] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@dn150-mic0] got pmi command (from 37): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 40): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 43): get_maxes [proxy:0:0@dn150-mic0] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@dn150-mic0] got pmi command (from 46): init pmi_version=1 pmi_subversion=1 [proxy:0:0@dn150-mic0] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@dn150-mic0] got pmi command (from 49): init pmi_version=1 pmi_subversion=1 [proxy:0:0@dn150-mic0] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@dn150-mic0] got pmi command (from 43): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 46): get_maxes [proxy:0:0@dn150-mic0] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@dn150-mic0] got pmi command (from 49): get_maxes [proxy:0:0@dn150-mic0] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@dn150-mic0] got pmi command (from 52): init pmi_version=1 pmi_subversion=1 [proxy:0:0@dn150-mic0] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@dn150-mic0] got pmi command (from 46): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 55): init pmi_version=1 pmi_subversion=1 [proxy:0:0@dn150-mic0] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0 [proxy:0:0@dn150-mic0] got pmi command (from 49): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 52): get_maxes [proxy:0:0@dn150-mic0] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@dn150-mic0] got pmi command (from 52): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 55): get_maxes [proxy:0:0@dn150-mic0] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0:0@dn150-mic0] got pmi command (from 55): barrier_in [proxy:0:0@dn150-mic0] forwarding command (cmd=barrier_in) upstream [mpiexec@dn150-mic0] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@dn150-mic0] PMI response to fd 6 pid 55: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] got pmi command (from 10): get_ranks2hosts [proxy:0:0@dn150-mic0] PMI response: put_ranks2hosts 55 1 10 dn150-mic0 0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15, [proxy:0:0@dn150-mic0] got pmi command (from 12): get_ranks2hosts [proxy:0:0@dn150-mic0] PMI response: put_ranks2hosts 55 1 10 dn150-mic0 0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15, [proxy:0:0@dn150-mic0] got pmi command (from 14): get_ranks2hosts [proxy:0:0@dn150-mic0] PMI response: put_ranks2hosts 55 1 10 dn150-mic0 0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15, [proxy:0:0@dn150-mic0] got pmi command (from 19): get_ranks2hosts [proxy:0:0@dn150-mic0] PMI response: put_ranks2hosts 55 1 10 dn150-mic0 0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15, [proxy:0:0@dn150-mic0] got pmi command (from 22): get_ranks2hosts [proxy:0:0@dn150-mic0] PMI response: put_ranks2hosts 55 1 10 dn150-mic0 0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15, [proxy:0:0@dn150-mic0] got pmi command (from 25): get_ranks2hosts [proxy:0:0@dn150-mic0] PMI response: put_ranks2hosts 55 1 10 dn150-mic0 0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15, [proxy:0:0@dn150-mic0] got pmi command (from 28): get_ranks2hosts [proxy:0:0@dn150-mic0] PMI response: put_ranks2hosts 55 1 10 dn150-mic0 0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15, [proxy:0:0@dn150-mic0] got pmi command (from 31): get_ranks2hosts [proxy:0:0@dn150-mic0] PMI response: put_ranks2hosts 55 1 10 dn150-mic0 0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15, [proxy:0:0@dn150-mic0] got pmi command (from 34): get_ranks2hosts [proxy:0:0@dn150-mic0] PMI response: put_ranks2hosts 55 1 10 dn150-mic0 0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15, [proxy:0:0@dn150-mic0] got pmi command (from 37): get_ranks2hosts [proxy:0:0@dn150-mic0] PMI response: put_ranks2hosts 55 1 10 dn150-mic0 0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15, [proxy:0:0@dn150-mic0] got pmi command (from 40): get_ranks2hosts [proxy:0:0@dn150-mic0] PMI response: put_ranks2hosts 55 1 10 dn150-mic0 0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15, [proxy:0:0@dn150-mic0] got pmi command (from 43): get_ranks2hosts [proxy:0:0@dn150-mic0] PMI response: put_ranks2hosts 55 1 10 dn150-mic0 0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15, [proxy:0:0@dn150-mic0] got pmi command (from 46): get_ranks2hosts [proxy:0:0@dn150-mic0] PMI response: put_ranks2hosts 55 1 10 dn150-mic0 0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15, [proxy:0:0@dn150-mic0] got pmi command (from 49): get_ranks2hosts [proxy:0:0@dn150-mic0] PMI response: put_ranks2hosts 55 1 10 dn150-mic0 0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15, [proxy:0:0@dn150-mic0] got pmi command (from 52): get_ranks2hosts [proxy:0:0@dn150-mic0] PMI response: put_ranks2hosts 55 1 10 dn150-mic0 0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15, [proxy:0:0@dn150-mic0] got pmi command (from 10): get_appnum [proxy:0:0@dn150-mic0] PMI response: cmd=appnum appnum=0 [proxy:0:0@dn150-mic0] got pmi command (from 12): get_appnum [proxy:0:0@dn150-mic0] PMI response: cmd=appnum appnum=0 [proxy:0:0@dn150-mic0] got pmi command (from 14): get_appnum [proxy:0:0@dn150-mic0] PMI response: cmd=appnum appnum=0 [proxy:0:0@dn150-mic0] got pmi command (from 19): get_appnum [proxy:0:0@dn150-mic0] PMI response: cmd=appnum appnum=0 [proxy:0:0@dn150-mic0] got pmi command (from 22): get_appnum [proxy:0:0@dn150-mic0] PMI response: cmd=appnum appnum=0 [proxy:0:0@dn150-mic0] got pmi command (from 25): get_appnum [proxy:0:0@dn150-mic0] PMI response: cmd=appnum appnum=0 [proxy:0:0@dn150-mic0] got pmi command (from 28): get_appnum [proxy:0:0@dn150-mic0] PMI response: cmd=appnum appnum=0 [proxy:0:0@dn150-mic0] got pmi command (from 31): get_appnum [proxy:0:0@dn150-mic0] PMI response: cmd=appnum appnum=0 [proxy:0:0@dn150-mic0] got pmi command (from 34): get_appnum [proxy:0:0@dn150-mic0] PMI response: cmd=appnum appnum=0 [proxy:0:0@dn150-mic0] got pmi command (from 37): get_appnum [proxy:0:0@dn150-mic0] PMI response: cmd=appnum appnum=0 [proxy:0:0@dn150-mic0] got pmi command (from 40): get_appnum [proxy:0:0@dn150-mic0] PMI response: cmd=appnum appnum=0 [proxy:0:0@dn150-mic0] got pmi command (from 43): get_appnum [proxy:0:0@dn150-mic0] PMI response: cmd=appnum appnum=0 [proxy:0:0@dn150-mic0] got pmi command (from 55): get_ranks2hosts [proxy:0:0@dn150-mic0] PMI response: put_ranks2hosts 55 1 10 dn150-mic0 0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15, [proxy:0:0@dn150-mic0] got pmi command (from 10): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 12): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 14): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 19): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 22): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 25): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 28): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 31): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 34): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 37): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 40): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 46): get_appnum [proxy:0:0@dn150-mic0] PMI response: cmd=appnum appnum=0 [proxy:0:0@dn150-mic0] got pmi command (from 49): get_appnum [proxy:0:0@dn150-mic0] PMI response: cmd=appnum appnum=0 [proxy:0:0@dn150-mic0] got pmi command (from 52): get_appnum [proxy:0:0@dn150-mic0] PMI response: cmd=appnum appnum=0 [proxy:0:0@dn150-mic0] got pmi command (from 10): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 12): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 14): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 19): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 22): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 25): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 28): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 31): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 34): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 37): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 40): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 43): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 46): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 55): get_appnum [proxy:0:0@dn150-mic0] PMI response: cmd=appnum appnum=0 [proxy:0:0@dn150-mic0] got pmi command (from 10): put kvsname=kvs_9989_0 key=sharedFilename[0] value=/dev/shm/Intel_MPI_QmjGme [mpiexec@dn150-mic0] [pgid: 0] got PMI command: cmd=put kvsname=kvs_9989_0 key=sharedFilename[0] value=/dev/shm/Intel_MPI_QmjGme [mpiexec@dn150-mic0] PMI response to fd 6 pid 10: cmd=put_result rc=0 msg=success [proxy:0:0@dn150-mic0] forwarding command (cmd=put kvsname=kvs_9989_0 key=sharedFilename[0] value=/dev/shm/Intel_MPI_QmjGme) upstream [proxy:0:0@dn150-mic0] got pmi command (from 12): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 14): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 19): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 22): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 25): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 28): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 31): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 34): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 49): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 52): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] we don't understand the response put_result; forwarding downstream [proxy:0:0@dn150-mic0] got pmi command (from 37): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 40): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 43): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 46): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 55): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 10): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 49): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 52): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 55): get_my_kvsname [proxy:0:0@dn150-mic0] PMI response: cmd=my_kvsname kvsname=kvs_9989_0 [proxy:0:0@dn150-mic0] got pmi command (from 43): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 46): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 49): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 52): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 55): barrier_in [proxy:0:0@dn150-mic0] forwarding command (cmd=barrier_in) upstream [mpiexec@dn150-mic0] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@dn150-mic0] PMI response to fd 6 pid 55: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] got pmi command (from 12): get kvsname=kvs_9989_0 key=sharedFilename[0] [proxy:0:0@dn150-mic0] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_QmjGme [proxy:0:0@dn150-mic0] got pmi command (from 14): get kvsname=kvs_9989_0 key=sharedFilename[0] [proxy:0:0@dn150-mic0] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_QmjGme [proxy:0:0@dn150-mic0] got pmi command (from 19): get kvsname=kvs_9989_0 key=sharedFilename[0] [proxy:0:0@dn150-mic0] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_QmjGme [proxy:0:0@dn150-mic0] got pmi command (from 22): get kvsname=kvs_9989_0 key=sharedFilename[0] [proxy:0:0@dn150-mic0] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_QmjGme [proxy:0:0@dn150-mic0] got pmi command (from 25): get kvsname=kvs_9989_0 key=sharedFilename[0] [proxy:0:0@dn150-mic0] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_QmjGme [proxy:0:0@dn150-mic0] got pmi command (from 28): get kvsname=kvs_9989_0 key=sharedFilename[0] [proxy:0:0@dn150-mic0] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_QmjGme [proxy:0:0@dn150-mic0] got pmi command (from 31): get kvsname=kvs_9989_0 key=sharedFilename[0] [proxy:0:0@dn150-mic0] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_QmjGme [proxy:0:0@dn150-mic0] got pmi command (from 34): get kvsname=kvs_9989_0 key=sharedFilename[0] [proxy:0:0@dn150-mic0] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_QmjGme [proxy:0:0@dn150-mic0] got pmi command (from 37): get kvsname=kvs_9989_0 key=sharedFilename[0] [proxy:0:0@dn150-mic0] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_QmjGme [proxy:0:0@dn150-mic0] got pmi command (from 40): get kvsname=kvs_9989_0 key=sharedFilename[0] [proxy:0:0@dn150-mic0] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_QmjGme [proxy:0:0@dn150-mic0] got pmi command (from 43): get kvsname=kvs_9989_0 key=sharedFilename[0] [proxy:0:0@dn150-mic0] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_QmjGme [proxy:0:0@dn150-mic0] got pmi command (from 46): get kvsname=kvs_9989_0 key=sharedFilename[0] [proxy:0:0@dn150-mic0] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_QmjGme [proxy:0:0@dn150-mic0] got pmi command (from 49): get kvsname=kvs_9989_0 key=sharedFilename[0] [proxy:0:0@dn150-mic0] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_QmjGme [proxy:0:0@dn150-mic0] got pmi command (from 52): get kvsname=kvs_9989_0 key=sharedFilename[0] [proxy:0:0@dn150-mic0] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_QmjGme [proxy:0:0@dn150-mic0] got pmi command (from 55): get kvsname=kvs_9989_0 key=sharedFilename[0] [proxy:0:0@dn150-mic0] PMI response: cmd=get_result rc=0 msg=success value=/dev/shm/Intel_MPI_QmjGme [0] MPI startup(): shm data transfer mode [7] MPI startup(): shm data transfer mode [11] MPI startup(): shm data transfer mode [3] MPI startup(): shm data transfer mode [2] MPI startup(): shm data transfer mode [12] MPI startup(): shm data transfer mode [1] MPI startup(): shm data transfer mode [4] MPI startup(): shm data transfer mode [5] MPI startup(): shm data transfer mode [6] MPI startup(): shm data transfer mode [8] MPI startup(): shm data transfer mode [9] MPI startup(): shm data transfer mode [10] MPI startup(): shm data transfer mode [13] MPI startup(): shm data transfer mode [14] MPI startup(): shm data transfer mode [15] MPI startup(): shm data transfer mode [proxy:0:0@dn150-mic0] got pmi command (from 10): put kvsname=kvs_9989_0 key=P0-businesscard-0 value=fabrics_list#shm$ [proxy:0:0@dn150-mic0] got pmi command (from 12): put kvsname=kvs_9989_0 key=P1-businesscard-0 value=fabrics_list#shm$ [proxy:0:0@dn150-mic0] got pmi command (from 14): put kvsname=kvs_9989_0 key=P2-businesscard-0 value=fabrics_list#shm$ [proxy:0:0@dn150-mic0] got pmi command (from 19): put kvsname=kvs_9989_0 key=P3-businesscard-0 value=fabrics_list#shm$ [proxy:0:0@dn150-mic0] got pmi command (from 22): put kvsname=kvs_9989_0 key=P4-businesscard-0 value=fabrics_list#shm$ [proxy:0:0@dn150-mic0] got pmi command (from 25): put kvsname=kvs_9989_0 key=P5-businesscard-0 value=fabrics_list#shm$ [proxy:0:0@dn150-mic0] got pmi command (from 28): put kvsname=kvs_9989_0 key=P6-businesscard-0 value=fabrics_list#shm$ [proxy:0:0@dn150-mic0] got pmi command (from 31): put kvsname=kvs_9989_0 key=P7-businesscard-0 value=fabrics_list#shm$ [proxy:0:0@dn150-mic0] got pmi command (from 34): put kvsname=kvs_9989_0 key=P8-businesscard-0 value=fabrics_list#shm$ [proxy:0:0@dn150-mic0] got pmi command (from 37): put kvsname=kvs_9989_0 key=P9-businesscard-0 value=fabrics_list#shm$ [proxy:0:0@dn150-mic0] got pmi command (from 40): put kvsname=kvs_9989_0 key=P10-businesscard-0 value=fabrics_list#shm$ [proxy:0:0@dn150-mic0] got pmi command (from 43): put kvsname=kvs_9989_0 key=P11-businesscard-0 value=fabrics_list#shm$ [proxy:0:0@dn150-mic0] got pmi command (from 46): put kvsname=kvs_9989_0 key=P12-businesscard-0 value=fabrics_list#shm$ [proxy:0:0@dn150-mic0] got pmi command (from 49): put kvsname=kvs_9989_0 key=P13-businesscard-0 value=fabrics_list#shm$ [proxy:0:0@dn150-mic0] got pmi command (from 55): put kvsname=kvs_9989_0 key=P15-businesscard-0 value=fabrics_list#shm$ [proxy:0:0@dn150-mic0] got pmi command (from 52): put kvsname=kvs_9989_0 key=P14-businesscard-0 value=fabrics_list#shm$ [mpiexec@dn150-mic0] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_9989_0 key=P0-businesscard-0 value=fabrics_list#shm$ [mpiexec@dn150-mic0] reply: cmd=put_result rc=0 msg=success [mpiexec@dn150-mic0] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_9989_0 key=P1-businesscard-0 value=fabrics_list#shm$ [mpiexec@dn150-mic0] reply: cmd=put_result rc=0 msg=success [mpiexec@dn150-mic0] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_9989_0 key=P2-businesscard-0 value=fabrics_list#shm$ [mpiexec@dn150-mic0] reply: cmd=put_result rc=0 msg=success [mpiexec@dn150-mic0] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_9989_0 key=P3-businesscard-0 value=fabrics_list#shm$ [mpiexec@dn150-mic0] reply: cmd=put_result rc=0 msg=success [mpiexec@dn150-mic0] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_9989_0 key=P4-businesscard-0 value=fabrics_list#shm$ [mpiexec@dn150-mic0] reply: cmd=put_result rc=0 msg=success [mpiexec@dn150-mic0] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_9989_0 key=P5-businesscard-0 value=fabrics_list#shm$ [mpiexec@dn150-mic0] reply: cmd=put_result rc=0 msg=success [mpiexec@dn150-mic0] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_9989_0 key=P6-businesscard-0 value=fabrics_list#shm$ [mpiexec@dn150-mic0] reply: cmd=put_result rc=0 msg=success [mpiexec@dn150-mic0] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_9989_0 key=P7-businesscard-0 value=fabrics_list#shm$ [mpiexec@dn150-mic0] reply: cmd=put_result rc=0 msg=success [mpiexec@dn150-mic0] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_9989_0 key=P8-businesscard-0 value=fabrics_list#shm$ [mpiexec@dn150-mic0] reply: cmd=put_result rc=0 msg=success [mpiexec@dn150-mic0] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_9989_0 key=P9-businesscard-0 value=fabrics_list#shm$ [mpiexec@dn150-mic0] reply: cmd=put_result rc=0 msg=success [mpiexec@dn150-mic0] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_9989_0 key=P10-businesscard-0 value=fabrics_list#shm$ [mpiexec@dn150-mic0] reply: cmd=put_result rc=0 msg=success [mpiexec@dn150-mic0] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_9989_0 key=P11-businesscard-0 value=fabrics_list#shm$ [mpiexec@dn150-mic0] reply: cmd=put_result rc=0 msg=success [mpiexec@dn150-mic0] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_9989_0 key=P12-businesscard-0 value=fabrics_list#shm$ [mpiexec@dn150-mic0] reply: cmd=put_result rc=0 msg=success [mpiexec@dn150-mic0] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_9989_0 key=P13-businesscard-0 value=fabrics_list#shm$ [mpiexec@dn150-mic0] reply: cmd=put_result rc=0 msg=success [mpiexec@dn150-mic0] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_9989_0 key=P15-businesscard-0 value=fabrics_list#shm$ [mpiexec@dn150-mic0] reply: cmd=put_result rc=0 msg=success [mpiexec@dn150-mic0] [pgid: 0] got aggregated PMI command (part of it): cmd=put kvsname=kvs_9989_0 key=P14-businesscard-0 value=fabrics_list#shm$ [mpiexec@dn150-mic0] reply: cmd=put_result rc=0 msg=success [proxy:0:0@dn150-mic0] got pmi command (from 10): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 12): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 14): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 19): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 22): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 25): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 28): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 31): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 34): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 37): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 40): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 43): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 46): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 49): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 52): barrier_in [proxy:0:0@dn150-mic0] got pmi command (from 55): barrier_in [mpiexec@dn150-mic0] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@dn150-mic0] PMI response to fd 6 pid 55: cmd=barrier_out [proxy:0:0@dn150-mic0] forwarding command (cmd=barrier_in) upstream [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [proxy:0:0@dn150-mic0] PMI response: cmd=barrier_out [0] MPI startup(): Rank Pid Node name Pin cpu [0] MPI startup(): 0 9991 dn150-mic0 1 [0] MPI startup(): 1 9992 dn150-mic0 16 [0] MPI startup(): 2 9993 dn150-mic0 31 [0] MPI startup(): 3 9994 dn150-mic0 46 [0] MPI startup(): 4 9995 dn150-mic0 61 [0] MPI startup(): 5 9996 dn150-mic0 76 [0] MPI startup(): 6 9997 dn150-mic0 91 [0] MPI startup(): 7 9998 dn150-mic0 106 [0] MPI startup(): 8 9999 dn150-mic0 121 [0] MPI startup(): 9 10000 dn150-mic0 136 [0] MPI startup(): 10 10001 dn150-mic0 151 [0] MPI startup(): 11 10002 dn150-mic0 166 [0] MPI startup(): 12 10003 dn150-mic0 181 [0] MPI startup(): 13 10004 dn150-mic0 196 [0] MPI startup(): 14 10005 dn150-mic0 211 [0] MPI startup(): 15 10006 dn150-mic0 226
0 Kudos
James_T_Intel
Moderator
751 Views

I know it gives a lot of output (you actually have very little output here for what it typically gives).  That's why I asked you to attach it.

From what I see here, it looks like everything is being passed as expected.  Can you try running a single rank outside of MPI (just run "cp2k.psmp.epcc H2O-64.inp" directly) with OMP_NUM_THREADS set?

0 Kudos
Miah__Wadud
Beginner
751 Views
Hi James, unfortunately the application segfaulted and produced the following output: ------------------------------ [eew918@dn150-mic0 benchmark]$ echo $OMP_NUM_THREADS 15 [eew918@dn150-mic0 benchmark]$ cp2k.psmp.epcc H2O-64.inp [-1] MPI startup(): Imported environment partly inaccesible. Map=0 Info=89fd800 [0] MPI startup(): shm data transfer mode [0] MPI startup(): Rank Pid Node name Pin cpu [0] MPI startup(): 0 10256 dn150-mic0 +1 [ application output ] Segmentation fault ------------------------------ Does the Xeon Phi need to be configured in a way that it allows multi-threaded applications to run? This exact binary ran much quicker on another Xeon Phi.
0 Kudos
James_T_Intel
Moderator
751 Views

The segfault could be due to the application expecting to be run with more than 1 rank.

I don't think there is any special configuration needed to enable threading.  There could be other issues at play, such as different compiler versions on each, resulting in different runtime libraries.  Or a different MPSS version.  I'm going to move this thread to the Intel® Many Integrated Core Architecture (Intel MIC Architecture) forum, you should be able to get better answers there.

0 Kudos
Miah__Wadud
Beginner
751 Views
Hi James, the application was built with -static-intel and only dynamically links with the Intel MPI libraries. Thanks for transferring it to the MIC forum. Hopefully someone can figure it out there.
0 Kudos
Reply