Intel® MPI Library
Get help with building, analyzing, optimizing, and scaling high-performance computing (HPC) applications.
2267 Discussions

Test program goes Wrong With mpirun

WangWJ
Novice
502 Views

When I want to test whether Intel MPI is installed successful, I find the test program complied from provided code goes wrong with mpirun.

Here is the code provided by Intel. The file is /intel/oneapi/mpi/latest/opt/mpi/test/test.f90

        program main
        use mpi
        implicit none

        integer i, size, rank, namelen, ierr
        character (len=MPI_MAX_PROCESSOR_NAME) :: name
        integer stat(MPI_STATUS_SIZE)

        call MPI_INIT (ierr)

        call MPI_COMM_SIZE (MPI_COMM_WORLD, size, ierr)
        call MPI_COMM_RANK (MPI_COMM_WORLD, rank, ierr)
        call MPI_GET_PROCESSOR_NAME (name, namelen, ierr)

        if (rank.eq.0) then

            print *, 'Hello world: rank ', rank, ' of ', size, ' running on ', name

            do i = 1, size - 1
                call MPI_RECV (rank, 1, MPI_INTEGER, i, 1, MPI_COMM_WORLD, stat, ierr)
                call MPI_RECV (size, 1, MPI_INTEGER, i, 1, MPI_COMM_WORLD, stat, ierr)
                call MPI_RECV (namelen, 1, MPI_INTEGER, i, 1, MPI_COMM_WORLD, stat, ierr)
                name = ''
                call MPI_RECV (name, namelen, MPI_CHARACTER, i, 1, MPI_COMM_WORLD, stat, ierr)
                print *, 'Hello world: rank ', rank, ' of ', size, ' running on ', name
            enddo

        else

            call MPI_SEND (rank, 1, MPI_INTEGER, 0, 1, MPI_COMM_WORLD, ierr)
            call MPI_SEND (size, 1, MPI_INTEGER, 0, 1, MPI_COMM_WORLD, ierr)
            call MPI_SEND (namelen, 1, MPI_INTEGER, 0, 1, MPI_COMM_WORLD, ierr)
            call MPI_SEND (name, namelen, MPI_CHARACTER, 0, 1, MPI_COMM_WORLD, ierr)

        endif

        call MPI_FINALIZE (ierr)

        end

I compile it with following:

source /opt/intel/oneapi/setvars.sh
mpiifx ./test.f90 

Then run with following:

mpirun -n 4 ./a.out

and it goes wrong:

WangWJ_0-1751621260195.png

Here is more information:

OS : Fedora41

oneapi-base-toolkit:2025.2.0

oneapi-hpc-toolkit:2025.2.0

intel mpi:Version 2021.16 Build 20250513

 

 

Labels (1)
0 Kudos
2 Replies
TobiasK
Moderator
485 Views

@WangWJ can you please include more details, including the HW of your system where you get this error?

Can you also please add the output of I_MPI_DEBUG=10 and I_MPI_HYDRA_DEBUG=1?

Best

Tobias

0 Kudos
WangWJ
Novice
215 Views

HW:

CPU: Intel(R) Xeon(R) Gold 6248R CPU @ 3.00GHz *32

MEM: 128GB

 

Output of I_MPI_DEBUG=10 and I_MPI_HYDRA_DEBUG=1:

[mpiexec@host-**bleep**-**bleep**-**bleep**-xx] Launch arguments: /opt/intel/oneapi/mpi/2021.16/bin//hydra_bstrap_proxy --upstream-host host-**bleep**-**bleep**-**bleep**-xx --upstream-port 34519 --pgid 0 --launcher ssh --launcher-number 0 --base-path /opt/intel/oneapi/mpi/2021.16/bin/ --tree-width 16 --tree-level 1 --time-left -1 --launch-type 2 --debug --proxy-id 0 --node-id 0 --subtree-size 1 --upstream-fd 7 /opt/intel/oneapi/mpi/2021.16/bin//hydra_pmi_proxy --usize -1 --auto-cleanup 1 --abort-signal 9 
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] 
  -----------------------------------------
            Topology parameters
 -----------------------------------------
 Number of packages     : 2
 Number of NUMA nodes   : 2
 Number of L3 caches    : 2
 Number of L2 caches    : 16
 Number of L1 caches    : 16
 Number of cores        : 16
 Number of threads      : 32
 Number of hwloc groups : 0
 Topology restricted    : no
 -----------------------------------------
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 10: cmd=init pmi_version=1 pmi_subversion=1
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 15: cmd=init pmi_version=1 pmi_subversion=1
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 9: cmd=init pmi_version=1 pmi_subversion=1
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 10: cmd=get_maxes
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=4096
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 15: cmd=get_maxes
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=4096
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 9: cmd=get_maxes
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=4096
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 10: cmd=get_appnum
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=appnum appnum=0
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 9: cmd=get_appnum
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=appnum appnum=0
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 15: cmd=get_appnum
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=appnum appnum=0
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 6: cmd=init pmi_version=1 pmi_subversion=1
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=response_to_init pmi_version=1 pmi_subversion=1 rc=0
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 9: cmd=get_my_kvsname
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=my_kvsname kvsname=kvs_13052_0
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 10: cmd=get_my_kvsname
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=my_kvsname kvsname=kvs_13052_0
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 6: cmd=get_maxes
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=maxes kvsname_max=256 keylen_max=64 vallen_max=4096
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 15: cmd=get_my_kvsname
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=my_kvsname kvsname=kvs_13052_0
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 6: cmd=get_appnum
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=appnum appnum=0
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 10: cmd=get kvsname=kvs_13052_0 key=PMI_process_mapping
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,4))
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 6: cmd=get_my_kvsname
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=my_kvsname kvsname=kvs_13052_0
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 6: cmd=get kvsname=kvs_13052_0 key=PMI_process_mapping
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,4))
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 15: cmd=get kvsname=kvs_13052_0 key=PMI_process_mapping
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,4))
[0] MPI startup(): Intel(R) MPI Library, Version 2021.16  Build 20250513 (id: a7c135c)
[0] MPI startup(): Copyright (C) 2003-2025 Intel Corporation.  All rights reserved.
[0] MPI startup(): library kind: release
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 9: cmd=get kvsname=kvs_13052_0 key=PMI_process_mapping
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=get_result rc=0 msg=success value=(vector,(0,1,4))
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 6: cmd=put kvsname=kvs_13052_0 key=-bcast-1-0 value=2F6465762F73686D2F496E74656C5F4D50495F4D6D6376584E
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=put_result rc=0 msg=success
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 9: cmd=barrier_in
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 6: cmd=barrier_in
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 10: cmd=barrier_in
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 15: cmd=barrier_in
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=barrier_out
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=barrier_out
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=barrier_out
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=barrier_out
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 9: cmd=get kvsname=kvs_13052_0 key=-bcast-1-0
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=get_result rc=0 msg=success value=2F6465762F73686D2F496E74656C5F4D50495F4D6D6376584E
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 10: cmd=get kvsname=kvs_13052_0 key=-bcast-1-0
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=get_result rc=0 msg=success value=2F6465762F73686D2F496E74656C5F4D50495F4D6D6376584E
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 15: cmd=get kvsname=kvs_13052_0 key=-bcast-1-0
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=get_result rc=0 msg=success value=2F6465762F73686D2F496E74656C5F4D50495F4D6D6376584E
[0] MPI startup(): libfabric loaded: libfabric.so.1 
[0] MPI startup(): libfabric version: 2.1.0-impi
[0] MPI startup(): max number of MPI_Request per vci: 67108864 (pools: 1)
[0] MPI startup(): libfabric provider: tcp
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 9: cmd=barrier_in
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 10: cmd=barrier_in
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 15: cmd=barrier_in
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 6: cmd=put kvsname=kvs_13052_0 key=bc-0 value=mpi#020093CFC0A8CD2A0000000000000000$
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=put_result rc=0 msg=success
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 6: cmd=barrier_in
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=barrier_out
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=barrier_out
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=barrier_out
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=barrier_out
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] pmi cmd from fd 15: cmd=get kvsname=kvs_13052_0 key=bc-0
[proxy:0:0@host-**bleep**-**bleep**-**bleep**-xx] PMI response: cmd=get_result rc=0 msg=success value=mpi#020093CFC0A8CD2A0000000000000000$
[0] MPI startup(): shm segment size (801 MB per rank) * (4 local ranks) = 3207 MB total
forrtl: severe (168): Program Exception - illegal instruction
Image              PC                Routine            Line        Source             
libc.so.6          00007F1291426490  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F12924E5BA2  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F12922F444D  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F129235A4E6  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F1291F692B1  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F129225E611  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F129225E47E  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F1291DCAFD0  MPI_Init              Unknown  Unknown
libmpifort.so.12.  00007F129BF2857F  MPI_INIT              Unknown  Unknown
a.out              00000000004022F4  Unknown               Unknown  Unknown
a.out              000000000040229D  Unknown               Unknown  Unknown
libc.so.6          00007F129140F488  Unknown               Unknown  Unknown
libc.so.6          00007F129140F54B  __libc_start_main     Unknown  Unknown
a.out              00000000004021B5  Unknown               Unknown  Unknown
forrtl: severe (168): Program Exception - illegal instruction
Image              PC                Routine            Line        Source             
libc.so.6          00007F5047626490  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F50486E5BA2  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F50484F444D  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F504855A4E6  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F50481692B1  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F504845E611  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F504845E47E  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F5047FCAFD0  MPI_Init              Unknown  Unknown
libmpifort.so.12.  00007F505212857F  MPI_INIT              Unknown  Unknown
a.out              00000000004022F4  Unknown               Unknown  Unknown
a.out              000000000040229D  Unknown               Unknown  Unknown
libc.so.6          00007F504760F488  Unknown               Unknown  Unknown
libc.so.6          00007F504760F54B  __libc_start_main     Unknown  Unknown
a.out              00000000004021B5  Unknown               Unknown  Unknown
forrtl: severe (168): Program Exception - illegal instruction
Image              PC                Routine            Line        Source             
libc.so.6          00007F7004826490  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F70058E5BA2  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F70056F444D  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F700575A4E6  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F70053692B1  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F700565E611  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F700565E47E  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F70051CAFD0  MPI_Init              Unknown  Unknown
libmpifort.so.12.  00007F700F32857F  MPI_INIT              Unknown  Unknown
a.out              00000000004022F4  Unknown               Unknown  Unknown
a.out              000000000040229D  Unknown               Unknown  Unknown
libc.so.6          00007F700480F488  Unknown               Unknown  Unknown
libc.so.6          00007F700480F54B  __libc_start_main     Unknown  Unknown
a.out              00000000004021B5  Unknown               Unknown  Unknown
forrtl: severe (168): Program Exception - illegal instruction
Image              PC                Routine            Line        Source             
libc.so.6          00007F3BCD226490  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F3BCE2E5BA2  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F3BCE0F444D  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F3BCE15A4E6  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F3BCDD692B1  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F3BCE05E611  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F3BCE05E47E  Unknown               Unknown  Unknown
libmpi.so.12.0.0   00007F3BCDBCAFD0  MPI_Init              Unknown  Unknown
libmpifort.so.12.  00007F3BD7D2857F  MPI_INIT              Unknown  Unknown
a.out              00000000004022F4  Unknown               Unknown  Unknown
a.out              000000000040229D  Unknown               Unknown  Unknown
libc.so.6          00007F3BCD20F488  Unknown               Unknown  Unknown
libc.so.6          00007F3BCD20F54B  __libc_start_main     Unknown  Unknown
a.out              00000000004021B5  Unknown               Unknown  Unknown

 

0 Kudos
Reply