Intel® Fortran Compiler
Build applications that can scale for the future with optimized code designed for Intel® Xeon® and compatible processors.

fpe0 and mpi_init fails

dr_jfloyd
Beginner
489 Views

Have a large mpi application (cfd model) where we use fpe0 when running our verification suite in debug. After updating to the current OneAPI compilers, using either ifort or ifx the following program fails at MPI_INIT if fpe0 is used as a compiler option.  This means no verificaiton cases can be run in debug, as they all require intializing MPI (the actual application uses MPI_INIT_THREAD but that also fails). The error message follows the source code.

program test_mpi
use mpi_f08
implicit none

integer i, size, rank, namelen, ierr
character (len=MPI_MAX_PROCESSOR_NAME) :: name
type(mpi_status) :: stat

call MPI_INIT (ierr)

call MPI_COMM_SIZE (MPI_COMM_WORLD, size, ierr)
call MPI_COMM_RANK (MPI_COMM_WORLD, rank, ierr)
call MPI_GET_PROCESSOR_NAME (name, namelen, ierr)

if (rank.eq.0) then

print *, 'Hello world: rank ', rank, ' of ', size, ' running on ', name

do i = 1, size - 1
call MPI_RECV (rank, 1, MPI_INTEGER, i, 1, MPI_COMM_WORLD, stat, ierr)
call MPI_RECV (size, 1, MPI_INTEGER, i, 1, MPI_COMM_WORLD, stat, ierr)
call MPI_RECV (namelen, 1, MPI_INTEGER, i, 1, MPI_COMM_WORLD, stat, ierr)
name = ''
call MPI_RECV (name, namelen, MPI_CHARACTER, i, 1, MPI_COMM_WORLD, stat, ierr)
print *, 'Hello world: rank ', rank, ' of ', size, ' running on ', name
enddo

else

call MPI_SEND (rank, 1, MPI_INTEGER, 0, 1, MPI_COMM_WORLD, ierr)
call MPI_SEND (size, 1, MPI_INTEGER, 0, 1, MPI_COMM_WORLD, ierr)
call MPI_SEND (namelen, 1, MPI_INTEGER, 0, 1, MPI_COMM_WORLD, ierr)
call MPI_SEND (name, namelen, MPI_CHARACTER, 0, 1, MPI_COMM_WORLD, ierr)

endif

call MPI_FINALIZE (ierr)

end

 

Image PC Routine Line Source
libc.so.6 000014D70BA54DB0 Unknown Unknown Unknown
libucp.so.0.0.0 000014D70C2E717E ucp_proto_perf_en Unknown Unknown
libucp.so.0.0.0 000014D70C2E881D ucp_proto_init_pa Unknown Unknown
libucp.so.0.0.0 000014D70C2EFDF7 ucp_proto_common_ Unknown Unknown
libucp.so.0.0.0 000014D70C2F15F5 ucp_proto_multi_i Unknown Unknown
libucp.so.0.0.0 000014D70C316404 Unknown Unknown Unknown
libucp.so.0.0.0 000014D70C2F2D42 Unknown Unknown Unknown
libucp.so.0.0.0 000014D70C2F4561 ucp_proto_select_ Unknown Unknown
libucp.so.0.0.0 000014D70C2F4A25 ucp_proto_select_ Unknown Unknown
libucp.so.0.0.0 000014D70C2E9A89 ucp_worker_get_ep Unknown Unknown
libucp.so.0.0.0 000014D70C33F39C ucp_wireup_init_l Unknown Unknown
libucp.so.0.0.0 000014D70C2D19CE ucp_ep_create_to_ Unknown Unknown
libucp.so.0.0.0 000014D70C2D2B33 ucp_ep_create Unknown Unknown
libmlx-fi.so 000014D709A08460 Unknown Unknown Unknown
libmpi.so.12.0.0 000014D70CB7295E Unknown Unknown Unknown
libmpi.so.12.0.0 000014D70C71C60A Unknown Unknown Unknown
libmpi.so.12.0.0 000014D70C9E414E Unknown Unknown Unknown
libmpi.so.12.0.0 000014D70C9E396B MPI_Init Unknown Unknown
libmpifort.so.12. 000014D70E0B90A6 mpi_init_f08_ Unknown Unknown
a.out 00000000004052BF Unknown Unknown Unknown
a.out 000000000040521D Unknown Unknown Unknown
libc.so.6 000014D70BA3FEB0 Unknown Unknown Unknown
libc.so.6 000014D70BA3FF60 __libc_start_main Unknown Unknown

0 Kudos
2 Replies
Steve_Lionel
Honored Contributor III
473 Views

I suggest that you ask this in Intel® HPC Toolkit - Intel Community as that's where the Intel MPI experts live. When you do so, please also show the error message, as all you showed here was a stack trace.

0 Kudos
dr_jfloyd
Beginner
466 Views

Thanks, I'll try there. Also, thanks for pointing out that I missed copying the floating invalid error message.

0 Kudos
Reply