Intel® MPI Library
Get help with building, analyzing, optimizing, and scaling high-performance computing (HPC) applications.
2262 Discussions

Intel MPI 2021.14 works, 2021.15 doesn't

Matt_Thompson
Novice
169 Views

All,

 

I recently downloaded oneAPI HPC 2025.1 and set about trying to build my codes. Now, eventually I hit an ICE building FMS but I had other things I can work on that didn't need it. However, I found that when I tried other tests, they *all* failed. Eventually I got to hello world and found it fails.

 

The code is boring:

program hello_world

use iso_fortran_env
use mpi

implicit none

integer :: comm
integer :: myid, npes, ierror
integer :: name_length

character(len=MPI_MAX_PROCESSOR_NAME) :: processor_name

call mpi_init(ierror)
comm = MPI_COMM_WORLD
call MPI_Comm_Rank(comm,myid,ierror)
call MPI_Comm_Size(comm,npes,ierror)
call MPI_Get_Processor_Name(processor_name,name_length,ierror)
write (output_unit,'(A,1X,I4,1X,A,1X,I4,1X,A,1X,A)') "Process", myid, "of", npes, "is on", trim(processor_name)
call MPI_Finalize(ierror)

end program hello_world

 

When I build and run with Intel MPI 2021.14, all is well:

> mpiifx --version && mpirun --version
ifx (IFX) 2025.1.0 20250317
Copyright (C) 1985-2025 Intel Corporation. All rights reserved.

Intel(R) MPI Library for Linux* OS, Version 2021.14 Build 20241121 (id: e7829d6)
Copyright 2003-2024, Intel Corporation.
> mpiifx -o helloWorld.simple.exe helloWorld.simple.F90 && mpirun -np 4 ./helloWorld.simple.exe
Process    2 of    4 is on gs6101-bucy.gsfc.nasa.gov
Process    3 of    4 is on gs6101-bucy.gsfc.nasa.gov
Process    1 of    4 is on gs6101-bucy.gsfc.nasa.gov
Process    0 of    4 is on gs6101-bucy.gsfc.nasa.gov


But when I use Intel MPI 2021.15:

> mpiifx --version && mpirun --version
ifx (IFX) 2025.1.0 20250317
Copyright (C) 1985-2025 Intel Corporation. All rights reserved.

Intel(R) MPI Library for Linux* OS, Version 2021.15 Build 20250213 (id: d233448)
Copyright 2003-2025, Intel Corporation.
> mpiifx -o helloWorld.simple.exe helloWorld.simple.F90 && mpirun -np 4 ./helloWorld.simple.exe
[mpiexec@gs6101-bucy.gsfc.nasa.gov] Error: Downstream from host gs6101-bucy.gsfc.nasa.gov exited abnormally
[mpiexec@gs6101-bucy.gsfc.nasa.gov] Trying to close other downstreams

 

Does anyone have any ideas about this?  I control my MPI via Lua module and the *only* difference is the version number:

> diff /ford1/share/gmao_SIteam/lmodulefiles/Compiler/ifx-2025.1/intelmpi/2021.14.lua /ford1/share/gmao_SIteam/lmodulefiles/Compiler/ifx-2025.1/intelmpi/2021.15.lua
1c1
< --[[ Lua modulefile for Intel MPI 2021.14 using Intel oneAPI 2025.1
---
> --[[ Lua modulefile for Intel MPI 2021.15 using Intel oneAPI 2025.1
9c9
< local version = "2021.14"
---
> local version = "2021.15"

 

I felt fine doing that since the official Intel modulefiles didn't seem to change:

> diff -s /ford1/share/gmao_SIteam/intel/oneapi/2025/mpi/2021.14/etc/modulefiles/mpi/2021.14 /ford1/share/gmao_SIteam/intel/oneapi/2025/mpi/2021.15/etc/modulefiles/mpi/2021.15
Files /ford1/share/gmao_SIteam/intel/oneapi/2025/mpi/2021.14/etc/modulefiles/mpi/2021.14 and /ford1/share/gmao_SIteam/intel/oneapi/2025/mpi/2021.15/etc/modulefiles/mpi/2021.15 are identical
0 Kudos
0 Replies
Reply