- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I'm hitting a segfault with some coarray code when running with 1 image. I've reduced it to this test case:
integer :: x(10), y(10)
call foo(x, y)
contains
subroutine foo(src, dest)
integer, intent(in), target :: src(:)
integer, intent(inout) :: dest(:)
type box
integer, pointer :: data(:)
end type
type(box), allocatable :: buffer[:]
allocate(buffer[*])
if (this_image() == 1) buffer%data => src
sync all
dest = buffer[1]%data
end subroutine
end
I'm using ifort 2021.5.0 and using these compiler options: -coarray -coarray-num-images=1
I'm also setting
export I_MPI_DEVICE=shm
export I_MPI_FABRICS=shm
per this Intel page in order to keep coarray executables from pounding on the loopback network device and getting really crappy performance (I'm running on a single shared-memory node).
With this setup the executable segfaults, however if I don't set those I_MPI variables then the test case executes without error.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I learned that there is a known issue with MPI that applies to your case and a workaround.
- Single-process run with I_MPI_FABRICS=shm may lead to crash. Please use out-of-box settings or I_MPI_FABRICS=shm:ofi instead.
It worked for me with your reproducer.
This will be documented in the MPI Library Release Notes in the next release.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I duplicated your issue. I'll keeping looking at that.
There are two workarounds that I see.
- Compile with -coarray=shared.
- Create a config file. The article has some details. It's written for distributed environments, but the concepts are the same for shared.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I learned that there is a known issue with MPI that applies to your case and a workaround.
- Single-process run with I_MPI_FABRICS=shm may lead to crash. Please use out-of-box settings or I_MPI_FABRICS=shm:ofi instead.
It worked for me with your reproducer.
This will be documented in the MPI Library Release Notes in the next release.
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page