Intel® MPI Library
Get help with building, analyzing, optimizing, and scaling high-performance computing (HPC) applications.

IRECV/SSEND crashes for Intel MPI Library 2019

John_Young
New Contributor I
606 Views

Hi,

I noticed that one of our MPI codes begin crashing after installing Intel Parallel Studio XE 2019 (Intel MPI Library 2019 Update 1) on Windows.  I tracked down the issue to a combination of SSEND/IRECV when the transferred data reaches a certain size.  Test code exhibiting the crash is attached.  The code does not crash when using Intel Parallel Studio XE 2018 (Intel MPI Library Update 3). 

In particular, the 2019 library exhibits a crash when the double precision (square) matrix being transferred has a dimension of around 360-365 in the vicinity of 135K total elements.  The crash occurs for both the 4-byte and 8-byte MPI interfaces.  My compile and dispatch commands are

mpiifort -fpp -DMPI_MPI_INTEGER_TYPE=4 -DMPI_SYS_INTEGER_TYPE=4 test.F90
mpiexec -n 2 ./test.exe

for the 4-byte interface and

mpiifort -ilp64 -i8 -fpp -DMPI_MPI_INTEGER_TYPE=8 -DMPI_SYS_INTEGER_TYPE=8 test.F90
mpiexec -n 2 ./test.exe

for the 8-byte interface. 

 

Any help or suggested workaround is much appreciated.

 

Thanks,

John

 

0 Kudos
4 Replies
James_T_Intel
Moderator
606 Views

There is a fix for this implemented in Intel® MPI Library 2019 Update 2.  Until this is released, you can run with I_MPI_FABRICS=ofi to work around this issue.

0 Kudos
John_Young
New Contributor I
606 Views

Great. Thanks for letting us know.

John

0 Kudos
John_Young
New Contributor I
606 Views

I have updated to Intel MPI Library 2019 Update 2 on MS Windows and the bug still occurs.  The originally attached program still crashes at the same place unless I_MPI_FABRICS is set to ofi.

Is there any ETA on when this bug will be fixed?

Thanks,
John

0 Kudos
James_T_Intel
Moderator
606 Views

We had an unexpected change regarding Update 2, so fixes planned for Update 2 were moved to Update 3.  I apologize for the delays, the fix should be available soon.

0 Kudos
Reply