Intel® MPI Library
Get help with building, analyzing, optimizing, and scaling high-performance computing (HPC) applications.
2272 토론

Bug Report: MPI_Ibsend leads to a segfault

jprotze
초급자
1,759 조회수

When relying on shm for single-node execution, a call to MPI_Ibsend leads to a segfault (see attached minimal reproducer):

```

$ mpicc testIbsend.c

$ mpiexec --version
Intel(R) MPI Library for Linux* OS, Version 2021.6 Build 20220227

$ env I_MPI_FABRICS=shm mpiexec -hosts=localhost -n 2 ./a.out

Ready: 1 of 2 tasks.
Ready: 0 of 2 tasks.
Signing off: 1 of 2 tasks.

===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= RANK 0 PID 2573295 RUNNING AT localhost
= KILLED BY SIGNAL: 11 (Segmentation fault)
===================================================================================

===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= RANK 1 PID 2573296 RUNNING AT localhost
= KILLED BY SIGNAL: 9 (Killed)
===================================================================================

```

The second process successfully receives the message and signs off.

 

gdb points to a null-pointer access in:

https://github.com/pmodels/mpich/blob/3.4.x/src/mpi/pt2pt/ibsend.c#L38

0 포인트
1 솔루션
RabiyaSK_Intel
1,732 조회수

Hi,

 

Thanks for posting in Intel Communities.

 

In order to resolve this issue, we kindly ask that you upgrade to the latest version of Intel MPI(2021.9) as we didn't face any problems with latest version.

 

Also could you please try compiling the MPI program with Intel compilers(i.e. mpiicc instead of mpicc)?

 

Please check the below attached screenshot:

RabiyaSK_Intel_0-1685098749303.png

 

If the problem persists even after upgrading to the latest Intel MPI 2021.9, please provide the specifications of the hardware and operating system that you are currently using.

 

Thanks & Regards,

Shaik Rabiya

 

원본 게시물의 솔루션 보기

0 포인트
3 응답
RabiyaSK_Intel
1,733 조회수

Hi,

 

Thanks for posting in Intel Communities.

 

In order to resolve this issue, we kindly ask that you upgrade to the latest version of Intel MPI(2021.9) as we didn't face any problems with latest version.

 

Also could you please try compiling the MPI program with Intel compilers(i.e. mpiicc instead of mpicc)?

 

Please check the below attached screenshot:

RabiyaSK_Intel_0-1685098749303.png

 

If the problem persists even after upgrading to the latest Intel MPI 2021.9, please provide the specifications of the hardware and operating system that you are currently using.

 

Thanks & Regards,

Shaik Rabiya

 

0 포인트
jprotze
초급자
1,721 조회수

Hi,

 

as suggested I tested with the latest Intel MPI and could not reproduce the issue. I'll ask our sysadmins to roll out the latest version of Intel MPI on our cluster.

 

Thanks

Joachim

0 포인트
RabiyaSK_Intel
1,697 조회수

Hi,


Thanks for accepting our solution. If you have need any additional information, please raise a new question as this thread will no longer be monitored by Intel.


Thanks & Regards,

Shaik Rabiya 


0 포인트
응답