Intel® MPI Library
Get help with building, analyzing, optimizing, and scaling high-performance computing (HPC) applications.
2154 Discussions

Cannot use jemalloc with IntelMPI

Eloi_Gaudry
Beginner
664 Views

Hi,

I've tried to bench several memory allocators on Linux (64-bit) such as ptmalloc2, tcmalloc and jemalloc with an application linked against IntelMPI (4.1.3.049).

Launching any application linked with jemalloc will cause the execution to abort with a signal 11. But the same application, when not linked with IntelMPI will work without any issue.

Is IntelMPI doing its own malloc/free ?
How can this issue be overcome ?

Thanks,
Eloi

 

0 Kudos
3 Replies
Eloi_Gaudry
Beginner
664 Views

Here is the backtrace I got within gdb:

Program received signal SIGSEGV, Segmentation fault.
0x00007ffff0c8241c in free (ptr=<error reading variable: Cannot access memory at address 0x7fffff7feff0>) at ../../i_rtc_hook.c:45
45	../../i_rtc_hook.c: No such file or directory.
(gdb) where
#0  0x00007ffff0c8241c in free (ptr=<error reading variable: Cannot access memory at address 0x7fffff7feff0>) at ../../i_rtc_hook.c:45
#1  0x00007ffff0c82480 in free (ptr=0x7fffe7839080) at ../../i_rtc_hook.c:65
#2  0x00007ffff0c82480 in free (ptr=0x7fffe7839080) at ../../i_rtc_hook.c:65
#3  0x00007ffff0c82480 in free (ptr=0x7fffe7839080) at ../../i_rtc_hook.c:65
#4  0x00007ffff0c82480 in free (ptr=0x7fffe7839080) at ../../i_rtc_hook.c:65
#5  0x00007ffff0c82480 in free (ptr=0x7fffe7839080) at ../../i_rtc_hook.c:65

 

 

0 Kudos
Gergana_S_Intel
Employee
664 Views

Hi Eloi,

Thanks for getting in touch.  Yes, Intel MPI does implement its own malloc although it's mostly a wrapper around the OS's malloc call (with some very slight modifications).

If you're looking to replace the memory management subsystem used within Intel MPI, that's easy enough to do.  The Unified Memory Management chapter in our Reference Manual gives you some examples.  Give that a try and let me know how it goes.

Regards,
~Gergana

0 Kudos
Gergana_S_Intel
Employee
664 Views

For those of you keeping track at home, the real issue was in the linking order when using a different memory allocator.  Intel MPI already uses the malloc, relloc, free routines.  If you're simply trying to replace the malloc library used (as a "drop-in" solution) from default glibc to something else (e.g. jemalloc), make sure that's linked *before* linking with libmpi for your application.

~Gergana

0 Kudos
Reply