- 신규로 표시
- 북마크
- 구독
- 소거
- RSS 피드 구독
- 강조
- 인쇄
- 부적절한 컨텐트 신고
Hi,
I've tried to bench several memory allocators on Linux (64-bit) such as ptmalloc2, tcmalloc and jemalloc with an application linked against IntelMPI (4.1.3.049).
Launching any application linked with jemalloc will cause the execution to abort with a signal 11. But the same application, when not linked with IntelMPI will work without any issue.
Is IntelMPI doing its own malloc/free ?
How can this issue be overcome ?
Thanks,
Eloi
링크가 복사됨
- 신규로 표시
- 북마크
- 구독
- 소거
- RSS 피드 구독
- 강조
- 인쇄
- 부적절한 컨텐트 신고
Here is the backtrace I got within gdb:
Program received signal SIGSEGV, Segmentation fault. 0x00007ffff0c8241c in free (ptr=<error reading variable: Cannot access memory at address 0x7fffff7feff0>) at ../../i_rtc_hook.c:45 45 ../../i_rtc_hook.c: No such file or directory. (gdb) where #0 0x00007ffff0c8241c in free (ptr=<error reading variable: Cannot access memory at address 0x7fffff7feff0>) at ../../i_rtc_hook.c:45 #1 0x00007ffff0c82480 in free (ptr=0x7fffe7839080) at ../../i_rtc_hook.c:65 #2 0x00007ffff0c82480 in free (ptr=0x7fffe7839080) at ../../i_rtc_hook.c:65 #3 0x00007ffff0c82480 in free (ptr=0x7fffe7839080) at ../../i_rtc_hook.c:65 #4 0x00007ffff0c82480 in free (ptr=0x7fffe7839080) at ../../i_rtc_hook.c:65 #5 0x00007ffff0c82480 in free (ptr=0x7fffe7839080) at ../../i_rtc_hook.c:65
- 신규로 표시
- 북마크
- 구독
- 소거
- RSS 피드 구독
- 강조
- 인쇄
- 부적절한 컨텐트 신고
Hi Eloi,
Thanks for getting in touch. Yes, Intel MPI does implement its own malloc although it's mostly a wrapper around the OS's malloc call (with some very slight modifications).
If you're looking to replace the memory management subsystem used within Intel MPI, that's easy enough to do. The Unified Memory Management chapter in our Reference Manual gives you some examples. Give that a try and let me know how it goes.
Regards,
~Gergana
- 신규로 표시
- 북마크
- 구독
- 소거
- RSS 피드 구독
- 강조
- 인쇄
- 부적절한 컨텐트 신고
For those of you keeping track at home, the real issue was in the linking order when using a different memory allocator. Intel MPI already uses the malloc, relloc, free routines. If you're simply trying to replace the malloc library used (as a "drop-in" solution) from default glibc to something else (e.g. jemalloc), make sure that's linked *before* linking with libmpi for your application.
~Gergana