Intel® Fortran Compiler
Build applications that can scale for the future with optimized code designed for Intel® Xeon® and compatible processors.
29235 Discussions

severe (174): SIGSEGV - libc.so.6

nansal
Beginner
1,519 Views

Hi,

I am using the intel fortran compiler for Linux 10.1.015 and intel MKL 10.0.2. When I run my executable program (TESTKMAT)with small dimension of matrices, I have NO ERROR. But when I increase dimension of matrices, I have this message after running the executable program:

forrtl: severe (174): SIGSEGV, segmentation fault occurred

Image PC Routine Line Source

TESTKMAT 0804B554 Unknown Unknown Unknown

TESTKMAT 0804AD91 Unknown Unknown Unknown

libc.so.6 0071E390 Unknown Unknown Unknown

TESTKMAT 0804ACA1 Unknown Unknown Unknown

I tried to increase the stack size in terminal:

$ source /opt/intel/fc/10.1.015/bin/ifortvars.sh

$ source /opt/intel/mkl/10.0.2.018/tools/environment/mklvars32.sh

$ ulimit -s unlimited

and using the compiler option '-heap-arrays' such as bellow:

$ ifort -heap-arrays TESTKMAT.for -L LD_LIBRARY_PATH -I LD_INCLUDE_PATH -lmkl_intel_lp64 -lmkl_intel_thread -lmkl_lapack -lmkl_ia32 -lmkl_core -lguide -lpthread

But I did not have any improvement and the previous message there was.

I really appreciate your helps.

Thanks
0 Kudos
5 Replies
Ron_Green
Moderator
1,519 Views
compile and link with -g -traceback -fp-stack-check -fpe0 -check all

this may help isolate the problem.

ron
0 Kudos
nansal
Beginner
1,519 Views
compile and link with -g -traceback -fp-stack-check -fpe0 -check all

this may help isolate the problem.

ron

Thank you so much Ron,

I used those compiler options.

My problem was in a line that I had assigned a real value to an array which has only 3 elements {I have defined it by 'DIMENSION A(3)'}!

In my program, the array of "A" is always fixed and has only 3 elements. But corresponding to the number of atoms, other matrices can have large sizes. When the number of atoms is 3718 I have no problem but when I set my program to 4950 atoms, this problem occurs. However, the array of "A" is unchanged.

Really I don't know what have happened?

Thanks in advance!


0 Kudos
joey_hylton
Beginner
1,519 Views
Quoting - nansal

Thank you so much Ron,

I used those compiler options.

My problem was in a line that I had assigned a real value to an array which has only 3 elements {I have defined it by 'DIMENSION A(3)'}!

In my program, the array of "A" is always fixed and has only 3 elements. But corresponding to the number of atoms, other matrices can have large sizes. When the number of atoms is 3718 I have no problem but when I set my program to 4950 atoms, this problem occurs. However, the array of "A" is unchanged.

Really I don't know what have happened?

Thanks in advance!


Did you get any solutions?
I met the same problem, but I located it to MPI functions which has no problem with the data size.
Thanks,

J.L
0 Kudos
Ron_Green
Moderator
1,519 Views
Quoting - nansal

Thank you so much Ron,

I used those compiler options.

My problem was in a line that I had assigned a real value to an array which has only 3 elements {I have defined it by 'DIMENSION A(3)'}!

In my program, the array of "A" is always fixed and has only 3 elements. But corresponding to the number of atoms, other matrices can have large sizes. When the number of atoms is 3718 I have no problem but when I set my program to 4950 atoms, this problem occurs. However, the array of "A" is unchanged.

Really I don't know what have happened?

Thanks in advance!



There are a few other diagnostics to try:

-g -traceback -fp-stack-check -gen-interfaces -warn interfaces

There is a bug in -gen-interfaces -warn interfaces with the 11.0.083 compiler, so if you have 11.0.081 or older use that compiler.

ron
0 Kudos
Ron_Green
Moderator
1,519 Views
Quoting - joey_hylton
Did you get any solutions?
I met the same problem, but I located it to MPI functions which has no problem with the data size.
Thanks,

J.L

When a call to an MPI routine seg faults, I look very carefully at the arguments. Make sure you are passing variables that have the correct MPI data types. Another source could be if the drivers for your underlying fabric (myrinet, infiniband) are not up to date or you have a mismatch of your adapter firmware to your dapl drivers. If you are using a non-ethernet interconnect, check all your adapters for the correct (and matching) firmware versions, and check that your drivers are up to date AND are supported on your OS. Sometimes people go crazy installing the latest version of their Linux distro and forget that the drivers they use for the fabric are not supported on these newest versions of the distro.

ron
0 Kudos
Reply