<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic yes, rank 0 and task 0 are in Intel® Fortran Compiler</title>
    <link>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073474#M119925</link>
    <description>&lt;P&gt;yes, rank 0 and task 0 are the same thing.&lt;/P&gt;

&lt;P&gt;no, it doesn't tell you too much: just that all other ranks wait in an MPI_Barrier, presumably until rank 0 finishes its particularities. Of course, they'll wait forever (or until the timeout), because rank 0 exited with an error..&lt;/P&gt;</description>
    <pubDate>Wed, 18 Nov 2015 21:01:34 GMT</pubDate>
    <dc:creator>John_D_6</dc:creator>
    <dc:date>2015-11-18T21:01:34Z</dc:date>
    <item>
      <title>error on many cores run</title>
      <link>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073467#M119918</link>
      <description>&lt;P&gt;Hi&lt;/P&gt;

&lt;P&gt;I have a code that works on a cluster when I use 6&lt;SUP class="moz-txt-sup"&gt;&lt;SPAN style="display:inline-block;width:0;height:0;overflow:hidden"&gt;^&lt;/SPAN&gt;3&lt;/SUP&gt; = 216 cores, but the code crashes when I try to make it run with an higher resolution using a 12&lt;SUP class="moz-txt-sup"&gt;&lt;SPAN style="display:inline-block;width:0;height:0;overflow:hidden"&gt;^&lt;/SPAN&gt;3&lt;/SUP&gt; = 1728 cores (all the parameters are the same except the grid spacing and the number of processors with which the code work).&lt;BR /&gt;
	&lt;BR /&gt;
	We tried to see if it is a memory issue but even running the job with 16 tasks per nodes (108 nodes) didn't help.&lt;BR /&gt;
	I cannot debug the program with something like totalview because of the limit of processes these debuggers can manage.&lt;/P&gt;

&lt;P&gt;I tried to compile the program with -O0 -g -traceback to get some better information in the error message.&lt;BR /&gt;
	When I add this options, even if the program crashes it runs until it expires the time I requested on the cluster.&lt;BR /&gt;
	&lt;BR /&gt;
	In this case I get:&lt;BR /&gt;
	srun.slurm: Job step aborted: Waiting up to 2 seconds for job step to finish.&lt;BR /&gt;
	slurmstepd-borgt091: *** JOB 5787356 CANCELLED AT 2015-11-02T11:17:00 DUE TO TIME LIMIT on borgt091 ***&lt;BR /&gt;
	slurmstepd-borgt091: *** STEP 5787356.0 CANCELLED AT 2015-11-02T11:17:00 DUE TO TIME LIMIT on borgt091 ***&lt;BR /&gt;
	forrtl: error (78): process killed (SIGTERM)&lt;BR /&gt;
	Image&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; PC&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Routine&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Line&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Source&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; &amp;nbsp;&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 000000000088C169&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 000000000088AA3E&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 0000000000848F32&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 0000000000815663&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 0000000000819219&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	libpthread.so.0&amp;nbsp;&amp;nbsp;&amp;nbsp; 00002AAAAB669810&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	libpthread.so.0&amp;nbsp;&amp;nbsp;&amp;nbsp; 00002AAAAB6663D0&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	forrtl: error (78): process killed (SIGTERM)&lt;BR /&gt;
	Image&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; PC&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Routine&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Line&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Source&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; &amp;nbsp;&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 000000000088C169&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 000000000088AA3E&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 0000000000848F32&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 0000000000815663&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 0000000000819219&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	libpthread.so.0&amp;nbsp;&amp;nbsp;&amp;nbsp; 00002AAAAB669810&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 0000000000819140&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	libpthread.so.0&amp;nbsp;&amp;nbsp;&amp;nbsp; 00002AAAAB669810&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	libmlx5-rdmav2.so&amp;nbsp; 00002AAAACE3F4BB&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;/P&gt;

&lt;P&gt;Stack trace terminated abnormally.&lt;/P&gt;

&lt;P&gt;(more similar lines...)&lt;/P&gt;

&lt;P&gt;I attach the complete error file (JOBID 5787356)&lt;BR /&gt;
	&lt;BR /&gt;
	However, when I run the same simulation without the compiler options I get a different error and the job break down earlier:&lt;BR /&gt;
	forrtl: severe (174): SIGSEGV, segmentation fault occurred&lt;BR /&gt;
	Image&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; PC&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Routine&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Line&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Source&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; &amp;nbsp;&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 0000000000869189&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 0000000000867A5E&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 0000000000825B72&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 00000000007F2633&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 00000000007F621B&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	libpthread.so.0&amp;nbsp;&amp;nbsp;&amp;nbsp; 00002AAAAB669810&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	libc.so.6&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 00002AAAAC126C52&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 00000000005389A2&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 00000000004A6643&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 0000000000462106&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 000000000041B72F&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 00000000004165C6&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	libc.so.6&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 00002AAAAC02FC36&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 00000000004164B9&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	srun.slurm: error: borgo015: task 0: Exited with exit code 174&lt;BR /&gt;
	MPT ERROR: borgo021 has had continuous IB fabric problems for 10&lt;BR /&gt;
	&amp;nbsp;&amp;nbsp; &amp;nbsp;(MPI_WATCHDOG_TIMER) minutes trying to reach borgo015. Aborting.&lt;BR /&gt;
	MPT ERROR: borgo020 has had continuous IB fabric problems for 10&lt;BR /&gt;
	&amp;nbsp;&amp;nbsp; &amp;nbsp;(MPI_WATCHDOG_TIMER) minutes trying to reach borgo015. Aborting.&lt;BR /&gt;
	MPT: Global rank 32 is aborting with error code 0.&lt;BR /&gt;
	&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Process ID: 12240, Host: borgo021, Program: /gpfsm/dnb32/gbrambil/Kcode/pulsarSILOF/3dpic_full_mpi.exe&lt;/P&gt;

&lt;P&gt;(other stuff later)&lt;/P&gt;

&lt;P&gt;I attach the error file of this job too (JOBID 5991137)&lt;BR /&gt;
	&lt;BR /&gt;
	Do you have any idea of what the problem could be? I saw this topic &lt;A href="https://software.intel.com/en-us/forums/intel-fortran-compiler-for-linux-and-mac-os-x/topic/558488" target="_blank"&gt;https://software.intel.com/en-us/forums/intel-fortran-compiler-for-linux-and-mac-os-x/topic/558488&lt;/A&gt; does it work for my case too (I cannot use a debugger like this guy)?&lt;/P&gt;

&lt;P&gt;P.S: in the error file it appears this line rm: cannot remove `pcrimth.dat': No such file or directory. Don't worry about it, it always appears but&amp;nbsp; the code runs.&lt;/P&gt;

&lt;P&gt;Thanks&lt;/P&gt;</description>
      <pubDate>Wed, 18 Nov 2015 14:44:07 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073467#M119918</guid>
      <dc:creator>Gabriele_B_</dc:creator>
      <dc:date>2015-11-18T14:44:07Z</dc:date>
    </item>
    <item>
      <title>This question appears to be</title>
      <link>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073468#M119919</link>
      <description>&lt;P&gt;This question appears to be related to MPI, but I don't see even a clue about which implementation of MPI. Each major implementation of MPI has its own email help list, except for Intel MPI (which I guess you aren't using) it would be the clusters and HPC companion forum to this one.&lt;/P&gt;</description>
      <pubDate>Wed, 18 Nov 2015 15:25:44 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073468#M119919</guid>
      <dc:creator>TimP</dc:creator>
      <dc:date>2015-11-18T15:25:44Z</dc:date>
    </item>
    <item>
      <title>I use SGI-MPT MPI.</title>
      <link>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073469#M119920</link>
      <description>&lt;P&gt;I use SGI-MPT MPI.&lt;/P&gt;

&lt;P&gt;I wrote on this forum because I saw a similar question have been posed succesfully (the link I inserted in my post)&lt;/P&gt;

&lt;P&gt;Thanks&lt;/P&gt;

&lt;P&gt;GB&lt;/P&gt;

&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 18 Nov 2015 15:41:36 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073469#M119920</guid>
      <dc:creator>Gabriele_B_</dc:creator>
      <dc:date>2015-11-18T15:41:36Z</dc:date>
    </item>
    <item>
      <title>It seems slightly related in</title>
      <link>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073470#M119921</link>
      <description>&lt;P&gt;It seems slightly related in that your cluster watchdog timer has killed your job, but it has explained that you have been waiting far to long to reach a specified node.&amp;nbsp; If that node was allocated to you by your cluster manager but is unavailable, that is a problem for your sysadmin.&lt;/P&gt;</description>
      <pubDate>Wed, 18 Nov 2015 15:49:00 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073470#M119921</guid>
      <dc:creator>TimP</dc:creator>
      <dc:date>2015-11-18T15:49:00Z</dc:date>
    </item>
    <item>
      <title>But what about the fact that</title>
      <link>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073471#M119922</link>
      <description>&lt;P&gt;But what about the fact that when I compile the program without -O0 -g -traceback I got a SIGSEGV error?&lt;/P&gt;

&lt;P&gt;Which one of the two is the "correct" error? Why, with these compiling options, the cluster doesn't kill the job until the time expires?&lt;/P&gt;

&lt;P&gt;Is WATCHDOG killing processes when the time expires?&lt;/P&gt;

&lt;P&gt;Thanks&lt;/P&gt;

&lt;P&gt;GB&lt;/P&gt;</description>
      <pubDate>Wed, 18 Nov 2015 16:22:39 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073471#M119922</guid>
      <dc:creator>Gabriele_B_</dc:creator>
      <dc:date>2015-11-18T16:22:39Z</dc:date>
    </item>
    <item>
      <title>indeed your job 5991137 shows</title>
      <link>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073472#M119923</link>
      <description>&lt;P&gt;indeed your job &lt;SPAN class="file"&gt;&lt;A href="https://community.intel.com/legacyfs/online/drupal_files/managed/ea/92/error.5991137.txt"&gt;5991137 shows the 'real' error in the sense that the segfault on rank 0 is what&lt;/A&gt;&lt;/SPAN&gt;&lt;SPAN class="file"&gt; needs to be fixed. &lt;/SPAN&gt;The other ranks then just stop, because they can't contact that node (borgo015) anymore. Just extend the walltime of your job as long as is allowed on your cluster and rerun the job with debug info and -O0. If you can't wait that long, you can compile with -O2 -g -traceback and rerun. Note that in that case the printed location of the error is probably (much) less accurate.&lt;/P&gt;</description>
      <pubDate>Wed, 18 Nov 2015 20:07:10 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073472#M119923</guid>
      <dc:creator>John_D_6</dc:creator>
      <dc:date>2015-11-18T20:07:10Z</dc:date>
    </item>
    <item>
      <title>Hi John,</title>
      <link>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073473#M119924</link>
      <description>&lt;P&gt;Hi John,&lt;/P&gt;

&lt;P&gt;thanks. Is task 0 equivalent to rank 0? Because if this is true it gives me some hints for the solution (the rank 0 job is doing something particular).&amp;nbsp;&lt;/P&gt;

&lt;P&gt;&lt;EM&gt;Just extend the walltime of your job as long as is allowed on your cluster and rerun the job with debug info and -O0.&lt;/EM&gt;&lt;/P&gt;

&lt;P&gt;I tried but when I run the job in this way (debug info for me are -g -traceback, correct?) no matter the time, the simulation takes all the time but in reality remain blocked at the same point.&lt;/P&gt;

&lt;P&gt;I'll try this one -O2 -g -traceback&lt;/P&gt;

&lt;P&gt;Is there anything in this part of the error message below that can tell me something of what is happening to the job that has the problem?&lt;/P&gt;

&lt;PRE&gt;MPT: --------stack traceback-------
MPT: Attaching to program: /proc/18777/exe, process 18777
MPT: Try: zypper install -C "debuginfo(build-id)=d4191084441e39a7b480fc4b41f67083812e9811"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=682e4f7a27ee294a58f17249a0717861db546f2d"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=32c1c7a7a20b54ac3af6b2f436b3375ffeb12f0b"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=2f51a06469a025d507534fe292dcf4e02235bd18"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=91334fd8105f0b62c0bdbbec14b45a9fd043f4c3"
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: Try: zypper install -C "debuginfo(build-id)=4aee0c3923838575483ebd16be6db85ecb6f0b75"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=3b149eccd897f1f37dce50ad22614043eba757a2"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=a95c0ce9f9752baf052d3b55b3b8ce19f662d2eb"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=719375f80fd84b85b905db2c20ec70e8805b36e5"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=4f3dc8efbe18b50a6abe70d8b2f862ce185542d6"
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaab66937f in waitpid () from /lib64/libpthread.so.0
MPT: (gdb) #0  0x00002aaaab66937f in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaabd6f32c in mpi_sgi_system (command=&amp;lt;optimized out&amp;gt;) at sig.c:99
MPT: #2  MPI_SGI_stacktraceback (header=&amp;lt;optimized out&amp;gt;) at sig.c:319
MPT: #3  0x00002aaaabcc4a4a in print_traceback (ecode=0) at abort.c:197
MPT: #4  0x00002aaaabcc4b3e in MPI_SGI_abort () at abort.c:85
MPT: #5  0x00002aaaabd7ff4f in try_repush (gap=&amp;lt;optimized out&amp;gt;, 
MPT:     now=2886902.8844944318) at ud.c:1599
MPT: #6  0x00002aaaabd80eb6 in MPI_SGI_ud_progress () at ud.c:1769
MPT: #7  0x00002aaaabd57f62 in MPI_SGI_progress_devices () at progress.c:118
MPT: #8  MPI_SGI_progress () at progress.c:241
MPT: #9  0x00002aaaabd690d5 in MPI_SGI_request_wait (request=0x7fffffffa35c, 
MPT:     status=0x2aaaac010970 &amp;lt;mpi_sgi_status_ignore&amp;gt;, set=0x7fffffffa364, 
MPT:     gen_rc=0x7fffffffa360) at req.c:1703
MPT: #10 0x00002aaaabcd6d32 in MPI_SGI_barrier_basic (comm=6) at barrier.c:61
MPT: #11 0x00002aaaabcd6e65 in MPI_SGI_barrier (comm=6) at barrier.c:204
MPT: #12 0x00002aaaabcd70f5 in MPI_SGI_barrier (comm=&amp;lt;optimized out&amp;gt;)
MPT:     at barrier.c:166
MPT: #13 0x00002aaaabcd7193 in PMPI_Barrier (comm=1) at barrier.c:344
MPT: #14 0x00002aaaabcd72df in pmpi_barrier__ ()
MPT:    from /usr/local/sgi/mpi/mpt-2.12/opt/sgi/mpt/mpt-2.12/lib/libmpi.so
MPT: #15 0x000000000041bd07 in MAIN__ ()
MPT: #16 0x00000000004165c6 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 18777] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/18777/exe, process 18777
MPT: Attaching to program: /proc/5232/exe, process 5232
MPT: Try: zypper install -C "debuginfo(build-id)=d4191084441e39a7b480fc4b41f67083812e9811"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=682e4f7a27ee294a58f17249a0717861db546f2d"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=32c1c7a7a20b54ac3af6b2f436b3375ffeb12f0b"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=2f51a06469a025d507534fe292dcf4e02235bd18"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=91334fd8105f0b62c0bdbbec14b45a9fd043f4c3"
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: Try: zypper install -C "debuginfo(build-id)=4aee0c3923838575483ebd16be6db85ecb6f0b75"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=3b149eccd897f1f37dce50ad22614043eba757a2"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=a95c0ce9f9752baf052d3b55b3b8ce19f662d2eb"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=719375f80fd84b85b905db2c20ec70e8805b36e5"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=4f3dc8efbe18b50a6abe70d8b2f862ce185542d6"
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaab66937f in waitpid () from /lib64/libpthread.so.0
MPT: (gdb) #0  0x00002aaaab66937f in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaabd6f32c in mpi_sgi_system (command=&amp;lt;optimized out&amp;gt;) at sig.c:99
MPT: #2  MPI_SGI_stacktraceback (header=&amp;lt;optimized out&amp;gt;) at sig.c:319
MPT: #3  0x00002aaaabcc4a4a in print_traceback (ecode=0) at abort.c:197
MPT: #4  0x00002aaaabcc4b3e in MPI_SGI_abort () at abort.c:85
MPT: #5  0x00002aaaabd7ff4f in try_repush (gap=&amp;lt;optimized out&amp;gt;, 
MPT:     now=2886904.607016339) at ud.c:1599
MPT: #6  0x00002aaaabd80eb6 in MPI_SGI_ud_progress () at ud.c:1769
MPT: #7  0x00002aaaabd57f62 in MPI_SGI_progress_devices () at progress.c:118
MPT: #8  MPI_SGI_progress () at progress.c:241
MPT: #9  0x00002aaaabd690d5 in MPI_SGI_request_wait (request=0x7fffffffa3dc, 
MPT:     status=0x2aaaac010970 &amp;lt;mpi_sgi_status_ignore&amp;gt;, set=0x7fffffffa3e4, 
MPT:     gen_rc=0x7fffffffa3e0) at req.c:1703
MPT: #10 0x00002aaaabcd6d32 in MPI_SGI_barrier_basic (comm=6) at barrier.c:61
MPT: #11 0x00002aaaabcd6e65 in MPI_SGI_barrier (comm=6) at barrier.c:204
MPT: #12 0x00002aaaabcd70f5 in MPI_SGI_barrier (comm=&amp;lt;optimized out&amp;gt;)
MPT:     at barrier.c:166
MPT: #13 0x00002aaaabcd7193 in PMPI_Barrier (comm=1) at barrier.c:344
MPT: #14 0x00002aaaabcd72df in pmpi_barrier__ ()
MPT:    from /usr/local/sgi/mpi/mpt-2.12/opt/sgi/mpt/mpt-2.12/lib/libmpi.so
MPT: #15 0x000000000041bd07 in MAIN__ ()
MPT: #16 0x00000000004165c6 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 5232] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/5232/exe, process 5232
MPT: Attaching to program: /proc/31976/exe, process 31976
MPT: Try: zypper install -C "debuginfo(build-id)=d4191084441e39a7b480fc4b41f67083812e9811"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=682e4f7a27ee294a58f17249a0717861db546f2d"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=32c1c7a7a20b54ac3af6b2f436b3375ffeb12f0b"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=2f51a06469a025d507534fe292dcf4e02235bd18"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=91334fd8105f0b62c0bdbbec14b45a9fd043f4c3"
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: Try: zypper install -C "debuginfo(build-id)=4aee0c3923838575483ebd16be6db85ecb6f0b75"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=3b149eccd897f1f37dce50ad22614043eba757a2"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=a95c0ce9f9752baf052d3b55b3b8ce19f662d2eb"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=719375f80fd84b85b905db2c20ec70e8805b36e5"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=4f3dc8efbe18b50a6abe70d8b2f862ce185542d6"
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaab66937f in waitpid () from /lib64/libpthread.so.0
MPT: (gdb) #0  0x00002aaaab66937f in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaabd6f32c in mpi_sgi_system (command=&amp;lt;optimized out&amp;gt;) at sig.c:99
MPT: #2  MPI_SGI_stacktraceback (header=&amp;lt;optimized out&amp;gt;) at sig.c:319
MPT: #3  0x00002aaaabcc4a4a in print_traceback (ecode=0) at abort.c:197
MPT: #4  0x00002aaaabcc4b3e in MPI_SGI_abort () at abort.c:85
MPT: #5  0x00002aaaabd7ff4f in try_repush (gap=&amp;lt;optimized out&amp;gt;, 
MPT:     now=2886904.7364443871) at ud.c:1599
MPT: #6  0x00002aaaabd80eb6 in MPI_SGI_ud_progress () at ud.c:1769
MPT: #7  0x00002aaaabd57f62 in MPI_SGI_progress_devices () at progress.c:118
MPT: #8  MPI_SGI_progress () at progress.c:241
MPT: #9  0x00002aaaabd690d5 in MPI_SGI_request_wait (request=0x7fffffffa3dc, 
MPT:     status=0x2aaaac010970 &amp;lt;mpi_sgi_status_ignore&amp;gt;, set=0x7fffffffa3e4, 
MPT:     gen_rc=0x7fffffffa3e0) at req.c:1703
MPT: #10 0x00002aaaabcd6d32 in MPI_SGI_barrier_basic (comm=6) at barrier.c:61
MPT: #11 0x00002aaaabcd6e65 in MPI_SGI_barrier (comm=6) at barrier.c:204
MPT: #12 0x00002aaaabcd70f5 in MPI_SGI_barrier (comm=&amp;lt;optimized out&amp;gt;)
MPT:     at barrier.c:166
MPT: #13 0x00002aaaabcd7193 in PMPI_Barrier (comm=1) at barrier.c:344
MPT: #14 0x00002aaaabcd72df in pmpi_barrier__ ()
MPT:    from /usr/local/sgi/mpi/mpt-2.12/opt/sgi/mpt/mpt-2.12/lib/libmpi.so
MPT: #15 0x000000000041bd07 in MAIN__ ()
MPT: #16 0x00000000004165c6 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 31976] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/31976/exe, process 31976
MPT: Attaching to program: /proc/24838/exe, process 24838
MPT: Try: zypper install -C "debuginfo(build-id)=d4191084441e39a7b480fc4b41f67083812e9811"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=682e4f7a27ee294a58f17249a0717861db546f2d"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=32c1c7a7a20b54ac3af6b2f436b3375ffeb12f0b"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=2f51a06469a025d507534fe292dcf4e02235bd18"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=91334fd8105f0b62c0bdbbec14b45a9fd043f4c3"
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: Try: zypper install -C "debuginfo(build-id)=4aee0c3923838575483ebd16be6db85ecb6f0b75"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=3b149eccd897f1f37dce50ad22614043eba757a2"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=a95c0ce9f9752baf052d3b55b3b8ce19f662d2eb"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=719375f80fd84b85b905db2c20ec70e8805b36e5"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=4f3dc8efbe18b50a6abe70d8b2f862ce185542d6"
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaab66937f in waitpid () from /lib64/libpthread.so.0
MPT: (gdb) #0  0x00002aaaab66937f in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaabd6f32c in mpi_sgi_system (command=&amp;lt;optimized out&amp;gt;) at sig.c:99
MPT: #2  MPI_SGI_stacktraceback (header=&amp;lt;optimized out&amp;gt;) at sig.c:319
MPT: #3  0x00002aaaabcc4a4a in print_traceback (ecode=0) at abort.c:197
MPT: #4  0x00002aaaabcc4b3e in MPI_SGI_abort () at abort.c:85
MPT: #5  0x00002aaaabd7ff4f in try_repush (gap=&amp;lt;optimized out&amp;gt;, 
MPT:     now=2886899.4397963542) at ud.c:1599
MPT: #6  0x00002aaaabd80eb6 in MPI_SGI_ud_progress () at ud.c:1769
MPT: #7  0x00002aaaabd57f62 in MPI_SGI_progress_devices () at progress.c:118
MPT: #8  MPI_SGI_progress () at progress.c:241
MPT: #9  0x00002aaaabd690d5 in MPI_SGI_request_wait (request=0x7fffffffa324, 
MPT:     status=0x2aaaac010970 &amp;lt;mpi_sgi_status_ignore&amp;gt;, set=0x7fffffffa320, 
MPT:     gen_rc=0x7fffffffa31c) at req.c:1703
MPT: #10 0x00002aaaabd7324d in MPI_SGI_recv (buf=&amp;lt;optimized out&amp;gt;, 
MPT:     count=&amp;lt;optimized out&amp;gt;, type=&amp;lt;optimized out&amp;gt;, des=&amp;lt;optimized out&amp;gt;, 
MPT:     tag=&amp;lt;optimized out&amp;gt;, comm=&amp;lt;optimized out&amp;gt;, 
MPT:     status=0x2aaaac010970 &amp;lt;mpi_sgi_status_ignore&amp;gt;) at sugar.c:40
MPT: #11 0x00002aaaabcd6df3 in MPI_SGI_barrier_basic (comm=6) at barrier.c:74
MPT: #12 0x00002aaaabcd6e65 in MPI_SGI_barrier (comm=6) at barrier.c:204
MPT: #13 0x00002aaaabcd70f5 in MPI_SGI_barrier (comm=&amp;lt;optimized out&amp;gt;)
MPT:     at barrier.c:166
MPT: #14 0x00002aaaabcd7193 in PMPI_Barrier (comm=1) at barrier.c:344
MPT: #15 0x00002aaaabcd72df in pmpi_barrier__ ()
MPT:    from /usr/local/sgi/mpi/mpt-2.12/opt/sgi/mpt/mpt-2.12/lib/libmpi.so
MPT: #16 0x000000000041bd07 in MAIN__ ()
MPT: #17 0x00000000004165c6 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 24838] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/24838/exe, process 24838
MPT: Attaching to program: /proc/12467/exe, process 12467
MPT: Try: zypper install -C "debuginfo(build-id)=d4191084441e39a7b480fc4b41f67083812e9811"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=682e4f7a27ee294a58f17249a0717861db546f2d"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=32c1c7a7a20b54ac3af6b2f436b3375ffeb12f0b"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=2f51a06469a025d507534fe292dcf4e02235bd18"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=91334fd8105f0b62c0bdbbec14b45a9fd043f4c3"
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: Try: zypper install -C "debuginfo(build-id)=4aee0c3923838575483ebd16be6db85ecb6f0b75"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=3b149eccd897f1f37dce50ad22614043eba757a2"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=a95c0ce9f9752baf052d3b55b3b8ce19f662d2eb"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=719375f80fd84b85b905db2c20ec70e8805b36e5"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=4f3dc8efbe18b50a6abe70d8b2f862ce185542d6"
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaab66937f in waitpid () from /lib64/libpthread.so.0
MPT: (gdb) #0  0x00002aaaab66937f in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaabd6f32c in mpi_sgi_system (command=&amp;lt;optimized out&amp;gt;) at sig.c:99
MPT: #2  MPI_SGI_stacktraceback (header=&amp;lt;optimized out&amp;gt;) at sig.c:319
MPT: #3  0x00002aaaabcc4a4a in print_traceback (ecode=0) at abort.c:197
MPT: #4  0x00002aaaabcc4b3e in MPI_SGI_abort () at abort.c:85
MPT: #5  0x00002aaaabd7ff4f in try_repush (gap=&amp;lt;optimized out&amp;gt;, 
MPT:     now=2886904.2856587609) at ud.c:1599
MPT: #6  0x00002aaaabd80eb6 in MPI_SGI_ud_progress () at ud.c:1769
MPT: #7  0x00002aaaabd57f62 in MPI_SGI_progress_devices () at progress.c:118
MPT: #8  MPI_SGI_progress () at progress.c:241
MPT: #9  0x00002aaaabd690d5 in MPI_SGI_request_wait (request=0x7fffffffa35c, 
MPT:     status=0x2aaaac010970 &amp;lt;mpi_sgi_status_ignore&amp;gt;, set=0x7fffffffa364, 
MPT:     gen_rc=0x7fffffffa360) at req.c:1703
MPT: #10 0x00002aaaabcd6d32 in MPI_SGI_barrier_basic (comm=6) at barrier.c:61
MPT: #11 0x00002aaaabcd6e65 in MPI_SGI_barrier (comm=6) at barrier.c:204
MPT: #12 0x00002aaaabcd70f5 in MPI_SGI_barrier (comm=&amp;lt;optimized out&amp;gt;)
MPT:     at barrier.c:166
MPT: #13 0x00002aaaabcd7193 in PMPI_Barrier (comm=1) at barrier.c:344
MPT: #14 0x00002aaaabcd72df in pmpi_barrier__ ()
MPT:    from /usr/local/sgi/mpi/mpt-2.12/opt/sgi/mpt/mpt-2.12/lib/libmpi.so
MPT: #15 0x000000000041bd07 in MAIN__ ()
MPT: #16 0x00000000004165c6 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 12467] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/12467/exe, process 12467
MPT: Attaching to program: /proc/12240/exe, process 12240
MPT: Try: zypper install -C "debuginfo(build-id)=d4191084441e39a7b480fc4b41f67083812e9811"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=682e4f7a27ee294a58f17249a0717861db546f2d"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=32c1c7a7a20b54ac3af6b2f436b3375ffeb12f0b"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=2f51a06469a025d507534fe292dcf4e02235bd18"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=91334fd8105f0b62c0bdbbec14b45a9fd043f4c3"
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: Try: zypper install -C "debuginfo(build-id)=4aee0c3923838575483ebd16be6db85ecb6f0b75"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=3b149eccd897f1f37dce50ad22614043eba757a2"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=a95c0ce9f9752baf052d3b55b3b8ce19f662d2eb"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=719375f80fd84b85b905db2c20ec70e8805b36e5"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=4f3dc8efbe18b50a6abe70d8b2f862ce185542d6"
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaab66937f in waitpid () from /lib64/libpthread.so.0
MPT: (gdb) #0  0x00002aaaab66937f in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaabd6f32c in mpi_sgi_system (command=&amp;lt;optimized out&amp;gt;) at sig.c:99
MPT: #2  MPI_SGI_stacktraceback (header=&amp;lt;optimized out&amp;gt;) at sig.c:319
MPT: #3  0x00002aaaabcc4a4a in print_traceback (ecode=0) at abort.c:197
MPT: #4  0x00002aaaabcc4b3e in MPI_SGI_abort () at abort.c:85
MPT: #5  0x00002aaaabd7ff4f in try_repush (gap=&amp;lt;optimized out&amp;gt;, 
MPT:     now=2886905.1025811359) at ud.c:1599
MPT: #6  0x00002aaaabd80eb6 in MPI_SGI_ud_progress () at ud.c:1769
MPT: #7  0x00002aaaabd57f62 in MPI_SGI_progress_devices () at progress.c:118
MPT: #8  MPI_SGI_progress () at progress.c:241
MPT: #9  0x00002aaaabd690d5 in MPI_SGI_request_wait (request=0x7fffffffa3dc, 
MPT:     status=0x2aaaac010970 &amp;lt;mpi_sgi_status_ignore&amp;gt;, set=0x7fffffffa3e4, 
MPT:     gen_rc=0x7fffffffa3e0) at req.c:1703
MPT: #10 0x00002aaaabcd6d32 in MPI_SGI_barrier_basic (comm=6) at barrier.c:61
MPT: #11 0x00002aaaabcd6e65 in MPI_SGI_barrier (comm=6) at barrier.c:204
MPT: #12 0x00002aaaabcd70f5 in MPI_SGI_barrier (comm=&amp;lt;optimized out&amp;gt;)
MPT:     at barrier.c:166
MPT: #13 0x00002aaaabcd7193 in PMPI_Barrier (comm=1) at barrier.c:344
MPT: #14 0x00002aaaabcd72df in pmpi_barrier__ ()
MPT:    from /usr/local/sgi/mpi/mpt-2.12/opt/sgi/mpt/mpt-2.12/lib/libmpi.so
MPT: #15 0x000000000041bd07 in MAIN__ ()
MPT: #16 0x00000000004165c6 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 12240] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/12240/exe, process 12240
MPT: Attaching to program: /proc/15596/exe, process 15596
MPT: Try: zypper install -C "debuginfo(build-id)=d4191084441e39a7b480fc4b41f67083812e9811"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=682e4f7a27ee294a58f17249a0717861db546f2d"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=32c1c7a7a20b54ac3af6b2f436b3375ffeb12f0b"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=2f51a06469a025d507534fe292dcf4e02235bd18"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=91334fd8105f0b62c0bdbbec14b45a9fd043f4c3"
MPT: (no debugging symbols found)...done.
MPT: [Thread debugging using libthread_db enabled]
MPT: Using host libthread_db library "/lib64/libthread_db.so.1".
MPT: Try: zypper install -C "debuginfo(build-id)=4aee0c3923838575483ebd16be6db85ecb6f0b75"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=3b149eccd897f1f37dce50ad22614043eba757a2"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=a95c0ce9f9752baf052d3b55b3b8ce19f662d2eb"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=719375f80fd84b85b905db2c20ec70e8805b36e5"
MPT: (no debugging symbols found)...done.
MPT: Try: zypper install -C "debuginfo(build-id)=4f3dc8efbe18b50a6abe70d8b2f862ce185542d6"
MPT: (no debugging symbols found)...done.
MPT: 0x00002aaaab66937f in waitpid () from /lib64/libpthread.so.0
MPT: (gdb) #0  0x00002aaaab66937f in waitpid () from /lib64/libpthread.so.0
MPT: #1  0x00002aaaabd6f32c in mpi_sgi_system (command=&amp;lt;optimized out&amp;gt;) at sig.c:99
MPT: #2  MPI_SGI_stacktraceback (header=&amp;lt;optimized out&amp;gt;) at sig.c:319
MPT: #3  0x00002aaaabcc4a4a in print_traceback (ecode=0) at abort.c:197
MPT: #4  0x00002aaaabcc4b3e in MPI_SGI_abort () at abort.c:85
MPT: #5  0x00002aaaabd7ff4f in try_repush (gap=&amp;lt;optimized out&amp;gt;, 
MPT:     now=2886904.6996221012) at ud.c:1599
MPT: #6  0x00002aaaabd80eb6 in MPI_SGI_ud_progress () at ud.c:1769
MPT: #7  0x00002aaaabd57f62 in MPI_SGI_progress_devices () at progress.c:118
MPT: #8  MPI_SGI_progress () at progress.c:241
MPT: #9  0x00002aaaabd690d5 in MPI_SGI_request_wait (request=0x7fffffffa35c, 
MPT:     status=0x2aaaac010970 &amp;lt;mpi_sgi_status_ignore&amp;gt;, set=0x7fffffffa364, 
MPT:     gen_rc=0x7fffffffa360) at req.c:1703
MPT: #10 0x00002aaaabcd6d32 in MPI_SGI_barrier_basic (comm=6) at barrier.c:61
MPT: #11 0x00002aaaabcd6e65 in MPI_SGI_barrier (comm=6) at barrier.c:204
MPT: #12 0x00002aaaabcd70f5 in MPI_SGI_barrier (comm=&amp;lt;optimized out&amp;gt;)
MPT:     at barrier.c:166
MPT: #13 0x00002aaaabcd7193 in PMPI_Barrier (comm=1) at barrier.c:344
MPT: #14 0x00002aaaabcd72df in pmpi_barrier__ ()
MPT:    from /usr/local/sgi/mpi/mpt-2.12/opt/sgi/mpt/mpt-2.12/lib/libmpi.so
MPT: #15 0x000000000041bd07 in MAIN__ ()
MPT: #16 0x00000000004165c6 in main ()
MPT: (gdb) A debugging session is active.
MPT: 
MPT: 	Inferior 1 [process 15596] will be detached.
MPT: 
MPT: Quit anyway? (y or n) [answered Y; input not from terminal]
MPT: Detaching from program: /proc/15596/exe, process 15596

MPT: -----stack traceback ends-----

MPT: -----stack traceback ends-----

MPT: -----stack traceback ends-----

MPT: -----stack traceback ends-----

MPT: -----stack traceback ends-----

MPT: -----stack traceback ends-----
&lt;/PRE&gt;

&lt;P&gt;&amp;nbsp;&lt;/P&gt;

&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 18 Nov 2015 20:40:40 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073473#M119924</guid>
      <dc:creator>Gabriele_B_</dc:creator>
      <dc:date>2015-11-18T20:40:40Z</dc:date>
    </item>
    <item>
      <title>yes, rank 0 and task 0 are</title>
      <link>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073474#M119925</link>
      <description>&lt;P&gt;yes, rank 0 and task 0 are the same thing.&lt;/P&gt;

&lt;P&gt;no, it doesn't tell you too much: just that all other ranks wait in an MPI_Barrier, presumably until rank 0 finishes its particularities. Of course, they'll wait forever (or until the timeout), because rank 0 exited with an error..&lt;/P&gt;</description>
      <pubDate>Wed, 18 Nov 2015 21:01:34 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073474#M119925</guid>
      <dc:creator>John_D_6</dc:creator>
      <dc:date>2015-11-18T21:01:34Z</dc:date>
    </item>
    <item>
      <title>Hey John,</title>
      <link>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073475#M119926</link>
      <description>&lt;P&gt;Hey John,&lt;/P&gt;

&lt;P&gt;I tried this -O2 and it gave me the line where the SIGSEGV is happening&lt;/P&gt;

&lt;P&gt;forrtl: severe (174): SIGSEGV, segmentation fault occurred&lt;BR /&gt;
	Image&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; PC&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Routine&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Line&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Source&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 0000000000862859&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 000000000086112E&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 000000000081F242&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 00000000007EBD03&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 00000000007EF8EB&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	libpthread.so.0&amp;nbsp;&amp;nbsp;&amp;nbsp; 00002AAAAB669810&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	libc.so.6&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 00002AAAAC126C52&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 0000000000532072&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 000000000049FD13&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 000000000045B7D6&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 000000000041B001&amp;nbsp; MAIN__&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 545&amp;nbsp; 3dpic_full_mpi.f&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 00000000004165C6&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	libc.so.6&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 00002AAAAC02FC36&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 00000000004164B9&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	srun.slurm: error: borgo010: task 0: Exited with exit code 174&lt;/P&gt;

&lt;P&gt;It's what I wanted. Thanks&lt;/P&gt;

&lt;P&gt;GB&lt;/P&gt;

&lt;P&gt;&amp;nbsp;&lt;/P&gt;

&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 19 Nov 2015 15:47:07 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073475#M119926</guid>
      <dc:creator>Gabriele_B_</dc:creator>
      <dc:date>2015-11-19T15:47:07Z</dc:date>
    </item>
    <item>
      <title>Hello Gabriele,</title>
      <link>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073476#M119927</link>
      <description>&lt;P&gt;Hello Gabriele,&lt;/P&gt;

&lt;P&gt;good to see that you got some location back. Note, that the location is inaccurate because of optimization, so it might be 10 lines earlier or later as well. Furthermore, it seems that you just compiled the main routine with debugging info. There seem to be 3 more frames in the callstack that are part of your program&lt;/P&gt;

&lt;P&gt;3dpic_full_mpi.ex&amp;nbsp; 0000000000532072&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 000000000049FD13&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 000000000045B7D6&amp;nbsp; Unknown&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Unknown&amp;nbsp; Unknown&lt;BR /&gt;
	3dpic_full_mpi.ex&amp;nbsp; 000000000041B001&amp;nbsp; MAIN__&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 545&amp;nbsp; 3dpic_full_mpi.f&lt;/P&gt;

&lt;P&gt;I suggest to recompile all routines and rerun your simulation to see those frames as well&lt;/P&gt;</description>
      <pubDate>Thu, 19 Nov 2015 19:46:37 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073476#M119927</guid>
      <dc:creator>John_D_6</dc:creator>
      <dc:date>2015-11-19T19:46:37Z</dc:date>
    </item>
    <item>
      <title>What do you mean for</title>
      <link>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073477#M119928</link>
      <description>&lt;P&gt;What do you mean for compiling all the routines? at the moment I'm compiling the code in this way&lt;/P&gt;

&lt;P&gt;set -x&lt;/P&gt;

&lt;P&gt;mpif90 -I/usr/local/other/SLES11.3/ncl/gcc-4.3.4/6.3.0-static/include -I/usr/local/other/SLES11.3/silo/4.10.2/include -extend-source -r8 -c -O2 -g -traceback 3dpic_full_mpi.f&lt;/P&gt;

&lt;P&gt;mpif90 -o 3dpic_full_mpi.exe 3dpic_full_mpi.o -L/usr/local/other/SLES11.3/ncl/gcc-4.3.4/6.3.0-static/lib -L/usr/local/other/SLES11.3/silo/4.10.2/lib -lsiloh5 -lhdf5_hl -lhdf5 -lsz -lz -lm -lrt -ldl -lstdc++ -O2 -g -traceback&lt;/P&gt;

&lt;P&gt;I have many subroutines in the code. I have only 2 other files files, but they are constants and variables and I include them in the .f file showed above.&lt;/P&gt;

&lt;P&gt;Thanks&lt;/P&gt;

&lt;P&gt;GB&lt;/P&gt;</description>
      <pubDate>Thu, 19 Nov 2015 20:18:30 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073477#M119928</guid>
      <dc:creator>Gabriele_B_</dc:creator>
      <dc:date>2015-11-19T20:18:30Z</dc:date>
    </item>
    <item>
      <title>my hunch is that your code</title>
      <link>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073478#M119929</link>
      <description>&lt;P&gt;my hunch is that your code crashes in a call to the silo or hdf5 library, since those are not compiled with debug info.&lt;/P&gt;</description>
      <pubDate>Thu, 19 Nov 2015 20:36:41 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073478#M119929</guid>
      <dc:creator>John_D_6</dc:creator>
      <dc:date>2015-11-19T20:36:41Z</dc:date>
    </item>
    <item>
      <title>I think you are quiet right,</title>
      <link>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073479#M119930</link>
      <description>&lt;P&gt;I think you are quiet right, and I should have found the error (I know my code, so I know that's SILO the critical part).&lt;/P&gt;

&lt;P&gt;-O2 this time was very precise: that is the exact line where the code breaks.&lt;/P&gt;

&lt;P&gt;How does -O2 work? How can I know how much to trust it? You said it has a 10 lines tolerance, but if in the future I insert some useless lines with dum operations does it still maintain this 10 lines sensibility?&lt;/P&gt;

&lt;P&gt;thanks&lt;/P&gt;

&lt;P&gt;GB&lt;/P&gt;

&lt;P&gt;&amp;nbsp;&lt;/P&gt;

&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 19 Nov 2015 21:04:53 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073479#M119930</guid>
      <dc:creator>Gabriele_B_</dc:creator>
      <dc:date>2015-11-19T21:04:53Z</dc:date>
    </item>
    <item>
      <title>the 10 lines I mentioned is</title>
      <link>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073480#M119931</link>
      <description>&lt;P&gt;the 10 lines I mentioned is just a number I made up: it could be more or less, just don't rely on it being exact. The compiler could move around statements, so there is only a loose correspondence between the order in your source code and the order in the executable.&lt;/P&gt;

&lt;P&gt;In this case, the compiler can't optimize the call to an external library, that's why you see the exact location.&lt;/P&gt;</description>
      <pubDate>Thu, 19 Nov 2015 21:16:45 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073480#M119931</guid>
      <dc:creator>John_D_6</dc:creator>
      <dc:date>2015-11-19T21:16:45Z</dc:date>
    </item>
    <item>
      <title>GB,</title>
      <link>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073481#M119932</link>
      <description>&lt;P&gt;GB,&lt;/P&gt;

&lt;P&gt;I gather from the name of the compute node (borg...) that you are running on the discover cluster at NASA/Goddard.&amp;nbsp; You should contact the NCCS support group at support@nccs.nasa.gov.&lt;/P&gt;

&lt;P&gt;Dan&lt;/P&gt;</description>
      <pubDate>Tue, 01 Dec 2015 19:55:26 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073481#M119932</guid>
      <dc:creator>dkokron</dc:creator>
      <dc:date>2015-12-01T19:55:26Z</dc:date>
    </item>
    <item>
      <title>Dan,</title>
      <link>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073482#M119933</link>
      <description>&lt;P&gt;Dan,&lt;/P&gt;

&lt;P&gt;I had done it. But John suggestion have been more helpful.&lt;/P&gt;

&lt;P&gt;GB&lt;/P&gt;</description>
      <pubDate>Tue, 01 Dec 2015 20:19:51 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Fortran-Compiler/error-on-many-cores-run/m-p/1073482#M119933</guid>
      <dc:creator>Gabriele_B_</dc:creator>
      <dc:date>2015-12-01T20:19:51Z</dc:date>
    </item>
  </channel>
</rss>

