<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re:SIGFPE with mpiexec.hydra for Intel MPI 2019 up... in Intel® MPI Library</title>
    <link>https://community.intel.com/t5/Intel-MPI-Library/SIGFPE-with-mpiexec-hydra-for-Intel-MPI-2019-update-7/m-p/1204631#M7065</link>
    <description>&lt;P&gt;Hi Bart,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Having not received your response for over a month, I am going ahead and closing this thread. Whenever, Intel MPI Library's native process manager is not used, we recommend to set the PMI library explicitly, using the I_MPI_PMI_LIBRARY environment variable. &lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;For more details, please refer to the following links - &lt;/P&gt;&lt;P&gt;[1] &lt;A href="https://software.intel.com/content/www/us/en/develop/articles/how-to-use-slurm-pmi-with-the-intel-mpi-library-for-linux.html" rel="noopener noreferrer" target="_blank"&gt;https://software.intel.com/content/www/us/en/develop/articles/how-to-use-slurm-pmi-with-the-intel-mpi-library-for-linux.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;[2] &lt;A href="https://slurm.schedmd.com/mpi_guide.html" rel="noopener noreferrer" target="_blank"&gt;https://slurm.schedmd.com/mpi_guide.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&lt;I style="font-family: Calibri, sans-serif; font-size: 11pt;"&gt;This issue will be treated as resolved and we will no longer respond to this thread.&amp;nbsp;If you require additional assistance from Intel, please start a new thread.&amp;nbsp;Any further interaction in this thread will be considered community only&lt;/I&gt;&lt;SPAN style="font-family: Calibri, sans-serif; font-size: 11pt;"&gt;.&lt;/SPAN&gt; &lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Best regards,&lt;/P&gt;&lt;P&gt;Amar&lt;/P&gt;&lt;BR /&gt;</description>
    <pubDate>Thu, 27 Aug 2020 12:40:18 GMT</pubDate>
    <dc:creator>DrAmarpal_K_Intel</dc:creator>
    <dc:date>2020-08-27T12:40:18Z</dc:date>
    <item>
      <title>SIGFPE with mpiexec.hydra for Intel MPI 2019 update 7</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/SIGFPE-with-mpiexec-hydra-for-Intel-MPI-2019-update-7/m-p/1168132#M6488</link>
      <description>&lt;P&gt;If I use Intel MPI update 7 in a Slurm configuration on two cores on two separate nodes, I get a SIGFPE here (according to gdb on the generated core file):&lt;/P&gt;&lt;P&gt;#0 0x00000000004436ed in ipl_create_domains (pi=0x0, scale=4786482) at ../../../../../src/pm/i_hydra/../../intel/ipl/include/../src/ipl_service.c:2240&lt;/P&gt;&lt;P&gt;This happens only with mpirun / mpiexec.hydra using e.g. "mpirun -n 2 ./test"&lt;/P&gt;&lt;P&gt;I know of 3 workarounds, any of which will let me run this successfully, but I thought maybe you or others should know about this crash:&lt;/P&gt;&lt;P&gt;1. Set I_MPI_PMI_LIBRARY=libpmi2.so and use "srun -n 2 ./test" (with Slurm configured to use pmi2).&lt;/P&gt;&lt;P&gt;2. Use I_MPI_HYDRA_TOPOLIB=ipl&lt;/P&gt;&lt;P&gt;3. Use the "legacy" mpiexec.hydra.&lt;/P&gt;</description>
      <pubDate>Tue, 21 Apr 2020 23:02:42 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/SIGFPE-with-mpiexec-hydra-for-Intel-MPI-2019-update-7/m-p/1168132#M6488</guid>
      <dc:creator>Bart_O_</dc:creator>
      <dc:date>2020-04-21T23:02:42Z</dc:date>
    </item>
    <item>
      <title>Some more details:</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/SIGFPE-with-mpiexec-hydra-for-Intel-MPI-2019-update-7/m-p/1168133#M6489</link>
      <description>&lt;P&gt;Some more details:&lt;/P&gt;&lt;P&gt;OS: CentOS 7.7, Linux blg8616.int.ets1.calculquebec.ca 3.10.0-1062.12.1.el7.x86_64 #1 SMP Tue Feb 4 23:02:59 UTC 2020 x86_64 Intel(R) Xeon(R) Gold 6148 CPU @ 2.40GHz GenuineIntel GNU/Linux&lt;/P&gt;&lt;P&gt;If I run (from "mpiicc ${I_MPI_ROOT}/test/test.c -g -o test") with "I_MPI_HYDRA_TOPOLIB=ipl" I get this:&lt;/P&gt;&lt;P&gt;[oldeman@blg8616 test]$ I_MPI_DEBUG=5 mpirun -n 2 ./test&lt;BR /&gt;[0] MPI startup(): libfabric version: 1.10.0a1-impi&lt;BR /&gt;[0] MPI startup(): libfabric provider: mlx&lt;BR /&gt;[0] MPI startup(): Rank&amp;nbsp;&amp;nbsp;&amp;nbsp; Pid&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Node name&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; Pin cpu&lt;BR /&gt;[0] MPI startup(): 0&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 173711&amp;nbsp;&amp;nbsp; blg8616.int.ets1.calculquebec.ca&amp;nbsp; 37&lt;BR /&gt;[0] MPI startup(): 1&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 220728&amp;nbsp;&amp;nbsp; blg8621.int.ets1.calculquebec.ca&amp;nbsp; 29&lt;BR /&gt;[0] MPI startup(): I_MPI_CC=icc&lt;BR /&gt;[0] MPI startup(): I_MPI_CXX=icpc&lt;BR /&gt;[0] MPI startup(): I_MPI_FC=ifort&lt;BR /&gt;[0] MPI startup(): I_MPI_F90=ifort&lt;BR /&gt;[0] MPI startup(): I_MPI_F77=ifort&lt;BR /&gt;[0] MPI startup(): I_MPI_ROOT=/cvmfs/soft.computecanada.ca/easybuild/software/2019/avx2/Compiler/intel2020/intelmpi/2019.7.217&lt;BR /&gt;[0] MPI startup(): I_MPI_LINK=opt&lt;BR /&gt;[0] MPI startup(): I_MPI_MPIRUN=mpirun&lt;BR /&gt;[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=ipl&lt;BR /&gt;[0] MPI startup(): I_MPI_HYDRA_BOOTSTRAP=slurm&lt;BR /&gt;[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default&lt;BR /&gt;[0] MPI startup(): I_MPI_DEBUG=5&lt;BR /&gt;Hello world: rank 0 of 2 running on blg8616.int.ets1.calculquebec.ca&lt;BR /&gt;Hello world: rank 1 of 2 running on blg8621.int.ets1.calculquebec.ca&lt;/P&gt;&lt;P&gt;but without that set, it gives me (and also with just "hostname"):&lt;/P&gt;&lt;P&gt;[oldeman@blg8616 test]$ I_MPI_DEBUG=5 mpirun -n 2 ./test&lt;BR /&gt;srun: error: blg8621: task 1: Floating point exception (core dumped)&lt;BR /&gt;srun: error: blg8616: task 0: Floating point exception (core dumped)&lt;/P&gt;&lt;P&gt;[mpiexec@blg8616.int.ets1.calculquebec.ca] wait_proxies_to_terminate (../../../../../src/pm/i_hydra/mpiexec/intel/i_mpiexec.c:528): downstream from host blg8616 exited with status 136&lt;BR /&gt;[mpiexec@blg8616.int.ets1.calculquebec.ca] main (../../../../../src/pm/i_hydra/mpiexec/mpiexec.c:2114): assert (exitcodes != NULL) failed&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 22 Apr 2020 00:11:21 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/SIGFPE-with-mpiexec-hydra-for-Intel-MPI-2019-update-7/m-p/1168133#M6489</guid>
      <dc:creator>Bart_O_</dc:creator>
      <dc:date>2020-04-22T00:11:21Z</dc:date>
    </item>
    <item>
      <title>Hi Bart,</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/SIGFPE-with-mpiexec-hydra-for-Intel-MPI-2019-update-7/m-p/1168134#M6490</link>
      <description>&lt;P&gt;Hi Bart,&lt;/P&gt;&lt;P&gt;Thanks for reaching out to us.&lt;/P&gt;&lt;P&gt;We will investigate this issue further and will get back to you soon.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;&lt;P&gt;Prasanth&lt;/P&gt;</description>
      <pubDate>Wed, 22 Apr 2020 10:47:49 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/SIGFPE-with-mpiexec-hydra-for-Intel-MPI-2019-update-7/m-p/1168134#M6490</guid>
      <dc:creator>PrasanthD_intel</dc:creator>
      <dc:date>2020-04-22T10:47:49Z</dc:date>
    </item>
    <item>
      <title>Re:SIGFPE with mpiexec.hydra for Intel MPI 2019 up...</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/SIGFPE-with-mpiexec-hydra-for-Intel-MPI-2019-update-7/m-p/1194234#M6959</link>
      <description>&lt;P&gt;Hi Bart,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Intel MPI Library 2019 U8 has just been released. Could you please rerun your experiments with mpiexec.hydra and report your findings, please?&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Best regards,&lt;/P&gt;&lt;P&gt;Amar  &lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Thu, 23 Jul 2020 07:51:05 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/SIGFPE-with-mpiexec-hydra-for-Intel-MPI-2019-update-7/m-p/1194234#M6959</guid>
      <dc:creator>DrAmarpal_K_Intel</dc:creator>
      <dc:date>2020-07-23T07:51:05Z</dc:date>
    </item>
    <item>
      <title>Re:SIGFPE with mpiexec.hydra for Intel MPI 2019 up...</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/SIGFPE-with-mpiexec-hydra-for-Intel-MPI-2019-update-7/m-p/1204631#M7065</link>
      <description>&lt;P&gt;Hi Bart,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Having not received your response for over a month, I am going ahead and closing this thread. Whenever, Intel MPI Library's native process manager is not used, we recommend to set the PMI library explicitly, using the I_MPI_PMI_LIBRARY environment variable. &lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;For more details, please refer to the following links - &lt;/P&gt;&lt;P&gt;[1] &lt;A href="https://software.intel.com/content/www/us/en/develop/articles/how-to-use-slurm-pmi-with-the-intel-mpi-library-for-linux.html" rel="noopener noreferrer" target="_blank"&gt;https://software.intel.com/content/www/us/en/develop/articles/how-to-use-slurm-pmi-with-the-intel-mpi-library-for-linux.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;[2] &lt;A href="https://slurm.schedmd.com/mpi_guide.html" rel="noopener noreferrer" target="_blank"&gt;https://slurm.schedmd.com/mpi_guide.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&lt;I style="font-family: Calibri, sans-serif; font-size: 11pt;"&gt;This issue will be treated as resolved and we will no longer respond to this thread.&amp;nbsp;If you require additional assistance from Intel, please start a new thread.&amp;nbsp;Any further interaction in this thread will be considered community only&lt;/I&gt;&lt;SPAN style="font-family: Calibri, sans-serif; font-size: 11pt;"&gt;.&lt;/SPAN&gt; &lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Best regards,&lt;/P&gt;&lt;P&gt;Amar&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Thu, 27 Aug 2020 12:40:18 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/SIGFPE-with-mpiexec-hydra-for-Intel-MPI-2019-update-7/m-p/1204631#M7065</guid>
      <dc:creator>DrAmarpal_K_Intel</dc:creator>
      <dc:date>2020-08-27T12:40:18Z</dc:date>
    </item>
  </channel>
</rss>

