<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re:mpirun hangs up  up in Intel® MPI Library</title>
    <link>https://community.intel.com/t5/Intel-MPI-Library/mpirun-hangs-up-up/m-p/1433869#M10092</link>
    <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;We have not heard back from you. Could you please provide an update on your issue?&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Thanks &amp;amp; Regards,&lt;/P&gt;&lt;P&gt;Varsha&lt;/P&gt;&lt;BR /&gt;</description>
    <pubDate>Tue, 29 Nov 2022 06:07:36 GMT</pubDate>
    <dc:creator>VarshaS_Intel</dc:creator>
    <dc:date>2022-11-29T06:07:36Z</dc:date>
    <item>
      <title>mpirun hangs up  up</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/mpirun-hangs-up-up/m-p/1430867#M10051</link>
      <description>&lt;P&gt;Hi,&lt;BR /&gt;While trying to run intel mpi on single node , the application is getting stuck.&lt;BR /&gt;&lt;BR /&gt;Here is the mpirun version -&amp;nbsp;&lt;BR /&gt;Intel(R) MPI Library for Linux* OS, Version 2019 Update 4 Build 20190430 (id: cbdd16069)&lt;BR /&gt;Copyright 2003-2019, Intel Corporation.&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;here are the logs when i trigger mpirun -&amp;nbsp;&lt;BR /&gt;I_MPI_DEBUG=16 I_MPI_HYDRA_DEBUG=on FI_LOG_LEVEL=debug /usr/diags/mpi/impi/2019.4.243/intel64/bin/mpirun -np 1 /bin/date&lt;BR /&gt;[mpiexec@cf-icex-82-1] Launch arguments: /usr/diags/mpi/impi/2019.4.243//intel64/bin//hydra_bstrap_proxy --upstream-host cf-icex-82-1 --upstream-port 45883 --pgid 0 --launcher ssh --launcher-number 0 --base-path /usr/diags/mpi/impi/2019.4.243//intel64/bin/ --tree-width 16 --tree-level 1 --time-left -1 --collective-launch 1 --debug --proxy-id 0 --node-id 0 --subtree-size 1 --upstream-fd 7 /usr/diags/mpi/impi/2019.4.243//intel64/bin//hydra_pmi_proxy --usize -1 --auto-cleanup 1 --abort-signal 9&lt;BR /&gt;&lt;BR /&gt;I have to terminate the mpirun by issuing ctrl+c, and&amp;nbsp;&lt;/P&gt;
&lt;P&gt;here's what i see in top&amp;nbsp; -&amp;nbsp;PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND&lt;BR /&gt;77177 root 20 0 17228 3056 2812 R 100.0 0.006 42:57.39 hydra_pmi_proxy&lt;BR /&gt;&lt;BR /&gt;OS -&amp;nbsp;SUSE Linux Enterprise Server 15 SP4&lt;BR /&gt;&lt;BR /&gt;tried with FI_PROVIDER=TCP and verbs, result is same.&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 17 Nov 2022 08:59:36 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/mpirun-hangs-up-up/m-p/1430867#M10051</guid>
      <dc:creator>psing51</dc:creator>
      <dc:date>2022-11-17T08:59:36Z</dc:date>
    </item>
    <item>
      <title>Re: mpirun hangs up  up</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/mpirun-hangs-up-up/m-p/1430881#M10053</link>
      <description>&lt;P&gt;here's what i have in strace -&amp;nbsp;&lt;BR /&gt;access("/usr/diags/mpi/impi/2019.4.243/intel64/bin/mpiexec.hydra", R_OK) = 0&lt;BR /&gt;stat("/usr/diags/mpi/impi/2019.4.243/intel64/bin/mpiexec.hydra", {st_mode=S_IFREG|0755, st_size=2972372, ...}) = 0&lt;BR /&gt;stat("/usr/diags/mpi/impi/2019.4.243/intel64/bin/mpiexec.hydra", {st_mode=S_IFREG|0755, st_size=2972372, ...}) = 0&lt;BR /&gt;geteuid() = 0&lt;BR /&gt;getegid() = 0&lt;BR /&gt;getuid() = 0&lt;BR /&gt;getgid() = 0&lt;BR /&gt;access("/usr/diags/mpi/impi/2019.4.243/intel64/bin/mpiexec.hydra", X_OK) = 0&lt;BR /&gt;stat("/usr/diags/mpi/impi/2019.4.243/intel64/bin/mpiexec.hydra", {st_mode=S_IFREG|0755, st_size=2972372, ...}) = 0&lt;BR /&gt;geteuid() = 0&lt;BR /&gt;getegid() = 0&lt;BR /&gt;getuid() = 0&lt;BR /&gt;getgid() = 0&lt;BR /&gt;access("/usr/diags/mpi/impi/2019.4.243/intel64/bin/mpiexec.hydra", R_OK) = 0&lt;BR /&gt;rt_sigprocmask(SIG_BLOCK, [INT CHLD], [], &lt;LI-EMOJI id="lia_smiling-face-with-sunglasses" title=":smiling_face_with_sunglasses:"&gt;&lt;/LI-EMOJI&gt; = 0&lt;BR /&gt;rt_sigprocmask(SIG_BLOCK, [CHLD], [INT CHLD], &lt;LI-EMOJI id="lia_smiling-face-with-sunglasses" title=":smiling_face_with_sunglasses:"&gt;&lt;/LI-EMOJI&gt; = 0&lt;BR /&gt;rt_sigprocmask(SIG_SETMASK, [INT CHLD], NULL, &lt;LI-EMOJI id="lia_smiling-face-with-sunglasses" title=":smiling_face_with_sunglasses:"&gt;&lt;/LI-EMOJI&gt; = 0&lt;BR /&gt;lseek(255, -515, SEEK_CUR) = 3270&lt;BR /&gt;clone(child_stack=NULL, flags=CLONE_CHILD_CLEARTID|CLONE_CHILD_SETTID|SIGCHLD, child_tidptr=0x7f6fb33626d0) = 21419&lt;BR /&gt;rt_sigprocmask(SIG_SETMASK, [], NULL, &lt;LI-EMOJI id="lia_smiling-face-with-sunglasses" title=":smiling_face_with_sunglasses:"&gt;&lt;/LI-EMOJI&gt; = 0&lt;BR /&gt;rt_sigprocmask(SIG_BLOCK, [CHLD], [], &lt;LI-EMOJI id="lia_smiling-face-with-sunglasses" title=":smiling_face_with_sunglasses:"&gt;&lt;/LI-EMOJI&gt; = 0&lt;BR /&gt;rt_sigprocmask(SIG_SETMASK, [], NULL, &lt;LI-EMOJI id="lia_smiling-face-with-sunglasses" title=":smiling_face_with_sunglasses:"&gt;&lt;/LI-EMOJI&gt; = 0&lt;BR /&gt;rt_sigprocmask(SIG_BLOCK, [CHLD], [], &lt;LI-EMOJI id="lia_smiling-face-with-sunglasses" title=":smiling_face_with_sunglasses:"&gt;&lt;/LI-EMOJI&gt; = 0&lt;BR /&gt;rt_sigaction(SIGINT, {sa_handler=0x55d1f7cb8d10, sa_mask=[], sa_flags=SA_RESTORER, sa_restorer=0x7f6fb294cd50}, {sa_handler=SIG_DFL, sa_mask=[], sa_flags=SA_RESTORER, sa_restorer=0x7f6fb294cd50}, &lt;LI-EMOJI id="lia_smiling-face-with-sunglasses" title=":smiling_face_with_sunglasses:"&gt;&lt;/LI-EMOJI&gt; = 0&lt;BR /&gt;wait4(-1,&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 17 Nov 2022 09:52:56 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/mpirun-hangs-up-up/m-p/1430881#M10053</guid>
      <dc:creator>psing51</dc:creator>
      <dc:date>2022-11-17T09:52:56Z</dc:date>
    </item>
    <item>
      <title>Re: mpirun hangs up  up</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/mpirun-hangs-up-up/m-p/1430882#M10054</link>
      <description>&lt;P&gt;when i run the mpiexec.hydra -np 1 hostname , here's what i get&amp;nbsp; -&amp;nbsp;&lt;BR /&gt;write(6, "\2\0\0\0\6\0\0\0\0\0\0\0\0\0\0\0\20\314\32\320", 20) = 20&lt;BR /&gt;write(6, "/root\0", 6) = 6&lt;BR /&gt;write(6, "\3\0\0\0\31\0\0\0\1\0\0\0\25\0\0\0\4\0\0\0", 20) = 20&lt;BR /&gt;write(6, "\1\0\0\0\r\0\0\0/bin/hostname\0\0\0\0", 25) = 25&lt;BR /&gt;write(6, "\10\0\0\0\21\0\0\0\360&amp;gt;m\0\0\0\0\0 m\0", 20) = 20&lt;BR /&gt;write(6, "(vector,(0,1,1))\0", 17) = 17&lt;BR /&gt;getpid() = 21442&lt;BR /&gt;write(6, "\4\0\0\0\f\0\0\0\37\242@\0\0\0\0\0\10\0\0\0", 20) = 20&lt;BR /&gt;write(6, "kvs_21442_0\0", 12) = 12&lt;BR /&gt;write(6, "\5\0\0\0\0\0\0\0\37\242@\0\0\0\0\0\10\0\0\0", 20) = 20&lt;BR /&gt;poll([{fd=4, events=POLLIN}, {fd=6, events=POLLIN}, {fd=10, events=POLLIN}, {fd=12, events=POLLIN}, {fd=0, events=POLLIN}], 5, -1&lt;BR /&gt;) = 1 ([{fd=0, revents=POLLIN}])&lt;BR /&gt;read(0, "\n", 16384) = 1&lt;BR /&gt;write(6, "\20\0\0\0\1\0\0\0\320\24\22\2\0\0\0\0@\23\22\2", 20) = 20&lt;BR /&gt;write(6, "\n", 1) = 1&lt;BR /&gt;poll([{fd=4, events=POLLIN}, {fd=6, events=POLLIN}, {fd=10, events=POLLIN}, {fd=12, events=POLLIN}, {fd=0, events=POLLIN}], 5, -1&lt;BR /&gt;) = 1 ([{fd=0, revents=POLLIN}])&lt;BR /&gt;read(0, "\n", 16384) = 1&lt;BR /&gt;write(6, "\20\0\0\0\1\0\0\0\320\24\22\2\0\0\0\0@\23\22\2", 20) = 20&lt;BR /&gt;write(6, "\n", 1) = 1&lt;BR /&gt;poll([{fd=4, events=POLLIN}, {fd=6, events=POLLIN}, {fd=10, events=POLLIN}, {fd=12, events=POLLIN}, {fd=0, events=POLLIN}], 5, -1&lt;BR /&gt;) = 1 ([{fd=0, revents=POLLIN}])&lt;BR /&gt;read(0, "\n", 16384) = 1&lt;BR /&gt;write(6, "\20\0\0\0\1\0\0\0\320\24\22\2\0\0\0\0@\23\22\2", 20) = 20&lt;BR /&gt;write(6, "\n", 1) = 1&lt;BR /&gt;poll([{fd=4, events=POLLIN}, {fd=6, events=POLLIN}, {fd=10, events=POLLIN}, {fd=12, events=POLLIN}, {fd=0, events=POLLIN}], 5, -1&lt;BR /&gt;) = 1 ([{fd=0, revents=POLLIN}])&lt;BR /&gt;read(0, "\n", 16384) = 1&lt;BR /&gt;write(6, "\20\0\0\0\1\0\0\0\320\24\22\2\0\0\0\0@\23\22\2", 20) = 20&lt;BR /&gt;write(6, "\n", 1) = 1&lt;BR /&gt;poll([{fd=4, events=POLLIN}, {fd=6, events=POLLIN}, {fd=10, events=POLLIN}, {fd=12, events=POLLIN}, {fd=0, events=POLLIN}], 5, -1&lt;BR /&gt;) = 1 ([{fd=0, revents=POLLIN}])&lt;BR /&gt;read(0, "\n", 16384) = 1&lt;BR /&gt;write(6, "\20\0\0\0\1\0\0\0\320\24\22\2\0\0\0\0@\23\22\2", 20) = 20&lt;BR /&gt;write(6, "\n", 1) = 1&lt;BR /&gt;poll([{fd=4, events=POLLIN}, {fd=6, events=POLLIN}, {fd=10, events=POLLIN}, {fd=12, events=POLLIN}, {fd=0, events=POLLIN}], 5, -1&lt;BR /&gt;) = 1 ([{fd=0, revents=POLLIN}])&lt;BR /&gt;read(0, "\n", 16384) = 1&lt;BR /&gt;write(6, "\20\0\0\0\1\0\0\0\320\24\22\2\0\0\0\0@\23\22\2", 20) = 20&lt;BR /&gt;write(6, "\n", 1) = 1&lt;BR /&gt;poll([{fd=4, events=POLLIN}, {fd=6, events=POLLIN}, {fd=10, events=POLLIN}, {fd=12, events=POLLIN}, {fd=0, events=POLLIN}], 5, -1&lt;BR /&gt;) = 1 ([{fd=0, revents=POLLIN}])&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 17 Nov 2022 09:54:59 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/mpirun-hangs-up-up/m-p/1430882#M10054</guid>
      <dc:creator>psing51</dc:creator>
      <dc:date>2022-11-17T09:54:59Z</dc:date>
    </item>
    <item>
      <title>Re: mpirun hangs up  up</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/mpirun-hangs-up-up/m-p/1431276#M10060</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thanks for posting in Intel Communities.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Could you please provide us with the complete debug log after running the below command at your end?&lt;/P&gt;
&lt;LI-CODE lang="markup"&gt;I_MPI_DEBUG=30 I_MPI_HYDRA_DEBUG=on FI_LOG_LEVEL=debug /usr/diags/mpi/impi/2019.4.243/intel64/bin/mpirun -n 2 ./hello&lt;/LI-CODE&gt;
&lt;P&gt;Please find the hello_mpi.cpp code is attached below.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thanks &amp;amp; Regards,&lt;/P&gt;
&lt;P&gt;Varsha&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 18 Nov 2022 13:45:49 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/mpirun-hangs-up-up/m-p/1431276#M10060</guid>
      <dc:creator>VarshaS_Intel</dc:creator>
      <dc:date>2022-11-18T13:45:49Z</dc:date>
    </item>
    <item>
      <title>Re:mpirun hangs up  up</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/mpirun-hangs-up-up/m-p/1433869#M10092</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;We have not heard back from you. Could you please provide an update on your issue?&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Thanks &amp;amp; Regards,&lt;/P&gt;&lt;P&gt;Varsha&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Tue, 29 Nov 2022 06:07:36 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/mpirun-hangs-up-up/m-p/1433869#M10092</guid>
      <dc:creator>VarshaS_Intel</dc:creator>
      <dc:date>2022-11-29T06:07:36Z</dc:date>
    </item>
    <item>
      <title>Re:mpirun hangs up  up</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/mpirun-hangs-up-up/m-p/1436094#M10107</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;We have not heard back from you. This thread will no longer be monitored by Intel.If you need further assistance, please post a new question.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Thanks &amp;amp; Regards,&lt;/P&gt;&lt;P&gt;Varsha&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Tue, 06 Dec 2022 12:14:53 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/mpirun-hangs-up-up/m-p/1436094#M10107</guid>
      <dc:creator>VarshaS_Intel</dc:creator>
      <dc:date>2022-12-06T12:14:53Z</dc:date>
    </item>
  </channel>
</rss>

