<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Coarray runtime bug in Intel® Fortran Compiler</title>
    <link>https://community.intel.com/t5/Intel-Fortran-Compiler/Coarray-runtime-bug/m-p/1374152#M160881</link>
    <description>&lt;P&gt;I'm hitting a segfault with some coarray code when running with 1 image.&amp;nbsp; I've reduced it to this test case:&lt;/P&gt;
&lt;LI-CODE lang="fortran"&gt;integer :: x(10), y(10)
call foo(x, y)
contains
  subroutine foo(src, dest)
    integer, intent(in), target :: src(:)
    integer, intent(inout) :: dest(:)
    type box
      integer, pointer :: data(:)
    end type
    type(box), allocatable :: buffer[:]
    allocate(buffer[*])
    if (this_image() == 1) buffer%data =&amp;gt; src
    sync all
    dest = buffer[1]%data
  end subroutine
end
&lt;/LI-CODE&gt;
&lt;P&gt;I'm using ifort 2021.5.0 and using these compiler options: -coarray -coarray-num-images=1&lt;/P&gt;
&lt;P&gt;I'm also setting&lt;/P&gt;
&lt;LI-CODE lang="bash"&gt;export I_MPI_DEVICE=shm
export I_MPI_FABRICS=shm&lt;/LI-CODE&gt;
&lt;P&gt;per this &lt;A href="https://www.intel.com/content/www/us/en/develop/documentation/fortran-compiler-oneapi-dev-guide-and-reference/top/optimization-and-programming-guide/coarrays-1/using-coarrays.html" target="_blank" rel="noopener"&gt;Intel page&lt;/A&gt;&amp;nbsp;in order to keep coarray executables from pounding on the loopback network device and getting really crappy performance (I'm running on a single shared-memory node).&lt;/P&gt;
&lt;P&gt;With this setup the executable segfaults, however if I don't set those I_MPI variables then the test case executes without error.&lt;/P&gt;</description>
    <pubDate>Mon, 04 Apr 2022 18:09:57 GMT</pubDate>
    <dc:creator>NCarlson</dc:creator>
    <dc:date>2022-04-04T18:09:57Z</dc:date>
    <item>
      <title>Coarray runtime bug</title>
      <link>https://community.intel.com/t5/Intel-Fortran-Compiler/Coarray-runtime-bug/m-p/1374152#M160881</link>
      <description>&lt;P&gt;I'm hitting a segfault with some coarray code when running with 1 image.&amp;nbsp; I've reduced it to this test case:&lt;/P&gt;
&lt;LI-CODE lang="fortran"&gt;integer :: x(10), y(10)
call foo(x, y)
contains
  subroutine foo(src, dest)
    integer, intent(in), target :: src(:)
    integer, intent(inout) :: dest(:)
    type box
      integer, pointer :: data(:)
    end type
    type(box), allocatable :: buffer[:]
    allocate(buffer[*])
    if (this_image() == 1) buffer%data =&amp;gt; src
    sync all
    dest = buffer[1]%data
  end subroutine
end
&lt;/LI-CODE&gt;
&lt;P&gt;I'm using ifort 2021.5.0 and using these compiler options: -coarray -coarray-num-images=1&lt;/P&gt;
&lt;P&gt;I'm also setting&lt;/P&gt;
&lt;LI-CODE lang="bash"&gt;export I_MPI_DEVICE=shm
export I_MPI_FABRICS=shm&lt;/LI-CODE&gt;
&lt;P&gt;per this &lt;A href="https://www.intel.com/content/www/us/en/develop/documentation/fortran-compiler-oneapi-dev-guide-and-reference/top/optimization-and-programming-guide/coarrays-1/using-coarrays.html" target="_blank" rel="noopener"&gt;Intel page&lt;/A&gt;&amp;nbsp;in order to keep coarray executables from pounding on the loopback network device and getting really crappy performance (I'm running on a single shared-memory node).&lt;/P&gt;
&lt;P&gt;With this setup the executable segfaults, however if I don't set those I_MPI variables then the test case executes without error.&lt;/P&gt;</description>
      <pubDate>Mon, 04 Apr 2022 18:09:57 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Fortran-Compiler/Coarray-runtime-bug/m-p/1374152#M160881</guid>
      <dc:creator>NCarlson</dc:creator>
      <dc:date>2022-04-04T18:09:57Z</dc:date>
    </item>
    <item>
      <title>Re: Coarray runtime bug</title>
      <link>https://community.intel.com/t5/Intel-Fortran-Compiler/Coarray-runtime-bug/m-p/1374520#M160894</link>
      <description>&lt;P&gt;I duplicated your issue. I'll keeping looking at that.&lt;/P&gt;
&lt;P&gt;There are two workarounds that I see.&amp;nbsp;&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;Compile with&amp;nbsp;&lt;SPAN&gt;-coarray=shared.&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN&gt;Create a config file. &lt;A href="https://www.intel.com/content/www/us/en/developer/articles/technical/distributed-memory-coarray-fortran-with-the-intel-fortran-compiler-for-linux-essential.html" target="_blank" rel="noopener"&gt;The article&lt;/A&gt; has some details. It's written for distributed environments, but the concepts are the same for shared.&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/OL&gt;</description>
      <pubDate>Tue, 05 Apr 2022 20:36:00 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Fortran-Compiler/Coarray-runtime-bug/m-p/1374520#M160894</guid>
      <dc:creator>Barbara_P_Intel</dc:creator>
      <dc:date>2022-04-05T20:36:00Z</dc:date>
    </item>
    <item>
      <title>Re:Coarray runtime bug</title>
      <link>https://community.intel.com/t5/Intel-Fortran-Compiler/Coarray-runtime-bug/m-p/1375661#M160929</link>
      <description>&lt;P&gt;I learned that there is a known issue with MPI that applies to your case and a workaround.&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;SPAN style="font-size: 10.5pt;"&gt;Single-process run with I_MPI_FABRICS=shm may lead to crash. Please use out-of-box settings or I_MPI_FABRICS=shm:ofi instead.&lt;/SPAN&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;It worked for me with your reproducer.&lt;/P&gt;&lt;P&gt;This will be documented in the MPI Library Release Notes in the next release.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Fri, 08 Apr 2022 18:10:28 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-Fortran-Compiler/Coarray-runtime-bug/m-p/1375661#M160929</guid>
      <dc:creator>Barbara_P_Intel</dc:creator>
      <dc:date>2022-04-08T18:10:28Z</dc:date>
    </item>
  </channel>
</rss>

