<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: &amp;quot;external32&amp;quot; Data Representation of MPI in Intel® MPI Library</title>
    <link>https://community.intel.com/t5/Intel-MPI-Library/quot-external32-quot-Data-Representation-of-MPI/m-p/1486475#M10605</link>
    <description>&lt;P&gt;Thank you for your reply. I will keep an eye&amp;nbsp;&lt;SPAN&gt;on this thread.&lt;/SPAN&gt;&lt;/P&gt;</description>
    <pubDate>Tue, 16 May 2023 05:09:50 GMT</pubDate>
    <dc:creator>Site</dc:creator>
    <dc:date>2023-05-16T05:09:50Z</dc:date>
    <item>
      <title>"external32" Data Representation of MPI</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/quot-external32-quot-Data-Representation-of-MPI/m-p/1484437#M10587</link>
      <description>&lt;P&gt;Intel MPI writes incorrect outputs for&amp;nbsp;double and complex numbers when the data representation is set to&amp;nbsp; "external32" . According to the &lt;A href="https://www.mpi-forum.org/docs/mpi-3.1/mpi31-report/node333.htm" target="_blank" rel="noopener"&gt;MPI standard&lt;/A&gt;, 'All floating point values are in big-endian IEEE format &lt;SPAN&gt;of the appropriate size&lt;/SPAN&gt;'. However, on a x86_64 machine, Intel MPI treats one 8 bytes double as two 4 bytes floats and swap their endianness respectively. Besides, the output of a complex number is some&amp;nbsp;gibberish.&lt;/P&gt;
&lt;P&gt;The following is an example "a.c" that outputs 1.0 + 2.0i as two doubles and as one double complex in "native" mode and "external32" mode respectively to the file "test.out":&lt;/P&gt;
&lt;LI-CODE lang="cpp"&gt;#include &amp;lt;complex.h&amp;gt;
#include "mpi.h"

int main(int argc, char *argv[]) {
    int myrank;
    double buf[2] = {1.0, 2.0};
    double complex num = 1.0 + 2.0 * I;
    MPI_File fh;
    MPI_Status status;

    MPI_Init(&amp;amp;argc, &amp;amp;argv);
    MPI_Comm_rank(MPI_COMM_WORLD, &amp;amp;myrank);

    MPI_File_open(MPI_COMM_WORLD, "test.out", MPI_MODE_WRONLY | MPI_MODE_CREATE, MPI_INFO_NULL, &amp;amp;fh);

    if (myrank == 0) {
        /* write the two doubles in buf */
        /* little-endian */
        MPI_File_set_view(fh, 0, MPI_DOUBLE, MPI_DOUBLE, "native", MPI_INFO_NULL);
        MPI_File_write(fh, buf, 2, MPI_DOUBLE, &amp;amp;status);
        /* big-endian */
        MPI_File_set_view(fh, 2*8, MPI_DOUBLE, MPI_DOUBLE, "external32", MPI_INFO_NULL);
        MPI_File_write(fh, buf, 2, MPI_DOUBLE, &amp;amp;status);
        /* write the complex double num */
        /* little-endian */
        MPI_File_set_view(fh, 4*8, MPI_C_DOUBLE_COMPLEX, MPI_C_DOUBLE_COMPLEX, "native", MPI_INFO_NULL);
        MPI_File_write(fh, &amp;amp;num, 1, MPI_C_DOUBLE_COMPLEX, &amp;amp;status);
        /* big-endian */
        MPI_File_set_view(fh, 6*8, MPI_C_DOUBLE_COMPLEX, MPI_C_DOUBLE_COMPLEX, "external32", MPI_INFO_NULL);
        MPI_File_write(fh, &amp;amp;num, 1, MPI_C_DOUBLE_COMPLEX, &amp;amp;status);
    }

    MPI_File_close(&amp;amp;fh);

    return 0;
}&lt;/LI-CODE&gt;
&lt;P&gt;the output is&lt;/P&gt;
&lt;LI-CODE lang="bash"&gt;mpiicc a.c
mpirun -np 1 ./a.out
xxd test.out
0000000: 0000 0000 0000 f03f 0000 0000 0000 0040  .......?.......@ #native
0000010: 0000 0000 3ff0 0000 0000 0000 4000 0000  ....?.......@... #external32
0000020: 0000 0000 0000 f03f 0000 0000 0000 0040  .......?.......@ #native
0000030: 504d 495f 4644 0000 0000 0000 0000 0000  PMI_FD.......... #external32&lt;/LI-CODE&gt;
&lt;P&gt;an equivalent python program "a.py" is&lt;/P&gt;
&lt;LI-CODE lang="python"&gt;import numpy as np

buf = np.array([1.0, 2.0])
num = np.array([1.0 + 2.0j])

with open('test_py.out', 'w') as f:
    buf.tofile(f)
    buf.astype('&amp;gt;f8').tofile(f)
    num.tofile(f)
    num.astype('&amp;gt;c16').tofile(f)&lt;/LI-CODE&gt;
&lt;P&gt;and the output is&lt;/P&gt;
&lt;LI-CODE lang="bash"&gt;python a.py
xxd test_py.out
00000000: 0000 0000 0000 f03f 0000 0000 0000 0040  .......?.......@
00000010: 3ff0 0000 0000 0000 4000 0000 0000 0000  ?.......@.......
00000020: 0000 0000 0000 f03f 0000 0000 0000 0040  .......?.......@
00000030: 3ff0 0000 0000 0000 4000 0000 0000 0000  ?.......@.......&lt;/LI-CODE&gt;
&lt;P&gt;I think the "external32" outputs of the c code should be the same as the python code outputs. Did I misunderstand the external32 MPI standard?&lt;/P&gt;</description>
      <pubDate>Tue, 09 May 2023 08:49:26 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/quot-external32-quot-Data-Representation-of-MPI/m-p/1484437#M10587</guid>
      <dc:creator>Site</dc:creator>
      <dc:date>2023-05-09T08:49:26Z</dc:date>
    </item>
    <item>
      <title>Re:"external32" Data Representation of MPI</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/quot-external32-quot-Data-Representation-of-MPI/m-p/1484820#M10588</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Thanks for posting in Intel Communities.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Could you please provide the following details:&lt;/P&gt;&lt;P&gt;1. Intel oneAPI Toolkit version along with Intel MPI version&lt;/P&gt;&lt;P&gt;2. OS and hardware details&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Thanks &amp;amp; Regards,&lt;/P&gt;&lt;P&gt;Shaik Rabiya&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Wed, 10 May 2023 09:19:39 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/quot-external32-quot-Data-Representation-of-MPI/m-p/1484820#M10588</guid>
      <dc:creator>RabiyaSK_Intel</dc:creator>
      <dc:date>2023-05-10T09:19:39Z</dc:date>
    </item>
    <item>
      <title>Re: "external32" Data Representation of MPI</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/quot-external32-quot-Data-Representation-of-MPI/m-p/1484876#M10589</link>
      <description>&lt;P&gt;OK. I tested on&lt;/P&gt;
&lt;LI-CODE lang="bash"&gt;System:    Kernel: 5.14.21-150400.24.18-default x86_64 bits: 64 Desktop: N/A Distro: openSUSE Leap 15.4
Machine:   Type: Server System: Sugon product: I620-G30 v: Purley serial: &amp;lt;superuser required&amp;gt;
           Mobo: Sugon model: 60P24-US v: 24001539 serial: &amp;lt;superuser required&amp;gt; UEFI-[Legacy]: American Megatrends v: 0JGST025
           date: 12/08/2017
CPU:       Info: 2x 10-Core model: Intel Xeon Silver 4114 bits: 64 type: MCP SMP cache: L2: 27.5 MiB
           Speed: 1586 MHz min/max: 800/3000 MHz Core speeds (MHz): 1: 1586 2: 943 3: 2503 4: 1301 5: 1303 6: 2500 7: 2497
           8: 2500 9: 1636 10: 2200 11: 1601 12: 1199 13: 1500 14: 1101 15: 2422 16: 2201 17: 2200 18: 2226 19: 1542 20: 2162&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;with&amp;nbsp;&lt;SPAN&gt;Intel oneAPI Toolkit&amp;nbsp;2021.1 and Intel MPI&amp;nbsp;2021.1.1.&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;I also tested on&lt;/SPAN&gt;&lt;/P&gt;
&lt;LI-CODE lang="bash"&gt;System:    Kernel: 3.10.0-1160.88.1.el7.x86_64 x86_64 bits: 64 Desktop: N/A
           Distro: CentOS Linux release 7.9.2009 (Core)
Machine:   Type: Server System: Dell product: PowerEdge R740 v: N/A serial: &amp;lt;superuser required&amp;gt;
           Mobo: Dell model: 06WXJT v: A01 serial: &amp;lt;superuser required&amp;gt; BIOS: Dell v: 2.8.2 date: 08/27/2020
CPU:       Info: 2x 8-Core model: Intel Xeon Silver 4208 bits: 64 type: MT MCP SMP cache: L2: 22 MiB
           Speed: 800 MHz min/max: 800/3200 MHz Core speeds (MHz): 1: 800 2: 800 3: 800 4: 800 5: 800 6: 897 7: 803 8: 800
           9: 800 10: 824 11: 801 12: 800 13: 800 14: 801 15: 926 16: 800 17: 800 18: 800 19: 806 20: 801 21: 800 22: 800
           23: 801 24: 800 25: 800 26: 804 27: 800 28: 800 29: 800 30: 800 31: 879 32: 800&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&amp;nbsp;with&amp;nbsp;&lt;SPAN&gt;Intel oneAPI Toolkit&amp;nbsp;2022.0 and Intel MPI&amp;nbsp;2021.5.0.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 10 May 2023 13:32:57 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/quot-external32-quot-Data-Representation-of-MPI/m-p/1484876#M10589</guid>
      <dc:creator>Site</dc:creator>
      <dc:date>2023-05-10T13:32:57Z</dc:date>
    </item>
    <item>
      <title>Re:"external32" Data Representation of MPI</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/quot-external32-quot-Data-Representation-of-MPI/m-p/1486460#M10604</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;We are able to reproduce your issue. We've informed the concerned development team about it. We will get back to you soon.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Thanks &amp;amp; Regards,&lt;/P&gt;&lt;P&gt;Shaik Rabiya&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Tue, 16 May 2023 03:28:43 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/quot-external32-quot-Data-Representation-of-MPI/m-p/1486460#M10604</guid>
      <dc:creator>RabiyaSK_Intel</dc:creator>
      <dc:date>2023-05-16T03:28:43Z</dc:date>
    </item>
    <item>
      <title>Re: "external32" Data Representation of MPI</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/quot-external32-quot-Data-Representation-of-MPI/m-p/1486475#M10605</link>
      <description>&lt;P&gt;Thank you for your reply. I will keep an eye&amp;nbsp;&lt;SPAN&gt;on this thread.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 16 May 2023 05:09:50 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/quot-external32-quot-Data-Representation-of-MPI/m-p/1486475#M10605</guid>
      <dc:creator>Site</dc:creator>
      <dc:date>2023-05-16T05:09:50Z</dc:date>
    </item>
    <item>
      <title>Re: "external32" Data Representation of MPI</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/quot-external32-quot-Data-Representation-of-MPI/m-p/1583152#M11637</link>
      <description>&lt;P&gt;Hello&amp;nbsp;&lt;a href="https://community.intel.com/t5/user/viewprofilepage/user-id/229477"&gt;@Site&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;This bug has been fixed in Intel MPI 2021.12, which is going to be made publicly available March 28th (as a part of Intel HPC Toolkit 2024.1).&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Cheers!&lt;/P&gt;&lt;P&gt;Rafael&lt;/P&gt;</description>
      <pubDate>Mon, 25 Mar 2024 14:56:38 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/quot-external32-quot-Data-Representation-of-MPI/m-p/1583152#M11637</guid>
      <dc:creator>Rafael_L_Intel</dc:creator>
      <dc:date>2024-03-25T14:56:38Z</dc:date>
    </item>
  </channel>
</rss>

