<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re:MPI Fatal error in PMPI_Allgatherv: MPIR_Localcopy(42)......Message truncated; 4028 bytes received b in Intel® MPI Library</title>
    <link>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1273254#M8079</link>
    <description>&lt;P&gt;&lt;I style="font-family: Calibri, sans-serif; font-size: 11pt;"&gt;This issue has been resolved and we will no longer respond to this thread.&amp;nbsp;If you require additional assistance from Intel, please start a new thread.&amp;nbsp;Any further interaction in this thread will be considered community only&lt;/I&gt;&lt;/P&gt;&lt;BR /&gt;</description>
    <pubDate>Tue, 13 Apr 2021 15:32:53 GMT</pubDate>
    <dc:creator>Michael_Intel</dc:creator>
    <dc:date>2021-04-13T15:32:53Z</dc:date>
    <item>
      <title>MPI Fatal error in PMPI_Allgatherv: MPIR_Localcopy, 4028 bytes received but buffer size is 4008</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1261385#M7874</link>
      <description>&lt;P&gt;OpenFOAM is the Open Source CFD Toolbox.&lt;/P&gt;
&lt;P&gt;I can use Intel parallel studio 2018u3 to successfully run OpenFOAM test case.&lt;/P&gt;
&lt;P&gt;But, there is a fatal error when I use OneAPI&amp;nbsp;HPCKit_p_2021.1.0.2684.&lt;/P&gt;
&lt;P&gt;Why does OneAPI report this error? How can I fix ?&lt;/P&gt;
&lt;P&gt;The fatal error is:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="none"&gt;Abort(740365582) on node 0 (rank 0 in comm 0): Fatal error in PMPI_Allgatherv: Message truncated, error stack:
PMPI_Allgatherv(437)..........................: MPI_Allgatherv(sbuf=0x21a8280, scount=1007, MPI_INT, rbuf=0x21c0820, rcounts=0x265dc00, displs=0x265dc10, datatype=MPI_INT, comm=comm=0x84000003) failed
MPIDI_Allgatherv_intra_composition_alpha(1764):
MPIDI_NM_mpi_allgatherv(394)..................:
MPIR_Allgatherv_intra_recursive_doubling(75)..:
MPIR_Localcopy(42)............................: Message truncated; 4028 bytes received but buffer size is 4008&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;icc -v&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="none"&gt;icc version 2021.1 (gcc version 7.3.0 compatibility)&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;mpirun --version&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="none"&gt;Intel(R) MPI Library for Linux* OS, Version 2021.1 Build 20201112 (id: b9c9d2fc5)
Copyright 2003-2020, Intel Corporation.&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;mpi debug info&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="none"&gt;[0] MPI startup(): Intel(R) MPI Library, Version 2021.1  Build 20201112 (id: b9c9d2fc5)
[0] MPI startup(): Copyright (C) 2003-2020 Intel Corporation.  All rights reserved.
[0] MPI startup(): library kind: release
[0] MPI startup(): Size of shared memory segment (112 MB per rank) * (8 local ranks) = 902 MB total
[0] MPI startup(): libfabric version: 1.11.0-impi
[0] MPI startup(): libfabric provider: verbs;ofi_rxm
[0] MPI startup(): Rank    Pid      Node name  Pin cpu
[0] MPI startup(): 0       1610443  Hnode5     {0,1,2,3,7,8}
[0] MPI startup(): 1       1610444  Hnode5     {12,13,14,18,19,20}
[0] MPI startup(): 2       1610445  Hnode5     {4,5,6,9,10,11}
[0] MPI startup(): 3       1610446  Hnode5     {15,16,17,21,22,23}
[0] MPI startup(): 4       1610447  Hnode5     {24,25,26,27,31,32}
[0] MPI startup(): 5       1610448  Hnode5     {36,37,38,42,43,44}
[0] MPI startup(): 6       1610449  Hnode5     {28,29,30,33,34,35}
[0] MPI startup(): 7       1610450  Hnode5     {39,40,41,45,46,47}
[0] MPI startup(): I_MPI_ROOT=/Oceanfile/kylin/Intel-One-API/mpi/2021.1.1
[0] MPI startup(): I_MPI_MPIRUN=mpirun
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_DEBUG=10&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 04 Mar 2021 11:19:18 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1261385#M7874</guid>
      <dc:creator>Xiaoqiang</dc:creator>
      <dc:date>2021-03-04T11:19:18Z</dc:date>
    </item>
    <item>
      <title>Re: MPI Fatal error in PMPI_Allgatherv: MPIR_Localcopy, 4028 bytes received but buffer size is 4008</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1262268#M7890</link>
      <description>&lt;P&gt;Hi Chen,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Looks like there has been a mismatch between the expected and received buffer size.&lt;/P&gt;
&lt;P&gt;The recieve_buffer might has been allocated only 4008bytes while the sent data from all processes results in 4028bytes.&lt;/P&gt;
&lt;P&gt;Please recheck the receive buffer size matches with the global_size*count*INT.&lt;/P&gt;
&lt;P&gt;In your code, as we can see that scount is 1007 and total bytes received are 4028 we can infer that total ranks are 4 and rbuf should be allocated memory as :&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;I&gt;rbuf = (int *)malloc(global_size*1007*sizeof(int));&lt;/I&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Could you please provide us the code snippet involving this mpi_Allgatherv call?&lt;/P&gt;
&lt;P&gt;That would help us a lot in debugging the error.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Regards&lt;/P&gt;
&lt;P&gt;Prasanth&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 08 Mar 2021 07:50:55 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1262268#M7890</guid>
      <dc:creator>PrasanthD_intel</dc:creator>
      <dc:date>2021-03-08T07:50:55Z</dc:date>
    </item>
    <item>
      <title>Re:MPI Fatal error in PMPI_Allgatherv: MPIR_Localcopy(42)......Message truncated; 4028 bytes received b</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1263377#M7903</link>
      <description>&lt;P&gt;Hi Chen,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;We haven't heard back from you.&lt;/P&gt;&lt;P&gt;Let us know if your issue is resolved and the given workaround fixed the issue.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Regards&lt;/P&gt;&lt;P&gt;Prasanth&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Thu, 11 Mar 2021 10:40:24 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1263377#M7903</guid>
      <dc:creator>PrasanthD_intel</dc:creator>
      <dc:date>2021-03-11T10:40:24Z</dc:date>
    </item>
    <item>
      <title>Re: Re:MPI Fatal error in PMPI_Allgatherv: MPIR_Localcopy(42)......Message truncated; 4028 bytes rec</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1264772#M7919</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;!--WeLinkPC--&gt;&lt;SPAN&gt;Prasanth，&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Sorry for the late reply.&amp;nbsp;I spent a lot of time trying to solve this problem.&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;The OpenFOAM source code is complex and contains some third-party dependent libraries.&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Fortunately, a third-party library called &lt;EM&gt;scotch_6.0.6&lt;/EM&gt; was found to cause the problem.&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;I think the root cause is that there are some overly aggressive optimizations in &lt;EM&gt;&lt;STRONG&gt;mpiicc&lt;/STRONG&gt;&lt;/EM&gt; with &lt;EM&gt;&lt;STRONG&gt;O3&lt;/STRONG&gt;&lt;/EM&gt;&amp;nbsp;option.&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Here are some tests to recompile &lt;EM&gt;scotch_6.0.6&lt;/EM&gt;:&lt;/SPAN&gt;&lt;/P&gt;
&lt;TABLE border="1" width="100%"&gt;
&lt;TBODY&gt;
&lt;TR&gt;
&lt;TD width="25%" height="47px"&gt;Software stack&lt;/TD&gt;
&lt;TD width="25%" height="47px"&gt;Compiler&lt;/TD&gt;
&lt;TD width="25%" height="47px"&gt;Compilation Options&lt;/TD&gt;
&lt;TD width="25%" height="47px"&gt;Whether this&amp;nbsp;&lt;!--WeLinkPC--&gt;&lt;SPAN&gt;issue&lt;/SPAN&gt; occurs&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="25%" height="25px"&gt;gcc9.3 and openmpi4.0.3&lt;/TD&gt;
&lt;TD width="25%" height="25px"&gt;mpicc&lt;/TD&gt;
&lt;TD width="25%" height="25px"&gt;-O3&lt;/TD&gt;
&lt;TD width="25%" height="25px"&gt;no&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="25%" height="25px"&gt;&lt;!--WeLinkPC--&gt;&lt;SPAN&gt;OneAPI&amp;nbsp;HPCKit_p_2021.1.0.2684&lt;/SPAN&gt;&lt;/TD&gt;
&lt;TD width="25%" height="25px"&gt;mpicc&lt;/TD&gt;
&lt;TD width="25%" height="25px"&gt;-O3&lt;/TD&gt;
&lt;TD width="25%" height="25px"&gt;no&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="25%" height="25px"&gt;&lt;SPAN&gt;&lt;!--WeLinkPC--&gt;OneAPI&amp;nbsp;HPCKit_p_2021.1.0.2684&lt;/SPAN&gt;&lt;/TD&gt;
&lt;TD width="25%" height="25px"&gt;mpiicc&lt;/TD&gt;
&lt;TD width="25%" height="25px"&gt;-O1&lt;/TD&gt;
&lt;TD width="25%" height="25px"&gt;no&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="25%" height="25px"&gt;&lt;SPAN&gt;&lt;!--WeLinkPC--&gt;OneAPI&amp;nbsp;HPCKit_p_2021.1.0.2684&lt;/SPAN&gt;&lt;/TD&gt;
&lt;TD width="25%" height="25px"&gt;mpiicc&lt;/TD&gt;
&lt;TD width="25%" height="25px"&gt;-O3&lt;/TD&gt;
&lt;TD width="25%" height="25px"&gt;yes&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Intel MPI provides both &lt;EM&gt;&lt;STRONG&gt;mpiicc&lt;/STRONG&gt;&lt;/EM&gt; and &lt;EM&gt;&lt;STRONG&gt;mpicc&lt;/STRONG&gt;&lt;/EM&gt;.&lt;/P&gt;
&lt;P&gt;My question is, what is the difference between them?&lt;/P&gt;
&lt;P&gt;When using the &lt;EM&gt;&lt;STRONG&gt;O3&lt;/STRONG&gt;&lt;/EM&gt; option, which optimization of &lt;EM&gt;&lt;STRONG&gt;mpiicc&lt;/STRONG&gt;&lt;/EM&gt; is optimal may cause this error?&lt;/P&gt;
&lt;P&gt;Thanks for your help.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;!--WeLinkPC--&gt;&lt;/P&gt;
&lt;P&gt;Regards&lt;/P&gt;
&lt;P&gt;Chen&lt;/P&gt;</description>
      <pubDate>Tue, 16 Mar 2021 11:55:50 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1264772#M7919</guid>
      <dc:creator>Xiaoqiang</dc:creator>
      <dc:date>2021-03-16T11:55:50Z</dc:date>
    </item>
    <item>
      <title>Re:MPI Fatal error in PMPI_Allgatherv: MPIR_Localcopy(42)......Message truncated; 4028 bytes received b</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1265150#M7932</link>
      <description>&lt;P&gt;Hi Chen,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;So you are facing the issue only when O3 optimization is enabled.&lt;/P&gt;&lt;P&gt;As you have said the error is maybe due to some optimization. We will look into the issue of what might be causing this error.&lt;/P&gt;&lt;P&gt;The difference between mpicc and mpiicc is that mpiicc uses Intel compilers (icc ) and mpicc uses gnu compilers (gcc).&lt;/P&gt;&lt;P&gt;Could you also please test with O2 once that would be helpful.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Regards&lt;/P&gt;&lt;P&gt;Prasanth&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Wed, 17 Mar 2021 12:08:03 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1265150#M7932</guid>
      <dc:creator>PrasanthD_intel</dc:creator>
      <dc:date>2021-03-17T12:08:03Z</dc:date>
    </item>
    <item>
      <title>Re: Re:MPI Fatal error in PMPI_Allgatherv: MPIR_Localcopy(42)......Message truncated; 4028 bytes rec</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1266435#M7959</link>
      <description>&lt;P&gt;&lt;!--WeLinkPC--&gt;&lt;SPAN&gt;Hi&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;Prasanth，&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Unfortunately,&amp;nbsp;&amp;nbsp;using &lt;STRONG&gt;mpiicc&lt;/STRONG&gt; with O2 option still cause this error.&lt;/P&gt;
&lt;P&gt;For better performance, I currently use &lt;STRONG&gt;mpicc&lt;/STRONG&gt; to compile programs.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;!--WeLinkPC--&gt;&lt;/P&gt;
&lt;P&gt;Regards&lt;/P&gt;
&lt;P&gt;Chen&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 22 Mar 2021 04:35:01 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1266435#M7959</guid>
      <dc:creator>Xiaoqiang</dc:creator>
      <dc:date>2021-03-22T04:35:01Z</dc:date>
    </item>
    <item>
      <title>Re:MPI Fatal error in PMPI_Allgatherv: MPIR_Localcopy(42)......Message truncated; 4028 bytes received b</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1268736#M7991</link>
      <description>&lt;P&gt;Hi Chen,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;We have tried to reproduce the issue.&lt;/P&gt;&lt;P&gt;I have downloaded Thirdparty-7 from the OpenFoam repository and build scotch6.0.6.&lt;/P&gt;&lt;P&gt;I have replaced the Makefile.inc with Makefile.inc.x86-64_pc_linux2.icc.impi and build the scotch.&lt;/P&gt;&lt;P&gt;It ran fine and the optimizations used were all -O0.&lt;/P&gt;&lt;P&gt;Could you let me know how to reproduce your error? As the error seems to be with the received buffer but you were able to compile it with different optimizations and with GCC+OpenMPI.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Regards&lt;/P&gt;&lt;P&gt;Prasanth&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Mon, 29 Mar 2021 11:34:31 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1268736#M7991</guid>
      <dc:creator>PrasanthD_intel</dc:creator>
      <dc:date>2021-03-29T11:34:31Z</dc:date>
    </item>
    <item>
      <title>Re: Re:MPI Fatal error in PMPI_Allgatherv: MPIR_Localcopy(42)......Message truncated; 4028 bytes rec</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1269197#M7995</link>
      <description>&lt;P&gt;&lt;!--WeLinkPC--&gt;&lt;SPAN&gt;Hi&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;Prasanth，&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;You can reproduce the problem by performing the following steps:&lt;/SPAN&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;STRONG&gt;Downloading Openfoam and Third-Party Library Source Codes&lt;/STRONG&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;&lt;A href="https://altushost-swe.dl.sourceforge.net/project/openfoam/v1906/OpenFOAM-v1906.tgz" target="_self"&gt;https://altushost-swe.dl.sourceforge.net/project/openfoam/v1906/OpenFOAM-v1906.tgz&lt;/A&gt;&amp;nbsp;&lt;/P&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;&lt;A href="https://altushost-swe.dl.sourceforge.net/project/openfoam/v1906/ThirdParty-v1906.tgz" target="_self"&gt;https://altushost-swe.dl.sourceforge.net/project/openfoam/v1906/ThirdParty-v1906.tgz&lt;/A&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;STRONG&gt;Decompress the source code package to the same directory&lt;/STRONG&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Xiaoqiang_1-1617101775047.png" style="width: 400px;"&gt;&lt;img src="https://community.intel.com/t5/image/serverpage/image-id/15997iD6682D3A03573A2A/image-size/medium/is-moderation-mode/true?v=v2&amp;amp;px=400&amp;amp;whitelist-exif-data=Orientation%2CResolution%2COriginalDefaultFinalSize%2CCopyright" role="button" title="Xiaoqiang_1-1617101775047.png" alt="Xiaoqiang_1-1617101775047.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;STRONG&gt;Configuring the Environment Variables of the Intel OneAPI&lt;/STRONG&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Xiaoqiang_2-1617101930338.png" style="width: 400px;"&gt;&lt;img src="https://community.intel.com/t5/image/serverpage/image-id/15998iF099FB7548762F2A/image-size/medium/is-moderation-mode/true?v=v2&amp;amp;px=400&amp;amp;whitelist-exif-data=Orientation%2CResolution%2COriginalDefaultFinalSize%2CCopyright" role="button" title="Xiaoqiang_2-1617101930338.png" alt="Xiaoqiang_2-1617101930338.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;STRONG&gt;Configuring OpenFOAM Environment Variables&lt;/STRONG&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="bash"&gt;cd OpenFOAM-v1906
vim etc/bashrc&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;Modify the configuration file to use the Intel OneAPI.&lt;/P&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Xiaoqiang_3-1617102589907.png" style="width: 400px;"&gt;&lt;img src="https://community.intel.com/t5/image/serverpage/image-id/15999i6C33A6605218DA20/image-size/medium/is-moderation-mode/true?v=v2&amp;amp;px=400&amp;amp;whitelist-exif-data=Orientation%2CResolution%2COriginalDefaultFinalSize%2CCopyright" role="button" title="Xiaoqiang_3-1617102589907.png" alt="Xiaoqiang_3-1617102589907.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;After the modification, load environment variables.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="bash"&gt;source etc/bashrc&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;When you load environment variables for the first time, a message is displayed indicating that the software is not installed. Rest assured, don't worry about this tip.&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;STRONG&gt;Compile the program using the built-in script.&lt;/STRONG&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="bash"&gt;./Allwmake -j 8 -s -k -q&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;It takes a long time to compile, which may take up to 2 hours.&lt;/P&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;After the compilation is complete, load environment variables again.&lt;/P&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;Check whether the OpenFOAM is successfully installed.&lt;/P&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;(scotch6.0.6 is compiled through mpiicc with O3 option)&lt;/P&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Xiaoqiang_4-1617103318570.png" style="width: 400px;"&gt;&lt;img src="https://community.intel.com/t5/image/serverpage/image-id/16000i1EF061019020904E/image-size/medium/is-moderation-mode/true?v=v2&amp;amp;px=400&amp;amp;whitelist-exif-data=Orientation%2CResolution%2COriginalDefaultFinalSize%2CCopyright" role="button" title="Xiaoqiang_4-1617103318570.png" alt="Xiaoqiang_4-1617103318570.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Xiaoqiang_5-1617103372034.png" style="width: 400px;"&gt;&lt;img src="https://community.intel.com/t5/image/serverpage/image-id/16001i691488AC1623516F/image-size/medium/is-moderation-mode/true?v=v2&amp;amp;px=400&amp;amp;whitelist-exif-data=Orientation%2CResolution%2COriginalDefaultFinalSize%2CCopyright" role="button" title="Xiaoqiang_5-1617103372034.png" alt="Xiaoqiang_5-1617103372034.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;STRONG&gt;Run test cases to reproduce the problem&lt;/STRONG&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="bash"&gt;cd tutorials/incompressible/pisoFoam/LES/motorBike/motorBike/
./Allclean
./Allrun&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;You can see the error information in the &lt;EM&gt;log.snappyHexMesh&lt;/EM&gt; file.&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;STRONG&gt;Recompile scotch_6.0.6&lt;/STRONG&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="bash"&gt;cd ThirdParty-v1906/scotch_6.0.6/src&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;Modify the Makefile.inc file and use mpicc.&lt;/P&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Xiaoqiang_6-1617104348173.png" style="width: 400px;"&gt;&lt;img src="https://community.intel.com/t5/image/serverpage/image-id/16002i85419D8527E5DDD5/image-size/medium/is-moderation-mode/true?v=v2&amp;amp;px=400&amp;amp;whitelist-exif-data=Orientation%2CResolution%2COriginalDefaultFinalSize%2CCopyright" role="button" title="Xiaoqiang_6-1617104348173.png" alt="Xiaoqiang_6-1617104348173.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;Compiling Library Files,&amp;nbsp;and add the newly generated library to the environment variable.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="bash"&gt;make clean &amp;amp;&amp;amp; make libptscotch -j
cd libscotch
export LD_LIBRARY_PATH=/home/openfoam/ThirdParty-v1906/scotch_6.0.6/src/libscotch:$LD_LIBRARY_PATH&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&amp;nbsp;&lt;!--WeLinkPC--&gt;&lt;STRONG&gt;Run test cases&lt;/STRONG&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="bash"&gt;cd OpenFOAM-v1906/tutorials/incompressible/pisoFoam/LES/motorBike/motorBike
./Allclean
./Allrun&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;If all goes well, the test cases can run properly.&lt;/P&gt;
&lt;P&gt;I hope this case is helpful to Intel OneAPI.&lt;/P&gt;</description>
      <pubDate>Tue, 30 Mar 2021 12:02:15 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1269197#M7995</guid>
      <dc:creator>Xiaoqiang</dc:creator>
      <dc:date>2021-03-30T12:02:15Z</dc:date>
    </item>
    <item>
      <title>Re:MPI Fatal error in PMPI_Allgatherv: MPIR_Localcopy(42)......Message truncated; 4028 bytes received b</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1271207#M8046</link>
      <description>&lt;P&gt;Hi Chen,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Thanks for providing the steps. They were very helpful.&lt;/P&gt;&lt;P&gt;We are looking into it and will get back to you.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Regards&lt;/P&gt;&lt;P&gt;Prasanth&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Tue, 06 Apr 2021 12:13:05 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1271207#M8046</guid>
      <dc:creator>PrasanthD_intel</dc:creator>
      <dc:date>2021-04-06T12:13:05Z</dc:date>
    </item>
    <item>
      <title>Re:MPI Fatal error in PMPI_Allgatherv: MPIR_Localcopy(42)......Message truncated; 4028 bytes received b</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1272086#M8060</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;I can reproduce the issue but it looks like an issue in the application code of ptscotch.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Please feel free to attach our ITAC message checker tool (export LD_PRELOAD=libVTmc.so:libmpi.so), which will report the following.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;[0] WARNING: LOCAL:MEMORY:OVERLAP: warning&lt;/P&gt;&lt;P&gt;[0] WARNING:&amp;nbsp;&amp;nbsp;Data transfer addresses the same bytes at address 0x195a3f4&lt;/P&gt;&lt;P&gt;[0] WARNING:&amp;nbsp;&amp;nbsp;in the receive buffer multiple times, which is only&lt;/P&gt;&lt;P&gt;[0] WARNING:&amp;nbsp;&amp;nbsp;allowed for send buffers.&lt;/P&gt;&lt;P&gt;[0] WARNING:&amp;nbsp;&amp;nbsp;Control over new buffer is about to be transferred to MPI at:&lt;/P&gt;&lt;P&gt;[0] WARNING:&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;MPI_Allgatherv(*sendbuf=0x1911b04, sendcount=191, sendtype=MPI_INT, *recvbuf=0x195a3f4, *recvcounts=0x33353f8, *displs=0x33353e0, recvtype=MPI_INT, comm=0xffffffffc4000000 SPLIT COMM_WORLD&amp;nbsp;[0:3])&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Best regards,&lt;/P&gt;&lt;P&gt;Michael&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Thu, 08 Apr 2021 16:02:48 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1272086#M8060</guid>
      <dc:creator>Michael_Intel</dc:creator>
      <dc:date>2021-04-08T16:02:48Z</dc:date>
    </item>
    <item>
      <title>Re:MPI Fatal error in PMPI_Allgatherv: MPIR_Localcopy(42)......Message truncated; 4028 bytes received b</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1273254#M8079</link>
      <description>&lt;P&gt;&lt;I style="font-family: Calibri, sans-serif; font-size: 11pt;"&gt;This issue has been resolved and we will no longer respond to this thread.&amp;nbsp;If you require additional assistance from Intel, please start a new thread.&amp;nbsp;Any further interaction in this thread will be considered community only&lt;/I&gt;&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Tue, 13 Apr 2021 15:32:53 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/MPI-Fatal-error-in-PMPI-Allgatherv-MPIR-Localcopy-4028-bytes/m-p/1273254#M8079</guid>
      <dc:creator>Michael_Intel</dc:creator>
      <dc:date>2021-04-13T15:32:53Z</dc:date>
    </item>
  </channel>
</rss>

