<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Hi, in Intel® oneAPI Math Kernel Library</title>
    <link>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001273#M18587</link>
    <description>&lt;P&gt;Hi,&lt;/P&gt;

&lt;P&gt;We have also been experiencing this issue. Has any progress been made?&lt;/P&gt;

&lt;P&gt;Many thanks,&lt;/P&gt;

&lt;P&gt;Joly&lt;/P&gt;

&lt;P&gt;Info:&lt;/P&gt;

&lt;P&gt;&lt;SPAN style="font-size: 1em; line-height: 1.5;"&gt;Cray XC30&lt;/SPAN&gt;&lt;/P&gt;

&lt;P&gt;MPICH2 (cray-mpich/6.3.1)&lt;/P&gt;

&lt;P&gt;Intel Compiler (intel/14.0.1.106) and MKL&lt;/P&gt;

&lt;P&gt;Image &amp;nbsp; &amp;nbsp; &amp;nbsp;Rank 39 [Fri Jun 27 10:29:28 2014] [c0-1c2s12n0] Fatal error in MPI_Recv: Invalid tag, error stack:&lt;BR /&gt;
	MPI_Recv(192): MPI_Recv(buf=0x51f8de0, count=1, MPI_INT, src=38, tag=5000000, comm=0xc4000001, status=0x7fffffff12b8) failed&lt;BR /&gt;
	MPI_Recv(113): Invalid tag, value is 5000000&lt;/P&gt;

&lt;P&gt;Backtrace:&lt;/P&gt;

&lt;P&gt;...&lt;/P&gt;

&lt;P&gt;...&lt;/P&gt;

&lt;P&gt;Dense invert : line 1138&lt;/P&gt;

&lt;P&gt;pdgetrf&lt;/P&gt;

&lt;P&gt;mpl_lu&lt;/P&gt;

&lt;P&gt;mpl_lu_nb&lt;/P&gt;

&lt;P&gt;mpl_pivot_comm&lt;/P&gt;

&lt;P&gt;MKL_RECV&lt;/P&gt;

&lt;P&gt;PMPI_Recv&lt;/P&gt;

&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Fri, 27 Jun 2014 08:48:00 GMT</pubDate>
    <dc:creator>Jolyon_A_</dc:creator>
    <dc:date>2014-06-27T08:48:00Z</dc:date>
    <item>
      <title>Fatal error in MPI_Recv: Invalid tag</title>
      <link>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001263#M18577</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;

&lt;P&gt;I was developing a new software using scaLAPACK/BLACS MKL implementation. In my small testing cluster everything worked ok. However, when I have moved to the 'big' cluster to start computations, the program sometimes fails with error (repeated for different threads):&lt;/P&gt;

&lt;P&gt;Rank 38 [Thu Apr 17 16:08:18 2014] [c7-1c1s8n3] Fatal error in MPI_Recv: Invalid tag, error stack:&lt;BR /&gt;
	MPI_Recv(192): MPI_Recv(buf=0x3847640, count=64, MPI_INT, src=37, tag=5000000, comm=0x84000004, status=0x7fffffff7418) failed&lt;BR /&gt;
	MPI_Recv(113): Invalid tag, value is 5000000&lt;/P&gt;

&lt;P&gt;I am using the compiler and libraries in Intel Composer XE Edition&amp;nbsp;2013 SP1.&amp;nbsp;The cluster is based on&amp;nbsp;Cray XC30 series and uses their own MPI implementation. I have read this MPI implementation have their own limits for 'tag' parameter (I tried different versions). However I have read 'tag' parameter limits are higher than&amp;nbsp;5000000.&lt;/P&gt;

&lt;P&gt;Sombody received the same error? Is it related to MKL implementation or MPI implementation?&lt;/P&gt;

&lt;P&gt;Many thanks!&lt;/P&gt;</description>
      <pubDate>Thu, 24 Apr 2014 10:34:56 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001263#M18577</guid>
      <dc:creator>Oriol_C_</dc:creator>
      <dc:date>2014-04-24T10:34:56Z</dc:date>
    </item>
    <item>
      <title>Hello, Oriol C.</title>
      <link>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001264#M18578</link>
      <description>&lt;P&gt;Hello, Oriol C.&lt;/P&gt;

&lt;P&gt;Cluster components of MKL use MPI functions through BLACS. For binary compatibility with MPI implementations there are several libs in MKL: libmkl_blacs_intelmpi_lp64 for Intel MPI, libmkl_blacs_openmpi_lp64 for OpenMPI, etc.&lt;/P&gt;

&lt;P&gt;As far as I know, on Cray cluster MVAPICH MPI is used (with some modifications). MVAPICH2 should be compatible with MPICH2, as well as Intel MPI. So I suppose for XC30 you should link your app with libmkl_blacs_intelmpi_lp64 or libmkl_blacs_intelmpi_ilp64.&lt;/P&gt;

&lt;P&gt;What MPI do you use in your testing cluster? How do you link BLACS? Do you recompile your application on XC30 or just copy executable files?&lt;/P&gt;</description>
      <pubDate>Fri, 25 Apr 2014 02:35:24 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001264#M18578</guid>
      <dc:creator>Evarist_F_Intel</dc:creator>
      <dc:date>2014-04-25T02:35:24Z</dc:date>
    </item>
    <item>
      <title>Hello Evarist,</title>
      <link>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001265#M18579</link>
      <description>&lt;P&gt;Hello Evarist,&lt;/P&gt;

&lt;P&gt;Thanks for your reply.&lt;/P&gt;

&lt;P&gt;&lt;SPAN style="font-size: 1em; line-height: 1.5;"&gt;I recompile my application on XC30. On my testing cluster I am using OpenMPI. I use GNU compiler and I am linking dynamically OpenMPI library (&lt;/SPAN&gt;&lt;SPAN style="font-size: 1em; line-height: 1.5;"&gt;mkl_blacs_openmpi_lp64):&lt;/SPAN&gt;&lt;/P&gt;

&lt;P&gt;mpic++ -g -I /opt/intel/mkl/include -fopenmp -m64 &amp;nbsp;-DUNIX -o executable file1.o file2.o ... -L/opt/intel/mkl/lib/intel64/ -lmkl_scalapack_lp64 -lmkl_intel_lp64 -lmkl_core -lmkl_gnu_thread -lmkl_blacs_openmpi_lp64 -ldl -lpthread -lm -Wl,-rpath=/opt/intel/mkl/lib/intel64/&lt;/P&gt;

&lt;P&gt;On XC30, I am using Intel compiler and I am linking statically l&lt;SPAN style="font-size: 12px; line-height: 18px;"&gt;ibmkl_blacs_intelmpi_ilp64:&lt;/SPAN&gt;&lt;/P&gt;

&lt;P&gt;CC -I /opt/intel/composerxe/mkl/include -openmp -O3 &amp;nbsp;-DUNIX -o executable file1.o file2.o ... /opt/intel/composerxe/mkl/lib/intel64/libmkl_scalapack_lp64.a -Wl,--start-group /opt/intel/composerxe/mkl/lib/intel64/libmkl_intel_lp64.a /opt/intel/composerxe/mkl/lib/intel64/libmkl_core.a /opt/intel/composerxe/mkl/lib/intel64/libmkl_intel_thread.a -Wl,--end-group /opt/intel/composerxe/mkl/lib/intel64/libmkl_blacs_intelmpi_lp64.a -lpthread -lm&lt;/P&gt;

&lt;P&gt;If I understand well the documentation, the XC30 uses MPT which have their own implementation of MPI.&lt;/P&gt;

&lt;P&gt;&amp;nbsp;&lt;/P&gt;

&lt;P&gt;&amp;nbsp;&lt;/P&gt;

&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 25 Apr 2014 09:16:16 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001265#M18579</guid>
      <dc:creator>Oriol_C_</dc:creator>
      <dc:date>2014-04-25T09:16:16Z</dc:date>
    </item>
    <item>
      <title>Thank you, Oriol, for sharing</title>
      <link>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001266#M18580</link>
      <description>&lt;P&gt;Thank you, Oriol, for sharing information.&lt;/P&gt;

&lt;P&gt;And I am sorry, I didn't catch - does recompilation solve the problem?&lt;/P&gt;</description>
      <pubDate>Fri, 25 Apr 2014 09:32:57 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001266#M18580</guid>
      <dc:creator>Evarist_F_Intel</dc:creator>
      <dc:date>2014-04-25T09:32:57Z</dc:date>
    </item>
    <item>
      <title>I apologize, sometimes I am</title>
      <link>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001267#M18581</link>
      <description>&lt;P&gt;I apologize, sometimes I am not very good explaining in english.&lt;/P&gt;

&lt;P&gt;I always recompiled on both clusters (i.e. recompilation does not solve the problem).&lt;/P&gt;

&lt;P&gt;Many thanks&lt;/P&gt;</description>
      <pubDate>Fri, 25 Apr 2014 09:45:33 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001267#M18581</guid>
      <dc:creator>Oriol_C_</dc:creator>
      <dc:date>2014-04-25T09:45:33Z</dc:date>
    </item>
    <item>
      <title>Hmm, unfortunately I have no</title>
      <link>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001268#M18582</link>
      <description>&lt;P&gt;Hmm, unfortunately I have no ideas why this happens. MKL Scalapack should not use such a big tag number.&lt;/P&gt;

&lt;P&gt;Could you please try to link with other blacs libraries? Probably sgimpt is what I would try first...&lt;/P&gt;</description>
      <pubDate>Fri, 25 Apr 2014 11:18:58 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001268#M18582</guid>
      <dc:creator>Evarist_F_Intel</dc:creator>
      <dc:date>2014-04-25T11:18:58Z</dc:date>
    </item>
    <item>
      <title>Ok, many thanks for your help</title>
      <link>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001269#M18583</link>
      <description>&lt;P&gt;Ok, many thanks for your help!&lt;/P&gt;

&lt;P&gt;&lt;SPAN style="font-size: 1em; line-height: 1.5;"&gt;I will try first with&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="font-size: 1em; line-height: 1.5;"&gt;libmkl_blacs_sgimpt_lp64 (&lt;/SPAN&gt;&lt;SPAN style="font-size: 1em; line-height: 1.5;"&gt;and&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="font-size: 1em; line-height: 1.5;"&gt;mkl_blacs_openmpi_lp64&lt;/SPAN&gt;&lt;SPAN style="font-size: 1em; line-height: 1.5;"&gt;). I will post if this solve the problem.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 25 Apr 2014 11:36:00 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001269#M18583</guid>
      <dc:creator>Oriol_C_</dc:creator>
      <dc:date>2014-04-25T11:36:00Z</dc:date>
    </item>
    <item>
      <title>I am seeing the same invalid</title>
      <link>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001270#M18584</link>
      <description>&lt;P&gt;I am seeing the same invalid tag value with scalapack's PDGESV&amp;nbsp;on cray XC30,&lt;/P&gt;

&lt;P&gt;&amp;nbsp;MPI_Recv(192): MPI_Recv(buf=0x30a0d50, count=2, MPI_INT, src=8, tag=5000000, comm=0x84000004, status=0x7fffffff0d38) failed&lt;BR /&gt;
	MPI_Recv(113): Invalid tag, value is 5000000&lt;/P&gt;

&lt;P&gt;&amp;nbsp;&lt;/P&gt;

&lt;P&gt;The situation is very similar. On our small cluster I compile with mpiifort and -mkl=cluster, and on XC30 I use the cray ftn wrapper for ifort but with -mkl=cluster as well. It never crashed like this on my home cluser, and it&lt;SPAN style="font-size: 1em; line-height: 1.5;"&gt;&amp;nbsp;seems to be only affecting certain calculations, maybe the ones where pdgsev is used instead of pzgesv...&lt;/SPAN&gt;&lt;/P&gt;

&lt;P&gt;I don't think that compiling with a different blacs will solve my problem.&lt;/P&gt;

&lt;P&gt;&lt;SPAN style="font-size: 1em; line-height: 1.5;"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 05 May 2014 08:46:00 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001270#M18584</guid>
      <dc:creator>Ariel_B_1</dc:creator>
      <dc:date>2014-05-05T08:46:00Z</dc:date>
    </item>
    <item>
      <title>Hi,</title>
      <link>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001271#M18585</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;

&lt;P&gt;I have tried with other BLACS libraries without success (I am unable of use many of them).&lt;/P&gt;

&lt;P&gt;After contacting with IT of the XC30, I provided him with a "simple" test version. In my case, the problem seems to appear when the program inverts a matrix using pdgetrf and pdgetri functions. They have told me that seems a bug in MKL and&lt;SPAN style="font-size: 1em; line-height: 1.5;"&gt;&amp;nbsp;they recommend me to change to their own libraries (libsci). However, I am using some MKL specific functions and I would like to continue using MKL.&lt;/SPAN&gt;&lt;/P&gt;

&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 05 May 2014 17:21:31 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001271#M18585</guid>
      <dc:creator>Oriol_C_</dc:creator>
      <dc:date>2014-05-05T17:21:31Z</dc:date>
    </item>
    <item>
      <title>Thank you Oriol and Ariel for</title>
      <link>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001272#M18586</link>
      <description>&lt;P&gt;Thank you Oriol and Ariel for pointing the functions!&lt;/P&gt;

&lt;P&gt;I will try to reproduce the problem using these functions and come back as soon as I will get any updates!&lt;/P&gt;</description>
      <pubDate>Thu, 08 May 2014 02:44:28 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001272#M18586</guid>
      <dc:creator>Evarist_F_Intel</dc:creator>
      <dc:date>2014-05-08T02:44:28Z</dc:date>
    </item>
    <item>
      <title>Hi,</title>
      <link>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001273#M18587</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;

&lt;P&gt;We have also been experiencing this issue. Has any progress been made?&lt;/P&gt;

&lt;P&gt;Many thanks,&lt;/P&gt;

&lt;P&gt;Joly&lt;/P&gt;

&lt;P&gt;Info:&lt;/P&gt;

&lt;P&gt;&lt;SPAN style="font-size: 1em; line-height: 1.5;"&gt;Cray XC30&lt;/SPAN&gt;&lt;/P&gt;

&lt;P&gt;MPICH2 (cray-mpich/6.3.1)&lt;/P&gt;

&lt;P&gt;Intel Compiler (intel/14.0.1.106) and MKL&lt;/P&gt;

&lt;P&gt;Image &amp;nbsp; &amp;nbsp; &amp;nbsp;Rank 39 [Fri Jun 27 10:29:28 2014] [c0-1c2s12n0] Fatal error in MPI_Recv: Invalid tag, error stack:&lt;BR /&gt;
	MPI_Recv(192): MPI_Recv(buf=0x51f8de0, count=1, MPI_INT, src=38, tag=5000000, comm=0xc4000001, status=0x7fffffff12b8) failed&lt;BR /&gt;
	MPI_Recv(113): Invalid tag, value is 5000000&lt;/P&gt;

&lt;P&gt;Backtrace:&lt;/P&gt;

&lt;P&gt;...&lt;/P&gt;

&lt;P&gt;...&lt;/P&gt;

&lt;P&gt;Dense invert : line 1138&lt;/P&gt;

&lt;P&gt;pdgetrf&lt;/P&gt;

&lt;P&gt;mpl_lu&lt;/P&gt;

&lt;P&gt;mpl_lu_nb&lt;/P&gt;

&lt;P&gt;mpl_pivot_comm&lt;/P&gt;

&lt;P&gt;MKL_RECV&lt;/P&gt;

&lt;P&gt;PMPI_Recv&lt;/P&gt;

&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 27 Jun 2014 08:48:00 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001273#M18587</guid>
      <dc:creator>Jolyon_A_</dc:creator>
      <dc:date>2014-06-27T08:48:00Z</dc:date>
    </item>
    <item>
      <title>As long as I know, no</title>
      <link>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001274#M18588</link>
      <description>&lt;P&gt;As long as I know, no progress has been made.&lt;/P&gt;

&lt;P&gt;On Cray XC30, I have to use libsci replacement when I need these functions.&lt;/P&gt;

&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 19 Aug 2014 14:37:25 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-oneAPI-Math-Kernel-Library/Fatal-error-in-MPI-Recv-Invalid-tag/m-p/1001274#M18588</guid>
      <dc:creator>Oriol_C_</dc:creator>
      <dc:date>2014-08-19T14:37:25Z</dc:date>
    </item>
  </channel>
</rss>

