<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Also, if you don't have Intel in Intel® MPI Library</title>
    <link>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1168161#M6497</link>
    <description>&lt;P&gt;Also, if you don't have Intel® MPI Library&amp;nbsp;already installed, visit&amp;nbsp;https://software.intel.com/en-us/mpi-library and follow the Choose &amp;amp; Download button to obtain it.&lt;/P&gt;</description>
    <pubDate>Mon, 13 Jan 2020 15:28:24 GMT</pubDate>
    <dc:creator>James_T_Intel</dc:creator>
    <dc:date>2020-01-13T15:28:24Z</dc:date>
    <item>
      <title>Why does my Octopus 9.1's build show an extremely different performance with different MPI implementations?</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1168155#M6491</link>
      <description>&lt;P&gt;I am compiling a scientific program package called Octopus 9.1 on a cluster by specifying BLAS library with&lt;/P&gt;
&lt;PRE class="brush:bash; class-name:dark;"&gt;-L${MKL_DIR} -Wl,-Bstatic -Wl,--start-group -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -Wl,--end-group -Wl,-Bdynamic -lpthread -lm -ldl&lt;/PRE&gt;

&lt;P&gt;BLACS with&lt;/P&gt;

&lt;PRE class="brush:bash; class-name:dark;"&gt;-L${MKL_DIR} -Wl,-Bstatic -lmkl_scalapack_lp64&lt;/PRE&gt;

&lt;P&gt;and SCALAPCK with&lt;/P&gt;

&lt;PRE class="brush:bash; class-name:dark;"&gt;-L${MKL_DIR} -Wl,-Bstatic -lmkl_scalapack_lp64&lt;/PRE&gt;

&lt;P&gt;All of those options and flags are what the Intel Link Line Advisor spits given my computer architecture. The compilers are openmpi's mpif90 and mpicc compiled with Intel 18.0.0 compilers. The program works fine, it runs fast, nothing to be worried except for a few segfault errors during test run which I suspect can be remedied by ulimit -s unlimited. But I would like to know why&amp;nbsp;-lmkl_intel_lp64, -lmkl_sequential, -lmkl_core, as well as that blacs and scalapack libraries have to be statically linked? For instance, when those -Wl,-Bstatic and -Wl,-Bdynamic are removed, I got segfault runtime error for any calculation I launched. Looking at Octopus's manual, it doesn't say anything about which intel's library should be linked statically or dynamically, in fact those intel-advised compiler options are architecture-dependent. Moreover, if I switched compilers to MPICH which also wrap intels compiler (same version as before), the program runs significantly slower (in one calculation it was 50 seconds with openmpi vs. 1 hour with mpich) and -Wl,-Bstatic and -Wl,-Bdynamic options have to be absent otherwise segfault. This is really bugging me, how come a mere difference in MPI implementation can lead to such a huge difference in performance and linking behavior. Any thought on this?&lt;/P&gt;</description>
      <pubDate>Thu, 09 Jan 2020 04:42:16 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1168155#M6491</guid>
      <dc:creator>efnacy</dc:creator>
      <dc:date>2020-01-09T04:42:16Z</dc:date>
    </item>
    <item>
      <title>Hi Efnacy,</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1168156#M6492</link>
      <description>&lt;P&gt;Hi Efnacy,&lt;/P&gt;&lt;P&gt;We are looking into your concern and will get back to you soon.&lt;/P&gt;&lt;P&gt;Regards,&lt;/P&gt;&lt;P&gt;Pradeep&lt;/P&gt;</description>
      <pubDate>Thu, 09 Jan 2020 06:28:39 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1168156#M6492</guid>
      <dc:creator>Pradeep_G_Intel</dc:creator>
      <dc:date>2020-01-09T06:28:39Z</dc:date>
    </item>
    <item>
      <title>Any news?</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1168157#M6493</link>
      <description>&lt;P&gt;Any news?&lt;/P&gt;</description>
      <pubDate>Sat, 11 Jan 2020 00:28:22 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1168157#M6493</guid>
      <dc:creator>efnacy</dc:creator>
      <dc:date>2020-01-11T00:28:22Z</dc:date>
    </item>
    <item>
      <title>I see two concerns in your</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1168158#M6494</link>
      <description>&lt;P&gt;I see two concerns in your post.&amp;nbsp; One is about the requirement to link Intel® MKL statically rather than dynamically.&amp;nbsp; I don't know why Octopus requires this, but I would suggest that you post that question in our forum for Intel® MKL (https://software.intel.com/en-us/forums/intel-math-kernel-library).&lt;/P&gt;&lt;P&gt;Regarding the differences in performance between MPI implementations, there are usually significant differences in MPI implementations, especially around various optimizations.&amp;nbsp; I would actually suggest testing with Intel® MPI Library as well to determine what is best for your system.&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2020 14:53:41 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1168158#M6494</guid>
      <dc:creator>James_T_Intel</dc:creator>
      <dc:date>2020-01-13T14:53:41Z</dc:date>
    </item>
    <item>
      <title>Hi James,</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1168159#M6495</link>
      <description>&lt;P&gt;Hi James,&lt;/P&gt;&lt;P&gt;thank you for your reply. Which options should I use in order to invoke Intel MPI library?&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2020 15:14:37 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1168159#M6495</guid>
      <dc:creator>efnacy</dc:creator>
      <dc:date>2020-01-13T15:14:37Z</dc:date>
    </item>
    <item>
      <title>If you have it installed on</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1168160#M6496</link>
      <description>&lt;P&gt;If you have it installed on your system, run the following:&lt;/P&gt;
&lt;PRE class="brush:bash; class-name:dark;"&gt;source &amp;lt;IMPI_install_path&amp;gt;/&amp;lt;version&amp;gt;/intel64/bin/mpivars.sh&lt;/PRE&gt;

&lt;P&gt;This will set up your environment such that Intel® MPI Library will be the first found by PATH and LD_LIBRARY_PATH.&amp;nbsp; Then, you can use the same mpif90 and mpicc scripts you normally use.&amp;nbsp; If you want to ensure you are using the Intel compilers as well, set the following.&amp;nbsp; You only need to set the ones relevant for what you are using.&lt;/P&gt;

&lt;PRE class="brush:bash; class-name:dark;"&gt;I_MPI_CC=icc
I_MPI_CXX=icpc
I_MPI_FC=ifort
I_MPI_F77=ifort
I_MPI_F90=ifort&lt;/PRE&gt;

&lt;P&gt;At runtime, use mpirun as normal and the environment should default to Intel® MPI Library.&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2020 15:25:57 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1168160#M6496</guid>
      <dc:creator>James_T_Intel</dc:creator>
      <dc:date>2020-01-13T15:25:57Z</dc:date>
    </item>
    <item>
      <title>Also, if you don't have Intel</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1168161#M6497</link>
      <description>&lt;P&gt;Also, if you don't have Intel® MPI Library&amp;nbsp;already installed, visit&amp;nbsp;https://software.intel.com/en-us/mpi-library and follow the Choose &amp;amp; Download button to obtain it.&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2020 15:28:24 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1168161#M6497</guid>
      <dc:creator>James_T_Intel</dc:creator>
      <dc:date>2020-01-13T15:28:24Z</dc:date>
    </item>
    <item>
      <title>Really appreciate the help! I</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1168162#M6498</link>
      <description>&lt;P&gt;Really appreciate the help! I checked my cluster and the following directory exist:&lt;/P&gt;&lt;P&gt;/home/compilers/Intel/parallel_studio_xe_2018.0/compilers_and_libraries_2018/linux/mpi/lib64/&lt;/P&gt;&lt;P&gt;I think that is where the MPI libraries reside, am I wrong? And in post #6, just to be sure, are the values of those variables indeed the base intel compilers, that is not the MPI wrappers?&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2020 15:37:40 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1168162#M6498</guid>
      <dc:creator>efnacy</dc:creator>
      <dc:date>2020-01-13T15:37:40Z</dc:date>
    </item>
    <item>
      <title>The environment variables</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1168163#M6499</link>
      <description>&lt;P&gt;The environment variables listed in post #6 instruct the compiler wrappers which compiler to use.&amp;nbsp; The default for the Intel® MPI Library general wrapper scripts on Linux* is to use the GNU* compilers, hence you need to specify the Intel compilers.&amp;nbsp; There is a separate set of wrapper scripts (mpiifort, mpiicc, mpiicpc) that will only use the Intel compilers, but this requires modifying your existing compile methods, rather than just setting a few environment variables.&lt;/P&gt;&lt;P&gt;The version you have installed is very old.&amp;nbsp; I strongly recommend getting the latest version, it is freely available.&amp;nbsp; Follow the instructions in post #7 to get it.&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2020 15:44:19 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1168163#M6499</guid>
      <dc:creator>James_T_Intel</dc:creator>
      <dc:date>2020-01-13T15:44:19Z</dc:date>
    </item>
    <item>
      <title>I am sorry but 2018 is</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1168164#M6500</link>
      <description>&lt;P&gt;I am sorry but 2018 is already very old? I will see if there is still enough space in my cluster account though because I only have 5 GB of storage and it is already taken up by other libraries and files.&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jan 2020 15:55:23 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1168164#M6500</guid>
      <dc:creator>efnacy</dc:creator>
      <dc:date>2020-01-13T15:55:23Z</dc:date>
    </item>
    <item>
      <title>Re:Why does my Octopus 9.1's build show an extreme...</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1269299#M7999</link>
      <description>&lt;P&gt;This thread is closed for Intel support.  Any further replies will be considered community only.  If you need Intel support, please start a new thread.&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Tue, 30 Mar 2021 17:38:25 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/Why-does-my-Octopus-9-1-s-build-show-an-extremely-different/m-p/1269299#M7999</guid>
      <dc:creator>James_T_Intel</dc:creator>
      <dc:date>2021-03-30T17:38:25Z</dc:date>
    </item>
  </channel>
</rss>

