<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Unable to run mpi program with more than 1 core in Intel® MPI Library</title>
    <link>https://community.intel.com/t5/Intel-MPI-Library/Unable-to-run-mpi-program-with-more-than-1-core/m-p/1520316#M10905</link>
    <description>&lt;P&gt;Sorry for the delay. The issue has been resolved. It was due to the architecture not supporting Intel MPI.&lt;/P&gt;</description>
    <pubDate>Mon, 04 Sep 2023 14:54:58 GMT</pubDate>
    <dc:creator>Pavan9</dc:creator>
    <dc:date>2023-09-04T14:54:58Z</dc:date>
    <item>
      <title>Unable to run mpi program with more than 1 core</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/Unable-to-run-mpi-program-with-more-than-1-core/m-p/1517126#M10878</link>
      <description>&lt;P&gt;The following is the stack when running with 2 cores. Running with 1 core the code is working.&lt;BR /&gt;The following are the env variables set&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;ONEAPI_ROOT&lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt;/opt/intel/oneapi&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;SETVARS_VARS_PATH&lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt;/opt/intel/oneapi/mpi/latest/env/vars.sh&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;I_MPI_FABRICS&lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt;shm&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;I_MPI_PERHOST&lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt;2&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;I_MPI_ROOT&lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt;/opt/intel/oneapi/mpi/2021.3.1&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;FI_PROVIDER_PATH&lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt;/opt/intel/oneapi/mpi/2021.3.1//libfabric/lib/prov:/usr/lib64/libfabric&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;PWD&lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt;/home/mpiuser&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;MANPATH&lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt;/opt/intel/oneapi/mpi/2021.3.1/man::&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;I_MPI_HYDRA_BOOTSTRAP_EXEC_EXTRA_ARGS&lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt;-o&lt;/SPAN&gt; &lt;SPAN&gt;ConnectionAttempts&lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt;10&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;HOME&lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt;/root&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;I_MPI_HYDRA_HOST_FILE&lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt;/etc/mpi/hostfile&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;SETVARS_COMPLETED&lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt;1&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;TERM&lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt;xterm&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;SHLVL&lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt;1&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;I_MPI_PIN_ORDER&lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt;compact&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;CPATH&lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt;/opt/intel/oneapi/mpi/2021.3.1//include&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;_&lt;/SPAN&gt;&lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt;/usr/bin/env&lt;/SPAN&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;LI-CODE lang="markup"&gt;[0] MPI startup(): Run 'pmi_process_mapping' nodemap algorithm
[0] MPI startup(): Intel(R) MPI Library, Version 2021.3.1  Build 20210719 (id: 48425b416)
[0] MPI startup(): Copyright (C) 2003-2021 Intel Corporation.  All rights reserved.
[0] MPI startup(): library kind: release
impi_mbind_local(): mbind(p=0x7f497eef8000, size=1073741824) error=1 "Operation not permitted"

impi_mbind_local(): mbind(p=0x7f07374fd000, size=1073741824) error=1 "Operation not permitted"

Assertion failed in file ../../src/mpid/ch4/shm/posix/eager/include/intel_transport.c at line 354: absolute_l3_cache_id &amp;lt;= max
/opt/intel/oneapi/mpi/2021.3.1//lib/release/libmpi.so.12(MPL_backtrace_show+0x1c) [0x7f844d5776dc]
/opt/intel/oneapi/mpi/2021.3.1//lib/release/libmpi.so.12(MPIR_Assert_fail+0x21) [0x7f844ce40171]
/opt/intel/oneapi/mpi/2021.3.1//lib/release/libmpi.so.12(+0x48168d) [0x7f844cdbd68d]
/opt/intel/oneapi/mpi/2021.3.1//lib/release/libmpi.so.12(+0x9da9a2) [0x7f844d3169a2]
/opt/intel/oneapi/mpi/2021.3.1//lib/release/libmpi.so.12(+0x606b3b) [0x7f844cf42b3b]
/opt/intel/oneapi/mpi/2021.3.1//lib/release/libmpi.so.12(+0x6f5a26) [0x7f844d031a26]
/opt/intel/oneapi/mpi/2021.3.1//lib/release/libmpi.so.12(+0x1cb93c) [0x7f844cb0793c]
/opt/intel/oneapi/mpi/2021.3.1//lib/release/libmpi.so.12(PMPI_Init_thread+0xe0) [0x7f844cda6760]
/root/miniconda3/lib/python3.10/site-packages/mpi4py/MPI.cpython-310-x86_64-linux-gnu.so(+0x2fbc3) [0x7f844dd62bc3]
python(PyModule_ExecDef+0x70) [0x5976e0]
python() [0x598a69]
python() [0x4fdc1b]
python(_PyEval_EvalFrameDefault+0x5a6f) [0x4f443f]
python(_PyFunction_Vectorcall+0x6f) [0x4fe5ef]
python(_PyEval_EvalFrameDefault+0x4b4e) [0x4f351e]
python(_PyFunction_Vectorcall+0x6f) [0x4fe5ef]
python(_PyEval_EvalFrameDefault+0x731) [0x4ef101]
python(_PyFunction_Vectorcall+0x6f) [0x4fe5ef]
python(_PyEval_EvalFrameDefault+0x31f) [0x4eecef]
python(_PyFunction_Vectorcall+0x6f) [0x4fe5ef]
python(_PyEval_EvalFrameDefault+0x31f) [0x4eecef]
python(_PyFunction_Vectorcall+0x6f) [0x4fe5ef]
python() [0x4fddb4]
python(_PyObject_CallMethodIdObjArgs+0x137) [0x50c987]
python(PyImport_ImportModuleLevelObject+0x537) [0x50bcf7]
python() [0x517854]
python() [0x4fe1a7]
python(PyObject_Call+0x209) [0x50a8b9]
python(_PyEval_EvalFrameDefault+0x5a6f) [0x4f443f]
python(_PyFunction_Vectorcall+0x6f) [0x4fe5ef]
python(_PyEval_EvalFrameDefault+0x31f) [0x4eecef]
python(_PyFunction_Vectorcall+0x6f) [0x4fe5ef]
Abort(1) on node 0: Internal error
Assertion failed in file ../../src/mpid/ch4/shm/posix/eager/include/intel_transport.c at line 354: absolute_l3_cache_id &amp;lt;= max
/opt/intel/oneapi/mpi/2021.3.1//lib/release/libmpi.so.12(MPL_backtrace_show+0x1c) [0x7f38d72106dc]
/opt/intel/oneapi/mpi/2021.3.1//lib/release/libmpi.so.12(MPIR_Assert_fail+0x21) [0x7f38d6ad9171]
/opt/intel/oneapi/mpi/2021.3.1//lib/release/libmpi.so.12(+0x48168d) [0x7f38d6a5668d]
/opt/intel/oneapi/mpi/2021.3.1//lib/release/libmpi.so.12(+0x9da902) [0x7f38d6faf902]
/opt/intel/oneapi/mpi/2021.3.1//lib/release/libmpi.so.12(+0x606b3b) [0x7f38d6bdbb3b]
/opt/intel/oneapi/mpi/2021.3.1//lib/release/libmpi.so.12(+0x6f5a26) [0x7f38d6ccaa26]
/opt/intel/oneapi/mpi/2021.3.1//lib/release/libmpi.so.12(+0x1cb93c) [0x7f38d67a093c]
/opt/intel/oneapi/mpi/2021.3.1//lib/release/libmpi.so.12(PMPI_Init_thread+0xe0) [0x7f38d6a3f760]
/root/miniconda3/lib/python3.10/site-packages/mpi4py/MPI.cpython-310-x86_64-linux-gnu.so(+0x2fbc3) [0x7f38d79fbbc3]
python(PyModule_ExecDef+0x70) [0x5976e0]
python() [0x598a69]
python() [0x4fdc1b]
python(_PyEval_EvalFrameDefault+0x5a6f) [0x4f443f]
python(_PyFunction_Vectorcall+0x6f) [0x4fe5ef]
python(_PyEval_EvalFrameDefault+0x4b4e) [0x4f351e]
python(_PyFunction_Vectorcall+0x6f) [0x4fe5ef]
python(_PyEval_EvalFrameDefault+0x731) [0x4ef101]
python(_PyFunction_Vectorcall+0x6f) [0x4fe5ef]
python(_PyEval_EvalFrameDefault+0x31f) [0x4eecef]
python(_PyFunction_Vectorcall+0x6f) [0x4fe5ef]
python(_PyEval_EvalFrameDefault+0x31f) [0x4eecef]
python(_PyFunction_Vectorcall+0x6f) [0x4fe5ef]
python() [0x4fddb4]
python(_PyObject_CallMethodIdObjArgs+0x137) [0x50c987]
python(PyImport_ImportModuleLevelObject+0x537) [0x50bcf7]
python() [0x517854]
python() [0x4fe1a7]
python(PyObject_Call+0x209) [0x50a8b9]
python(_PyEval_EvalFrameDefault+0x5a6f) [0x4f443f]
python(_PyFunction_Vectorcall+0x6f) [0x4fe5ef]
python(_PyEval_EvalFrameDefault+0x31f) [0x4eecef]
python(_PyFunction_Vectorcall+0x6f) [0x4fe5ef]
Abort(1) on node 1: Internal error&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 24 Aug 2023 05:35:45 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/Unable-to-run-mpi-program-with-more-than-1-core/m-p/1517126#M10878</guid>
      <dc:creator>Pavan9</dc:creator>
      <dc:date>2023-08-24T05:35:45Z</dc:date>
    </item>
    <item>
      <title>Re:Unable to run mpi program with more than 1 core</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/Unable-to-run-mpi-program-with-more-than-1-core/m-p/1517554#M10883</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thank you for posting in intel community.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Could you please provide the following details so that we could check whether they are of supported type:&lt;/P&gt;&lt;P&gt; 1. OS Details&amp;nbsp;&lt;/P&gt;&lt;P&gt; 2. CPU Details&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Could you please check with I_MPI_FABRICS= ofi instead of 'shm' because it is used for single node.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;&lt;U&gt;NOTE&lt;/U&gt;&lt;/STRONG&gt;: Could you please try to check with the latest version of Intel MPI(2021.10) and let us know if the issue persists , If so please provide us the sample producer and steps to reproduce at our end.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks And Regards,&lt;/P&gt;&lt;P&gt; Aishwarya&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Fri, 25 Aug 2023 07:11:52 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/Unable-to-run-mpi-program-with-more-than-1-core/m-p/1517554#M10883</guid>
      <dc:creator>AishwaryaCV_Intel</dc:creator>
      <dc:date>2023-08-25T07:11:52Z</dc:date>
    </item>
    <item>
      <title>Re:Unable to run mpi program with more than 1 core</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/Unable-to-run-mpi-program-with-more-than-1-core/m-p/1520261#M10901</link>
      <description>&lt;P&gt;Hi,&amp;nbsp;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;We haven't heard back from you, could you please provide details asked in previous response and let us know whether your issue is resolved or not with the latest version(2021.10)?. If yes, make sure to accept this as a solution.&amp;nbsp;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Thank you and best regards,&amp;nbsp;&lt;/P&gt;&lt;P&gt;Aishwarya&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Mon, 04 Sep 2023 11:36:19 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/Unable-to-run-mpi-program-with-more-than-1-core/m-p/1520261#M10901</guid>
      <dc:creator>AishwaryaCV_Intel</dc:creator>
      <dc:date>2023-09-04T11:36:19Z</dc:date>
    </item>
    <item>
      <title>Re: Unable to run mpi program with more than 1 core</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/Unable-to-run-mpi-program-with-more-than-1-core/m-p/1520316#M10905</link>
      <description>&lt;P&gt;Sorry for the delay. The issue has been resolved. It was due to the architecture not supporting Intel MPI.&lt;/P&gt;</description>
      <pubDate>Mon, 04 Sep 2023 14:54:58 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/Unable-to-run-mpi-program-with-more-than-1-core/m-p/1520316#M10905</guid>
      <dc:creator>Pavan9</dc:creator>
      <dc:date>2023-09-04T14:54:58Z</dc:date>
    </item>
    <item>
      <title>Re:Unable to run mpi program with more than 1 core</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/Unable-to-run-mpi-program-with-more-than-1-core/m-p/1521396#M10909</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Glad to know that your issue is resolved. If you need any additional information, please post a new question as this thread will no longer be monitored by Intel.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Thanks And Regards,&lt;/P&gt;&lt;P&gt;Aishwarya&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Thu, 07 Sep 2023 05:58:52 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/Unable-to-run-mpi-program-with-more-than-1-core/m-p/1521396#M10909</guid>
      <dc:creator>AishwaryaCV_Intel</dc:creator>
      <dc:date>2023-09-07T05:58:52Z</dc:date>
    </item>
    <item>
      <title>Re: Unable to run mpi program with more than 1 core</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/Unable-to-run-mpi-program-with-more-than-1-core/m-p/1682114#M12123</link>
      <description>&lt;P&gt;For anyone encountering this issue with an older version of IMPI, the suggestion above of setting&amp;nbsp;&lt;SPAN&gt;I_MPI_FABRICS=ofi worked for me.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 10 Apr 2025 19:55:15 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/Unable-to-run-mpi-program-with-more-than-1-core/m-p/1682114#M12123</guid>
      <dc:creator>Juan_Colmenares</dc:creator>
      <dc:date>2025-04-10T19:55:15Z</dc:date>
    </item>
  </channel>
</rss>

