<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: use I_MPI_DEBUG=6 There is no print binding relationship in Intel® MPI Library</title>
    <link>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1276589#M8162</link>
    <description>&lt;P&gt;Hi,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thanks for reaching out to us.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Could you once try increasing the Debug level and check?&lt;/P&gt;
&lt;P&gt;Also seems like you are getting a message regarding the incorrect Gather result.&lt;/P&gt;
&lt;P&gt;You can use -check_mpi by ITAC to check the correctness of an MPI program.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Could you please try using the below command and provide us the output log?&lt;/P&gt;
&lt;P&gt;I_MPI_DEBUG=30 mpiexec -check_mpi -genv -n 6 -ppn 3 -hosts lico-c1,head ./test1.o&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thanks &amp;amp; Regards,&lt;/P&gt;
&lt;P&gt;Santosh&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Mon, 26 Apr 2021 10:49:34 GMT</pubDate>
    <dc:creator>SantoshY_Intel</dc:creator>
    <dc:date>2021-04-26T10:49:34Z</dc:date>
    <item>
      <title>use I_MPI_DEBUG=6 There is no print binding relationship</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1276389#M8157</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;
&lt;P&gt;I execute the MPI program with the following command：&lt;/P&gt;
&lt;P&gt;[host test]# mpiexec -genv I_MPI_DEBUG=6 -n 6 -ppn 3 -hosts lico-c1,head ./test1.o&lt;BR /&gt;[0] MPI startup(): Intel(R) MPI Library, Version 2021.1 Build 20201112 (id: b9c9d2fc5)&lt;BR /&gt;[0] MPI startup(): Copyright (C) 2003-2020 Intel Corporation. All rights reserved.&lt;BR /&gt;[0] MPI startup(): library kind: release&lt;BR /&gt;[0] MPI startup(): libfabric version: 1.11.0-impi&lt;BR /&gt;[0] MPI startup(): libfabric provider: psm2&lt;BR /&gt;[0] MPI startup(): I_MPI_ROOT=/opt/intel/oneapi/mpi/2021.1.1&lt;BR /&gt;[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc&lt;BR /&gt;[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default&lt;BR /&gt;[0] MPI startup(): I_MPI_DEBUG=6&lt;BR /&gt;Hello world: rank 0 of 6 running on lico-C1&lt;BR /&gt;Hello world: rank 1 of 6 running on lico-C1&lt;BR /&gt;Hello world: rank 2 of 6 running on lico-C1&lt;BR /&gt;Hello world: rank 3 of 6 running on head&lt;BR /&gt;Hello world: rank 4 of 6 running on head&lt;BR /&gt;Hello world: rank 5 of 6 running on head&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;But，The program runs normally but does not output binding information&lt;/P&gt;
&lt;P&gt;I want to know why&lt;/P&gt;
&lt;P&gt;Thanks&lt;/P&gt;</description>
      <pubDate>Sun, 25 Apr 2021 02:52:39 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1276389#M8157</guid>
      <dc:creator>侯玉山</dc:creator>
      <dc:date>2021-04-25T02:52:39Z</dc:date>
    </item>
    <item>
      <title>Re: use I_MPI_DEBUG=6 There is no print binding relationship</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1276404#M8159</link>
      <description>&lt;P&gt;Printed this information：&lt;/P&gt;
&lt;P&gt;[0] MPI startup(): Incorrect Gather result in I_MPI_Pinning_printing&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sun, 25 Apr 2021 10:44:26 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1276404#M8159</guid>
      <dc:creator>侯玉山</dc:creator>
      <dc:date>2021-04-25T10:44:26Z</dc:date>
    </item>
    <item>
      <title>Re: use I_MPI_DEBUG=6 There is no print binding relationship</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1276589#M8162</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thanks for reaching out to us.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Could you once try increasing the Debug level and check?&lt;/P&gt;
&lt;P&gt;Also seems like you are getting a message regarding the incorrect Gather result.&lt;/P&gt;
&lt;P&gt;You can use -check_mpi by ITAC to check the correctness of an MPI program.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Could you please try using the below command and provide us the output log?&lt;/P&gt;
&lt;P&gt;I_MPI_DEBUG=30 mpiexec -check_mpi -genv -n 6 -ppn 3 -hosts lico-c1,head ./test1.o&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thanks &amp;amp; Regards,&lt;/P&gt;
&lt;P&gt;Santosh&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 26 Apr 2021 10:49:34 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1276589#M8162</guid>
      <dc:creator>SantoshY_Intel</dc:creator>
      <dc:date>2021-04-26T10:49:34Z</dc:date>
    </item>
    <item>
      <title>Re: use I_MPI_DEBUG=6 There is no print binding relationship</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1276885#M8166</link>
      <description>&lt;P&gt;hi,&lt;/P&gt;
&lt;P&gt;Here is the information output from the new command(&lt;SPAN&gt;I_MPI_DEBUG=30 mpiexec -check_mpi -genv -n 6 -ppn 3 -hosts lico-c1,head ./test1.o&lt;/SPAN&gt;).&lt;/P&gt;
&lt;P&gt;This is a problem on multiple nodes, and binding information can be output on a single node.&lt;/P&gt;
&lt;P&gt;[root@head mpi_dir]# I_MPI_DEBUG=30 mpiexec -check_mpi -genv -n 6 -ppn 3 -hosts lico-c1,head ./test1.o&lt;BR /&gt;[0] MPI startup(): Intel(R) MPI Library, Version 2021.1 Build 20201112 (id: b9c9d2fc5)&lt;BR /&gt;[0] MPI startup(): Copyright (C) 2003-2020 Intel Corporation. All rights reserved.&lt;BR /&gt;[0] MPI startup(): library kind: release&lt;BR /&gt;[0] MPI startup(): Size of shared memory segment (374 MB per rank) * (3 local ranks) = 1122 MB total&lt;BR /&gt;[0] MPI startup(): libfabric version: 1.11.0-impi&lt;BR /&gt;libfabric:1122027:core:core:ofi_hmem_init():202&amp;lt;info&amp;gt; Hmem iface FI_HMEM_CUDA not supported&lt;BR /&gt;libfabric:1122027:core:core:ofi_hmem_init():202&amp;lt;info&amp;gt; Hmem iface FI_HMEM_ROCR not supported&lt;BR /&gt;libfabric:1122027:core:core:ofi_hmem_init():202&amp;lt;info&amp;gt; Hmem iface FI_HMEM_ZE not supported&lt;BR /&gt;libfabric:1122027:core:core:ofi_hmem_init():202&amp;lt;info&amp;gt; Hmem iface FI_HMEM_CUDA not supported&lt;BR /&gt;libfabric:1122027:core:core:ofi_hmem_init():202&amp;lt;info&amp;gt; Hmem iface FI_HMEM_ROCR not supported&lt;BR /&gt;libfabric:1122027:core:core:ofi_hmem_init():202&amp;lt;info&amp;gt; Hmem iface FI_HMEM_ZE not supported&lt;BR /&gt;libfabric:1122027:core:core:ofi_register_provider():427&amp;lt;info&amp;gt; registering provider: verbs (111.0)&lt;BR /&gt;libfabric:1122027:core:core:ofi_register_provider():427&amp;lt;info&amp;gt; registering provider: tcp (111.0)&lt;BR /&gt;libfabric:1122027:core:core:ofi_register_provider():427&amp;lt;info&amp;gt; registering provider: sockets (111.0)&lt;BR /&gt;libfabric:1122027:core:core:ofi_register_provider():427&amp;lt;info&amp;gt; registering provider: shm (111.0)&lt;BR /&gt;libfabric:1122027:core:core:ofi_register_provider():427&amp;lt;info&amp;gt; registering provider: ofi_rxm (111.0)&lt;BR /&gt;[3] MPI startup(): Size of shared memory segment (406 MB per rank) * (3 local ranks) = 1218 MB total&lt;BR /&gt;libfabric:1122027:core:core:ofi_register_provider():427&amp;lt;info&amp;gt; registering provider: psm2 (111.0)&lt;BR /&gt;libfabric:1122027:core:core:ofi_register_provider():427&amp;lt;info&amp;gt; registering provider: mlx (1.4)&lt;BR /&gt;libfabric:1122027:core:core:ofi_register_provider():427&amp;lt;info&amp;gt; registering provider: ofi_hook_noop (111.0)&lt;BR /&gt;libfabric:1122027:core:core:fi_getinfo_():1117&amp;lt;info&amp;gt; Found provider with the highest priority psm2, must_use_util_prov = 0&lt;BR /&gt;libfabric:1122027:core:core:fi_getinfo_():1144&amp;lt;info&amp;gt; Since psm2 can be used, mlx has been skipped. To use mlx, please, set FI_PROVIDER=mlx&lt;BR /&gt;libfabric:1122027:core:core:fi_getinfo_():1144&amp;lt;info&amp;gt; Since psm2 can be used, verbs has been skipped. To use verbs, please, set FI_PROVIDER=verbs&lt;BR /&gt;libfabric:1122027:core:core:fi_getinfo_():1144&amp;lt;info&amp;gt; Since psm2 can be used, tcp has been skipped. To use tcp, please, set FI_PROVIDER=tcp&lt;BR /&gt;libfabric:1122027:core:core:fi_getinfo_():1144&amp;lt;info&amp;gt; Since psm2 can be used, sockets has been skipped. To use sockets, please, set FI_PROVIDER=sockets&lt;BR /&gt;libfabric:1122027:core:core:fi_getinfo_():1144&amp;lt;info&amp;gt; Since psm2 can be used, shm has been skipped. To use shm, please, set FI_PROVIDER=shm&lt;BR /&gt;libfabric:1122027:core:core:fi_getinfo_():1117&amp;lt;info&amp;gt; Found provider with the highest priority psm2, must_use_util_prov = 0&lt;BR /&gt;libfabric:1122027:core:core:fi_getinfo_():1144&amp;lt;info&amp;gt; Since psm2 can be used, mlx has been skipped. To use mlx, please, set FI_PROVIDER=mlx&lt;BR /&gt;libfabric:1122027:core:core:fi_getinfo_():1144&amp;lt;info&amp;gt; Since psm2 can be used, verbs has been skipped. To use verbs, please, set FI_PROVIDER=verbs&lt;BR /&gt;libfabric:1122027:core:core:fi_getinfo_():1144&amp;lt;info&amp;gt; Since psm2 can be used, tcp has been skipped. To use tcp, please, set FI_PROVIDER=tcp&lt;BR /&gt;libfabric:1122027:core:core:fi_getinfo_():1144&amp;lt;info&amp;gt; Since psm2 can be used, sockets has been skipped. To use sockets, please, set FI_PROVIDER=sockets&lt;BR /&gt;libfabric:1122027:core:core:fi_getinfo_():1144&amp;lt;info&amp;gt; Since psm2 can be used, shm has been skipped. To use shm, please, set FI_PROVIDER=shm&lt;BR /&gt;[0] MPI startup(): libfabric provider: psm2&lt;BR /&gt;[0] MPI startup(): detected psm2 provider, set device name to "psm2"&lt;BR /&gt;libfabric:1122027:core:core:fi_fabric_():1397&amp;lt;info&amp;gt; Opened fabric: psm2&lt;BR /&gt;[0] MPI startup(): max_ch4_vcis: 1, max_reg_eps 1, enable_sep 0, enable_shared_ctxs 0, do_av_insert 1&lt;BR /&gt;libfabric:1122027:core:core:ofi_shm_map():137&amp;lt;warn&amp;gt; shm_open failed&lt;BR /&gt;[0] MPI startup(): addrnamelen: 16&lt;BR /&gt;libfabric:1122027:core:core:ofi_ns_add_local_name():370&amp;lt;warn&amp;gt; Cannot add local name - name server uninitialized&lt;BR /&gt;[0] MPI startup(): Load tuning file: "/opt/intel/oneapi/mpi/2021.1.1/etc/tuning_skx_shm-ofi.dat"&lt;BR /&gt;[2] MPI startup(): Incorrect Gather result in I_MPI_Pinning_printing&lt;BR /&gt;[0] MPI startup(): Incorrect Gather result in I_MPI_Pinning_printing&lt;BR /&gt;[0] MPI startup(): I_MPI_ROOT=/opt/intel/oneapi/mpi/2021.1.1&lt;BR /&gt;[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc&lt;BR /&gt;[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default&lt;BR /&gt;[0] MPI startup(): I_MPI_DEBUG=30&lt;/P&gt;
&lt;P&gt;[0] INFO: CHECK LOCAL:EXIT:SIGNAL ON&lt;BR /&gt;[0] INFO: CHECK LOCAL:EXIT:BEFORE_MPI_FINALIZE ON&lt;BR /&gt;[0] INFO: CHECK LOCAL:MPI:CALL_FAILED ON&lt;BR /&gt;[0] INFO: CHECK LOCAL:MEMORY:OVERLAP ON&lt;BR /&gt;[0] INFO: CHECK LOCAL:MEMORY:ILLEGAL_MODIFICATION ON&lt;BR /&gt;[0] INFO: CHECK LOCAL:MEMORY:INACCESSIBLE ON&lt;BR /&gt;[0] INFO: CHECK LOCAL:MEMORY:ILLEGAL_ACCESS OFF&lt;BR /&gt;[0] INFO: CHECK LOCAL:MEMORY:INITIALIZATION OFF&lt;BR /&gt;[0] INFO: CHECK LOCAL:REQUEST:ILLEGAL_CALL ON&lt;BR /&gt;[0] INFO: CHECK LOCAL:REQUEST:NOT_FREED ON&lt;BR /&gt;[0] INFO: CHECK LOCAL:REQUEST:PREMATURE_FREE ON&lt;BR /&gt;[0] INFO: CHECK LOCAL:DATATYPE:NOT_FREED ON&lt;BR /&gt;[0] INFO: CHECK LOCAL:BUFFER:INSUFFICIENT_BUFFER ON&lt;BR /&gt;[0] INFO: CHECK GLOBAL:DEADLOCK:HARD ON&lt;BR /&gt;[0] INFO: CHECK GLOBAL:DEADLOCK:POTENTIAL ON&lt;BR /&gt;[0] INFO: CHECK GLOBAL:DEADLOCK:NO_PROGRESS ON&lt;BR /&gt;[0] INFO: CHECK GLOBAL:MSG:DATATYPE:MISMATCH ON&lt;BR /&gt;[0] INFO: CHECK GLOBAL:MSG:DATA_TRANSMISSION_CORRUPTED ON&lt;BR /&gt;[0] INFO: CHECK GLOBAL:MSG:PENDING ON&lt;BR /&gt;[0] INFO: CHECK GLOBAL:COLLECTIVE:DATATYPE:MISMATCH ON&lt;BR /&gt;[0] INFO: CHECK GLOBAL:COLLECTIVE:DATA_TRANSMISSION_CORRUPTED ON&lt;BR /&gt;[0] INFO: CHECK GLOBAL:COLLECTIVE:OPERATION_MISMATCH ON&lt;BR /&gt;[0] INFO: CHECK GLOBAL:COLLECTIVE:SIZE_MISMATCH ON&lt;BR /&gt;[0] INFO: CHECK GLOBAL:COLLECTIVE:REDUCTION_OPERATION_MISMATCH ON&lt;BR /&gt;[0] INFO: CHECK GLOBAL:COLLECTIVE:ROOT_MISMATCH ON&lt;BR /&gt;[0] INFO: CHECK GLOBAL:COLLECTIVE:INVALID_PARAMETER ON&lt;BR /&gt;[0] INFO: CHECK GLOBAL:COLLECTIVE:COMM_FREE_MISMATCH ON&lt;BR /&gt;[0] INFO: maximum number of errors before aborting: CHECK-MAX-ERRORS 1&lt;BR /&gt;[0] INFO: maximum number of reports before aborting: CHECK-MAX-REPORTS 0 (= unlimited)&lt;BR /&gt;[0] INFO: maximum number of times each error is reported: CHECK-SUPPRESSION-LIMIT 10&lt;BR /&gt;[0] INFO: timeout for deadlock detection: DEADLOCK-TIMEOUT 60s&lt;BR /&gt;[0] INFO: timeout for deadlock warning: DEADLOCK-WARNING 300s&lt;BR /&gt;[0] INFO: maximum number of reported pending messages: CHECK-MAX-PENDING 20&lt;/P&gt;
&lt;P&gt;Hello world: rank 0 of 6 running on lico-C1&lt;BR /&gt;Hello world: rank 1 of 6 running on lico-C1&lt;BR /&gt;Hello world: rank 2 of 6 running on lico-C1&lt;BR /&gt;Hello world: rank 3 of 6 running on head&lt;BR /&gt;Hello world: rank 4 of 6 running on head&lt;BR /&gt;Hello world: rank 5 of 6 running on head&lt;/P&gt;
&lt;P&gt;[0] INFO: Error checking completed without finding any problems.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thanks &amp;amp; Regards&lt;/P&gt;</description>
      <pubDate>Tue, 27 Apr 2021 05:50:58 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1276885#M8166</guid>
      <dc:creator>侯玉山</dc:creator>
      <dc:date>2021-04-27T05:50:58Z</dc:date>
    </item>
    <item>
      <title>Re:use I_MPI_DEBUG=6 There is no print binding relationship</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1277343#M8180</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Thanks for your response.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Could u please provide the details of &lt;I&gt;cpuinfo&lt;/I&gt; of both the nodes?&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Thanks &amp;amp; Regards,&lt;/P&gt;&lt;P&gt;Santosh&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Wed, 28 Apr 2021 14:29:32 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1277343#M8180</guid>
      <dc:creator>SantoshY_Intel</dc:creator>
      <dc:date>2021-04-28T14:29:32Z</dc:date>
    </item>
    <item>
      <title>Re: Re:use I_MPI_DEBUG=6 There is no print binding relationship</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1277503#M8182</link>
      <description>&lt;P&gt;hi,&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;cpuinfo:&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;head:&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;[root@head ~]# lscpu&lt;BR /&gt;Architecture: x86_64&lt;BR /&gt;CPU op-mode(s): 32-bit, 64-bit&lt;BR /&gt;Byte Order: Little Endian&lt;BR /&gt;CPU(s): 72&lt;BR /&gt;On-line CPU(s) list: 0-71&lt;BR /&gt;Thread(s) per core: 2&lt;BR /&gt;Core(s) per socket: 18&lt;BR /&gt;Socket(s): 2&lt;BR /&gt;NUMA node(s): 4&lt;BR /&gt;Vendor ID: GenuineIntel&lt;BR /&gt;CPU family: 6&lt;BR /&gt;Model: 85&lt;BR /&gt;Model name: Intel(R) Xeon(R) Gold 6150 CPU @ 2.70GHz&lt;BR /&gt;Stepping: 4&lt;BR /&gt;CPU MHz: 1400.235&lt;BR /&gt;CPU max MHz: 2701.0000&lt;BR /&gt;CPU min MHz: 1200.0000&lt;BR /&gt;BogoMIPS: 5400.00&lt;BR /&gt;Virtualization: VT-x&lt;BR /&gt;L1d cache: 32K&lt;BR /&gt;L1i cache: 32K&lt;BR /&gt;L2 cache: 1024K&lt;BR /&gt;L3 cache: 25344K&lt;BR /&gt;NUMA node0 CPU(s): 0-2,5,6,9,10,14,15,36-38,41,42,45,46,50,51&lt;BR /&gt;NUMA node1 CPU(s): 3,4,7,8,11-13,16,17,39,40,43,44,47-49,52,53&lt;BR /&gt;NUMA node2 CPU(s): 18-20,23,24,27,28,32,33,54-56,59,60,63,64,68,69&lt;BR /&gt;NUMA node3 CPU(s): 21,22,25,26,29-31,34,35,57,58,61,62,65-67,70,71&lt;BR /&gt;Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc art arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc cpuid aperfmperf pni pclmulqdq dtes64 monitor ds_cpl vmx smx est tm2 ssse3 sdbg fma cx16 xtpr pdcm pcid dca sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand lahf_lm abm 3dnowprefetch cpuid_fault epb cat_l3 cdp_l3 invpcid_single pti intel_ppin ssbd mba ibrs ibpb stibp tpr_shadow vnmi flexpriority ept vpid fsgsbase tsc_adjust bmi1 hle avx2 smep bmi2 erms invpcid rtm cqm mpx rdt_a avx512f avx512dq rdseed adx smap clflushopt clwb intel_pt avx512cd avx512bw avx512vl xsaveopt xsavec xgetbv1 xsaves cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local dtherm ida arat pln pts pku ospke md_clear flush_l1d&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;lico-C1:&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;[root@lico-C1 ~]# lscpu&lt;BR /&gt;Architecture: x86_64&lt;BR /&gt;CPU op-mode(s): 32-bit, 64-bit&lt;BR /&gt;Byte Order: Little Endian&lt;BR /&gt;CPU(s): 48&lt;BR /&gt;On-line CPU(s) list: 0-47&lt;BR /&gt;Thread(s) per core: 2&lt;BR /&gt;Core(s) per socket: 12&lt;BR /&gt;Socket(s): 2&lt;BR /&gt;NUMA node(s): 2&lt;BR /&gt;Vendor ID: GenuineIntel&lt;BR /&gt;CPU family: 6&lt;BR /&gt;Model: 85&lt;BR /&gt;Model name: Intel(R) Xeon(R) Gold 6126 CPU @ 2.60GHz&lt;BR /&gt;Stepping: 4&lt;BR /&gt;CPU MHz: 2299.978&lt;BR /&gt;CPU max MHz: 2601.0000&lt;BR /&gt;CPU min MHz: 1000.0000&lt;BR /&gt;BogoMIPS: 5200.00&lt;BR /&gt;Virtualization: VT-x&lt;BR /&gt;L1d cache: 32K&lt;BR /&gt;L1i cache: 32K&lt;BR /&gt;L2 cache: 1024K&lt;BR /&gt;L3 cache: 19712K&lt;BR /&gt;NUMA node0 CPU(s): 0-11,24-35&lt;BR /&gt;NUMA node1 CPU(s): 12-23,36-47&lt;BR /&gt;Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc art arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc cpuid aperfmperf pni pclmulqdq dtes64 monitor ds_cpl vmx smx est tm2 ssse3 sdbg fma cx16 xtpr pdcm pcid dca sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand lahf_lm abm 3dnowprefetch cpuid_fault epb cat_l3 cdp_l3 invpcid_single pti intel_ppin ssbd mba ibrs ibpb stibp tpr_shadow vnmi flexpriority ept vpid fsgsbase tsc_adjust bmi1 hle avx2 smep bmi2 erms invpcid rtm cqm mpx rdt_a avx512f avx512dq rdseed adx smap clflushopt clwb intel_pt avx512cd avx512bw avx512vl xsaveopt xsavec xgetbv1 xsaves cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local dtherm ida arat pln pts pku ospke md_clear flush_l1d&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thank you for your help.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 29 Apr 2021 01:28:47 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1277503#M8182</guid>
      <dc:creator>侯玉山</dc:creator>
      <dc:date>2021-04-29T01:28:47Z</dc:date>
    </item>
    <item>
      <title>Re:use I_MPI_DEBUG=6 There is no print binding relationship</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1277587#M8184</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Could you please provide some more information so that we can investigate your issue well?&lt;/P&gt;&lt;P&gt;Could you please provide&lt;I&gt; &lt;/I&gt;the&lt;I&gt; OS and schedule&lt;/I&gt;r details of both nodes?&lt;/P&gt;&lt;P&gt;And Do you find the same issue after executing other sample MPI programs too?&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Awaiting your reply.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Thanks &amp;amp; Regards,&lt;/P&gt;&lt;P&gt;Santosh&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Thu, 29 Apr 2021 07:16:05 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1277587#M8184</guid>
      <dc:creator>SantoshY_Intel</dc:creator>
      <dc:date>2021-04-29T07:16:05Z</dc:date>
    </item>
    <item>
      <title>Re: Re:use I_MPI_DEBUG=6 There is no print binding relationship</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1277892#M8193</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Thanks for your help.&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;I got some cpuinfo use $ cat /proc/cpuinfo&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;[root@head ~]# cat /proc/cpuinfo&lt;BR /&gt;processor : 0&lt;BR /&gt;vendor_id : GenuineIntel&lt;BR /&gt;cpu family : 6&lt;BR /&gt;model : 85&lt;BR /&gt;model name : Intel(R) Xeon(R) Gold 6150 CPU @ 2.70GHz&lt;BR /&gt;stepping : 4&lt;BR /&gt;microcode : 0x2000065&lt;BR /&gt;cpu MHz : 1371.976&lt;BR /&gt;cache size : 25344 KB&lt;BR /&gt;physical id : 0&lt;BR /&gt;siblings : 36&lt;BR /&gt;core id : 0&lt;BR /&gt;cpu cores : 18&lt;BR /&gt;apicid : 0&lt;BR /&gt;initial apicid : 0&lt;BR /&gt;fpu : yes&lt;BR /&gt;fpu_exception : yes&lt;BR /&gt;cpuid level : 22&lt;BR /&gt;wp : yes&lt;BR /&gt;flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc art arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc cpuid aperfmperf pni pclmulqdq dtes64 monitor ds_cpl vmx smx est tm2 ssse3 sdbg fma cx16 xtpr pdcm pcid dca sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand lahf_lm abm 3dnowprefetch cpuid_fault epb cat_l3 cdp_l3 invpcid_single pti intel_ppin ssbd mba ibrs ibpb stibp tpr_shadow vnmi flexpriority ept vpid fsgsbase tsc_adjust bmi1 hle avx2 smep bmi2 erms invpcid rtm cqm mpx rdt_a avx512f avx512dq rdseed adx smap clflushopt clwb intel_pt avx512cd avx512bw avx512vl xsaveopt xsavec xgetbv1 xsaves cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local dtherm ida arat pln pts pku ospke md_clear flush_l1d&lt;BR /&gt;bugs : cpu_meltdown spectre_v1 spectre_v2 spec_store_bypass l1tf mds swapgs taa itlb_multihit&lt;BR /&gt;bogomips : 5400.00&lt;BR /&gt;clflush size : 64&lt;BR /&gt;cache_alignment : 64&lt;BR /&gt;address sizes : 46 bits physical, 48 bits virtual&lt;BR /&gt;power management:&lt;/P&gt;
&lt;P&gt;processor : 1&lt;BR /&gt;vendor_id : GenuineIntel&lt;BR /&gt;cpu family : 6&lt;BR /&gt;model : 85&lt;BR /&gt;model name : Intel(R) Xeon(R) Gold 6150 CPU @ 2.70GHz&lt;BR /&gt;stepping : 4&lt;BR /&gt;microcode : 0x2000065&lt;BR /&gt;cpu MHz : 1232.666&lt;BR /&gt;cache size : 25344 KB&lt;BR /&gt;physical id : 0&lt;BR /&gt;siblings : 36&lt;BR /&gt;core id : 1&lt;BR /&gt;cpu cores : 18&lt;BR /&gt;apicid : 2&lt;BR /&gt;initial apicid : 2&lt;BR /&gt;fpu : yes&lt;BR /&gt;fpu_exception : yes&lt;BR /&gt;cpuid level : 22&lt;BR /&gt;wp : yes&lt;BR /&gt;flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc art arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc cpuid aperfmperf pni pclmulqdq dtes64 monitor ds_cpl vmx smx est tm2 ssse3 sdbg fma cx16 xtpr pdcm pcid dca sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand lahf_lm abm 3dnowprefetch cpuid_fault epb cat_l3 cdp_l3 invpcid_single pti intel_ppin ssbd mba ibrs ibpb stibp tpr_shadow vnmi flexpriority ept vpid fsgsbase tsc_adjust bmi1 hle avx2 smep bmi2 erms invpcid rtm cqm mpx rdt_a avx512f avx512dq rdseed adx smap clflushopt clwb intel_pt avx512cd avx512bw avx512vl xsaveopt xsavec xgetbv1 xsaves cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local dtherm ida arat pln pts pku ospke md_clear flush_l1d&lt;BR /&gt;bugs : cpu_meltdown spectre_v1 spectre_v2 spec_store_bypass l1tf mds swapgs taa itlb_multihit&lt;BR /&gt;bogomips : 5400.00&lt;BR /&gt;clflush size : 64&lt;BR /&gt;cache_alignment : 64&lt;BR /&gt;address sizes : 46 bits physical, 48 bits virtual&lt;BR /&gt;power management:&lt;/P&gt;
&lt;P&gt;processor : 2&lt;BR /&gt;vendor_id : GenuineIntel&lt;BR /&gt;cpu family : 6&lt;BR /&gt;model : 85&lt;/P&gt;
&lt;P&gt;...&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;and&amp;nbsp;I added "export I_MPI_PLATFORM=auto",&amp;nbsp;Then I get the binding information. But I don't know why "I_ MPI_ DEBUG = 4" doesn't work.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 30 Apr 2021 06:17:06 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1277892#M8193</guid>
      <dc:creator>侯玉山</dc:creator>
      <dc:date>2021-04-30T06:17:06Z</dc:date>
    </item>
    <item>
      <title>Re: use I_MPI_DEBUG=6 There is no print binding relationship</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1279058#M8223</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;I&gt;&amp;gt;&amp;gt;"I added "export I_MPI_PLATFORM=auto", Then I get the binding information."&lt;/I&gt;&lt;/P&gt;
&lt;P&gt;Glad to know that your issue is resolved.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;gt;&amp;gt; "&lt;I&gt;I don't know why "I_ MPI_ DEBUG = 4" doesn't work&lt;/I&gt;"&lt;/P&gt;
&lt;P&gt;We see the binding information in multinode systems using I_MPI_DEBUG, having similar CPU skews without setting I_MPI_PLATFORM.&lt;/P&gt;
&lt;P&gt;Having different skews of CPU might be resulting in this kind of behavior i.e not printing binding information.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;For more information regarding I_MPI_PLATFORM please refer to the link below:&lt;/P&gt;
&lt;P&gt;&lt;A href="https://software.intel.com/content/www/us/en/develop/documentation/mpi-developer-reference-linux/top/environment-variable-reference/other-environment-variables.html#other-environment-variables_GUID-19D7A430-A85C-42AE-940C-868191EBE017" target="_blank" rel="noopener noreferrer"&gt;https://software.intel.com/content/www/us/en/develop/documentation/mpi-developer-reference-linux/top/environment-variable-reference/other-environment-variables.html#other-environment-variables_GUID-19D7A430-A85C-42AE-940C-868191EBE017&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Let us know if there is anything else that we can help you with?&lt;/P&gt;
&lt;P&gt;If no, please confirm whether we can close this thread from our end.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Awaiting your reply.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thanks &amp;amp; Regards,&lt;/P&gt;
&lt;P&gt;Santosh&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 05 May 2021 14:40:17 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1279058#M8223</guid>
      <dc:creator>SantoshY_Intel</dc:creator>
      <dc:date>2021-05-05T14:40:17Z</dc:date>
    </item>
    <item>
      <title>Re: use I_MPI_DEBUG=6 There is no print binding relationship</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1279248#M8228</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;
&lt;P&gt;our problem has been solved,and this thread is ready to shut down.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thank you again for your support and help.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 06 May 2021 02:35:37 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1279248#M8228</guid>
      <dc:creator>侯玉山</dc:creator>
      <dc:date>2021-05-06T02:35:37Z</dc:date>
    </item>
    <item>
      <title>Re: use I_MPI_DEBUG=6 There is no print binding relationship</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1279249#M8229</link>
      <description>&lt;P class="sub_section_element_selectors"&gt;Hello,&lt;/P&gt;
&lt;P class="sub_section_element_selectors"&gt;our problem has been solved,and this thread is ready to shut down.&lt;/P&gt;
&lt;P class="sub_section_element_selectors"&gt;&amp;nbsp;&lt;/P&gt;
&lt;P class="sub_section_element_selectors"&gt;&amp;nbsp;&lt;/P&gt;
&lt;P class="sub_section_element_selectors"&gt;Thank you again for your support and help.&lt;/P&gt;</description>
      <pubDate>Thu, 06 May 2021 02:36:05 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1279249#M8229</guid>
      <dc:creator>侯玉山</dc:creator>
      <dc:date>2021-05-06T02:36:05Z</dc:date>
    </item>
    <item>
      <title>Re: use I_MPI_DEBUG=6 There is no print binding relationship</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1279298#M8233</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thanks for the confirmation!&lt;/P&gt;
&lt;P&gt;As this issue has been resolved, we will no longer respond to this thread. If you require any additional assistance from Intel, please start a new thread. Any further interaction in this thread will be considered community only.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Have a Good day.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Thanks &amp;amp; Regards&lt;/P&gt;
&lt;P&gt;Santosh&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 07 May 2021 09:29:21 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/use-I-MPI-DEBUG-6-There-is-no-print-binding-relationship/m-p/1279298#M8233</guid>
      <dc:creator>SantoshY_Intel</dc:creator>
      <dc:date>2021-05-07T09:29:21Z</dc:date>
    </item>
  </channel>
</rss>

