<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Hi Choi, in Intel® MPI Library</title>
    <link>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015943#M4006</link>
    <description>&lt;P&gt;Hi Choi,&lt;BR /&gt;
	What's about passwordless ssh between host and MIC cards for the root account?&lt;/P&gt;</description>
    <pubDate>Thu, 23 Jul 2015 12:18:22 GMT</pubDate>
    <dc:creator>Artem_R_Intel1</dc:creator>
    <dc:date>2015-07-23T12:18:22Z</dc:date>
    <item>
      <title>How to run WRF Conus12km on Intel® Xeon Phi™ Coprocessors and Intel® Xeon® Processors</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015926#M3989</link>
      <description>&lt;P&gt;Hi&lt;/P&gt;

&lt;P&gt;I tried to run the WRF in the Xeon and Xeon Phi to intel MPI.&lt;/P&gt;

&lt;P&gt;I was referring to. &lt;A href="https://software.intel.com/en-us/articles/wrf-conus12km-on-intel-xeon-phi-coprocessors-and-intel-xeon-processors" target="_blank"&gt;https://software.intel.com/en-us/articles/wrf-conus12km-on-intel-xeon-phi-coprocessors-and-intel-xeon-processors&lt;/A&gt;&lt;/P&gt;

&lt;P&gt;And ..&lt;/P&gt;

&lt;P&gt;OMP: Warning # 234: KMP_AFFINITY: granularity = fine will be used.&lt;BR /&gt;
	[0] MPI startup (): Multi-threaded optimized library&lt;BR /&gt;
	[0] MPI startup (): RDMA, shared memory, and socket data transfer modes&lt;BR /&gt;
	[1] MPI startup (): RDMA, shared memory, and socket data transfer modes&lt;BR /&gt;
	[0] MPI startup (): can not open dynamic library libdat2.so.2&lt;BR /&gt;
	[0] MPI startup (): can not open dynamic library libdat2.so&lt;BR /&gt;
	[0] MPI startup (): can not open dynamic library libdat.so.1&lt;BR /&gt;
	[0] MPI startup (): can not open dynamic library libdat.so&lt;BR /&gt;
	[0] MPI startup (): dapl fabric is not available and fallback fabric is not enabled&lt;BR /&gt;
	[1] DAPL startup (): trying to open first DAPL provider from I_MPI_DAPL_PROVIDER_LIST: ofa-v2-mlx4_0-1u&lt;BR /&gt;
	phi-test-mic2: UCM: 2c8d: dd0ceb40: 205 us (205 us): open_hca: ibv_get_device_list () failed&lt;BR /&gt;
	[1] DAPL startup (): failed to open DAPL provider ofa-v2-mlx4_0-1u&lt;BR /&gt;
	[1] MPI startup (): dapl fabric is not available and fallback fabric is not enabled&lt;/P&gt;

&lt;P&gt;I wait for your help.&lt;BR /&gt;
	Thank you.&lt;/P&gt;</description>
      <pubDate>Fri, 17 Jul 2015 06:21:29 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015926#M3989</guid>
      <dc:creator>choi_w_</dc:creator>
      <dc:date>2015-07-17T06:21:29Z</dc:date>
    </item>
    <item>
      <title>Hello,</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015927#M3990</link>
      <description>&lt;P&gt;Hello,&lt;BR /&gt;
	Check&amp;nbsp;that OFED SW is installed and ofed-mic service&amp;nbsp;is run on the host (refer to the Intel MPSS User's Guide for details).&lt;BR /&gt;
	BTW which version of Intel MPI Library do you use?&lt;/P&gt;</description>
      <pubDate>Fri, 17 Jul 2015 13:24:05 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015927#M3990</guid>
      <dc:creator>Artem_R_Intel1</dc:creator>
      <dc:date>2015-07-17T13:24:05Z</dc:date>
    </item>
    <item>
      <title>Hi Artem R.</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015928#M3991</link>
      <description>&lt;P&gt;Hi Artem R.&lt;/P&gt;

&lt;P&gt;By installing OFED results were as follows.&lt;/P&gt;

&lt;P&gt;[mpiexec@phi-test] hostlist_fn (ui/mpich/utils.c:372): duplicate host file or host list setting&lt;BR /&gt;
	[mpiexec@phi-test] match_arg (utils/args/args.c:152): match handler returned error&lt;BR /&gt;
	[mpiexec@phi-test] HYDU_parse_array (utils/args/args.c:174): argument matching returned error&lt;BR /&gt;
	[mpiexec@phi-test] parse_args (ui/mpich/utils.c:1596): error parsing input array&lt;BR /&gt;
	[mpiexec@phi-test] HYD_uii_mpx_get_parameters (ui/mpich/utils.c:1648): unable to parse user arguments&lt;BR /&gt;
	[mpiexec@phi-test] main (ui/mpich/mpiexec.c:153): error parsing parameters&lt;/P&gt;

&lt;P&gt;And this is my version of MPI.&lt;/P&gt;

&lt;P&gt;Intel(R) MPI Library for Linux* OS, Version 5.0 Update 3 Build 20150128 (build id: 11250)&lt;BR /&gt;
	Copyright (C) 2003-2015, Intel Corporation. All rights reserved.&lt;/P&gt;

&lt;P&gt;&lt;SPAN style="font-size: 13.0080003738403px; line-height: 19.5120010375977px;"&gt;[mpiexec@phi-test] which mpirun&lt;/SPAN&gt;&lt;/P&gt;

&lt;P&gt;/opt/intel/impi/5.0.3.048/intel64/bin/mpirun&lt;/P&gt;

&lt;P&gt;Thank you&lt;/P&gt;</description>
      <pubDate>Mon, 20 Jul 2015 00:24:00 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015928#M3991</guid>
      <dc:creator>choi_w_</dc:creator>
      <dc:date>2015-07-20T00:24:00Z</dc:date>
    </item>
    <item>
      <title>Hi Choi,</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015929#M3992</link>
      <description>&lt;P&gt;Hi Choi,&lt;BR /&gt;
	Could you please provide your MPI settings (I_MPI_* environment variables, command line options)?&lt;/P&gt;</description>
      <pubDate>Mon, 20 Jul 2015 07:21:06 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015929#M3992</guid>
      <dc:creator>Artem_R_Intel1</dc:creator>
      <dc:date>2015-07-20T07:21:06Z</dc:date>
    </item>
    <item>
      <title>Hi Artem R.</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015930#M3993</link>
      <description>&lt;P&gt;&lt;SPAN style="font-size: 12px; line-height: 18px;"&gt;Hi Artem R.&lt;/SPAN&gt;&lt;/P&gt;

&lt;P&gt;I'm sorry, I do not remember this command is heard.&lt;/P&gt;

&lt;P&gt;I had such export.&lt;/P&gt;

&lt;P&gt;export MIC_ULIMIT_STACKSIZE=365536&lt;BR /&gt;
	export I_MPI_DEVICE=rdssm&lt;BR /&gt;
	export I_MPI_MIC=1&lt;BR /&gt;
	export DAPL_DBG_TYPE=0&lt;BR /&gt;
	export I_MPI_DAPL_PROVIDER_LIST=ofa-v2-mlx4_0-1u,ofa-v2-scif0&lt;BR /&gt;
	export I_MPI_PIN_MODE=pm&lt;BR /&gt;
	export I_MPI_PIN_DOMAIN=auto&lt;/P&gt;

&lt;P&gt;Now this error occurs.&lt;/P&gt;

&lt;P&gt;[test1@phi-test CONUS12_rundir]$ ./MIC.sh&lt;BR /&gt;
	OMP: Warning #234: KMP_AFFINITY: granularity=fine will be used.&lt;BR /&gt;
	[0] MPI startup(): Multi-threaded optimized library&lt;BR /&gt;
	[0] MPI startup(): RDMA, shared memory, and socket data transfer modes&lt;BR /&gt;
	[1] MPI startup(): RDMA, shared memory, and socket data transfer modes&lt;BR /&gt;
	[0] DAPL startup(): trying to open first DAPL provider from I_MPI_DAPL_PROVIDER_LIST: ofa-v2-mlx4_0-1u&lt;BR /&gt;
	[0] DAPL startup(): failed to open DAPL provider ofa-v2-mlx4_0-1u&lt;BR /&gt;
	[0] MPI startup(): dapl fabric is not available and fallback fabric is not enabled&lt;BR /&gt;
	[1] DAPL startup(): trying to open first DAPL provider from I_MPI_DAPL_PROVIDER_LIST: ofa-v2-mlx4_0-1u&lt;BR /&gt;
	[1] DAPL startup(): failed to open DAPL provider ofa-v2-mlx4_0-1u&lt;BR /&gt;
	[1] MPI startup(): dapl fabric is not available and fallback fabric is not enabled&lt;/P&gt;

&lt;P&gt;Thank you.&lt;/P&gt;</description>
      <pubDate>Mon, 20 Jul 2015 08:38:00 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015930#M3993</guid>
      <dc:creator>choi_w_</dc:creator>
      <dc:date>2015-07-20T08:38:00Z</dc:date>
    </item>
    <item>
      <title>The specified environment</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015931#M3994</link>
      <description>&lt;P&gt;The specified environment variables look correct.&lt;BR /&gt;
	With the mentioned error message I'd suspect command line options. Could you please provide your mpiexec.hydra/mpirun command line?&lt;/P&gt;</description>
      <pubDate>Mon, 20 Jul 2015 09:00:25 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015931#M3994</guid>
      <dc:creator>Artem_R_Intel1</dc:creator>
      <dc:date>2015-07-20T09:00:25Z</dc:date>
    </item>
    <item>
      <title>HI Artem R.</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015932#M3995</link>
      <description>&lt;P&gt;&lt;SPAN style="font-size: 12px; line-height: 18px;"&gt;HI Artem R.&lt;/SPAN&gt;&lt;/P&gt;

&lt;P&gt;&lt;SPAN style="font-size: 12px; line-height: 18px;"&gt;&amp;lt;MIC.sh&amp;gt;&lt;/SPAN&gt;&lt;/P&gt;

&lt;P&gt;&lt;SPAN style="font-size: 12px; line-height: 18px;"&gt;-----------------------------------------------------------------------------------------------------&lt;/SPAN&gt;&lt;/P&gt;

&lt;P&gt;#source /opt/intel/impi/5.0.3.048/mic/bin/mpivars.sh&lt;BR /&gt;
	source /opt/intel/composer_xe_2015.3.187/bin/compilervars.sh intel64&lt;/P&gt;

&lt;P&gt;ulimit -s unlimited&lt;BR /&gt;
	#ulimit -l 1&lt;BR /&gt;
	export MIC_ULIMIT_STACKSIZE=365536&lt;BR /&gt;
	export I_MPI_DEVICE=rdssm&lt;BR /&gt;
	export I_MPI_MIC=1&lt;BR /&gt;
	export DAPL_DBG_TYPE=0&lt;BR /&gt;
	export I_MPI_DAPL_PROVIDER_LIST=ofa-v2-mlx4_0-1u,ofa-v2-scif0&lt;BR /&gt;
	#export I_MPI_OFA_PROVIDER_LIST=ofa-v2-mlx4_0-1u,ofa-v2-scif0&lt;BR /&gt;
	export I_MPI_PIN_MODE=pm&lt;BR /&gt;
	export I_MPI_PIN_DOMAIN=auto&lt;/P&gt;

&lt;P&gt;#export I_MPI_FABRICS=shm:dspl&lt;BR /&gt;
	#export I_MPI_DAPL_UD=enable&lt;BR /&gt;
	#export I_MPI_DAPL_UD_PROVIDER=ofa-v2-mlx4_0-1u&lt;BR /&gt;
	./run.symmetric&lt;/P&gt;

&lt;P&gt;&lt;SPAN style="font-size: 12px; line-height: 18px;"&gt;-----------------------------------------------------------------------------------------------------&lt;/SPAN&gt;&lt;/P&gt;

&lt;P&gt;&lt;SPAN style="font-size: 12px; line-height: 18px;"&gt;mpiexec.hydra command line:&lt;/SPAN&gt;&lt;/P&gt;

&lt;P&gt;mpiexec.hydra -host phi-test -n 1 -env WRF_NUM_TILES 20 -env KMP_AFFINITY scatter -env OMP_NUM_THREADS 2 -env KMP_LIBRARY=turnaround -env OMP_SCHEDULE=static -env KMP_STACKSIZE=190M -env I_MPI_DEBUG 5 /home/test1/WRF_0715/WRF-XEON/WRFV3/CONUS12_rundir/x86/wrf.exe : -host phi-test-mic1 -n 1 -env KMP_AFFINITY balanced -env OMP_NUM_THREADS 30 -env KMP_LIBRARY=turnaround -env OMP_SCHEDULE=static -env KMP_STACKSIZE=190M -env I_MPI_DEBUG 5 /home/test1/WRF_0715/WRF-XEON/WRFV3/CONUS12_rundir/mic/wrf.sh&lt;/P&gt;

&lt;P&gt;Thank you.&lt;/P&gt;

&lt;P&gt;&amp;nbsp;&lt;/P&gt;

&lt;P&gt;&amp;nbsp;&lt;/P&gt;

&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 20 Jul 2015 09:11:59 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015932#M3995</guid>
      <dc:creator>choi_w_</dc:creator>
      <dc:date>2015-07-20T09:11:59Z</dc:date>
    </item>
    <item>
      <title>Regarding to the following</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015933#M3996</link>
      <description>&lt;P&gt;Regarding to the following error message:&lt;/P&gt;

&lt;BLOCKQUOTE&gt;
	&lt;P&gt;[&lt;A href="mailto:test1@phi"&gt;&lt;U&gt;&lt;FONT color="#0066cc"&gt;test1@phi&lt;/FONT&gt;&lt;/U&gt;&lt;/A&gt;-test CONUS12_rundir]$ ./MIC.sh&lt;BR /&gt;
		OMP: Warning #234: KMP_AFFINITY: granularity=fine will be used.&lt;BR /&gt;
		[0] MPI startup(): Multi-threaded optimized library&lt;BR /&gt;
		[0] MPI startup(): RDMA, shared memory, and socket data transfer modes&lt;BR /&gt;
		[1] MPI startup(): RDMA, shared memory, and socket data transfer modes&lt;BR /&gt;
		[0] DAPL startup(): trying to open first DAPL provider from I_MPI_DAPL_PROVIDER_LIST: ofa-v2-mlx4_0-1u&lt;BR /&gt;
		[0] DAPL startup(): failed to open DAPL provider ofa-v2-mlx4_0-1u&lt;BR /&gt;
		[0] MPI startup(): dapl fabric is not available and fallback fabric is not enabled&lt;BR /&gt;
		[1] DAPL startup(): trying to open first DAPL provider from I_MPI_DAPL_PROVIDER_LIST: ofa-v2-mlx4_0-1u&lt;BR /&gt;
		[1] DAPL startup(): failed to open DAPL provider ofa-v2-mlx4_0-1u&lt;BR /&gt;
		[1] MPI startup(): dapl fabric is not available and fallback fabric is not enabled&lt;/P&gt;
&lt;/BLOCKQUOTE&gt;

&lt;P&gt;Doy you have any InfiniBand* adapters (IBA)&amp;nbsp;on your system? ofa-v2-mlx4_0-1u DAPL provider is for IBA only.&lt;/P&gt;

&lt;P&gt;If you don't have any IBA on the system you can use the following Intel MPI settings (DAPL fabric over SCIF):&lt;BR /&gt;
	export I_MPI_FABRICS=shm:dapl (use it instead of obsolete 'export I_MPI_DEVICE=rdssm')&lt;BR /&gt;
	export I_MPI_DAPL_PROVIDER=ofa-v2-scif0 (use it instead of 'export I_MPI_DAPL_PROVIDER_LIST=ofa-v2-mlx4_0-1u,ofa-v2-scif0')&lt;/P&gt;

&lt;P&gt;Or even try default Intel MPI fabrics settings (without I_MPI_DEVICE/I_MPI_FABRICS/I_MPI_DAPL_PROVIDER/I_MPI_DAPL_PROVIDER_LIST). Intel MPI Library should detect the fabric settings&amp;nbsp;automatically in this case (you can monitor it via the debug information).&lt;/P&gt;

&lt;P&gt;As another option you can try I_MPI_FABRICS=shm:tcp. It doesn't require OFED at all. But the performance may be worse than with DAPL+SCIF.&lt;/P&gt;</description>
      <pubDate>Mon, 20 Jul 2015 09:46:02 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015933#M3996</guid>
      <dc:creator>Artem_R_Intel1</dc:creator>
      <dc:date>2015-07-20T09:46:02Z</dc:date>
    </item>
    <item>
      <title>HI Artem R.</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015934#M3997</link>
      <description>&lt;P&gt;HI Artem R.&lt;/P&gt;

&lt;P&gt;Excellent ~ !! Thank you. Now it works the MPI.&lt;/P&gt;

&lt;P&gt;First, I've tried.&lt;BR /&gt;
	export I_MPI_FABRICS = shm: dapl&lt;BR /&gt;
	export I_MPI_DAPL_PROVIDER = ofa-v2-scif0&lt;BR /&gt;
	But I got an error.&lt;/P&gt;

&lt;P&gt;[Test1 @ phi-test CONUS12_rundir] $ ./MIC.sh&lt;BR /&gt;
	[0] MPI startup (): Multi-threaded optimized library&lt;BR /&gt;
	[0] DAPL startup (): trying to open DAPL provider from I_MPI_DAPL_PROVIDER: ofa-v2-scif0&lt;BR /&gt;
	[0] DAPL startup (): failed to open DAPL provider ofa-v2-scif0&lt;BR /&gt;
	[0] MPI startup (): dapl fabric is not available and fallback fabric is not enabled&lt;BR /&gt;
	[1] DAPL startup (): trying to open DAPL provider from I_MPI_DAPL_PROVIDER: ofa-v2-scif0&lt;BR /&gt;
	[1] DAPL startup (): failed to open DAPL provider ofa-v2-scif0&lt;BR /&gt;
	[1] MPI startup (): dapl fabric is not available and fallback fabric is not enabled&lt;/P&gt;

&lt;P&gt;Secondly, so I had to change.&lt;BR /&gt;
	&amp;nbsp;export I_MPI_FABRICS = shm: tcp&lt;BR /&gt;
	&amp;nbsp;&lt;BR /&gt;
	[Test1 @ phi-test CONUS12_rundir] $ ./MIC.sh&lt;BR /&gt;
	[0] MPI startup (): Multi-threaded optimized library&lt;BR /&gt;
	[0] MPI startup (): shm and tcp data transfer modes&lt;BR /&gt;
	[1] MPI startup (): shm and tcp data transfer modes&lt;BR /&gt;
	[0] MPI startup (): Rank Pid Node name Pin cpu&lt;BR /&gt;
	[0] MPI startup (): 0 48134 phi-test {0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17, 18,19,20,21,22,23,24,25,26,27,28,29,&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47}&lt;BR /&gt;
	[0] MPI startup (): 1 12476 phi-test-mic1 {0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16, 17,18,19,20,21,22,23,24,25,26,27,28,29,&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54, 55,56&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;, 57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81 , 82,8&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;3,84,85,86,87,88,89,90,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,10&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;7,108,109,110,111,112,113,114,115,116,117,118,119,120,121,122,123,124,125,126,12&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;7,128,129,130,131,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,14&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;7,148,149,150,151,152,153,154,155,156,157,158,159,160,161,162,163,164,165,166,16&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;7,168,169,170,171,172,173,174,175,176,177,178,179,180,181,182,183,184,185,186,18&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;7,188,189,190,191,192,193,194,195,196,197,198,199,200,201,202,203,204,205,206,20&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;7,208,209,210,211,212,213,214,215,216,217,218,219,220,221,222,223,224,225,226,22&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;7}&lt;BR /&gt;
	[0] MPI startup (): I_MPI_DEBUG = 5&lt;BR /&gt;
	[0] MPI startup (): I_MPI_FABRICS = shm: tcp&lt;BR /&gt;
	[0] MPI startup (): I_MPI_INFO_NUMA_NODE_DIST = 10,21,21,10&lt;BR /&gt;
	[0] MPI startup (): I_MPI_INFO_NUMA_NODE_MAP = mlx4_0: 0, mic0: 0, mic1: 0, mic2: 0, mic3: 0, mic4: 0, mic5: 0, mic6: 0, mic7: 0&lt;BR /&gt;
	[0] MPI startup (): I_MPI_INFO_NUMA_NODE_NUM = 2&lt;BR /&gt;
	[0] MPI startup (): I_MPI_MIC = 1&lt;BR /&gt;
	[0] MPI startup (): I_MPI_PIN_MAPPING = 1: 0 0&lt;BR /&gt;
	&amp;nbsp;starting wrf task 0 of 2&lt;BR /&gt;
	&amp;nbsp;starting wrf task 1 of 2&lt;/P&gt;

&lt;P&gt;real 2m26.205s&lt;BR /&gt;
	user 1m6.966s&lt;BR /&gt;
	sys 1m18.145s&lt;/P&gt;

&lt;P&gt;real 2m25.750s&lt;BR /&gt;
	user 71m35.660s&lt;BR /&gt;
	sys 1m12.200s&lt;/P&gt;

&lt;P&gt;It works successfully .. But the slow speed.&lt;BR /&gt;
	In my opinion, symmetric mode would be better if fast?&lt;/P&gt;

&lt;P&gt;The problem is that I use a tcp?&lt;BR /&gt;
	This problem infiniband am wrong?&lt;BR /&gt;
	Or are my options wrong ??&lt;/P&gt;

&lt;P&gt;Thans for your effort and kindness.&lt;/P&gt;</description>
      <pubDate>Tue, 21 Jul 2015 00:59:06 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015934#M3997</guid>
      <dc:creator>choi_w_</dc:creator>
      <dc:date>2015-07-21T00:59:06Z</dc:date>
    </item>
    <item>
      <title>Hi Choi,</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015935#M3998</link>
      <description>&lt;P&gt;Hi Choi,&lt;BR /&gt;
	Regarding to the DAPL+SCIF issue could you please specify your OS, MPSS, OFED versions?&lt;BR /&gt;
	Also please provide the output of 'ibv_devices/ibv_devinfo' utilities and /etc/dat.conf content.&lt;BR /&gt;
	And please check that ofed-mic service is running.&lt;/P&gt;</description>
      <pubDate>Tue, 21 Jul 2015 07:49:16 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015935#M3998</guid>
      <dc:creator>Artem_R_Intel1</dc:creator>
      <dc:date>2015-07-21T07:49:16Z</dc:date>
    </item>
    <item>
      <title>Hi Artem R.</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015936#M3999</link>
      <description>&lt;P&gt;Hi Artem R.&lt;/P&gt;

&lt;P&gt;System Info&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; HOST OS &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; : Linux&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; OS Version &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; : 2.6.32-358.el6.x86_64&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Driver Version &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; : 3.5.1-1&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; MPSS Version &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; : 3.5.1&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; OFED &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;: 1.5.3&lt;/P&gt;

&lt;P&gt;[root@phi-test /]# ./usr/bin/ibv_devinfo&lt;BR /&gt;
	hca_id: mlx4_0&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; transport: &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;InfiniBand (0)&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; fw_ver: &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 2.32.5100&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; node_guid: &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;0cc4:7aff:ff5f:2228&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; sys_image_guid: &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 0cc4:7aff:ff5f:222b&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; vendor_id: &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;0x02c9&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; vendor_part_id: &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 4099&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; hw_ver: &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 0x0&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; board_id: &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; SM_2301000001000&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; phys_port_cnt: &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;1&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; port: &amp;nbsp; 1&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; state: &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;PORT_DOWN (1)&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; max_mtu: &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;4096 (5)&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; active_mtu: &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 4096 (5)&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; sm_lid: &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 0&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; port_lid: &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 0&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; port_lmc: &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 0x00&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; link_layer: &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; InfiniBand&lt;/P&gt;

&lt;P&gt;And I'll add this also.&lt;BR /&gt;
	ofed_info, dat.conf&lt;/P&gt;

&lt;P&gt;Thank you!&lt;/P&gt;

&lt;P&gt;&lt;A href="https://community.intel.com/legacyfs/online/drupal_files/469310"&gt;469310&lt;/A&gt;&lt;/P&gt;

&lt;P&gt;&lt;A href="https://community.intel.com/legacyfs/online/drupal_files/469311"&gt;469311&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 21 Jul 2015 08:09:08 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015936#M3999</guid>
      <dc:creator>choi_w_</dc:creator>
      <dc:date>2015-07-21T08:09:08Z</dc:date>
    </item>
    <item>
      <title>Hi Choi,</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015937#M4000</link>
      <description>&lt;P&gt;Hi Choi,&lt;BR /&gt;
	According to the provided ibv_devinfo output there's missed 'scif0' device.&lt;BR /&gt;
	Please check that openibd/ofed-mic services are running.&lt;BR /&gt;
	You mentioned OFED 1.5.3 but according to ofed_info there's OFED 3.5-2-MIC installed - please ensure that OFED was installed according to the Intel MPSS User's Guide.&lt;/P&gt;</description>
      <pubDate>Tue, 21 Jul 2015 08:25:30 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015937#M4000</guid>
      <dc:creator>Artem_R_Intel1</dc:creator>
      <dc:date>2015-07-21T08:25:30Z</dc:date>
    </item>
    <item>
      <title>Hi Artem R.</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015938#M4001</link>
      <description>&lt;P&gt;Hi Artem R.&lt;/P&gt;

&lt;P&gt;Oops, I'm sorry.&lt;BR /&gt;
	Come to check again OFED Version 3.5.2 is a right.&lt;/P&gt;

&lt;P&gt;I'm sorry, but I do not know what ofed-mic is working.&lt;BR /&gt;
	How can I check?&lt;/P&gt;

&lt;P&gt;And I think I install a properly OFED 3.5.2-MIC.&lt;/P&gt;

&lt;P&gt;I'm sorry continued since you ask.&lt;/P&gt;

&lt;P&gt;Thank you.&lt;/P&gt;</description>
      <pubDate>Tue, 21 Jul 2015 09:16:31 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015938#M4001</guid>
      <dc:creator>choi_w_</dc:creator>
      <dc:date>2015-07-21T09:16:31Z</dc:date>
    </item>
    <item>
      <title>You can start openibd/ofed</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015939#M4002</link>
      <description>&lt;P&gt;You can start openibd/ofed-mic services with 'service &amp;lt;service_name&amp;gt; start' command (root or sudo permissions are required).&lt;BR /&gt;
	To check service status you can use 'service &amp;lt;service_name&amp;gt; status'&lt;/P&gt;</description>
      <pubDate>Tue, 21 Jul 2015 09:54:12 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015939#M4002</guid>
      <dc:creator>Artem_R_Intel1</dc:creator>
      <dc:date>2015-07-21T09:54:12Z</dc:date>
    </item>
    <item>
      <title>Hi Artem R.</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015940#M4003</link>
      <description>&lt;P&gt;&lt;SPAN style="font-size: 12px; line-height: 18px;"&gt;Hi Artem R.&lt;/SPAN&gt;&lt;/P&gt;

&lt;P&gt;I tried. service command.&lt;/P&gt;

&lt;P&gt;[root@phi-test test1]# service ofed-mic status&lt;BR /&gt;
	Status of OFED Stack:&lt;BR /&gt;
	host &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; [ &amp;nbsp;OK &amp;nbsp;]&lt;BR /&gt;
	mic0 Password:&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;[ &amp;nbsp;OK &amp;nbsp;]&lt;BR /&gt;
	mic1 Password:&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;[ &amp;nbsp;OK &amp;nbsp;]&lt;BR /&gt;
	mic2 Password:&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;[ &amp;nbsp;OK &amp;nbsp;]&lt;BR /&gt;
	mic3 Password:&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;[ &amp;nbsp;OK &amp;nbsp;]&lt;BR /&gt;
	mic4 Password:&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;[ &amp;nbsp;OK &amp;nbsp;]&lt;BR /&gt;
	mic5 Password:&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;[ &amp;nbsp;OK &amp;nbsp;]&lt;BR /&gt;
	mic6 Password:&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;[ &amp;nbsp;OK &amp;nbsp;]&lt;BR /&gt;
	mic7 Password:&lt;BR /&gt;
	&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;[ &amp;nbsp;OK &amp;nbsp;]&lt;/P&gt;

&lt;P&gt;[root@phi-test test1]# service mpss status&lt;BR /&gt;
	mpss is running&lt;BR /&gt;
	[root@phi-test test1]# service mpss start&lt;BR /&gt;
	Starting Intel(R) MPSS:&lt;BR /&gt;
	mic0: online (mode: linux image: /usr/share/mpss/boot/bzImage-knightscorner)&lt;BR /&gt;
	mic1: online (mode: linux image: /usr/share/mpss/boot/bzImage-knightscorner)&lt;BR /&gt;
	mic2: online (mode: linux image: /usr/share/mpss/boot/bzImage-knightscorner)&lt;BR /&gt;
	mic3: online (mode: linux image: /usr/share/mpss/boot/bzImage-knightscorner)&lt;BR /&gt;
	mic4: online (mode: linux image: /usr/share/mpss/boot/bzImage-knightscorner)&lt;BR /&gt;
	mic5: online (mode: linux image: /usr/share/mpss/boot/bzImage-knightscorner)&lt;BR /&gt;
	mic6: online (mode: linux image: /usr/share/mpss/boot/bzImage-knightscorner)&lt;BR /&gt;
	mic7: online (mode: linux image: /usr/share/mpss/boot/bzImage-knightscorner)&lt;/P&gt;

&lt;P&gt;To obtain the fastest speed,&amp;nbsp;How can I do now ??&lt;/P&gt;

&lt;P&gt;Thank you~!&lt;/P&gt;

&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 21 Jul 2015 23:51:00 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015940#M4003</guid>
      <dc:creator>choi_w_</dc:creator>
      <dc:date>2015-07-21T23:51:00Z</dc:date>
    </item>
    <item>
      <title>Hi Choi,</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015941#M4004</link>
      <description>&lt;P&gt;Hi Choi,&lt;BR /&gt;
	Could you please provide output of the following commands (some commands may require root permissions):&lt;BR /&gt;
	service openibd status&lt;BR /&gt;
	rpm -qa | grep scif&lt;BR /&gt;
	lsmod | grep scif&lt;/P&gt;

&lt;P&gt;Also there're suspicious lines ("Password:") in the 'service ofed-mic status' output. Not sure how critical it is, but could you please check that passwordless ssh between host and MIC cards is configured for the root user (try to connect to the MIC cards from the host via ssh).&lt;/P&gt;</description>
      <pubDate>Wed, 22 Jul 2015 08:56:21 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015941#M4004</guid>
      <dc:creator>Artem_R_Intel1</dc:creator>
      <dc:date>2015-07-22T08:56:21Z</dc:date>
    </item>
    <item>
      <title>Hi Hi Artem R.</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015942#M4005</link>
      <description>&lt;P&gt;&amp;nbsp;&lt;SPAN style="font-size: 12px; line-height: 18px;"&gt;Hi Artem R.&lt;/SPAN&gt;&lt;/P&gt;

&lt;P&gt;[root@phi-test x86]# service openibd status&lt;/P&gt;

&lt;P&gt;&amp;nbsp; HCA driver loaded&lt;/P&gt;

&lt;P&gt;Configured IPoIB devices:&lt;BR /&gt;
	ib0&lt;/P&gt;

&lt;P&gt;Currently active IPoIB devices:&lt;BR /&gt;
	ib0&lt;/P&gt;

&lt;P&gt;The following OFED modules are loaded:&lt;/P&gt;

&lt;P&gt;&amp;nbsp; rdma_ucm&lt;BR /&gt;
	&amp;nbsp; rdma_cm&lt;BR /&gt;
	&amp;nbsp; ib_addr&lt;BR /&gt;
	&amp;nbsp; ib_ipoib&lt;BR /&gt;
	&amp;nbsp; mlx4_core&lt;BR /&gt;
	&amp;nbsp; mlx4_ib&lt;BR /&gt;
	&amp;nbsp; mlx4_en&lt;BR /&gt;
	&amp;nbsp; mlx5_core&lt;BR /&gt;
	&amp;nbsp; mlx5_ib&lt;BR /&gt;
	&amp;nbsp; ib_mthca&lt;BR /&gt;
	&amp;nbsp; ib_uverbs&lt;BR /&gt;
	&amp;nbsp; ib_umad&lt;BR /&gt;
	&amp;nbsp; ib_sa&lt;BR /&gt;
	&amp;nbsp; ib_cm&lt;BR /&gt;
	&amp;nbsp; ib_mad&lt;BR /&gt;
	&amp;nbsp; ib_core&lt;BR /&gt;
	&amp;nbsp; iw_cxgb3&lt;BR /&gt;
	&amp;nbsp; iw_cxgb4&lt;BR /&gt;
	&amp;nbsp; iw_nes&lt;BR /&gt;
	&amp;nbsp; ib_qib&lt;/P&gt;

&lt;P&gt;[root@phi-test x86]# rpm -qa | grep scif&lt;BR /&gt;
	mpss-sciftutorials-3.5.1-1.glibc2.12.2.x86_64&lt;BR /&gt;
	intel-mic-ofed-libibscif-1.0.0-0.x86_64&lt;BR /&gt;
	libscif-doc-3.5.1-1.glibc2.12.2.x86_64&lt;BR /&gt;
	libscif0-3.5.1-1.glibc2.12.2.x86_64&lt;BR /&gt;
	libscif-dev-3.5.1-1.glibc2.12.2.x86_64&lt;BR /&gt;
	intel-mic-ofed-libibscif-devel-1.0.0-0.x86_64&lt;BR /&gt;
	mpss-sciftutorials-doc-3.5.1-1.glibc2.12.2.x86_64&lt;/P&gt;

&lt;P&gt;[root@phi-test x86]# lsmod | grep scif&lt;BR /&gt;
	ibscif &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 84477 &amp;nbsp;0&lt;BR /&gt;
	ib_core &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;73628 &amp;nbsp;18 ibscif,ibp_server,rdma_ucm,rdma_cm,iw_cm,ib_ipoib,ib_cm,ib_sa,ib_uverbs,ib_umad,iw_nes,iw_cxgb4,iw_cxgb3,ib_qib,mlx5_ib,mlx4_ib,ib_mthca,ib_mad&lt;BR /&gt;
	mic &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 588847 &amp;nbsp;49 ibscif,ibp_sa_server,ibp_cm_server,ibp_server,ib_qib&lt;BR /&gt;
	compat &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; 16629 &amp;nbsp;27 ibscif,ibp_sa_server,ibp_cm_server,ibp_server,rdma_ucm,rdma_cm,iw_cm,ib_addr,ib_ipoib,ib_cm,ib_sa,ib_uverbs,ib_umad,iw_nes,iw_cxgb4,cxgb4,iw_cxgb3,cxgb3,ib_qib,mlx5_ib,mlx5_core,mlx4_en,mlx4_ib,ib_mthca,ib_mad,ib_core,mlx4_core&lt;/P&gt;

&lt;P&gt;Thank you!!&lt;/P&gt;</description>
      <pubDate>Wed, 22 Jul 2015 09:37:00 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015942#M4005</guid>
      <dc:creator>choi_w_</dc:creator>
      <dc:date>2015-07-22T09:37:00Z</dc:date>
    </item>
    <item>
      <title>Hi Choi,</title>
      <link>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015943#M4006</link>
      <description>&lt;P&gt;Hi Choi,&lt;BR /&gt;
	What's about passwordless ssh between host and MIC cards for the root account?&lt;/P&gt;</description>
      <pubDate>Thu, 23 Jul 2015 12:18:22 GMT</pubDate>
      <guid>https://community.intel.com/t5/Intel-MPI-Library/How-to-run-WRF-Conus12km-on-Intel-Xeon-Phi-Coprocessors-and/m-p/1015943#M4006</guid>
      <dc:creator>Artem_R_Intel1</dc:creator>
      <dc:date>2015-07-23T12:18:22Z</dc:date>
    </item>
  </channel>
</rss>

