C:\Program Files (x86)\DHI\2012\MIKE Zero\Examples\MIKE_ZERO\AutoCal\MIKE_SHE\Ex ample1>mpiexec -hosts 2 apc145 1 apc789 6 -genv I_MPI_DEBUG 100 "MPI Test.exe" MPI_INIT MPI_INIT MPI_INIT MPI_INIT MPI_INIT MPI_INIT MPI_INIT [0] MPI startup(): Intel(R) MPI Library, Version 4.0 Update 2 Build 20110428 [0] MPI startup(): Copyright (C) 2003-2011 Intel Corporation. All rights reserv ed. [0] MPI startup(): fabric dapl failed: will try use tcp fabric [0] MPI startup(): shm and tcp data transfer modes [4] MPI startup(): fabric dapl failed: will try use tcp fabric[1] MPI startup(): fabric dapl failed: will try use tcp fabric [5] MPI startup(): fabric dapl failed: will try use tcp fabric [2] MPI startup(): fabric dapl failed: will try use tcp fabric [1] MPI startup(): shm and tcp data transfer modes [4] MPI startup(): shm and tcp data transfer modes [5] MPI startup(): shm and tcp data transfer modes [2] MPI startup(): shm and tcp data transfer modes [6] MPI startup(): fabric dapl failed: will try use tcp fabric [3] MPI startup(): fabric dapl failed: will try use tcp fabric [6] MPI startup(): shm and tcp data transfer modes [3] MPI startup(): shm and tcp data transfer modes [1] MPI startup(): Internal info: pinning initialization already done [0] MPI startup(): Internal info: pinning initialization already done [0] MPI startup(): set domain to {0,1} on node apc145 [6] MPI startup(): Internal info: pinning initialization already done [6] MPI startup(): set domain to {10,11} on node apc789 [6] MPI startup(): Recognition level=1. Platform code=2. Device=5 [6] MPI startup(): Parent configuration:(intra=1 inter=6 flags=0), (code=2 ppn_i dx=2) [5] MPI startup(): Internal info: pinning initialization already done [5] MPI startup(): set domain to {8,9} on node apc789 [5] MPI startup(): Recognition level=1. Platform code=2. Device=5 [5] MPI startup(): Parent configuration:(intra=1 inter=6 flags=0), (code=2 ppn_i dx=2) [3] MPI startup(): Internal info: pinning initialization already done [3] MPI startup(): set domain to {4,5} on node apc789 [3] MPI startup(): Recognition level=1. Platform code=2. Device=5 [3] MPI startup(): Parent configuration:(intra=1 inter=6 flags=0), (code=2 ppn_i dx=2) [2] MPI startup(): Internal info: pinning initialization already done [2] MPI startup(): set domain to {2,3} on node apc789 [2] MPI startup(): Recognition level=1. Platform code=2. Device=5 [2] MPI startup(): Parent configuration:(intra=1 inter=6 flags=0), (code=2 ppn_i dx=2) [4] MPI startup(): Internal info: pinning initialization already done [4] MPI startup(): set domain to {6,7} on node apc789 [4] MPI startup(): Recognition level=1. Platform code=2. Device=5 [4] MPI startup(): Parent configuration:(intra=1 inter=6 flags=0), (code=2 ppn_i dx=2) [0] MPI startup(): Recognition level=1. Platform code=1. Device=5 [0] MPI startup(): Parent configuration:(intra=1 inter=6 flags=0), (code=1 ppn_i dx=2) Device_reset_idx=1 [0] MPI startup(): Allgather: 1: 0-2048 & 0-4 [0] MPI startup(): Allgather: 1: 524288-2147483647 & 0-4[1] MPI startup(): set d omain to {0,1} on node apc789 [1] MPI startup(): Recognition level=1. Platform code=2. Device=5 [1] MPI startup(): Parent configuration:(intra=1 inter=6 flags=0), (code=2 ppn_i dx=2) [0] MPI startup(): Allgather: 1: 0-256 & 5-8 [0] MPI startup(): Allgather: 4: 0-4 & 9-16 [0] MPI startup(): Allgather: 1: 4-256 & 9-16 [0] MPI startup(): Allgather: 4: 0-8 & 17-32 [0] MPI startup(): Allgather: 1: 8-256 & 17-32 [0] MPI startup(): Allgather: 4: 0-4 & 33-64 [0] MPI startup(): Allgather: 1: 4-256 & 33-64 [0] MPI startup(): Allgather: 1: 0-16 & 65-2147483647 [0] MPI startup(): Allgather: 4: 16-128 & 65-2147483647 [0] MPI startup(): Allgather: 3: 0-2147483647 & 0-2147483647 [0] MPI startup(): Allgatherv: 0: 0-2147483647 & 0-2147483647 [0] MPI startup(): Allreduce: 1: 0-2048 & 0-4 [0] MPI startup(): Allreduce: 3: 2097152-2147483647 & 0-4 [0] MPI startup(): Allreduce: 1: 0-4 & 5-8 [0] MPI startup(): Allreduce: 5: 4-32 & 5-8 [0] MPI startup(): Allreduce: 1: 32-1024 & 5-8 [0] MPI startup(): Allreduce: 1: 0-1024 & 9-16 [0] MPI startup(): Allreduce: 3: 1048576-2147483647 & 9-16 [0] MPI startup(): Allreduce: 1: 0-1024 & 17-32 [0] MPI startup(): Allreduce: 3: 2097152-2147483647 & 17-32 [0] MPI startup(): Allreduce: 1: 0-1024 & 33-64 [0] MPI startup(): Allreduce: 3: 1048576-2147483647 & 33-64 [0] MPI startup(): Allreduce: 1: 0-4 & 65-2147483647 [0] MPI startup(): Allreduce: 5: 4-8 & 65-2147483647 [0] MPI startup(): Allreduce: 1: 8-128 & 65-2147483647 [0] MPI startup(): Allreduce: 5: 128-512 & 65-2147483647 [0] MPI startup(): Allreduce: 2: 0-2147483647 & 0-2147483647 [0] MPI startup(): Alltoall: 3: 0-2 & 0-4 [0] MPI startup(): Alltoall: 2: 2-4 & 0-4 [0] MPI startup(): Alltoall: 3: 4-32 & 0-4 [0] MPI startup(): Alltoall: 2: 32-2097152 & 0-4 [0] MPI startup(): Alltoall: 3: 2097152-2147483647 & 0-4 [0] MPI startup(): Alltoall: 1: 64-128 & 5-8 [0] MPI startup(): Alltoall: 2: 128-131072 & 5-8 [0] MPI startup(): Alltoall: 3: 131072-262144 & 5-8 [0] MPI startup(): Alltoall: 2: 262144-1048576 & 5-8 [0] MPI startup(): Alltoall: 2: 2097152-2147483647 & 5-8 [0] MPI startup(): Alltoall: 2: 256-8192 & 9-16 [0] MPI startup(): Alltoall: 3: 8192-32768 & 9-16 [0] MPI startup(): Alltoall: 2: 524288-1048576 & 9-16 [0] MPI startup(): Alltoall: 2: 256-4096 & 17-32 [0] MPI startup(): Alltoall: 3: 65536-2147483647 & 17-32 [0] MPI startup(): Alltoall: 2: 512-1024 & 33-64 [0] MPI startup(): Alltoall: 3: 2048-32768 & 33-64 [0] MPI startup(): Alltoall: 3: 65536-131072 & 33-64 [0] MPI startup(): Alltoall: 2: 131072-2147483647 & 33-64 [0] MPI startup(): Alltoall: 1: 0-4 & 65-2147483647 [0] MPI startup(): Alltoall: 3: 2048-4096 & 65-2147483647 [0] MPI startup(): Alltoall: 3: 32768-131072 & 65-2147483647 [0] MPI startup(): Alltoall: 4: 0-2147483647 & 0-2147483647 [0] MPI startup(): Alltoallv: 2: 0-2147483647 & 65-2147483647 [0] MPI startup(): Alltoallw: 0: 0-2147483647 & 0-2147483647 [0] MPI startup(): Barrier: 2: 0-2147483647 & 0-4 [0] MPI startup(): Barrier: 3: 0-2147483647 & 5-32 [0] MPI startup(): Barrier: 4: 0-2147483647 & 0-2147483647 [0] MPI startup(): Bcast: 1: 0-65536 & 0-4 [0] MPI startup(): Bcast: 1: 262144-1048576 & 0-4 [0] MPI startup(): Bcast: 4: 0-1048576 & 5-8 [0] MPI startup(): Bcast: 4: 0-2048 & 9-16 [0] MPI startup(): Bcast: 4: 4-8 & 17-32 [0] MPI startup(): Bcast: 4: 16-64 & 17-32 [0] MPI startup(): Bcast: 4: 128-1024 & 17-32 [0] MPI startup(): Bcast: 4: 256-512 & 33-64 [0] MPI startup(): Bcast: 7: 0-2147483647 & 0-2147483647 [0] MPI startup(): Exscan: 0: 0-2147483647 & 0-2147483647 [0] MPI startup(): Gather: 3: 0-1024 & 0-4 [0] MPI startup(): Gather: 3: 8192-16384 & 9-16 [0] MPI startup(): Gather: 3: 131072-262144 & 9-16 [0] MPI startup(): Gather: 3: 1048576-2147483647 & 9-16 [0] MPI startup(): Gather: 3: 2048-4096 & 17-32 [0] MPI startup(): Gather: 3: 32768-262144 & 17-32 [0] MPI startup(): Gather: 3: 524288-2097152 & 17-32 [0] MPI startup(): Gather: 3: 2048-4096 & 33-64 [0] MPI startup(): Gather: 3: 32768-131072 & 33-64 [0] MPI startup(): Gather: 3: 262144-1048576 & 33-64 [0] MPI startup(): Gather: 1: 1048576-2147483647 & 33-64 [0] MPI startup(): Gather: 3: 1024-2048 & 65-2147483647 [0] MPI startup(): Gather: 3: 16384-32768 & 65-2147483647 [0] MPI startup(): Gather: 3: 262144-524288 & 65-2147483647 [0] MPI startup(): Gather: 1: 524288-2147483647 & 65-2147483647 [0] MPI startup(): Gather: 2: 0-2147483647 & 0-2147483647 [0] MPI startup(): Gatherv: 1: 0-2147483647 & 0-2147483647 [0] MPI startup(): Reduce_scatter: 2: 4096-2147483647 & 0-4 [0] MPI startup(): Reduce_scatter: 2: 16384-131072 & 5-8 [0] MPI startup(): Reduce_scatter: 4: 131072-2147483647 & 5-8 [0] MPI startup(): Reduce_scatter: 4: 32768-2147483647 & 9-16 [0] MPI startup(): Reduce_scatter: 4: 32768-2147483647 & 17-32 [0] MPI startup(): Reduce_scatter: 4: 65536-131072 & 33-64 [0] MPI startup(): Reduce_scatter: 2: 131072-262144 & 33-64 [0] MPI startup(): Reduce_scatter: 4: 262144-2147483647 & 33-64 [0] MPI startup(): Reduce_scatter: 5: 32768-524288 & 65-2147483647 [0] MPI startup(): Reduce_scatter: 4: 524288-2147483647 & 65-2147483647 [0] MPI startup(): Reduce_scatter: 1: 0-2147483647 & 0-2147483647 [0] MPI startup(): Reduce: 1: 0-2147483647 & 0-4 [0] MPI startup(): Reduce: 1: 65536-2147483647 & 5-8 [0] MPI startup(): Reduce: 1: 0-512 & 9-16 [0] MPI startup(): Reduce: 1: 1024-4096 & 9-16 [0] MPI startup(): Reduce: 1: 16384-1048576 & 9-16 [0] MPI startup(): Reduce: 1: 0-512 & 17-32 [0] MPI startup(): Reduce: 1: 1024-4096 & 17-32 [0] MPI startup(): Reduce: 1: 16384-65536 & 17-32 [0] MPI startup(): Reduce: 1: 131072-262144 & 17-32 [0] MPI startup(): Reduce: 1: 0-512 & 33-64 [0] MPI startup(): Reduce: 1: 1024-4096 & 33-64 [0] MPI startup(): Reduce: 1: 0-256 & 65-2147483647 [0] MPI startup(): Reduce: 3: 0-2147483647 & 0-2147483647 [0] MPI startup(): Scan: 0: 0-2147483647 & 0-2147483647 [0] MPI startup(): Scatter: 1: 0-0 & 0-4 [0] MPI startup(): Scatter: 2: 1024-2147483647 & 0-4 [0] MPI startup(): Scatter: 2: 0-256 & 5-8 [0] MPI startup(): Scatter: 2: 0-128 & 9-16 [0] MPI startup(): Scatter: 2: 131072-262144 & 9-16 [0] MPI startup(): Scatter: 1: 0-64 & 17-32 [0] MPI startup(): Scatter: 2: 16384-262144 & 17-32 [0] MPI startup(): Scatter: 1: 0-64 & 33-64 [0] MPI startup(): Scatter: 1: 32768-65536 & 33-64 [0] MPI startup(): Scatter: 1: 0-32 & 65-2147483647 [0] MPI startup(): Scatter: 2: 2048-32768 & 65-2147483647 [0] MPI startup(): Scatter: 3: 0-2147483647 & 0-2147483647 [0] MPI startup(): Scatterv: 1: 0-2147483647 & 0-2147483647 Getting MPI_COMM_SIZE MPI_COMM_SIZE 7 Getting MPI_COMM_RANK MPI_COMM_RANK 1 Getting MPI_GET_PROCESSOR_NAME MPI_GET_PROCESSOR_NAME apc789.dhi.dk rank: 1 Calling MPI_Barrier on rank 1 Getting MPI_COMM_SIZE Getting MPI_COMM_SIZE MPI_COMM_SIZE 7 Getting MPI_COMM_RANK MPI_COMM_RANK 4 Getting MPI_GET_PROCESSOR_NAME MPI_GET_PROCESSOR_NAME apc789.dhi.dk rank: 4 Calling MPI_Barrier on rank 4 Getting MPI_COMM_SIZE MPI_COMM_SIZE 7 Getting MPI_COMM_RANK MPI_COMM_RANK 3 Getting MPI_GET_PROCESSOR_NAME MPI_GET_PROCESSOR_NAME apc789.dhi.dk rank: 3 Calling MPI_Barrier on rank 3 Getting MPI_COMM_SIZE MPI_COMM_SIZE 7 Getting MPI_COMM_RANK MPI_COMM_RANK 2 Getting MPI_GET_PROCESSOR_NAME MPI_GET_PROCESSOR_NAME apc789.dhi.dk rank: 2 Calling MPI_Barrier on rank 2 Getting MPI_COMM_SIZE MPI_COMM_SIZE 7 Getting MPI_COMM_RANK MPI_COMM_RANK 6 Getting MPI_GET_PROCESSOR_NAME MPI_GET_PROCESSOR_NAME apc789.dhi.dk rank: 6 Calling MPI_Barrier on rank 6 MPI_COMM_SIZE 7 Getting MPI_COMM_RANK MPI_COMM_RANK 5 Getting MPI_GET_PROCESSOR_NAME MPI_GET_PROCESSOR_NAME apc789.dhi.dk rank: 5 Calling MPI_Barrier on rank 5 [0] Rank Pid Node name Pin cpu [0] 0 16844 apc145 {0,1} [0] 1 7396 apc789 {0,1} [0] 2 7212 apc789 {2,3} [0] 3 7420 {4,5} [0] 4 3276 apc789 {6,7} [0] 5 6936 {8,9} [0] 6 5084 {10,11} [0] MPI startup(): I_MPI_DEBUG=100 [0] MPI startup(): I_MPI_PIN_INFO=x0,1 [0] MPI startup(): NUMBER_OF_PROCESSORS=2 [0] MPI startup(): PROCESSOR_IDENTIFIER=Intel64 Family 6 Model 23 Stepping 6, Ge nuineIntel Getting MPI_COMM_SIZE MPI_COMM_SIZE 7 Getting MPI_COMM_RANK MPI_COMM_RANK 0 Getting MPI_GET_PROCESSOR_NAME MPI_GET_PROCESSOR_NAME apc145.dhi.dk rank: 0 Calling MPI_Barrier on rank 0