Intel® MPI Library
Get help with building, analyzing, optimizing, and scaling high-performance computing (HPC) applications.
2159 Discussions

problem with mpich on windows HPC cluster

parsa_banihashemi
306 Views
Hi,

I have a cluster with windows HPC server 2007 service pack 1
I can not run communicative commands of mpi on the system.

when I run the following simple code on 2 nodes using one core on each:
--------------------
use mpi
implicit real*8(a-h,o-z)

call MPI_Init ( ierr )
call MPI_Comm_rank ( MPI_COMM_WORLD, my_id, ierr )
CALL MPI_COMM_SIZE ( MPI_COMM_WORLD, NUM_PROCS, IERR )

print*,my_id, NUM_PROCS

call mpi_barrier(mpi_comm_world,ierr)

print*,my_id, NUM_PROCS

call MPI_Finalize ( ierr )
end
--------------------



It does the first printing, but On mpi_barrier, It return error: rank 0 unable to connect to rank 1 using business card(33709,....)

mpi_barrier error,
I have the sam eproblem with all subroutines: mpi_bcast, mpi_reduce,.... and all subroutines that need communication.
But If I want it to print ranks, or each cpu do some operations on its own, It does. I have run this code with the same version of mpich on a pc with windows xp, cammunicating through ethernet with a laptop with windows vista. please help me out.

Regards
Parsa
0 Kudos
0 Replies
Reply