Intel® MPI Library
Get help with building, analyzing, optimizing, and scaling high-performance computing (HPC) applications.

MPI_Sendrecv problem

diedro
Beginner
884 Views

dear all,

I have a problem with  MPI_Sendrecv, probably because I do not understand it.

In my program I create a Cartesian topology:

   periods = .FALSE.
   CALL MPI_CART_CREATE (MPI_COMM_WORLD,ndims,dims,periods,.TRUE.,COMM_CART,MPI%iErr)

and I find all the neighbors:

 ! Find neighbors
 CALL MPI_CART_SHIFT(COMM_CART,0,1,source,RCPU,MPI%iErr)  ! x-dir, right
 CALL MPI_CART_SHIFT(COMM_CART,0,-1,source,LCPU,MPI%iErr) ! x-dir, left
 CALL MPI_CART_SHIFT(COMM_CART,1,1,source,TCPU,MPI%iErr)  ! y-dir, top
 CALL MPI_CART_SHIFT(COMM_CART,1,-1,source,BCPU,MPI%iErr) ! y-dir, bottom

I want that each processor send some data to the processor below, so I decide to use MPI_Sendrecv:

   !BCPU = where we send
   !TCPU = who sends
   CALL MPI_Sendrecv(P_SEND,nptSend,MPI_PARTICLE_TYPE,BCPU,201,&
                     P_RECV,nptSend,MPI_PARTICLE_TYPE,TCPU,201,MPI_COMM_WORLD,MPI_STATUS_IGNORE,MPI%iErr)

the problem is that the bellow processor receive only some data. I mean, If i set nptSend=100, only the first 20 value of P_RECV are not equal to zero.

There is something that I miss and I do not know what.

Someone Can help me, please?

0 Kudos
1 Reply
diedro
Beginner
884 Views

Dear all,

I got it, I have some problem with MPI_PARTICLE_TYPE.

Sorry for your time

0 Kudos
Reply