- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Hi Fiona,

Thank you for your response. Sorry for the incomplete message earlier, I noticed a mistake and couldn't figure out how to temporarily delete the post before getting back to it today.

Here is the code. First the value of nb is mysteriously set to zero after calling Blacs_Gridinfo. Then the values of l_nrows and l_ncols from numroc are both equal to the reset value of nb. Any help you can provide is greatly appreciated. Thank you - Phil.

!Compile options:

!

! -i8 -I${MKLROOT}/include/intel64/ilp64 -I${MKLROOT}/include

!

!Link options:

!

!${MKLROOT}/lib/intel64/libmkl_blas95_ilp64.a ${MKLROOT}/lib/intel64/libmkl_lapack95_ilp64.a ${MKLROOT}/lib/intel64/libmkl_scalapack_ilp64.a -Wl,--start-group ${MKLROOT}/lib/intel64/libmkl_intel_ilp64.a ${MKLROOT}/lib/intel64/libmkl_core.a ${MKLROOT}/lib/intel64/libmkl_sequential.a ${MKLROOT}/lib/intel64/libmkl_blacs_openmpi_ilp64.a -Wl,--end-group -lpthread -lm -ldl

!

!

Program CheckIntelNumroc

use mpi

implicit none

integer :: NPROW=4,NPCOL=6,NB=4

integer :: nprocs,iam,IERROR,MYPROC,MYROW,MYCOL,CONTEXT,n

integer :: numroc,l_nrows,l_ncols

CALL MPI_INIT (IERROR)

CALL MPI_COMM_SIZE (MPI_COMM_WORLD,NPROCS,IERROR)

CALL MPI_COMM_RANK (MPI_COMM_WORLD,MYPROC,IERROR)

CALL BLACS_PINFO(IAM,nprocs)

call blacs_get(0,0,context)

call blacs_gridinit( context, 'R', nprow, npcol)

print *,nb !nb returns the correct value

CALL BLACS_GRIDINFO( context, nprow, npcol, myrow, mycol )

print *, nb,iam,myrow,mycol,nprow,npcol !NOTE THAT NB=0 in the printout! Why is this?

nb=16

n=10000

call blacs_barrier(context,'a')

l_nrows = numroc(n,nb,myrow,0,NPROW)

l_ncols = numroc(n,nb,mycol,0,NPCOL)

print *, nb,iam,myrow,mycol,nprow,npcol,l_nrows,l_ncols !l_ncols = l_nrows = nb in the printout, this is not correct. Other values seem correct.

Call Blacs_Gridexit(context)

Call MPI_Finalize(ierror)

end program CheckIntelNumroc

Link Copied

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Hi,

Could you please provide more info including all parameters of numroc_ .For the process's row/column coordinate in the process grid may also relay on how you use BLACS_GRIDINFO and BLACS_GRIDINIT, please provide a reproducer/ sample code. Thanks.

Best regards,

Fiona

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Hi,

I tested with IntelMPI, there's no any problem with the value of l_nrows & l_ncols. For example, if we set to the 10000*10000 matrix into 4*6 small matrix (NPROW=4,NPCOL=6), and I have 4 processors. The numroc will do following work:

!Figure PROC's distance from source process MYDIST = MOD( NPROCS+IPROC-ISRCPROC, NPROCS ) !Figure the total number of whole NB blocks N is split up into NBLOCKS = N / NB !Figure the minimum number of rows/cols a process can have NUMROC = (NBLOCKS/NPROCS) * NB !See if there are any extra blocks EXTRABLKS = MOD( NBLOCKS, NPROCS ) !If I have an extra block IF( MYDIST.LT.EXTRABLKS ) THEN NUMROC = NUMROC + NB !If I have last block, it may be a partial block ELSE IF( MYDIST.EQ.EXTRABLKS ) THEN NUMROC = NUMROC + MOD( N, NB ) END IF

That means, for first 6 matrices (myrow=0, 0<=mycol<6), the l_nrows is (n/nb/nprocs)*nb+extra that (10000/16/4)*16+16=2512. And left (myrow!=0, 0<=mycol<6) row of matrices is always 2496 (column are not considered here). I saw you tried to print the result, but there's one point you need to pay attention. Those processors are alternatively working, the printing work of each processor is not finished one by one. The l_nrows of first matrix may be printed at the end, if you want to see the sequential result, you probably could print rank by rank by adding following code:

DO i=0,24 CALL MPI_BARRIER(MPI_COMM_WORLD,IERROR) IF(MYPROC==i) THEN write (*,*) "after numroc,nb=",nb,",iam=",iam,",myrow=",myrow,",mycol=",mycol,",nprow=",nprow,",npcol=",npcol,",l_nrows=",l_nrows,",l_ncols=",l_ncols END IF END DO

Best regards,

Fiona

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Hi Fiona,

Thank you for your response. The problem appears to have been due to a linker issue and to be fixed now.

Cheers,

Phil

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page