Intel® oneAPI Math Kernel Library
Ask questions and share information with other developers who use Intel® Math Kernel Library.

MPICH3 and libmkl_blacs_mpich_

Jesse_Q_
Beginner
1,765 Views

While attempting to run siesta-4.1-b3 using MPICH3 (3.1.4) with intel mkl 2018 libmkl_blacs_mpich I received the following error:

[Jesses-iMac-2:98886] *** An error occurred in MPI_Comm_rank

[Jesses-iMac-2:98886] *** reported by process [128122881,206158430208]

[Jesses-iMac-2:98886] *** on communicator MPI_COMM_WORLD

[Jesses-iMac-2:98886] *** MPI_ERR_COMM: invalid communicator

[Jesses-iMac-2:98886] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,

[Jesses-iMac-2:98886] ***    and potentially your MPI job)

I believe this to be an error with the mpi.h. Is MPICH3 supported or do I need to install MPICH2 (1.5-rc3, latest version I'm aware of for MPICH2)? My operating system is high Sierra OSX.

 

0 Kudos
6 Replies
Ying_H_Intel
Employee
1,765 Views

Hi Jesse,

Here is the MKL system requirement https://software.intel.com/en-us/articles/intel-math-kernel-library-intel-mkl-2018-system-requirements for your reference.
macOS* 10.12 (Xcode 8.x) and macOS* 10.13 (Xcode 9.x) (Intel® 64)​
MPICH version 3.1  (http://www-unix.mcs.anl.gov/mpi/mpich)​ is supported

What is compiler on that machine?  

One more thing, the zip file you shared seems empty

Best Regards,
​Ying

P.S  https://software.intel.com/en-us/articles/intel-mkl-link-line-advisor/
​which will help to provide the exact MKL library wanted when you link in application.

for example,  -qopenmp $(MKL)/lib/libmkl_blas95_ilp64.a $(MKL)/lib/libmkl_lapack95_ilp64.a $(INTEL_LIBS) -Wl,-rpath,$(MKL)/lib \
-lmkl_scalapack_ilp64 -lmkl_intel_ilp64 -lmkl_intel_thread -lmkl_core -lmkl_blacs_mpich_ilp64 -liomp5 -lpthread -lm -ldl

Here you are using the ILP64 (integer is 64bit).  Please make sure your siesta​ application ask the 64bit integer.

 

 

0 Kudos
Jesse_Q_
Beginner
1,765 Views

Hi Ying,

My compilers are version 18.0.2 and xcode 9.3.

I will redo all compilations (hdf5, netcdf, netcdf-fortran) with mpich 3.1 (mpich-3.1.tar.gz) and will link with 32 bit and 64 bit to see if that is also a problem. I will get back to you shortly.

Regards,

0 Kudos
Jesse_Q_
Beginner
1,765 Views

OK.

Compiling with Mpich3.1 either 64/32 int I get the following error

[jessequinn@Jesses-iMac-2 Tests]$ make MPI="mpirun -np 4 /Users/jessequinn/Packages/tmp/siesta-4.1-b3/Obj/siesta"

/Applications/Xcode.app/Contents/Developer/usr/bin/make -C ag completed

>>>> Running ag test...

    ==> Copying pseudopotential file for Ag...

    ==> Running SIESTA as mpirun -np 4 /Users/jessequinn/Packages/tmp/siesta-4.1-b3/Obj/siesta ../../../siesta -fdf XML.Write ../ag.fdf 

forrtl: severe (174): SIGSEGV, segmentation fault occurred

Image              PC                Routine            Line        Source             

libifcoremt.dylib  000000010D6D5254  for__signal_handl     Unknown  Unknown

libsystem_platfor  00007FFF7D032F5A  _sigtramp             Unknown  Unknown

libmpi.40.dylib    000000010D167204  MPI_Comm_rank         Unknown  Unknown

libpmpich.12.dyli  000000010D25356B  mpi_comm_rank_        Unknown  Unknown

siesta             0000000102BAE3B5  mpi_siesta_mp_mym     Unknown  Unknown

siesta             000000010247A6B2  m_siesta_init_mp_     Unknown  Unknown

siesta             0000000102ABDA70  MAIN__                Unknown  Unknown

siesta             0000000102234DAE  main                  Unknown  Unknown

forrtl: severe (174): SIGSEGV, segmentation fault occurred

Image              PC                Routine            Line        Source             

libifcoremt.dylib  0000000115E07254  for__signal_handl     Unknown  Unknown

libsystem_platfor  00007FFF7D032F5A  _sigtramp             Unknown  Unknown

libmpi.40.dylib    00000001112CC204  MPI_Comm_rank         Unknown  Unknown

libpmpich.12.dyli  00000001159CE56B  mpi_comm_rank_        Unknown  Unknown

siesta             000000010B20B3B5  mpi_siesta_mp_mym     Unknown  Unknown

siesta             000000010AAD76B2  m_siesta_init_mp_     Unknown  Unknown

siesta             000000010B11AA70  MAIN__                Unknown  Unknown

siesta             000000010A891DAE  main                  Unknown  Unknown

forrtl: severe (174): SIGSEGV, segmentation fault occurred

Image              PC                Routine            Line        Source             

libifcoremt.dylib  000000010BD78254  for__signal_handl     Unknown  Unknown

libsystem_platfor  00007FFF7D032F5A  _sigtramp             Unknown  Unknown

libmpi.40.dylib    000000010AD75204  MPI_Comm_rank         Unknown  Unknown

libpmpich.12.dyli  000000010B93F56B  mpi_comm_rank_        Unknown  Unknown

siesta             00000001011743B5  mpi_siesta_mp_mym     Unknown  Unknown

siesta             0000000100A406B2  m_siesta_init_mp_     Unknown  Unknown

siesta             0000000101083A70  MAIN__                Unknown  Unknown

siesta             00000001007FADAE  main                  Unknown  Unknown

forrtl: severe (174): SIGSEGV, segmentation fault occurred

Image              PC                Routine            Line        Source             

libifcoremt.dylib  00000001148D9254  for__signal_handl     Unknown  Unknown

libsystem_platfor  00007FFF7D032F5A  _sigtramp             Unknown  Unknown

libmpi.40.dylib    000000011438A204  MPI_Comm_rank         Unknown  Unknown

libpmpich.12.dyli  000000011446756B  mpi_comm_rank_        Unknown  Unknown

siesta             00000001106A83B5  mpi_siesta_mp_mym     Unknown  Unknown

siesta             000000010FF746B2  m_siesta_init_mp_     Unknown  Unknown

siesta             00000001105B7A70  MAIN__                Unknown  Unknown

siesta             000000010FD2EDAE  main                  Unknown  Unknown

make[1]: *** [completed_work] Error 174

make: *** [ag] Error 2

[jessequinn@Jesses-iMac-2 Tests]$ 

Any idea?
0 Kudos
Ying_H_Intel
Employee
1,765 Views

Hi Jesse,

Do you have other machine like Linux etc?  the MPI and Mac OS maybe still issue.

From system requirement, it seems not MPI for Mac OS.

MPI implementations that Intel® MKL for Windows* OS has been validated against:

  • Intel® MPI Library Version 5.1 (Intel® 64) (http://www.intel.com/go/mpi)
  • Intel® MPI Library Version 2018 (Intel® 64) (http://www.intel.com/go/mpi)
  • Intel® MPI Library Version 2017 (Intel® 64) (http://www.intel.com/go/mpi)
  • MPICH2 version 1.5 (http://www-unix.mcs.anl.gov/mpi/mpich)
  • MS MPI, CCE or HPC 2012 on Intel® 64 (http://www.microsoft.com/downloads)
  • Open MPI 1.8.x (Intel® 64) (http://www.open-mpi.org)

MPI implementations that Intel® MKL for Linux* OS has been validated against:

  • Intel® MPI Library Version 5.1 (Intel® 64) (http://www.intel.com/go/mpi)
  • Intel® MPI Library Version 2018 (Intel® 64) (http://www.intel.com/go/mpi)
  • Intel® MPI Library Version 2017 (Intel® 64) (http://www.intel.com/go/mpi)
  • MPICH2 version 1.5 (Intel® 64) (http://www-unix.mcs.anl.gov/mpi/mpich)
  • MPICH version 3.1  (http://www-unix.mcs.anl.gov/mpi/mpich)
  • MPICH version 3.2  (http://www-unix.mcs.anl.gov/mpi/mpich)
  • Open MPI 1.8.x (Intel® 64) (http://www.open-mpi.org)


Or the environment and dependency seems complex.  Could you submit your question to On-line sever center supporttickets.intel.com/ where our engineer may help to see the details.

Best Regards,

Ying

 

0 Kudos
Jesse_Q_
Beginner
1,765 Views

According to the linker

MPICH is the only MPI implementation supported. However, what version I do not know!!! And no, I do not have a linux.

0 Kudos
Konstantin_A_Intel
1,765 Views

Hi Jesse,

As I can see, Siesta crashes at the initialization stage during MPI_Comm_rank. That indicates that you definitely have some issues with MPI and it is hardly connected to MKL. So, I recommend you either to try MPICH 2 or to run a simple MPI test first.

Regards,

Konstantin

0 Kudos
Reply