Intel® oneAPI Math Kernel Library
Ask questions and share information with other developers who use Intel® Math Kernel Library.

Using Intel MKL with PETSc

ngolubev
Beginner
1,896 Views
I tried to configure petsc-3.0.0-p7 with mkl 10.2.3.029 for core 2 computer
(Fedora 11)
./config/configure.py PETSC_ARCH=linux-intel --with-debugging=0 --with-cc=gcc
--with-fc=gfortran COPTFLAGS='-O3 -march=p4 -mtune=p4' FOPTFLAGS='-O3 -qarch=p4
-qtune=p4' --download-mpich=1
--with-blas-lapack-dir=/opt/intel/mkl/10.2.3.029/lib/em64t

As a result I had message :
UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details):
-----------------------------------------------------------------
----------------------
You set a value for --with-blas-lapack-dir=, but
/opt/intel/mkl/10.2.3.029/lib/em64t cannot be used

I found a lot of errors in log file. As example below some part of it:

Possible ERROR while running linker: /usr/bin/ld: cannot find -lmkl_lapack
............................
/opt/intel/mkl/10.2.3.029/lib/em64t/libmkl_lapack.so: undefined reference to
`mkl_pds_sp_blkslv_pardiso'
/opt/intel/mkl/10.2.3.029/lib/em64t/libmkl_lapack.so: undefined reference to
`mkl_lapack_zgetrf'
/opt/intel/mkl/10.2.3.029/lib/em64t/libmkl_lapack.so: undefined reference to
`mkl_serv_xerbla'
/opt/intel/mkl/10.2.3.029/lib/em64t/libmkl_def.so: undefined reference to
`mkl_pdepl_d_inv_ft_dd_nd'
/opt/intel/mkl/10.2.3.029/lib/em64t/libmkl_lapack.so: undefined reference to
`mkl_lapack_cunmqr'

These libmkl_lapack.so and libmkl_def.so libraries exist in this directory.

How I could check why it happen? Is it because of new versions of Petsc and MKL
or I just wrong installed MKL?
With standard BLAS Petsc works, so problem seems in MKL.

Could I find more detailed discription how to connect Petsc to MKL than
http://software.intel.com/en-us/articles/mkl-blas-lapack-with-petsc/ ?



0 Kudos
4 Replies
Artem_V_Intel
Employee
1,896 Views

Hello,

I tried to build petsc-3.0.0-p10 with MKL 10.2.3.029 on intel 64-bits dual core machine. I used the next commands for building:

$ PETSC_DIR=$PWD

$ export PETSC_DIR

$ ./configure --with-blas-lapack-dir=/opt/intel/mkl/10.2.3.029/lib/em64t

$ make all

$ make test

I doesn't obtain any issues during compilation. All tests ran successfully.

Thanks,

Art

0 Kudos
TimP
Honored Contributor III
1,896 Views

The document


http://software.intel.com/en-us/articles/mkl-blas-lapack-with-petsc/


refers specifically to MKL 10.1. I suspect you may be running into the change in 10.2,
if the lapack.so you have no longer is a script containing references to certain choices of other
libraries such as the lp64, core, and thread libraries. It may require an expert on petsc and MKL
to update the procedure so as to take care of the dependencies.

I just completed a project where I had to give up on shared Intel libraries and use not only the static
library setup as prescribed in the MKL link advisor, but add static libraries from the Intel compiler
library directory. The icpc -static-intel link command took care only of the libraries which directly
satisfy dependencies of the C++ source files, not those brought in indirectly by MKL or by Fortran
source files. A reason given for not giving a fully automatic procedure for static link is that
someone will make mistakes with OpenMP support when generating .so files.
0 Kudos
ngolubev
Beginner
1,896 Views

Thanks Artem,

I reduced number of parameters and now configured PETSC with MKL.

Unfortunatelly I have new problem now. If I compare results for typical model (Poisson equation, 220*220*220 cells, pretty sparse matrix, KSPBCGS) I don't see any improvements. So I need advise of experienced people. Could MKL improve perfomance of solution of sparse matrix? Did anybody get this improvement for PETSC?

For full matrix using of processor optimized libraries (like Atlas) provides several time faster code compare to fortran compiled.

New code uses

/opt/intel/mkl/10.2.3.029/lib/em64t -L/opt/intel/mkl/10.2.3.029/lib/em64t -lflapack -lfblas -lnsl -lrt -lm

-L/opt/intel/mkl/10.2.3.029/lib/em64t -ldl -lmpich -lpthread -lrt -lgcc_s -lmpichf90 -lgfortranbegin -lgfortran -lm

compare to standard

-L/home/Nik/soft/petsc-3.0.0-p7/linux-gnu-c-debug/lib -lflapack -lfblas -lnsl -lrt -lm

-L/lib64 -ldl -lmpich -lpthread -lrt -lgcc_s -lmpichf90 -lgfortranbegin -lgfortran -lm

Any advises how to improve performance appreciated.

Nik

0 Kudos
ngolubev
Beginner
1,896 Views
Quoting ngolubev
Unfortunatelly I have new problem now. If I compare results for typical model (Poisson equation, 220*220*220 cells, pretty sparse matrix, KSPBCGS) I don't see any improvements. So I need advise of experienced people. Could MKL improve perfomance of solution of sparse matrix? Did anybody get this improvement for PETSC?
For full matrix using of processor optimized libraries (like Atlas) provides several time faster code compare to fortran compiled.

I have a bad news. I got responce from PETSc developers. They don't expect improvements from using MKL for my problem.

Anyway, any information on using MKL with PETSc appreciated.

Nik

0 Kudos
Reply