Intel® oneAPI Math Kernel Library
Ask questions and share information with other developers who use Intel® Math Kernel Library.

Problem in subsequent calls to PARDISO MKL

OGUZ_UMUT_S_
Beginner
306 Views

I am using INTEL MKL PARDISO to solve Laplace equation. However, the Matrix becomes more sparse every time I call PARDISO due to my physical problem. PARDISO works perfectly fine until it fails at the stage phase = 11. I was thinking it was because of the very sparse shape of the MATRIX after some iterations. Afterwards, I saved the MATRIX for which the PARDISO failed and I re-used my program with the last MATRIX that I saved and PARDISO works again perfectly fine until it fails after a random number of calls. It may be 1000 calls or more or less. 

 

Is there any known issue about that ? 

 

 

Thanks

 

 

0 Kudos
5 Replies
mecej4
Honored Contributor III
306 Views

 However, the Matrix becomes more sparse every time I call PARDISO due to my physical problem

That does not strike me as being reasonable. If you are solving the Laplace equation, and you are not changing the domain, grid and nature of the boundary conditions, you should be able to do the factorization once and repeat the solution phase any number of times. I suspect that you are either not calling Pardiso in the correct sequence or you are not calling with the correct parameters.

If you need to troubleshoot your issues further, you should provide a working example that exhibits the behavior that you described. If you make repeated calls with phase=11, you need to issue calls with phase=0 or phase = -1 in order to free up memory that is no longer needed or used.

Olaf Schenk provides an example C code at http://pardiso-project.org/index.php?p=manual for solving the Laplace equation. I modified the code to run with the MKL version of Pardiso using zero-based indexing (see attached file; you will also need the file laplace.h from the Pardiso examples download page). The program solves the Laplace equation and checks that the residuals (the elements of the vector A x - b) are all less than 10-10th of the norm of b, Running with a 100 X 100 grid (about 10,000 unknowns,  30,000 nonzero entries in the symmetric matrix), with 10,000 sets of right hand sides gave me the following results.

S:\lang\Pard>laplace 100 100 100
n, nnz = 10000, 29800

Reordering completed ...
Number of nonzeros in factors  = 219383
Number of factorization MFLOPS = 14
Factorization completed ...
Factorization time   0.298
Step   10, solution time =   1.770
Step   20, solution time =   1.160
Step   30, solution time =   0.955
Step   40, solution time =   0.804
Step   50, solution time =   0.701
Step   60, solution time =   0.698
Step   70, solution time =   0.716
Step   80, solution time =   0.729
Step   90, solution time =   0.736
Step  100, solution time =   0.712

I made runs using (i) the current 32-bit versions of Intel C (14.0.0.103) and MKL 11, and (ii) the 64-bit versions of MS C V.17 and MKL 10.3.2.154. 

0 Kudos
OGUZ_UMUT_S_
Beginner
306 Views

Hi,

Thanks for your message. I solve the Laplace equation to study a discrete fuse network. And every time the current in a fuse reaches a threshold you remove it from the system. So you start with  the discrete Laplace equation and it becomes something else due to breaking of the fuses during external loading (applied voltage). 

However, I solved the problem. The code crashes if you use METIS  which is default for non symetric matrices instead of < Preprocessing with multiple minimum degree, tree height >. Basically, I used  iparm(2) = 0 in Fortran.  I also think that one should check iparm(13) but I didnot test. 

As I mentioned before, if I relaunch the code with the Matrix that crashed it, it is working correctly again when iparm(2) = 2 until it crashes again . So I am pretty sure there is an internal memory problem in the Intel Pardiso METIS

 

All the Best

 

 

 

0 Kudos
Gennady_F_Intel
Moderator
306 Views

Here are results I have with the latest version 11.1 . Windows, 64 bit.

n, nnz = 10000, 29800

 iparm(2) == 0

Reordering completed ...
Number of nonzeros in factors  = 203391
Number of factorization MFLOPS = 12
Factorization completed ...
Factorization time   0.143
Step   10, solution time =   1.452
Step   20, solution time =   1.421
Step   30, solution time =   1.293
Step   40, solution time =   1.284
Step   50, solution time =   1.290
Step   60, solution time =   1.281
Step   70, solution time =   1.304
Step   80, solution time =   1.340
Step   90, solution time =   1.287
Step  100, solution time =   1.409
Press any key to continue . . .

+++++++++++++++++++++

n, nnz = 10000, 29800

 iparm(2) == 2

Reordering completed ...
Number of nonzeros in factors  = 219383
Number of factorization MFLOPS = 14
Factorization completed ...
Factorization time   0.180
Step   10, solution time =   1.241
Step   20, solution time =   1.362
Step   30, solution time =   1.244
Step   40, solution time =   1.219
Step   50, solution time =   1.261
Step   60, solution time =   1.212
Step   70, solution time =   1.252
Step   80, solution time =   1.339
Step   90, solution time =   1.212
Step  100, solution time =   3.467
Press any key to continue . . .

 

 

 

0 Kudos
OGUZ_UMUT_S_
Beginner
306 Views

Hello,

You are solving the Laplace equation which has a symmetric matrix. In my case I start with the Laplace equation as a discrete fuse model. When the current on a fuse reaches a threshold due to voltage loading on the top boundary, you remove the fuse from the system. Therefore, the matrix  becomes non symmetric. 

 

Using  iparm(2) = 0, the issue has been solved. But it persists with iparm(2) =2 thta uses METIS algorithm. As I mentioned before, when I re-use the matrix that crashed PARDISO by relaunching the code, it again works correctly until it crashes once again. 

I think there is an internal memory problem with METIS algoirthm.

 

All the Best

 

0 Kudos
Gennady_F_Intel
Moderator
306 Views

that's not clear for me how to reproduce the problem? Can you give the exact example shows this failure? 

regards, Gennady

0 Kudos
Reply