I have came across a problem with FEAST in MKL 11.2, whereby it is not returning all the eigenvalues in the specified search range. The problem type is Generalized Sparse (feast_scsrgv).
Default values in fpm. I provide an initial guess subspace size, say m0 = 500. The problem returns with info = 0 and returned number of eigenvalues m = 496. At this stage one might think all is fine, however if I increase the initial subspace guess to m0 = 600, then retuned values are info = 0 and m = 555. Further increases in m0 show that m remains equal to 555. Hence there was 59 missing eigenvalues in the first solve with m0 = 500, in which case I would have expected info = 3 (subspace size too small). Furthermore the 59 missing eigen values where at the lower end of the range, which are more critical to my scenario. The problem size n was 1346.
This is not isolated to one problem as we have observed this with several problems.
It is most disconcerting as this is affecting a product in use by many engineers.
I look forward to your response.
Attached file gives f90 project and also a eigeninout.dat file which contains the matrix data.
If you change m0size to 10 and run you get info = 0 and m= 9, smallest eigen value is 14.43.
If you change m0size to 11 and run you get info = 0 and m= 10, smallest eigen value is 3.72.
Where m0size = 10 I would expect info = 3 as not not all eigenvalues in the specified range have been found.
Thank you for you help,
I have carefully investigated your problem and must say that it is an expected behavior of the algorithm. Eigenvalues that are close to the edge of an interval can be lost due to lack of precision of spectral projector in the endpoints. To avoid such problem you can improve accuracy by increasing the number of contour points Ne (fpm(2)=16) or what I would recommend more - extend the interval and use the initial guess for subspace dimension m0 1.5 times bigger to insure that all of the eigenvalues are found.
Many thanks. I have tried the increased number of contour points on the sample project and it works. Will increasing Ne from 8 to 16 significantly impact on the the performance of the solver, e.g. will is double the time?
That is correct, doubling the number of contour points can result in up to 2 times worse performance, that is why I would recommend you to extend the interval or use bigger initial guess for subspace dimension m0, since these two approaches will slightly increase time.