- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi:
I find out that cluster_sparse_solver can not release physical memory for rank=0 process.
I apply two processes and two threads for each process to solve mtype=6, complex and symmetric matrix and use distribute assembled matrix input format as well as distribute RHS elements. The full example is in the attachment file.
The test I do is use literation statement to do the same calculation again and again. The result for every loop is correct. And the physical memory for rank=1 remains the same for each loop. However, the physical memory for rank=0 keeps on climbing. My computer has 16G memory and the physical memory occupation is shown below:
loop rank=0(%) rank=1(%)
phase 11 phase 23
0 4.7 6.5 4.6
1 5.7 7.4
2 6.6 8.3
3 7.5 9.3
4 8.4 10.2
5 9.4 11.2
I use the following command to compile: mpic++ -cxx=icpc -std=c++1y -mkl -xHost plain.cpp
and to run: mpiexec -n 2 ./a.out
I use mpich 3.1, mkl 11.2, icpc 15.0.0 on linux 64
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi!
I see the leaks on my side too but the size of these leaks of memory not so big like you saw on your side. Nevertheless, we will check what's going on with this tests and will back to you soon.
--Gennady
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I try to use mkl 11.2. update 1, but the problem still remains the same. Does mkl have any plan to fix the problem?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
yes. the preliminary plan to add this fix into the next update. you will be updated when the fix would be released.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi! the problem has been fixed in 11.2 update 2 which has been released officially. Please check the problem on your side and let us know the results. thanks.
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page