- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have a generalized eigenvalue problem, and I am calling LAPACKE_dsygst eight times, once on each block, of my block-diagonal matrix. I seem to have a memory leak: I am calling
_CrtMemCheckpoint( &s1 );
ch = LAPACKE_dsygst(LAPACK_ROW_MAJOR, 1,'U', _basisPop, &gmat[addresses], R, &emat[addresses],R);
_CrtMemCheckpoint( &s2 );
if ( _CrtMemDifference( &s3, &s1, &s2) )
_CrtMemDumpStatistics( &s3 );
The _CrtMemCheckpoint checks the state of the heap, and _CrtMemDumpStatistics returns the difference between the two states. As far as I know, LAPACKE_dsygst shouldn't be permanently allocating any memory, but I get a finite memory difference of 1141840 bytes (for an 858 by 858 matrix of doubles) after the first call. What is really puzzling is that I am calling this eight times, once on each block, but I only get the memory difference on the first call on the first block: the other seven times there is no memory difference before and after.
Ideas? I'm not a real programmer, i'm a physicist, so i really don't know what I'm doing here....
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page