- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I have obeserved some strange behaviour in my code, and it seems to me that it is connected to the number of allocated pointers:
I'm using Fortan 9.030 on Windows 2003 Compute Cluster Edition (Beta) EM64T in parallel (MPI) on 24 machines equipped with 8 GB.
The Code is also linked to PETSC libraries compiled with cl 14.00.40310.41
In the code (3d electromagnetic field computation) i allocate pointers for each face in the grid, when meshing structures. My code runs well under 32 Bit, and also under EM64T as long as the problems remain small (approximately 6 million allocated pointers) But when I try to solve larger problems (~30 million allocated pointers) the code will begin to crash. The crashes (segmentations) happen inside the PETSC library but are not fully predictable. Which prozessor crashes is dependend on the memory utilization: changing the number of grid cells, pre-allocating stack, and all operations changing the allocated memory have an effect on the site of the crash....
i manged to get rid of the allocations using a workaround at the moment and it works fine, but i will need to use the pointers in the future....
Is there a limitation to the number of pointers ? Or is the error located somewhere else??
I have obeserved some strange behaviour in my code, and it seems to me that it is connected to the number of allocated pointers:
I'm using Fortan 9.030 on Windows 2003 Compute Cluster Edition (Beta) EM64T in parallel (MPI) on 24 machines equipped with 8 GB.
The Code is also linked to PETSC libraries compiled with cl 14.00.40310.41
In the code (3d electromagnetic field computation) i allocate pointers for each face in the grid, when meshing structures. My code runs well under 32 Bit, and also under EM64T as long as the problems remain small (approximately 6 million allocated pointers) But when I try to solve larger problems (~30 million allocated pointers) the code will begin to crash. The crashes (segmentations) happen inside the PETSC library but are not fully predictable. Which prozessor crashes is dependend on the memory utilization: changing the number of grid cells, pre-allocating stack, and all operations changing the allocated memory have an effect on the site of the crash....
i manged to get rid of the allocations using a workaround at the moment and it works fine, but i will need to use the pointers in the future....
Is there a limitation to the number of pointers ? Or is the error located somewhere else??
Link Copied
1 Reply
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
There is no limit that I know of. If you have an example that shows the problem, please submit it to Intel Premier Support.
![](/skins/images/4399995709D76C02984DDB9A1DB96C28/responsive_peak/images/icon_anonymous_message.png)
Reply
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page