- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Abstract:
Compilers allow varying levels of optimization, but it is often impractical to apply the most aggressive levels on all files of large commercial applications. Applying high optimization levels to low time consuming functions increases compile time and adds a degree of risk, without even the possibility of a measurable gain in performance. Additionally, some compilers are capable of issuing optimization reports, but the size of such reports for entire projects makes them unusable.
A solution to these dilemmas can be found in applying aggressive compiler optimizations only to those source files that contribute in a significant manner to total elapsed time. To this end, the use of a performance analyzer to guide makefile modifications allows a developer to quickly focus on only those code sections where their efforts will be rewarded.
The article will discuss the coordinated use performance analysis data to drive optimizing compiler to usage, with the aim of improving performance while minimizing effort, time and risk.
Message Edited by calsamri on 07-14-2005 04:29 PM
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page