- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I'm working on a power optimization problem, however I've been unable to find any information explicitly showing CPU power consumption with CPU temperature.
A number of papers / presentations discuss how CPU current leakage increases with temperature, however concrete data is never listed. One IEEE paper written by Intel engineers (The Effect of Data Center Temperature on Energy Efficiency), estimates 50% of power usage is due to current leakage and increases 2% / degree Celsius increase.
I've tried reproducing experimental results with some older Xeon processors with little success, however I will admit poor experimental control.
I realize there is little doubt leakage current is affected by temperature, however does anyone actually have specific data? I'm particularly interested in server CPU's. Thank you for the help.
A number of papers / presentations discuss how CPU current leakage increases with temperature, however concrete data is never listed. One IEEE paper written by Intel engineers (The Effect of Data Center Temperature on Energy Efficiency), estimates 50% of power usage is due to current leakage and increases 2% / degree Celsius increase.
I've tried reproducing experimental results with some older Xeon processors with little success, however I will admit poor experimental control.
I realize there is little doubt leakage current is affected by temperature, however does anyone actually have specific data? I'm particularly interested in server CPU's. Thank you for the help.
Link Copied
1 Reply
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Quoting - gearheadcb
I'm working on a power optimization problem, however I've been unable to find any information explicitly showing CPU power consumption with CPU temperature.
A number of papers / presentations discuss how CPU current leakage increases with temperature, however concrete data is never listed. One IEEE paper written by Intel engineers (The Effect of Data Center Temperature on Energy Efficiency), estimates 50% of power usage is due to current leakage and increases 2% / degree Celsius increase.
I've tried reproducing experimental results with some older Xeon processors with little success, however I will admit poor experimental control.
I realize there is little doubt leakage current is affected by temperature, however does anyone actually have specific data? I'm particularly interested in server CPU's. Thank you for the help.
A number of papers / presentations discuss how CPU current leakage increases with temperature, however concrete data is never listed. One IEEE paper written by Intel engineers (The Effect of Data Center Temperature on Energy Efficiency), estimates 50% of power usage is due to current leakage and increases 2% / degree Celsius increase.
I've tried reproducing experimental results with some older Xeon processors with little success, however I will admit poor experimental control.
I realize there is little doubt leakage current is affected by temperature, however does anyone actually have specific data? I'm particularly interested in server CPU's. Thank you for the help.
Hello Gearheadcb,
Unfortuneately, this forum supports Active Management Technology and it's developer's who are writing software for it. I looked through our other forums and I didn't see one that really fits your question. I would try posting your question via our support email.

Reply
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page