Big Ideas
See how hardware, software, and innovation come together.
65 Discussions

Four principles for writing energy and carbon-efficient software

2 0 8,672

Intel is known for its industry leading hardware solutions and for decades we have been a strong environmental steward.  This stewardship has included efforts to accelerate sustainability through increasingly efficient products to developing innovative energy conserving features for our processors and platforms.  With this work, Intel is leading the enablement of a more sustainable data center and compute industry.  With global data center energy consumption accounting for 0.9-1.3% of total global energy demand (IEA, Sept 2022), that’s a great thing.

But global and substantial impact is best achieved by looking at all areas in compute that can provide sustainability value. According to Intel estimates, infrastructure and software inefficiency count for over 50% of greenhouse gas (GHG) emissions in the data center.   This illuminates that it’s not only what’s in your data center that matters, but how you use it.  For example, infrastructure inefficiency can be addressed through greater server utilization. 

For this blog, though, I will focus on software inefficiency as there are ways to make your software more carbon and energy efficient without compromising its functionality or performance. In fact, making your software more energy and carbon efficient is likely the fastest way to improve the environmental impact of your IT operations. Here are four principles that can guide you in designing and developing software that minimizes its energy consumption and carbon footprint.

Principle 1: If you cannot measure, you cannot improve.

The first principle is to measure the energy efficiency of your software. You cannot improve what you do not know, so you need reliable data and tools to track and analyze how much energy your software consumes and how much carbon it emits.

There are several approaches to measuring the energy consumption of your applications.

One approach is to use an energy meter. These sit between your device and the wall socket and report the actual energy consumption of your device. Since multiple applications can be running on your device at one time the challenge is in attributing how much energy is consumed by your one application, so this solution is best suited for applications that consume many system resources or are long-running.

One method we are applying within Intel is called Software Carbon Intensity (SCI), developed by The Green Software Foundation. SCI measures the amount of carbon emissions per unit of value, for example carbon per minute for a video streaming application.  It considers both the operational emissions, from the energy consumed, and the embodied emissions, from manufacturing hardware your software runs on.  Leveraging this method can provide a strong foundation to your carbon measurement practices.

Another solution is using a meter like Intel's RAPL (Running Average Power Limit). This gives you similar information through a software interface to give you access to power registers on x86 CPUs.  This is easy to extend in software rather than using an external piece of hardware like the energy meter. Intel’s RAPL provides more convenient access to energy consumption information at the socket level. However, you will still need to find solutions to attribute that socket-level energy consumption back to a process running on your device. For more information on Intel's RAPL, refer to section 15.10 in Intel's Software Development Manual.

Finally, a third solution, if you are using Windows, is to use the PowerCfg utility. Where Intel’s RAPL gives you direct energy consumption at the socket level through telemetry and counters built into the chips, PowerCfg uses a model to estimate energy consumption from other factors like CPU utilization. The advantage of PowerCfg is that it gives you an estimate of the energy consumption per process and hardware component.

Another item to understand about measurement is that energy and carbon goals may differ but are indeed related. Most, but not all, practices in this space first start with measuring the energy consumed over a period by the hardware running your software. Converting this value to a carbon impact requires you to use a conversion factor called the carbon intensity of the energy, which is a measure of how much carbon is emitted per kWh; the global average for 2022 was 440g / kWh.

Principle 2: Standardize your practices.

There are different methods and metrics to measure the energy efficiency of software, such as power consumption, CPU utilization, memory usage, and network traffic. However, these metrics may only capture part of the complete picture of the environmental impact of your software. For example, power consumption may vary depending on the hardware configuration, operating system, and network conditions. Therefore, it is essential to use standardized and comprehensive methods that account for all these factors.  

Beyond standards for attributing energy and carbon to software, there is a more significant need to understand how to attribute carbon across data center environments. Through the Open Compute Project, Intel is collaborating on a SustainabilityIinitiative to develop a standard to enable comparisons to be possible across data center environments. This includes manufacturing, operations, software, and end-of-life. Join us in the OCP workstream if you want to be part of this exciting work.

Using these methods and tools, you can gain insights into how your software affects the environment and identify opportunities for improvement.

Principle 3: Meet developers where they are.

The third principle is to meet developers where they are. You should not expect all developers to have the same knowledge or interest in sustainability issues. You must provide them with easy-to-use and accessible tools that integrate with their existing workflows and environments. You don't have to reinvent the wheel whenever you start a new project or update an existing one. Instead, you can adopt best practices that have been proven effective by other developers or organizations.

One practice is called Green Software Engineering (GSE), an emerging discipline that applies principles and techniques from climate science, software engineering, hardware engineering, energy markets, and data center design to create software that minimizes its environmental impact while maximizing its value. GSE covers various topics such as energy efficiency, hardware efficiency, and carbon awareness. Intel has been working through The Green Software Foundation to develop a set of common patterns that can be applied to optimize your software for different scenarios and trade-offs and leverage opportunities for innovation and differentiation.

Another practice is to build functionality in the open-source frameworks developers are already using. Intel’s Global Extensible Open Power Manager (GEOPM) is a collaborative framework for exploring power and energy optimizations on heterogeneous platforms. GEOPM is open-source and developed to enable efficient power management and performance optimizations. With this powerful tool, users are able to monitor their system's energy consumption and safely optimize system hardware settings to help achieve energy efficiency or performance objectives.

Intel’s Granulate helps software teams reduce energy and carbon emissions through automatic and continuous workload optimizations. Granulate passively learns how your application runs and then applies optimizations automatically with no code changes involved. Customers report Granulate can improve application performance by up to 40% (as measured by response time).

By using these tools and following these practices, you can empower your developers to make informed decisions about the environmental impact of their software and encourage them to adopt sustainability practices.


Principle 4: Focus on high impact

The fourth principle is to focus on high-impact areas where your software can significantly reduce carbon emissions and energy consumption. You do not have to tackle every aspect of your software at once but prioritize those with the most potential for improvement and value creation.

One such area is Artificial Intelligence (AI), one of today's most influential and pervasive technologies. AI has many applications that can help solve some of humanity's most pressing challenges, such as climate change, health care, and education. However, AI also consumes much energy and generates carbon emissions due to its intensive computational requirements and data processing needs.

Therefore, it is essential to design and develop AI systems that are not only intelligent but also efficient and responsible. This means using techniques such as model compression, quantization, pruning, and distillation to reduce the size and complexity of AI models and improve their performance and accuracy.

TensorFlow has recently released support for Intel's OneDNN, an open-source deep learning library that enables users to optimize their(AI workloads. This library optimizes computation and memory utilization on Intel processor architecturey. It also provides several other features, such as automatic optimization of data layout and AI algorithm selection based on hardware characteristics. With this new release, developers can now leverage these advanced optimizations to improve their TensorFlow-based AI applications' performance –expectation is up to 3X performance for AI operations and see gains in energy efficiency.

In addition, scikit-learn-intelex (sklearnex) is an open-source extension package designed by Intel to accelerate the Scikit-learn library. This acceleration is achieved via patching, so it is very easy to use – and to integrate into any existing machine learning project – with minimal disruption. In addition, the package is automatically included in the Anaconda Distribution. Using the package Intel optimized software can achieve up to 8.5x and 7x less footprint on CPU and DRAM, respectively.

Being more energy efficient isn't the only solution to reducing the carbon emissions of AI. It's possible to consume the same amount of energy but emit fewer carbon emissions by using the cleanest energy possible. The measure of how clean or dirty electricity is called the carbon intensity of electricity, and it varies by region but also by time. When the wind is blowing or the sun is shining, more of the electricity you consume comes from lower carbon renewable sources. Architecting your software to take advantage of these lower carbon sources of electricity is called carbon aware computing.

Applying these techniques can create more carbon/energy-efficient AI systems that deliver more value to your customers and society.


To recap, here are four principles that can help you write carbon/energy-efficient software:

  • If you cannot measure, you cannot improve: Provide the best data and tooling to measure energy efficiency.
  • Standardize your practices: Adopt frameworks and methodologies that define common requirements for measuring and reporting carbon emissions and energy consumption.
  • Meet developers where they are: Provide easy-to-use and accessible tools that integrate with existing workflows and environments.
  • Focus on high impact: Prioritize areas where your software can significantly reduce carbon emissions.

Sustainability is a challenge, but together we can make a difference.

Tags (2)
About the Author
Jen M. Huffstetler is Chief Product Sustainability Officer and VP/GM of Intel Future Platform Strategy and Sustainability. In this role, she is responsible for driving the integration and execution of the corporate-wide Intel Platform technologies & business strategies to drive future growth and corporate-level product strategy and action for Sustainability. Jen joined Intel in 1996 as a fab process engineer and has spent most of her career applying her extensive technical and business experience to lead strategy, product management, and product marketing efforts for a number of core Intel businesses. Most recently she led Data Center Platform Strategy, responsible for building and executing cross-corporate Cloud to Edge technical and business strategies, incubating new services, and driving Xeon Business Management and Operations. Jen holds a bachelor’s degree in chemical engineering from MIT, and an MBA from Babson College, F.W. Olin Graduate School in Corporate Entrepreneurship.