What is it about?
Rapid growth in cloud technologies such as Machine Learning has provided people around the world with unprecedented access to computational power: the computational demands of these workloads can incur a high energy cost. Energy translates to a real-world carbon footprint: so-called operational carbon emissions arise when the source of electricity is not carbon-free, meaning it factors in the carbon intensity of the grid that powers the data center. The carbon intensity of a grid is sensitive to small changes in carbon-intensive generation, and can vary by both location and time. Each datacenter region incorporates a different mix of energy sources, so can vary widely. Carbon intensity varies by hour, day, and season due to changes in electricity demand, low carbon generation (wind, solar, hydro, nuclear, biomass), and conventional hydrocarbon generation. As a result, there are many opportunities to shift computing resources to capitalize on these variations: this is a part of Green Software Engineering known as 'carbon-aware computing'. There are certain decisions that users can make to make their workloads more carbon-aware, such as choosing a geographic region, deciding when to run a training job, or pausing & resuming workloads upon given carbon conditions. Knowing what actions are possible, and what impact they have, will help users make informed decisions on how to reduce the carbon footprint of their workloads.
Featured Image
Photo by israel palacio on Unsplash
Why is it important?
AI has the potential to accelerate progress; for example, by integrating renewable energy into an electricity grid or reducing the cost of carbon capture. At the same time, the technology itself needs to be sustainable. We don’t have an accurate measure of AI’s overall carbon footprint because the infrastructure to track and report this is still in its infancy. Without a consistent framework to measure operational carbon emissions on a granular basis, users and cloud providers cannot take effective action. We provide a framework to measure the operational carbon intensity of a ML workloads, and three simple tactics to let users mitigate it.
Perspectives
Read the Original
This page is a summary of: Measuring the Carbon Intensity of AI in Cloud Instances, June 2022, ACM (Association for Computing Machinery),
DOI: 10.1145/3531146.3533234.
You can read the full text:
Resources
Contributors
The following have contributed to this page