Quantcast
© 2023 The Arizona Board of Regents on behalf of The University of Arizona.

Decarbonizing The Internet

The internet as we know it is inextricable from the cloud—the ethereal space through which all e-mails, Zooms, and Instagram posts pass.


Mara Grunbaum, Yasmin Tayag
Jun 8, 2023

The internet as we know it is inextricable from the cloud—the ethereal space through which all e-mails, Zooms, and Instagram posts pass. As many of us well-know, however, this nebulous concept is anchored to the Earth by sprawling warehouses that crunch and store data in remote places. Their energy demands are enormous and increasing exponentially: One model predicts they will use up to 13 percent of the world’s power by 2030 compared to just 3 percent in 2010. Gains in computing efficiency have helped matters, says University of Massachusetts Amherst assistant professor of informatics and computer science Mohammad Hajiesmaili, but those improvements do little to reduce the centers’ impact on the environment.

“If the power supply is coming from fuel sources, it’s not carbon optimized,” explains Hajiesmaili. But renewable power is sporadic, given its reliance on sun and wind, and geographically constrained, since it’s only harvested in certain places. This is the puzzle Hajiesmaili is working to solve: How can data centers run on carbon-free energy 24/7?

The answer involves designing systems that organize their energy use around a zero-carbon goal. Several approaches are in the works. The simplest uses schemes that schedule computing tasks to coincide with the availability of renewable energy. But that fix can’t work on its own given the unpredictability of bright sunlight and gusts of wind—and the fact that the cloud doesn’t sleep. Another strategy is “geographical load balancing,” which involves moving tasks from one data center to another based on local access to clean power. It, also, has drawbacks: Transferring data from one place to another still requires energy, Hajiesmaili notes, and, “if you’re not careful, this overhead might be substantial.”

An ideal solution, and the focal point of much of his work these days, involves equipping data centers with batteries that store renewable energy as a reserve to tap, say, at night. “Whenever the carbon intensity of the grid is high,” he says, “you can just discharge from the battery instead of consuming local high-carbon energy sources.” Even though batteries that are big enough, or cheap enough, to fully power data centers don’t exist yet, Hajiesmaili is already developing algorithms to control when future devices will charge and discharge—using carbon optimization as their guiding principle. This “carbon-aware” battery use is just one of many ways in which Hajiesmaili thinks cloud design should be overhauled; ultimately, the entire system must shift to put carbon use front and center.

Most big technology companies have pledged to become carbon-neutral—or negative, in Microsoft’s case—in the coming decades. Historically, they have pursued those goals by buying controversial offset credits, but interest in carbon-intelligent computing is mounting. Google, for one, already uses geographical load balancing and is continuing to fine-tune it with Hajiesmaili’s input, and cloud-computer company VMWare has its own carbon-cutting projects in the works. In his view, though, the emerging field of computational decarbonization has applications far beyond the internet. All aspects of society—agriculture, transportation, housing—could someday optimize their usage through the same approach. “It’s just the beginning,” he says. “It’s going to be huge.”—Yasmin Tayag

Publication: Sohaib Ahmad, et al., Learning from Optimal: Energy Procurement Strategies for Data Centers, ACM Digital Library (2023). DOI: 10.1145/3307772.3328308

Original Story Source: University of Arizona


RECOMMENDED