fnPrime



Artificial Intelligence Uses Extreme Amount of Electricity

The amount of energy data centers consume is expected to increase 10 times over the next several years.    October 25, 2023


By Greg Zimmerman, senior contributing editor


The boom in artificial intelligence (AI) has its pros and cons. While AI can certainly be a benefit for facility managers, helping them making data-driven decisions and generating creative solutions to difficult problems (among several other benefits detailed in this FacilitiesNet article), there is one notable drawback to AI. The computing power required to run AI use a LOT of electricity.  

According to Scientific American, currently, data centers use about 1 to 1.5 percent of all electricity used globally. But that percentage could increase dramatically in coming years, as much as ten times up to 21 percent, according to AI and data centers expert Alex de Vries. His analysis shows that if the expected 1.5 million NVIDIA AI servers are shipped as expected in the next four years, just those machines could consume 85.4 terawatt-hours of electricity. That’s more than some small countries’ total energy spend, he says. 

Here’s more data: A report from McKinsey as reported in Business Insider, shows that currently data centers in the US consume about 17 gigawatts of power annually, but that number is expected to more than double to 35 gigawatts by 2030.  

Researchers are working on more efficient computing methods to run AI. The Lincoln Laboratory at the Massachusetts Institute of Technology is one of the first to spearhead efforts to study and design energy efficient super computers. But for now, the alarm is being sounded by data center experts, computer scientists, and environmentalists about the heavy toll the increased amounts of electricity AI requires will cause.  

Greg Zimmerman is senior contributing editor for FacilitiesNet.com and Building Operating Management magazine. 

Next


Read next on FacilitiesNet