AI drives sustainability despite concerns over data center energy consumption
A critique of the rise of artificial intelligence is the amount of energy required to power the technology, but solutions might reside within the tech itself.
The International Energy Agency expects data centers to double their electricity use by 2026 compared to 2022, when they were responsible for using 2% of the world’s electricity. However, research finds the adoption of AI is creating energy efficiencies in other industries, which might more than offset power hungry AI.
“Even if the predictions that data centers will soon account for 4% of global energy consumption become a reality, AI is having a major impact on reducing the remaining 96% of energy consumption,” according to research by The Lisbon Council.
Also, accelerated computing, which uses the parallel processing of Nvidia (NASDAQ:NVDA) GPUs, allows much more work to be done in less time. This results in less energy consumption than using CPUs built to tackle one task at a time.
“While energy consumption for training large language models has increased substantially, hardware and software innovation such as accelerated computing is progressing so fast that the overall energy consumption is growing much less quickly than computing requirements and performances,” the study finds.
“By transitioning from CPU-only operations to GPU-accelerated systems, HPC and AI workloads can save over 40 terawatt-hours of energy annually, equivalent to the electricity needs of nearly 5 million U.S. homes,” according to a blog post by Nvidia.
Another study by the Center for Data Innovation found that some of the rhetoric surrounding the excessive use of energy for AI is unfounded. The study cites a Forbes article from the 1990s which predicted the internet would consume half of the electric grid’s usage within the next decade.
“With the recent surge in interest in artificial intelligence, people are once again raising questions about the energy use of an emerging technology,” the study reads. “However, as with past technologies, many of the early claims about the consumption of energy by AI have proven to be inflated and misleading.”
Indeed, a plethora of big tech companies behind the rise of AI are pursuing carbon-neutral company goals. This list includes tech giants such as Apple (AAPL), Microsoft (MSFT), IBM (IBM), Dell Technologies (DELL), Google (GOOG)(GOOGL), Meta (META) and Intel (NASDAQ:INTC).
Wind and solar accounted for 14.1% of electricity production last year, according to the U.S. Energy Information Administration. This is not enough to help tech companies with high-energy usage achieve carbon-neutral goals, which has prompted some to take a look at nuclear power.
Amazon Web Services (NASDAQ:AMZN) bought a data center earlier this year powered by Talen Energy’s nuclear power plant in Salem Township, Penn. NextEra Energy (NYSE:NEE) is also considering firing up a retired nuclear power plant in Iowa in part due to the rise of power demand from AI data centers.
And of course, tech companies themselves are finding ways to increase the efficiency of their own products. Nvidia said its GPUs have improved their efficiency in running large language models by 45,000 times over the last eight years.
“If the efficiency of cars improved as much as NVIDIA has advanced the efficiency of AI on its accelerated computing platform, cars would get 280,000 miles per gallon,” Nvidia said.
Super Micro Computer (NASDAQ:SMCI) is also advancing liquid-cooled AI superclusters with GPU producers such as Nvidia, AMD (NASDAQ:AMD) and Intel. These have the potential to reduce data center energy bills by as much as 40%.