
Almost all leading AI developers are focused on building AI models that mimic the way humans think, but new research shows that these sophisticated systems could be more energy-hungry, raising concerns about AI pressure on power grids.
AI models used 30 times more reasoning power on average to respond to 1,000 written prompts than surrogates that didn’t have that reasoning ability or had it disabled, according to a study released Thursday. This work was carried out through the AI Energy Score project, led by Hugging Face research scientist Sasha Luccioni and Sales force Inc. Head of AI Sustainability Boris Gamazychikov.
The researchers evaluated 40 open and freely available AI models, including software from OpenAI and Alphabet Inc. Google and Microsoft Some models have been shown to have much greater variation in power consumption, including one from Chinese startup DeepSeek. A stripped-down version of the DeepSeek R1 used just 50 watts per hour to respond to prompts when inference was turned off, or roughly the power needed to power a 50-watt light bulb for an hour. With heuristics enabled, the same model requires 7,626 watt-hours to complete tasks.
The growing power needs of artificial intelligence are coming under increasing scrutiny. As technology companies race to build more and larger data centers to support artificial intelligence, industry observers have raised concerns about this Stressing electricity networks And raising energy costs for consumers. Bloomberg investigation In September, it found that wholesale electricity prices had risen as much as 267% over the past five years in areas near data centers. There are also environmental disadvantages, such as Microsoft, Google, Amazon.com Inc. has admitted Previously, it was possible to build a data center Complicating its long-term climate goals.
Over a year ago, I released OpenAI The first inference modelcalled o1. While its previous software responded to queries almost instantly, o1 spent more time calculating the answer before responding. Since then, several other AI companies have released similar systems, aiming to solve more complex multi-step problems in areas such as science, mathematics, and programming.
Although thinking systems are quickly becoming an industry standard for carrying out more complex tasks, there has been little research into their energy requirements. The researchers said much of the increase in energy consumption is because the logic models generate more text when responding.
Lucioni said the new report aims to better understand how artificial intelligence’s energy needs will evolve. She also hopes it will help people better understand that there are different types of AI models suitable for different procedures. Not every query requires leveraging the most computationally intensive logical AI systems.
“We have to be smarter about the way we use artificial intelligence,” Lucioni said. “Choosing the right model for the right task is important.”
To test the difference in energy use, the researchers ran all models on the same computers. They used the same prompts for each, ranging from simple questions — like asking which team won the Super Bowl in a given year — to more complex math problems. They also used a software tool called CodeCarbon To track how much energy has been consumed in real time.
The results varied greatly. The researchers found that one of Microsoft’s Phi 4 inference models used 9,462 watt-hours with inference turned on, compared to about 18 watt-hours with inference off. Meanwhile, OpenAI’s larger gpt-oss model had a less pronounced difference. He used 8,504 watt-hours with the “High” setting considered the most computationally intensive and 5,313 watt-hours with the setting down to “Low.”
OpenAI, Microsoft, Google, and DeepSeek did not immediately respond to a request for comment.
Google Internal research released In August, it estimated that the average text directed to the Gemini AI service uses 0.24 watt-hours of energy, roughly equivalent to watching TV for less than nine seconds. Google said this number is “significantly lower than many public estimates.”
Much of the discussion about AI power consumption has focused on the large-scale facilities built to train AI systems. However, technology companies are increasingly doing so Divert more resources to inferenceOr the process of operating artificial intelligence systems after training them. The push toward inference models is a big part of that because these systems rely more on heuristics.
Recently, some technology leaders have acknowledged that the power of AI must be taken into account. Microsoft CEO Satya Nadella said the industry should get “social permission to consume energy” for AI data centers in November. interview. To do this, he said technology must use artificial intelligence to do good and promote broad economic growth.
https://fortune.com/img-assets/wp-content/uploads/2025/12/GettyImages-2245485836-e1764971682378.jpg?resize=1200,600
2025-12-05 21:56:00