One of the questions behind the rush to implement generative artificial intelligence (AI) tools is how to manage their massive energy footprint.
This need is expected to double over the next 10 years, and a recent panel discussion heard how some energy companies are addressing this issue.
“There is a huge amount of power growth coming in the next decade,” Colin Guldimann, director, sustainable finance at RBC, said during AI vs. the grid, at the MaRS Climate Impact summit Dec. 3-4 in Toronto.
The impact of electrical demand cannot be ignored, one panellist said.
“When we talk about data centres, the whole story is really around electricity. A lot of electricity goes in, what comes out is the internet and computing and heat, so it really is a story about electricity. Therefore data centres don’t talk about square footage of their spaces or the number of servers in their spaces, they talk about the number of megawatts of their space,” Kathleen Kauth, COO of climate consultancy Mantle Developments, said.
Kauth cited a recent report by Infrastructure Masons, a group of digital infrastructure professionals.
“They came out with their 2024 state of the industry report a few months ago, and they said we’re going to need 38 gigawatts of new AI infrastructure globally by 2028," she said. "To put that in perspective, the theoretical gigawatt capacity of Ontario is roughly 38: we don’t use that much, but it’s theoretically what we have available. So, over the next three years, the global data centre industry is going to build the capacity of Ontario globally.”
Satisfying the thirst for power
In the U.S., this thirst for computing power might become even more dramatic, according to Kauth.
“Lawrence Berkeley National Laboratory, in a report, is saying we expect that to triple again by 2030, reaching upwards of 10 per cent of all electrical consumption in the U.S.”
There are a number of potential new and old power solutions being proposed, such as Microsoft recently talking about re-opening a nuclear reactor at Three Mile Island to power its own AI needs.
Google needs help with its efforts around AI as well.
“Google’s big and wealthy, but they’re not big and wealthy enough to build their own nuclear power plants,” Kauth said.
Harnessing the power of heat waste
Some solutions were raised to address the growing energy needs but heat waste is also something that can be harnessed, according to another panellist.
When electricity energy is created, it takes the form of AC, or alternating current. However, data centres use DC, or direct current power.
“That conversion process is part of the heat that’s generated in the data centre. It’s also part of the efficiency loss," Bolis Ibrahim, president of Cence Power, which provides DC power to data centres, said. "The Lawrence Berkeley National Laboratory, they ran a big multi-year study, and they found that if the data centres were distributing direct current power within the data centre, that they could increase the efficiency by up to 10 per cent in some cases . . .
"Really, the technology is distributing direct current power directly to the servers.”
Due to the way they operate, the data centres are part of the problem, according to an energy CEO.
“Data centres are very, very inefficient. Producing power is very inefficient in itself. So just to give you some numbers, only 20 per cent of the energy that goes into a data centre is actually used for computing, and (a) proportion of that is lost as waste heat. But it’s low-grade waste heat,” Ibraheem Khan, CEO of Extract Energy, said. His company converts waste heat back into energy.
“That’s really where we’re seeing a lot of interest, which is that low-grade waste heat that these inefficient systems produce, really are just kind of wasted and if we can provide our solution, we can increase the efficiency pretty significantly,” Khan added.
While the technology industry seems to be paying attention to the enormous amount of power needed to manage future growth, groups such as the Open Compute Project are working on “blueprints” to manage the process, another panellist said.
“I think at the end of the year, we can consider that having the magic secret recipe could be a very competitive advantage, but energy efficiency is also related to cost, and so we all want to reduce our environmental impact,” Germain Masse, global AI product marketing manager at cloud provider OVHcloud, said.
“We are doing a lot of work on improving the efficiency of data centres, but at some point, it will not be enough. We definitely also need to question the usage of AI and the usage we have and resources are constrained,” he said.
Tradeoff between technology and energy
For some, the technology that companies are relying upon may become part of the solution.
“AI was here before generative AI, and it was already solving issues we can have in developing the best, efficient environment, but generative AI will have impact and will increase efficiency in many things, and in climate as well, in research and so on,” Masse said.
The technology can also help companies make better raw materials that don’t lose as much energy, according to Khan.
“AI can play a very important role in in determining various compositions that can potentially support that, or processes to make materials that can allow that.”