Telcos risk complicity in AI's energy crimes

Operators hope AI will make networks more energy efficient, but consumption by AI data centers could massively outweigh those benefits.

Iain Morris, International Editor

May 10, 2024

5 Min Read
Microsoft data center
The forecasts for energy consumption by data centers are scary.(Source: Microsoft)

On the surface, it looks as green as photosynthesis: An AI-managed, self-driving network that automatically powers down basestations when only bats and badgers are awake, ticking all the right net-zero boxes. Yet behind this power-efficient, climate-friendly wonder is a dirty, great AI machine whose cogs have a wolfish appetite for energy that renewables may struggle to sate.

It is like the promotion of open, virtual radio access network technology as a competitive spur when it meant relying solely on Intel for chips (fortunately, that's started to change). Worse, and in a more pertinent and hypocritical sense, it is like Norway – whose environmental image and domestic religiosity about renewables ends in the gas and oil fields of the North Sea, where it drills and exports with the gusto of any Texan energy tycoon.

There is now a real prospect that energy consumption by AI data centers will outweigh any network energy savings derived from AI. Measuring this would be tough because those AI data centers will probably underpin dozens if not hundreds of other AI applications. But the standalone numbers and forecasts are scary enough. Among them is one reportedly shared by Rene Haas, the CEO of chip designer Arm, who thinks AI data centers may consume a quarter of all US power by 2030, up from 2% today.

Numerous telcos have raced into partnerships with AWS, Google Cloud and Microsoft Azure, the Big Tech giants whose data centers are to blame for most of that energy use. Still to publish an update for 2023, Microsoft as a whole guzzled 18,645 gigawatt-hours in 2022, about one-and-a-half times as much as Deutsche Telekom did across its European and US operations. Annual energy use was relatively unchanged at the German telco between 2020 and 2022. At Microsoft, it rose two-thirds.

The power bottleneck

Liberty Global, which has an AI partnership with Microsoft, is one telco drawn to AI as a potential energy-saving tool in the network. Enrique Rodriguez, its chief technology officer, outlined the potential opportunities at a briefing with journalists this week. "These are things like using AI to save power on mobile towers," he said. "These are hard-core engineering problems where AI can provide a significant improvement."

Like Microsoft, Liberty Global has not yet published energy data for 2023. However, its consumption rose 4% in 2022, to about 474 gigawatt-hours, and it complained about higher energy costs at its UK Virgin Media O2 subsidiary in its just-published annual report for last year. Nevertheless, Rodriguez acknowledged that the impact of AI on data traffic and energy use is felt not in networks but in data centers.

He has firsthand experience with this, too, because Liberty Global remains a 50% owner of AtlasEdge, a European data center specialist. "We are seeing the requirements for data centers increase dramatically and we are seeing probably the first signs of actual bottleneck is more on power infrastructure," Rodriguez said. By contrast, AI is so far having no discernible effect on levels of network traffic.

The hope seems to be that AI will be less voracious when the balance tilts from training large language models (LLMs) in hyperscaler facilities to the usage of AI applications – what some refer to as the "inference" model. "The LLMs are very power hungry mainly on the development side," said Rodriguez. "I would expect the balance to change in the next few years where less and less of the pie goes into the generation side and a lot more of the pie goes into the usage side, and I think that will be more manageable."

His other observation is that no other initiative compares favorably with AI's energy-saving promise. "We did an early test on our mobile networks and within a week we were saving about 10% of power and we have no other initiative that comes even close," he said.

Are these forecasts about AI data centers by the likes of Arm's CEO – a man who presumably knows what he is talking about – very wide of the mark, then? Data suggests not. The energy bill for training GPT-4, the well-known LLM from Microsoft-backed OpenAI, was the sort faced by countries, according to analysis carried out by Harvey Lewis, a partner at EY. He estimates it took about 25,000 Nvidia A100 graphics processing units running for about 100 days, which would have produced enough energy to power 30 million UK homes for around 18 months.

The coming AGI frenzy

Lewis, though, shares Rodriguez's optimism about inference being far less greedy. He also thinks algorithms are becoming more efficient. "Microsoft last week released Phi-3, which is an open-source model, and it is not a GPT-4 class model, but it is not too far from it and its footprint is massively reduced," he said. "One reason is if you curate the data when you train that model and only train it on data it needs to fulfill the task, you get massive energy gains."

Others will remain skeptical. Even if Liberty Global cut its entire energy consumption by 10%, it would save only about 47 gigawatt hours annually, based on data for 2022. That equals just 1% of the increase in energy consumption by Microsoft that year. Implicit in the talk about inference is an assumption that LLM training will stop. Yet AI's overlords are now eyeing what they call artificial general intelligence (AGI). OpenAI founder Sam Altman is reportedly unconcerned about the expense.

"Whether we burn $500 million a year or $5 billion – or $50 billion a year – I don't care, I genuinely don't," he said at Stanford University this month. Of course, telcos could always spurn use of whatever AGI turns out to be – and the future LLMs that precede it – citing energy concerns. But does anyone think that's realistic?

Read more about:

AI

About the Author(s)

Iain Morris

International Editor, Light Reading

Iain Morris joined Light Reading as News Editor at the start of 2015 -- and we mean, right at the start. His friends and family were still singing Auld Lang Syne as Iain started sourcing New Year's Eve UK mobile network congestion statistics. Prior to boosting Light Reading's UK-based editorial team numbers (he is based in London, south of the river), Iain was a successful freelance writer and editor who had been covering the telecoms sector for the past 15 years. His work has appeared in publications including The Economist (classy!) and The Observer, besides a variety of trade and business journals. He was previously the lead telecoms analyst for the Economist Intelligence Unit, and before that worked as a features editor at Telecommunications magazine. Iain started out in telecoms as an editor at consulting and market-research company Analysys (now Analysys Mason).

Subscribe and receive the latest news from the industry.
Join 62,000+ members. Yes it's completely free.

You May Also Like