What Meta's expensive hunt for AI gold means for telcos

Meta's spending splurge on AI, and the likelihood hyperscaler results will show the same, should alarm the world's telcos.

Iain Morris, International Editor

April 25, 2024

7 Min Read
Meta founder Mark Zuckerberg
Mark Zuckerberg's investments in the metaverse have yet to pay off.(Source: Meta)

Artificial intelligence (AI) is seemingly on the rampage in Mark Zuckerberg's pockets. Last year, Meta, the company formerly known as Facebook, threw about $28 billion at capital expenditure. This year, Zuckerberg's creation plans to spend between $35 billion and $40 billion. That's up from a previous forecast of between $30 billion and $37 billion, and it looks like AI is to blame. Meta, in short, is spending colossal sums on the infrastructure to support AI and the products that go with it. Next year, it expects to invest an even bigger amount – although it's not saying how much.

The investor response to this was to send Meta's share price down about 15% in pre-market trading on April 25. At close of business the day before, it remained, nonetheless, up 43% since the start of the year. Most of the metrics for the recently ended first quarter would be envied by just about every other company bar Nvidia.

Despite Facebook's image as a fusty old website for Generation Xers posting holiday snaps, and the view of Zuckerberg as a devious control freak, business is booming. First-quarter sales rose 27%, to $36.5 billion, compared with the year-earlier quarter. Net income rocketed 117%, to about $12.4 billion. The measure of daily active people on Meta properties was up 7% year-over-year in March, to 3.24 billion. More people appear to have stopped worrying about data privacy – if they ever did. Other social media sites, Elon Musk's X included, seem barely relevant.

But the apparent costs of AI are scaring investors, and there is a lack of observable AI revenues outside Nvidia, a company dubbed "monopolistic" by some telecom operators because it so clearly dominates sales of the graphics processing units (GPUs) used in AI. It's not just capex that is on the increase at Meta, either. Last year, its overall operating costs amounted to about $88 billion. Under previous guidance it had expected to spend between $94 billion and $99 billion this year. The range has now been tightened up to between $96 billion and $99 billion.

Reality Labs gets unreal

Zuckerberg, naturally, did the usual thing of insisting one has to spend money to make money. "As we're scaling capex and energy expenses for AI, we'll continue focusing on operating the rest of our company efficiently," he said. "But realistically, even with shifting many of our existing resources to focus on AI, we'll still grow our investment envelope meaningfully before we make much revenue from some of these new products."

Meta, though, has form when it comes to not being able to monetize investments. The big one is WhatsApp, the free-to-use messaging service that Facebook acquired for $19 billion in 2014. More recently, it has failed to get people spending on the metaverse, its vision of an extended reality universe and the rationale for its company name change from Facebook to Meta in 2021. Reality Labs, the part of Meta charged with metaverse development, has spent $58.5 billion since 2020 and yet made just $8 billion in revenues over the past four years, according to Rohit Kulkarni, an analyst with Roth Capital Partners.

To another prominent analyst, the latest projections are confusing and suggest Meta has gone back to its old profligate ways. Substantial expenses are being recorded at Reality Labs, with Zuckerberg attributing this to Reality Labs' bigger role in AI activities. Richard Windsor, an analyst with Radio Free Mobile, said this "doesn't quite make sense" in his latest blog.

If resources were focused on AI, they should really have been moved out of Reality Labs, he notes. "Hence, on the face of it, it looks like a badly disguised return to the old bad habit of profligate spending under the guise of AI in the hope that the market will not care because it is for AI." The drop in Meta's share price proves the market "wasn't fooled," he said.

He is also concerned because Llama 3, Meta's latest AI large language model (LLM), is no bigger in terms of parameters than Llama 2, its predecessor, and should therefore demand less investment in cloud-computing resources than its rivals do. "Instead, Meta should be focusing on using its smaller model to run at a far lower cost than its rivals and encouraging developers to use its model as their services will be much cheaper to run," said Windsor.

To Kulkarni, Meta's splurge is a possible sign AI is "getting too expensive already." In a research note issued today, he said the "uptick hints at likely similar announcements coming" from Amazon, Google and Microsoft. "All in, we interpret these investments as Big 'Nets creating deeper AI moats for 2025 and beyond."

Open sores

This should make telcos, among others involved with Meta and the hyperscalers, somewhat uncomfortable. "Moat" is a fashionable term in the technology market for a competitive advantage that can be sustained, but companies often build these moats by making it hard for customers to escape their products. in the case of Nvidia, the moat is arguably not the chips themselves but CUDA, the proprietary software platform that goes with them.

With Meta and the "Big 'Nets," a moat could involve making telcos (among others) dependent on specific AI platforms and other technologies. Some big telcos are conscious of the danger. "If you want to use multiple LLMs and they were too tightly coupled to the hyperscaler that provides the LLM, you would have to replicate the data for every hyperscaler you wanted to use," said Scott Petty, Vodafone's chief technology officer, at a recent press briefing. "We have to have a model where you choose to put the data where you best want to put it and have openness in the way that LLMs work."

Meta, of course, is not a hyperscaler selling public cloud services to operators, and some telcos have been drawn to Llama because of its open source credentials, championed by Zuckerberg on his earnings call with analysts this week. But these look bogus to many observers. Last year, the Open Source Initiative (OSI), a Californian non-profit that describes itself as "the steward of the open source definition," slammed Meta for alleged deception.

"The license for the Llama LLM is very plainly not an 'open source' license," wrote Stefano Maffulli, the OSI's executive director, on the organization's website. One disqualifying clause in the Llama 2 community license agreement reads: "If, on the Llama 2 version release date, the monthly active users of the products or services made available by or for Licensee, or Licensee's affiliates, is greater than 700 million monthly active users in the preceding calendar month, you must request a license from Meta, which Meta may grant to you in its sole discretion, and you are not authorized to exercise any of the rights under this Agreement unless or until Meta otherwise expressly grants you such rights."

No telco outside China serves more than 700 million customers, but this clause would obviously restrict Amazon, Google and Microsoft, along with a few other tech platforms. The training of Llama, meanwhile, has happened inside Meta's own data centers at considerable expense. Whatever it says about open source, Meta seems unlikely to have made that investment if it did not plan to be the company that mainly profits.

Of equal concern to any telco should be AI's impact on energy use. Consumption by major European telcos analysed in this piece has been relatively flat in recent years. The same cannot be said of Meta, whose energy consumption rose from 7,521 gigawatt hours in 2020 to 11,822 in 2022. Its carbon dioxide emissions, including the Scope 3 category blamed on suppliers and customers, grew from about 8.6 million metric tons in 2020 to more than 14 million in 2022.

Figures are not yet available for 2023, but no one expects a decrease. Rene Haas, the CEO of UK chip designer Arm, reportedly expects AI data centers to consume up to a quarter of US power by 2030, compared with just 4% today. For telcos agog about AI while professing their environmental friendliness, that is what a former US presidential candidate would have called inconvenient.

Read more about:

AI

About the Author(s)

Iain Morris

International Editor, Light Reading

Iain Morris joined Light Reading as News Editor at the start of 2015 -- and we mean, right at the start. His friends and family were still singing Auld Lang Syne as Iain started sourcing New Year's Eve UK mobile network congestion statistics. Prior to boosting Light Reading's UK-based editorial team numbers (he is based in London, south of the river), Iain was a successful freelance writer and editor who had been covering the telecoms sector for the past 15 years. His work has appeared in publications including The Economist (classy!) and The Observer, besides a variety of trade and business journals. He was previously the lead telecoms analyst for the Economist Intelligence Unit, and before that worked as a features editor at Telecommunications magazine. Iain started out in telecoms as an editor at consulting and market-research company Analysys (now Analysys Mason).

Subscribe and receive the latest news from the industry.
Join 62,000+ members. Yes it's completely free.

You May Also Like