The 15th edition of Huawei’s Mobile Broadband Forum (MBBF) took place in Istanbul late last month.

I’ve attended the majority of these events, making the annual trip to understand what the telecoms powerhouse is up to, what it thinks will drive the market forward and to learn from the industry stakeholders it convenes.

This year, those learnings were bolstered by an opportunity to moderate the event’s 5.5G x AI Summit.

With 5G-Advanced arriving this year and AI dominating almost every industry conversation, the summit’s focus on the intersection of AI innovation and 5G evolution aligned well with the research we are doing at GSMA Intelligence.

But what did we learn?

AI progress and implications
There is almost no need to recount the incredible progress with AI development and innovation over the past few years. For most telcos, AI has long been a tool used in network planning and operations. The commercialisation of Open AI’s ChatGPT almost two years ago, however, kicked off a wave of AI democratisation leading to competing product developments and a near-constant flow of new use cases for consumer and enterprise customers.

If you’ve attended any industry event over the past year, you know just how much AI has dominated the messaging (and thinking) of suppliers, operators and all stakeholders across the mobile ecosystem.

The impact of AI on business, societies and everyday life is still a work in progress. Bringing together analysts, enterprise AI consumers and a leading telecom network supplier in Huawei, the 5.5G x AI Summit focused instead on how AI will likely impact networks. Two key implications were suggested.

More data. Dario Talmesio, research director for service provider strategy and regulation with Omdia, began the discussion of AI’s impact on networks with a presentation recapping the company’s views of how AI will drive network traffic.

Building on an expected growth in generative and core AI applications, Omdia sees network traffic growing solidly in the near-term.

Perhaps more importantly, the expectation is much of the traffic across mobile networks will be new AI applications or those enhanced by the technology, though new device and silicon capabilities which put the processing in the hands of users will play a role in this new traffic generation.

New data. The topic of AI-centric devices inevitably leads to thoughts of smartphones or PCs, though Chang Lin, CEO of Leju Robot, showcased how robots fit into the mix.

Whether for household or industrial usage, robots will maintain significant on-board AI processing, but will also need to connect with networks to manage dynamic real-time scenarios indoors and out, while maximising battery life.

Also, remote control and human-robot interaction scenarios will require low-latency and reliable connectivity to be successful and avoid potential liability risks.

Implications for networks
More data, low-latency access and highly reliable connectivity, the likely impacts of AI on network traffic are not particularly surprising.

The objective of the Summit, however, was to look at what these impacts would mean for networks and three requirements drove the discussion.

Added capacity needs are an obvious implication of new traffic demands: more AI traffic will require networks which can scale to accommodate it. Perhaps less obvious is that uplink expansion will be particularly important. Beyond the uploading of AI-generated content, AI workloads processed off-device will need to send data to cloud resources.

On the topic of cloud and AI workloads, the key question is where they will run? If on-device processing dominates, the impact on networks may be minimal: network traffic increases might even be moderated, though the energy consumption of the devices will be increased. If cloud-processing dominates, the impact on network traffic could be immense to the extent that the ability of current networks to support large-scale AI industrialisation would need to be verified.

As always, the reality will likely be somewhere in between the two extremes.

Huawei, for example, presented data about the toll AI takes on device battery and memory resources, limiting the extent to which on-device processing can reasonably handle all AI loads. At the same time, the need for secure, low-latency, and highly available and scalable AI processing ensures workloads will live on-device, in the cloud, and everywhere in between.

AI will require scalable networks to support the new traffic demands they generate. More fundamentally, however, AI requires network, business and personal data to generate useful results, depending on the applications and use cases.

One thing will remain constant: the need for data to be secure.

We see this reflected in GSMA Intelligence research highlighting network and end-user security as a top network transformation priority for operators.

5G-Advanced
From the very first keynote of MBBF 2024, Huawei executives highlighted AI for networks compared with networks for AI as a key point of tension for the industry.

GSMA Intelligence customers and anyone who has paid attention to our AI coverage knows what this means. AI has, and will continue to be, successfully used for network operations, but that cannot obscure the fact we also need to build, evolve and upgrade networks with support for AI workloads in mind.

While speaking about the potential of AI to revolutionise the services operators offer, nearly every keynote speaker returned to this concept.

In many cases, the networks for AI discussion centred on IT and data infrastructure, including the importance of optimised cloud resources and clean, holistic data assets.

The 5.5G x AI Summit, then, delivered an important reminder of the role 5G-Advanced connectivity can play in bringing AI aspirations to reality.

From network and uplink capacity expansions, to low-cost IoT and latency improvements, 5G-Advanced’s role in collecting and transporting AI-centric data is clear, potentially explaining why GSMA Intelligence shows so much interest in 5G-Advanced deployment.

Connected to distributed cloud processing resources, the potential is just as clear. How, exactly, AI processes get orchestrated across distributed nodes and the role for operators in the process is still unclear. I am hopeful we will have ideas and examples around MBBF 2025.

– Peter Jarich – head, GSMA Intelligence

The editorial views expressed in this article are solely those of the author and will not necessarily reflect the views of the GSMA, its Members or Associate Members.