PARTNER INTERVIEW: Exponential growth in data volumes and rising network complexity are forcing mobile operators across the world, with support from their network partners, to shift their computing needs closer to where the data is used, a strategy increasingly recognised by the industry as an effective solution.
With the emergence of 5G core networks, virtual content delivery networks (CDNs) and various cloud-native applications, cloud computing technologies now face a number of obstacles, including high performance loss, security risks, and new operations and maintenance (O&M) challenges.
The huge number of devices connected to the internet creates a high volume of data handled by traditional data centre infrastructure. Rising smartphone usage will continue to climb, with adoption expected to reach 92 per cent in 2030, up from the current 76 per cent, GSMA Intelligence data showed.
In addition, the boom in AI applications, such as ChatGPT, is a key factor in putting increased stress on data centres, along with growing use of social media, online gaming and streaming video services.
AI applications require more power-intensive computations from servers and storage systems than traditional workloads.
Fairfield Market Research forecasts the AI market to grow by about 30 per cent annually until 2030.
Meanwhile, edge computing devices are expected to take over more AI computation, with the global edge computing market expected to grow 39 per cent annually until 2030, Polaris Market Research predicts.
Boosting efficiency
Rising demand for computing power creates a widening gap between the computing power supply and the demand for data processing, said ZTE VP Chen Xinyu (pictured) in an interview with Mobile World Live. To overcome this misconfiguration, the company developed a new computing network infrastructure with software and hardware coordination, centring on data process units (DPUs).
The data centre architecture changes from being computing-centric to data-centric accelerated by DPU. Advantages include saving CPU core resources, enhanced performance, security isolation and increased flexibility.
Key applications of ZTE’s DPU-centric architecture cover IT data centres, intelligent computing, telco cloud and edge computing.
ZTE’s next-generation cloud infrastructure offloads the management control and hypervisor virtualisation layers of the cloud platform from servers to its DPU named NEO (native enhanced-cloud orchestration) card, Chen explained.
The acceleration card is part of a broad class of devices with dedicated hardware accelerators or programmable processors, called function accelerator cards (FACs) used to speed up activities, such as networking, security and storage. Different types of FACs are introduced into the system for different requirements, such as smart network interface cards (NICs) for speeding up data forwarding, GPUs for image processing and DPUs for data processing.
The benefits of adopting ZTE’s NEO card are many.
First, offloading virtual switches to intelligent cloud cards speeds up network performance, reducing CPU demand by 20 per cent, Chen stated.
In addition, storage performance is improved to 3MIOPs, and through virtualisation layer offloading, the server uses only one virtual CPU, instead of as many as 15. By cloud management offloading, clients can reduce CPU consumption, he added.
Security hardening is another advantage. The NEO card is physically isolated from the host, with unidirectional control. The dedicated hardware is encrypted to implement physical isolation of service from management. Even if a virtual machine (VM) is attacked or the host is compromised, the management and control planes cannot be hit. The system is more secure and can be deployed in a zero-trust environment.
NEO card
By integrating software and hardware system architecture for its NEO card, the computing, network and storage modules are offloaded from the server to the card, simplifying cloud platform deployment.
The company unveiled the card at MWC21 Shanghai.
The cloud-based product is designed to improve server resource utilisation and performance. The plug-and-play card can be directly inserted on a customer’s original server to deploy that system to the edge. The cloud platform is more secure and can be deployed in any zero-trust environment, Chen said.
The card can be widely deployed in public and private clouds. With the cloud acceleration card deployed, a new server can quickly access the cloud.
Based on the software-hardware coordination system architecture, the NEO card can flexibly unload and accelerate network, storage and security services to achieve maximum performance. It cooperates in cloud management to implement unified control of VMs, containers and bare metal services to improve O&M efficiency.
The key advantages are enhanced security, high performance and ease of use, he noted. A hypervisor virtualisation layer is pre-integrated through a cloud-based board to enable on-site deployment in minutes and reduce O&M overhead.
The NEO card also improves cost control and server performance, with server resource utilisation improved without increasing the overall cost for the customer. Equipment room occupancy also can be reduced, lowering the total cost.
Success cases
Edge hyper-convergence: Guangdong Mobile deployed ZTE’s hyper-converged edge computing platform for network edge services, increasing the integration level by 20 per cent and significantly reducing the space requirements of equipment rooms.
The supplier explained that the space in the edge equipment room is limited, resources are scarce and computing power density needs to be increased. The hypervisor and virtual infrastructure management is offloaded from the NEO control card, while the acceleration card offloads and accelerates the network and storage, providing a highly integrated, secure and plug-and-play environment.
Cloud-based bare metal management: Hebei Mobile implemented ZTE’s data-centric architecture using DPUs, resulting in a flexible supply of VMs, containers and bare machines, as well as providing a consistent service experience and simplifying O&M management complexity.
Since it is difficult to manage bare metal, cloud networks and cloud storage cannot be used in this case. ZTE’s NEO cloud card offloads bare metal container management to the card, ensuring the flexibility of VMs and the performance of physical machines.