"Applying scalable and practical machine learning for real-time programmable optimization in 5G cellular networks "

This video program is a part of the Premium packages:

"Applying scalable and practical machine learning for real-time programmable optimization in 5G cellular networks "


  • IEEE MemberUS $1.00
  • Society MemberUS $0.00
  • IEEE Student MemberUS $1.00
  • Non-IEEE MemberUS $2.00
Purchase

  • IEEE MemberUS $50.00
  • Society MemberUS $0.00
  • IEEE Student MemberUS $25.00
  • Non-IEEE MemberUS $100.00
Purchase

"Applying scalable and practical machine learning for real-time programmable optimization in 5G cellular networks "

0 views
  • Share

Traditional mobile networks have cellular Radio Access Network (RAN) and core networking components inter-operate with each other based on protocol and standards, as opposed to data and intelligence. With the impetus on advanced ML/AI techniques, open-source technologies, SDN/NFV and network edge cloud, there has been a recent interest in the telecom industry to move from native and closed-source solutions in vendor-proprietary hardware to intelligent data-driven and open solutions in third-party commercial-off-the-shelf platforms. This has opened opportunities for multiple players in the telecom space, fostering an innovative and competitive 3rd party ecosystem. However, in the past, the closed architecture of a cellular network made it challenging to incorporate these solutions in operational mobile networks. With the advent of 5G, disaggregated RAN and open-RAN architectures and edge-cloud APIs, there has been a significant momentum in building intelligent solutions on operational 5G networks.
In this talk, we will cover the application of scalable and practical ML/AI techniques in the context of 5G cellular networks under two key aspects: (i) real-time optimization of the cellular RAN, and (ii) end-to-end network and application optimization. We will leverage SDN/NFV, open RAN-architected RAN Intelligent Controller (RIC) and edge-cloud APIs towards achieving this. In particular, we talk about how we could predict by using ML/AI and optimize RAN latency in real-time for LTE-NR dual-connected 5G users by leveraging the RIC component. We also talk about how to optimize IP packet sizes in the core network based on ML/AI-driven RAN latency prediction to optimize the goodput and latency of the end-user application and the throughput of the network by leveraging network edge-cloud APIs. We further discuss the relevance of these techniques in the context of network slicing for enhanced mobile broadband (eMBB) and ultra-high reliable low-latency communication (uRLLC) applications.

Applying scalable and practical machine learning for real-time programmable optimization in 5G cellular networks


Rajarajan Sivaraj, Mavenir

Advertisment

Advertisment