Telecommunication networks are complex organisms that are growing only more complex as they evolve to support more in-depth 5G use cases. Advances in artificial intelligence (AI) are quickly being looked at to help manage this growing complexity, though hurdles remain.
Subhankar Pal, senior director for innovation at Capgemini Engineering, in an interview with SDxCentral explained that one highly touted use case is the ability to use AI to better manage network resources. This involves being able to actively manage the use and power requirement of a cell site.
“Switching of a cell or switching off some of the antennas at night when they’re required, or even when we are speaking, maybe there are pauses, intermediate pauses, and can some of the time slots or the communication be shut off at the time,” Pal said.
This might not seem like a significant benefit, but operators typically have tens of thousands of large cell sites on their network, with ongoing deployment of smaller cell sites designed for more targeted and localized coverage needs. Having these all running when not needed is a huge drain on energy and financial resources.
As an example, Pal said larger carriers can spend up to $1.5 billion per year just on energy.
“If you can reduce 10% of this energy, that’s a huge cost saving and of course also helping your net-zero targets,” Pal said. “Energy saving is a big area for this.”
This sentiment was echoed by Chris Sambar, EVP for technology at AT&T, during a keynote speech at the recent Brooklyn 6G Summit where he talked about the use of AI to manage tower resources.
“I just think that’s the coolest thing,” Sambar said. “Basically, we tell the network if it’s not being used right now … just shut off the radios that aren’t being used. … And then if somebody starts using it, it fires back up. That has saved us the equivalent of 14,000 homes worth of energy in any given year. We like innovative technologies … that we can put into the network and save energy.”
AI needs quality 5G data
Pal also noted that one of the biggest hurdles facing this AI evolution in 5G networks is gaining access to quality data that can be used with machine learning (ML) to better train AI models.
Spain-based telecommunications giant Telefónica labeled these data sources as “data foundations.”
“Data foundations are important for AI because they provide the raw material that AI algorithms use to learn and make predictions,” Richard Benjamins, chief responsible AI officer for the carrier noted in a recent blog post. Without access to high-quality, diverse and relevant data, AI systems would not be able to perform their tasks effectively. Data foundations also play a critical role to ensure the scalability and maintainability of AI systems over time. Relevant topics include privacy, data anonymization and synthetic data.”
However, gaining access to this data is challenging for telecom operators as they are the ones often sitting on a treasure trove of legally gathered user data they are hesitant to share with outside sources due to competitive and legal reasons.
“If I want to get data from the radio network, the operator may not like to share their user information, the mobile numbers, or their mobility patterns with exact locations,” Pal said. “If you don’t have very high-quality data, then the predictions will not be very high quality.”
Benjamins’ reference to using anonymized data is one way to get around the privacy challenge, but it remains a larger issue for multinational operators that have to manage diverse regulatory environments.
“If I need huge compute resources to run my machine learning training algorithms, that may reside in cloud resources from an operator network,” Pal said of this challenge. “If I am working with an operator that is in southern Europe and the cloud data is somewhere in northern Europe, is it even possible to get that data? Even if it’s the same operator this is becoming a challenge to rectify due to privacy issues.”
Sourced from SDX Central