The introduction of fifth generation (5G) and sixth generation (6G) networks has brought new possibilities. However, dynamic radio resource management (RRM) is required. These networks are useful for advanced technologies such as drones, virtual reality, and augmented reality. However, to do this, you need to be able to track and predict current metrics.
Researchers are beginning to use artificial intelligence (AI) and machine learning (ML) to accurately predict mobile network profiles using artificial intelligence (AI) and machine learning (ML) algorithms. Using AI and ML in 5G networks enables effective and streamlined network planning and management. A prominent application of ML in fifth-generation (5G) and sixth-generation (6G) networks is network traffic prediction, which monitors user requests and analyzes user behavior in apps.
Therefore, researchers at RUDN University recently attempted to study traffic forecasting. They investigated his two popular time series analysis models: the Holt-Winter model and the Seasonally Integrated Autoregressive Moving Average (SARIMA). They used a dataset from Portuguese mobile operators to aggregate hourly downloads and upload traffic statistics. One of the researchers highlighted that the increase in connected devices has led to a sharp increase in traffic volume, causing problems such as network congestion, reduced quality of service, delays, data loss, and blocking of new connections. Therefore, the network architecture must adapt to the increasing amount of traffic and consider several types of traffic with different requirements.
The researchers found that both of these models worked well and were very accurate in predicting traffic within the next hour. They found that SARIMA excels in predicting user-to-base station traffic, with an average error rate of only 11.2%. The researchers highlighted that the ability to record temporal patterns allows for accurate monitoring of temporal fluctuations and patterns in mobile network traffic. In contrast, the Holt-Winter model shows better performance when estimating traffic from base stations to users, with an error of only ~4%. The researchers attribute the Holt-Winter model’s performance to its ability to manage the complex seasonality and trend components of some traffic datasets.
The researchers used multiple criteria to measure the model’s performance. These criteria are mean square error (MSE), root mean square error (RMSE), mean absolute error (MAE), mean absolute percent error (MAPE), and mean scaled log error (MSLE). They emphasized that although both models performed well, performance could be further improved by fine-tuning certain hyperparameters. The researchers emphasized that although their individual models performed well, no method can be universally applied to all situations. Researchers plan to combine statistical models with AI and ML techniques to obtain sophisticated predictions and quickly detect anomalies.
In conclusion, this study showed that by using AI and ML algorithms, 5G and 6G network providers can effectively predict and respond to evolving traffic dynamics. This study could be important as researchers focus on improving the efficiency of their approach and promoting improved user satisfaction. Efforts to maximize the efficiency of 5G and 6G networks continue with cutting-edge technology and the pursuit of greater accuracy in network traffic prediction and anomaly detection.
Please check paper. All credit for this study goes to the researchers of this project.Don’t forget to follow us twitter.participate 36,000+ ML SubReddits, 41,000+ Facebook communities, Discord channeland LinkedIn groupsHmm.
If you like what we do, you’ll love Newsletter..
Don’t forget to join us telegram channel
Rachit Ranjan is a consulting intern at MarktechPost. He is currently pursuing his bachelor’s degree from Indian Institute of Technology (IIT) Patna. He is actively developing a career in the fields of artificial intelligence and data science and has a passion and dedication to exploring these fields.