Prediction of 5G New Radio Wireless Channel Path Gains and Delays Using Machine Learning and CSI Feedback

Author(s):  
Ben Earle ◽  
Ala'a Al-Habashna ◽  
Gabriel Wainer ◽  
Xingliang Li ◽  
Guoqiang Xue
2020 ◽  
Vol 14 (22) ◽  
pp. 4116-4126
Author(s):  
Alaa ElHelaly ◽  
Mai Kafafy ◽  
Ahmed H. Mehanna ◽  
Mohamed M. Khairy

2020 ◽  
Vol 10 (18) ◽  
pp. 6164
Author(s):  
Luis Diez ◽  
Alfonso Fernández ◽  
Muhammad Khan ◽  
Yasir Zaki ◽  
Ramón Agüero

It is well known that transport protocol performance is severely hindered by wireless channel impairments. We study the applicability of Machine Learning (ML) techniques to predict congestion status of 5G access networks, in particular mmWave links. We use realistic traces, using the 3GPP channel models, without being affected using legacy congestion-control solutions. We start by identifying the metrics that might be exploited from the transport layer to learn the congestion state: delay and inter-arrival time. We formally study their correlation with the perceived congestion, which we ascertain based on buffer length variation. Then, we conduct an extensive analysis of various unsupervised and supervised solutions, which are used as a benchmark. The results yield that unsupervised ML solutions can detect a large percentage of congestion situations and they could thus bring interesting possibilities when designing congestion-control solutions for next-generation transport protocols.


2021 ◽  
Author(s):  
Abdelfatteh Haidine ◽  
Fatima Zahra Salmam ◽  
Abdelhak Aqqal ◽  
Aziz Dahbi

The deployment of 4G/LTE (Long Term Evolution) mobile network has solved the major challenge of high capacities, to build real broadband mobile Internet. This was possible mainly through very strong physical layer and flexible network architecture. However, the bandwidth hungry services have been developed in unprecedented way, such as virtual reality (VR), augmented reality (AR), etc. Furthermore, mobile networks are facing other new services with extremely demand of higher reliability and almost zero-latency performance, like vehicle communications or Internet-of-Vehicles (IoV). Using new radio interface based on massive MIMO, 5G has overcame some of these challenges. In addition, the adoption of software defend networks (SDN) and network function virtualization (NFV) has added a higher degree of flexibility allowing the operators to support very demanding services from different vertical markets. However, network operators are forced to consider a higher level of intelligence in their networks, in order to deeply and accurately learn the operating environment and users behaviors and needs. It is also important to forecast their evolution to build a pro-actively and efficiently (self-) updatable network. In this chapter, we describe the role of artificial intelligence and machine learning in 5G and beyond, to build cost-effective and adaptable performing next generation mobile network. Some practical use cases of AI/ML in network life cycle are discussed.


Author(s):  
Pejman Ghasemzadeh ◽  
Subharthi Banerjee ◽  
Michael Hempel ◽  
Hamid Sharif ◽  
Tarek Omar

Abstract Automatic Modulation Classification (AMC) is becoming an essential component in receiver designs for next-generation communication systems, such as Cognitive Radios (CR). AMC enables receivers to classify an intercepted signal’s modulation scheme without any prior information about the signal. This is becoming increasingly vital due to the combination of congested frequency bands and geographically disparate frequency licensing for the railroad industry across North America. Thus, a radio technology is needed that allows train systems to adapt automatically and intelligently to changing locations and corresponding RF environment fluctuations. Three AMC approaches have been proposed in the scientific literature. The performance of these approaches depends especially on the particular environment where the classifiers are employed. In this work, the authors present a performance evaluation of the Feature-based AMC approach, as this is the most promising approach for railroads in real-time AMC operations under various different wireless channel environments. This study is done as the first one for railroads application where it considers different environments models including Non-Gaussian Class A noise, Multipath fast fading, and their combination. The evaluation is conducted for signals using a series of QAM modulation schemes. The authors selected the signal’s Cumulant statistical features for the feature extraction stage in this study, coupled with three different machine learning classifiers: Support Vector Machine (SVM), Deep Neural Network (DNN) and Recurrent Neural Network (RNN) utilizing long-short term memory (LSTM), in order to maintain control over the classifiers’ accuracy and computational complexity, especially for the non-linear cases. Our results indicate that when the signal model noise shows higher non-linear behavior, the RNN classifier on average achieves higher classification accuracy than the other classifiers.


Electronics ◽  
2021 ◽  
Vol 10 (24) ◽  
pp. 3114
Author(s):  
Abdallah Mobark Aldosary ◽  
Saud Alhajaj Aldossari ◽  
Kwang-Cheng Chen ◽  
Ehab Mahmoud Mohamed ◽  
Ahmed Al-Saman

The exploitation of higher millimeter wave (MmWave) is promising for wireless communication systems. The goals of machine learning (ML) and its subcategories of deep learning beyond 5G (B5G) is to learn from the data and make a prediction or a decision other than relying on the classical procedures to enhance the wireless design. The new wireless generation should be proactive and predictive to avoid the previous drawbacks in the existing wireless generations to meet the 5G target services pillars. One of the aspects of Ultra-Reliable Low Latency Communications (URLLC) is moving the data processing tasks to the cellular base stations. With the rapid usage of wireless communications devices, base stations are required to execute and make decisions to ensure communication reliability. In this paper, an efficient new methodology using ML is applied to assist base stations in predicting the frequency bands and the path loss based on a data-driven approach. The ML algorithms that are used and compared are Multilelayers Perceptrons (MLP) as a neural networks branch and Random Forests. Systems that consume different bands such as base stations in telecommunications with uplink and downlink transmissions and other internet of things (IoT) devices need an urgent response between devices to alter bands to maintain the requirements of the new radios (NR). Thus, ML techniques are needed to learn and assist a base station to fluctuate between different bands based on a data-driven system. Then, to testify the proposed idea, we compare the analysis with other deep learning methods. Furthermore, to validate the proposed models, we applied these techniques to different case studies to ensure the success of the proposed works. To enhance the accuracy of supervised data learning, we modified the random forests by combining an unsupervised algorithm to the learning process. Eventually, the superiority of ML towards wireless communication demonstrated great accuracy at 90.24%.


Sign in / Sign up

Export Citation Format

Share Document