A Novel Atmosphere-informed Data-driven Predictive Channel Modeling for B5G/6G Satellite-terrestrial Wireless Communication Systems at Q-band

Author(s):  
Lu Bai ◽  
Qian Xu ◽  
Shangbin Wu ◽  
Spiros Ventouras ◽  
George Goussetis
2020 ◽  
pp. 31-54
Author(s):  
Caslav Stefanovic ◽  
Danijel Djosic ◽  
Stefan Panic ◽  
Dejan Milic ◽  
Mihajlo Stefanovic

2021 ◽  
pp. 228-235
Author(s):  
Sarun Duangsuwan ◽  

A challenge swarm unmanned aerial vehicles (swarm UAVs)-based wireless communication systems have been focused on channel modeling in various environments. In this paper, we present the characterized path loss air-to-air (A2A) channel modeling-based measurement and prediction model. The channel model was considered using A2A Two-Ray (A2AT-R) extended path loss modeling. The prediction model was considered using an artificial neural network (ANN) algorithm to train the measured dataset. To evaluate the measurement result, path loss models between the A2AT-R model and the prediction model are shown. We show that the prediction model using ANN is optimal to train the measured data for the A2A channel model. To discuss the result, the parametric prediction errors such as mean absolute error (MAE), root mean square error (RMSE), and R-square (R2), are performed.


Electronics ◽  
2021 ◽  
Vol 10 (24) ◽  
pp. 3114
Author(s):  
Abdallah Mobark Aldosary ◽  
Saud Alhajaj Aldossari ◽  
Kwang-Cheng Chen ◽  
Ehab Mahmoud Mohamed ◽  
Ahmed Al-Saman

The exploitation of higher millimeter wave (MmWave) is promising for wireless communication systems. The goals of machine learning (ML) and its subcategories of deep learning beyond 5G (B5G) is to learn from the data and make a prediction or a decision other than relying on the classical procedures to enhance the wireless design. The new wireless generation should be proactive and predictive to avoid the previous drawbacks in the existing wireless generations to meet the 5G target services pillars. One of the aspects of Ultra-Reliable Low Latency Communications (URLLC) is moving the data processing tasks to the cellular base stations. With the rapid usage of wireless communications devices, base stations are required to execute and make decisions to ensure communication reliability. In this paper, an efficient new methodology using ML is applied to assist base stations in predicting the frequency bands and the path loss based on a data-driven approach. The ML algorithms that are used and compared are Multilelayers Perceptrons (MLP) as a neural networks branch and Random Forests. Systems that consume different bands such as base stations in telecommunications with uplink and downlink transmissions and other internet of things (IoT) devices need an urgent response between devices to alter bands to maintain the requirements of the new radios (NR). Thus, ML techniques are needed to learn and assist a base station to fluctuate between different bands based on a data-driven system. Then, to testify the proposed idea, we compare the analysis with other deep learning methods. Furthermore, to validate the proposed models, we applied these techniques to different case studies to ensure the success of the proposed works. To enhance the accuracy of supervised data learning, we modified the random forests by combining an unsupervised algorithm to the learning process. Eventually, the superiority of ML towards wireless communication demonstrated great accuracy at 90.24%.


Sign in / Sign up

Export Citation Format

Share Document