Performance optimization of QoS-supported dense WLANs using machine-learning-enabled enhanced distributed channel access (MEDCA) mechanism

2019 ◽  
Vol 32 (17) ◽  
pp. 13107-13115 ◽  
Author(s):  
Rashid Ali ◽  
Ali Nauman ◽  
Yousaf Bin Zikria ◽  
Byung-Seo Kim ◽  
Sung Won Kim
2020 ◽  
Vol 6 (1) ◽  
Author(s):  
Pei Liu ◽  
Haiyou Huang ◽  
Stoichko Antonov ◽  
Cheng Wen ◽  
Dezhen Xue ◽  
...  

2017 ◽  
Vol 31 (4) ◽  
pp. 381-395 ◽  
Author(s):  
Mehmet Ufuk Caglayan

This paper introduces a special issue of this journal (Probability in the Engineering and Informational Sciences) that is devoted to G(elenbe)-Networks and their Applications. The special issue is based on revised versions of some of the papers that were presented at a workshop held in early January 2017 at the Séminaire Saint-Paul in Nice (France). It includes contributions in several research directions that followed from the introduction of the G-Network in the late 1980s. The papers present original theoretical developments, as well as applications of G-Networks to Machine Learning, to the performance optimization of energy systems via the novelEnergy Packet Networksformalism for systems that operate with renewable and intermittent energy sources, and to packet network routing and Cloud management over the Internet. We introduce these contributions from the perspective of an overview of recent work based on G-Networks.


The cognitive radio is a promising candidate to resolve the issues created due to the spectrum scarcity in wireless communication. The main motive of CR technology is to create communication opportunities by sensing and learning from the external environment thereby improving the spectrum usage efficiency. The integration of the link adaptation, MIMO-OFDM with CR technology gives the utmost performance. In this treatise, we propose the hybrid channel access (interweave-underlay) scheme for the MIMO-OFDM based adaptive CR system. We have investigated the performance of the CR system under a hybrid channel access (interweave-underlay) scheme as well as underlay channel access scheme. We consider the binary stochastic model to reflect the primary user (PU) activities. The performance is evaluated with different channel detection probabilities under perfect as well as imperfect (false alarm) spectrum sensing environment. The result shows the significant improvement in the throughput of the system in proportion with the higher-quality channel detection probability with perfect spectrum sensing. This exploits the hybrid channel access scheme as one of the techniques to optimize the performance of the system. This signifies the importance of the optimum channel sensing and its selection in improving the performance of the system. To show the performance improvement of the CR system using hybrid channel access scheme we have compared it with the performance of the CR system based on the conventional underlay channel access scheme. The proposed scheme can be used to improve the performance of the entire CR network.


Information ◽  
2021 ◽  
Vol 12 (2) ◽  
pp. 53
Author(s):  
Massimo Giovannozzi ◽  
Ewen Maclean ◽  
Carlo Emilio Montanari ◽  
Gianluca Valentino ◽  
Frederik F. Van der Veken

A Machine Learning approach to scientific problems has been in use in Science and Engineering for decades. High-energy physics provided a natural domain of application of Machine Learning, profiting from these powerful tools for the advanced analysis of data from particle colliders. However, Machine Learning has been applied to Accelerator Physics only recently, with several laboratories worldwide deploying intense efforts in this domain. At CERN, Machine Learning techniques have been applied to beam dynamics studies related to the Large Hadron Collider and its luminosity upgrade, in domains including beam measurements and machine performance optimization. In this paper, the recent applications of Machine Learning to the analyses of numerical simulations of nonlinear beam dynamics are presented and discussed in detail. The key concept of dynamic aperture provides a number of topics that have been selected to probe Machine Learning. Indeed, the research presented here aims to devise efficient algorithms to identify outliers and to improve the quality of the fitted models expressing the time evolution of the dynamic aperture.


2021 ◽  
Vol 135 ◽  
pp. 103691
Author(s):  
Haoyu Wang ◽  
J. Thomas Gruenwald ◽  
James Tusar ◽  
Richard Vilim

2021 ◽  
Vol 11 (7) ◽  
pp. 3235
Author(s):  
Hyejeong Choi ◽  
Sejin Park

Recently, the machine learning research trend expands to the system performance optimization field, where it has still been proposed by researchers based on their intuitions and heuristics. Compared to conventional major machine learning research areas such as image or speech recognition, machine learning-based system performance optimization fields are at the beginning stage. However, recent papers show that this approach is promising and has significant potential. This paper reviews 11 machine learning-based system performance optimization approaches from nine recent papers based on well-known machine learning models such as perceptron, LSTM, and RNN. This survey provides a detailed design and summarizes model, input, output, and prediction method of each approach. This paper covers various system performance areas from the data structure to essential system components of a computer system such as index structure, branch predictor, sort, and cache management. The result shows that machine learning-based system performance optimization has an important potential for future research. We expect that this paper shows a wide range of applicability of machine learning technology and provides a new perspective for system performance optimization.


Sign in / Sign up

Export Citation Format

Share Document