scholarly journals Power Consumption Modeling of Discontinuous Reception for Cellular Machine Type Communications

Sensors ◽  
2019 ◽  
Vol 19 (3) ◽  
pp. 617 ◽  
Author(s):  
Yasir Mehmood ◽  
Lei Zhang ◽  
Anna Förster

Machine-type communication (MTC) is an emerging communication trend where intelligent machines are capable of communicating with each other without human intervention. Mobile cellular networks, with their wide range, high data rates, and continuously decreasing costs, offer a good infrastructure for implementing them. However, power consumption is a great issue, which has recently been addressed by 3GPP (3rd Generation Partnership Project) by defining power-saving mechanisms. In this paper, we address the problem of modeling these power-saving mechanisms. Currently existing modeling schemes do not consider the full range of states in the discontinuous reception (DRX) mechanism in LTE-A networks. We propose a semi-Markov based analytical model, which closes this gap and shows very good results in terms of predicting performance evaluation metrics, such as the power-saving factor and wake-up latency of MTC devices compared to simulation experiments. Furthermore, we offer an evaluation of the DRX parameters and their impact on power consumption of MTC devices.

2018 ◽  
Vol 7 (4.30) ◽  
pp. 562
Author(s):  
Wafi A. Mabrouk ◽  
M. F.L Abdullah ◽  
M. S.M Gismalla

FSO technology has attracted a lot of popularity for a variety of applied telecommunication fields. It presents a wide range of advantages that place it in the frontier of high data rates applications, last mile problem and bottleneck issues. It has been preferred for its ease of deployment without fiber cables, no extra tariff fees, cost-effectiveness, and efficiency. FSO excels in performance when compared to contemporary RF technology. On the other hand, there is an increased demand for alternative rail communications solutions. In order to deliver a safer, reliable and fast internet access. In this paper, performance evaluation of a ground-to-train Free Space Optical link communication (G2T FSO) was performed. The system was simulated at 2.5 Gb/s link under several weather conditions. Receiver and geometrical loss were included as well. Furthermore, performance was evaluated in terms of received power, Q factor, BER and eye diagram. Substantial vulnerability to severe fog attenuation was found. Although the system was able to operate with acceptable eye height with min BER of 10-38.  


2014 ◽  
Vol 556-562 ◽  
pp. 2076-2080
Author(s):  
Xiang Yu ◽  
Yao Song

With the rapid development of emerging applications and data transmission rate in LTE (Long Term Evolution) system, terminal power consumption is becoming seriously. In LTE, the discontinuous reception (DRX) mechanism was taken as an important method of saving energy of terminal. Through operations of all sorts of timers, a detailed analysis of the principle of DRX is introduced in this paper. And based on this, adjustable DRX long cycle is proposed. On the basis of the DRX semi-Markov process and ETSI data model, formulas of power saving factor and wake-up delay for adjustable LTE DRX are derived. Then from the comparisons of fixed frame DRX cycle and adjustable DRX cycle in terms of power saving and wake-up delay, we know that this optimizing mechanism has obvious improvement in the reduction of terminal power consumption.


2020 ◽  
Author(s):  
Syed Hashim Ali Shah ◽  
Sundar Aditya ◽  
Sundeep Rangan

Discontinuous reception (DRX), wherein a user equipment (UE) temporarily disables its receiver, is a critical power saving feature in modern cellular systems. DRX is likely to be aggressively used at mmWave and sub-THz frequencies due to the high front-end power consumption. A key challenge for DRX at these frequencies is blockage-induced link outages: A UE will likely need to track many directional links to ensure reliable multi-connectivity, thereby increasing the power consumption. In this paper, we explore reinforcement learning-based link tracking policies in connected mode DRX that reduce power consumption by tracking only a fraction of the available links, but without adversely affecting the outage and throughput performance. Through detailed, system level simulations at 28 GHz (5G) and 140 GHz (6G), we observe that even sub-optimal link tracking policies can achieve considerable power savings with relatively little degradation in outage and throughput performance, especially with digital beamforming at the UE. In particular, we show that it is feasible to reduce power consumption by 75% and still achieve up to 95% (80%) of the maximum throughput using digital beamforming at 28 GHz (140 GHz), subject to an outage probability of at most 1%.


2018 ◽  
Vol 8 (2) ◽  
pp. 2864-2868 ◽  
Author(s):  
M. Azhar ◽  
A. Shabbir

The growing importance of energy efficient networks with high data rate requirements is a major concern for network operators. Services provided by the network operators are required to ensure the consumers’ satisfaction. For the providing of high data rates with good signal quality, small cells are deployed. But these cells can increase energy consumption if not equipped with some intelligent power saving or distribution mechanism. In this paper, a previously tested small cell sleeping mode scheme is compared with the new proposed scheme of reducing power in low or normal traffic hours. This scheme provided 13-15% increase in energy efficiency. The new scheme resulted to beneficial simulated outcomes and can be applied to overcome the energy consumption issue.


Electronics ◽  
2019 ◽  
Vol 8 (7) ◽  
pp. 778 ◽  
Author(s):  
Mudasar Latif Memon ◽  
Mukesh Kumar Maheshwari ◽  
Navrati Saxena ◽  
Abhishek Roy ◽  
Dong Ryeol Shin

5G is expected to deal with high data rates for different types of wireless traffic. To enable high data rates, 5G employs beam searching operation to align the best beam pairs. Beam searching operation along with high order modulation techniques in 5G, exhausts the battery power of user equipment (UE). LTE network uses discontinuous reception (DRX) with fixed sleep cycles to save UE energy. LTE-DRX in current form cannot work in 5G network, as it does not consider multiple beam communication and the length of sleep cycle is fixed. On the other hand, artificial intelligence (AI) has a tendency to learn and predict the packet arrival-time values from real wireless traffic traces. In this paper, we present AI based DRX (AI-DRX) mechanism for energy efficiency in 5G enabled devices. We propose AI-DRX algorithm for multiple beam communications, to enable dynamic short and long sleep cycles in DRX. AI-DRX saves the energy of UE while considering delay requirements of different services. We train a recurrent neural network (RNN) on two real wireless traces with minimum root mean square error (RMSE) of 5 ms for trace 1 and 6 ms for trace 2. Then, we utilize the trained RNN model in AI-DRX algorithm to make dynamic short or long sleep cycles. As compared to LTE-DRX, AI-DRX achieves 69 % higher energy efficiency on trace 1 and 55 % more energy efficiency on trace 2, respectively. The AI-DRX attains 70 % improvement in energy efficiency for trace 2 compared with Poisson packet arrival model for λ = 1 / 20 .


2020 ◽  
Author(s):  
Syed Hashim Ali Shah ◽  
Sundar Aditya ◽  
Sundeep Rangan

Discontinuous reception (DRX), wherein a user equipment (UE) temporarily disables its receiver, is a critical power saving feature in modern cellular systems. DRX is likely to be aggressively used at mmWave and sub-THz frequencies due to the high front-end power consumption. A key challenge for DRX at these frequencies is blockage-induced link outages: A UE will likely need to track many directional links to ensure reliable multi-connectivity, thereby increasing the power consumption. In this paper, we explore reinforcement learning-based link tracking policies in connected mode DRX that reduce power consumption by tracking only a fraction of the available links, but without adversely affecting the outage and throughput performance. Through detailed, system level simulations at 28 GHz (5G) and 140 GHz (6G), we observe that even sub-optimal link tracking policies can achieve considerable power savings with relatively little degradation in outage and throughput performance, especially with digital beamforming at the UE. In particular, we show that it is feasible to reduce power consumption by 75% and still achieve up to 95% (80%) of the maximum throughput using digital beamforming at 28 GHz (140 GHz), subject to an outage probability of at most 1%.


2008 ◽  
Vol 6 ◽  
pp. 325-330 ◽  
Author(s):  
M. Schämann ◽  
M. Bücker ◽  
S. Hessel ◽  
U. Langmann

Abstract. High data rates combined with high mobility represent a challenge for the design of cellular devices. Advanced algorithms are required which result in higher complexity, more chip area and increased power consumption. However, this contrasts to the limited power supply of mobile devices. This presentation discusses the application of an HSDPA receiver which has been optimized regarding power consumption with the focus on the algorithmic and architectural level. On algorithmic level the Rake combiner, Prefilter-Rake equalizer and MMSE equalizer are compared regarding their BER performance. Both equalizer approaches provide a significant increase of performance for high data rates compared to the Rake combiner which is commonly used for lower data rates. For both equalizer approaches several adaptive algorithms are available which differ in complexity and convergence properties. To identify the algorithm which achieves the required performance with the lowest power consumption the algorithms have been investigated using SystemC models regarding their performance and arithmetic complexity. Additionally, for the Prefilter Rake equalizer the power estimations of a modified Griffith (LMS) and a Levinson (RLS) algorithm have been compared with the tool ORINOCO supplied by ChipVision. The accuracy of this tool has been verified with a scalable architecture of the UMTS channel estimation described both in SystemC and VHDL targeting a 130 nm CMOS standard cell library. An architecture combining all three approaches combined with an adaptive control unit is presented. The control unit monitors the current condition of the propagation channel and adjusts parameters for the receiver like filter size and oversampling ratio to minimize the power consumption while maintaining the required performance. The optimization strategies result in a reduction of the number of arithmetic operations up to 70% for single components which leads to an estimated power reduction of up to 40% while the BER performance is not affected. This work utilizes SystemC and ORINOCO for the first estimation of power consumption in an early step of the design flow. Thereby algorithms can be compared in different operating modes including the effects of control units. Here an algorithm having higher peak complexity and power consumption but providing more flexibility showed less consumption for normal operating modes compared to the algorithm which is optimized for peak performance.


2020 ◽  
Author(s):  
Syed Hashim Ali Shah ◽  
Sundeep Rangan ◽  
Sundar Aditya

Discontinuous reception (DRX), wherein a user equipment (UE) temporarily disables its receiver, is a critical power saving feature in modern cellular systems. DRX is likely to be aggressively used at mmWave and sub-THz frequencies due to the high front-end power consumption. A key challenge for DRX at these frequencies is blockage-induced link outages: A UE will likely need to track many directional links to ensure reliable multi-connectivity, thereby increasing the power consumption. In this paper, we explore reinforcement learning-based link tracking policies in connected mode DRX that reduce power consumption by tracking only a fraction of the available links, but without adversely affecting the outage and throughput performance. Through detailed, system level simulations at 28 GHz (5G) and 140 GHz (6G), we observe that even sub-optimal link tracking policies can achieve considerable power savings with relatively little degradation in outage and throughput performance, especially with digital beamforming at the UE. In particular, we show that it is feasible to reduce power consumption by 75% and still achieve up to 95% (80%) of the maximum throughput using digital beamforming at 28 GHz (140 GHz), subject to an outage probability of at most 1%.


Author(s):  
John Maynard Smith ◽  
Eors Szathmary

Over the history of life there have been several major changes in the way genetic information is organized and transmitted from one generation to the next. These transitions include the origin of life itself, the first eukaryotic cells, reproduction by sexual means, the appearance of multicellular plants and animals, the emergence of cooperation and of animal societies, and the unique language ability of humans. This ambitious book provides the first unified discussion of the full range of these transitions. The authors highlight the similarities between different transitions--between the union of replicating molecules to form chromosomes and of cells to form multicellular organisms, for example--and show how understanding one transition sheds light on others. They trace a common theme throughout the history of evolution: after a major transition some entities lose the ability to replicate independently, becoming able to reproduce only as part of a larger whole. The authors investigate this pattern and why selection between entities at a lower level does not disrupt selection at more complex levels. Their explanation encompasses a compelling theory of the evolution of cooperation at all levels of complexity. Engagingly written and filled with numerous illustrations, this book can be read with enjoyment by anyone with an undergraduate training in biology. It is ideal for advanced discussion groups on evolution and includes accessible discussions of a wide range of topics, from molecular biology and linguistics to insect societies.


Sign in / Sign up

Export Citation Format

Share Document