scholarly journals Performance Analysis of Uplink Scheduling Algorithms in LTE Networks

Author(s):  
Shafinaz Bt Ismail ◽  
Darmawaty Bt Mohd Ali ◽  
Norsuzila Ya’acob

Scheduling is referring to the process of allocating resources to User Equipment based on scheduling algorithms that is located at the LTE base station. Various algorithms have been proposed as the execution of scheduling algorithm, which represents an open issue in Long Term Evolution (LTE) standard. This paper makes an attempt to study and compare the performance of three well-known uplink schedulers namely, Maximum Throughput (MT), First Maximum Expansion (FME), and Round Robin (RR). The evaluation is considered for a single cell with interference for three flows such as Best effort, Video and VoIP in a pedestrian environment using the LTE-SIM network simulator. The performance evaluation is conducted in terms of system throughput, fairness index, delay and packet loss ratio (PLR). The simulations results show that RR algorithm always reaches the lowest PLR, delivering highest throughput for video and VoIP flows among all those strategies. Thus, RR is the most suitable scheduling algorithm for VoIP and video flows while MT and FME is appropriate for BE flows in LTE networks.

Long Term Evolution- Advanced (LTE-A) networks have been introduced in Third Generation Partnership Project (3GPP) release – 10 specifications, with an objective of obtaining a high data rate for the cell edge users, higher spectral efficiency and high Quality of service for multimedia services at the cell edge/Indoor areas. A Heterogeneous network (HetNet) in a LTE-A is a network consisting of high power macro-nodes and low power micro-nodes of different cell coverage capabilities. Due to this, non-desired signals acting as interference exist between the micro and macro nodes and their users. Interference is broadly classified as cross-tier and co-tier interference. The cross tier interference can be reduced by controlling the base station transmit power while the co-tier interference can be reduced by proper resource allocation among the users. Scheduling is the process of optimal allocation of resources to the users. For proper resource allocation, scheduling is done at the Main Base station (enodeB). Some LTE-A downlink scheduling algorithms are based on transmission channel quality feedback given by user equipment in uplink transmission. Various scheduling algorithms are being developed and evaluated using a network simulator. This paper presents the performance evaluation of the Adaptive Hybrid LTE-A Downlink scheduling algorithm. The evaluation is done in terms of parameters like user’s throughput (Peak, Average, and Edge), Average User’s spectral efficiency and Fairness Index. The evaluated results of the proposed algorithm is compared with the existing downlink scheduling algorithms such as Round Robin, Proportional Fair, Best Channel Quality Indicator (CQI) using a network simulator. The comparison results show the effectiveness of the proposed adaptive Hybrid Algorithm in improving the cell Edge user’s throughput as well the Fairness Index.


2021 ◽  
Vol 7 ◽  
pp. e546
Author(s):  
Khuram Ashfaq ◽  
Ghazanfar Ali Safdar ◽  
Masood Ur-Rehman

Background Wireless links are fast becoming the key communication mode. However, as compared to the wired link, their characteristics make the traffic prone to time- and location-dependent signal attenuation, noise, fading, and interference that result in time varying channel capacities and link error rate. Scheduling algorithms play an important role in wireless links to guarantee quality of service (QoS) parameters such as throughput, delay, jitter, fairness and packet loss rate. The scheduler has vital importance in current as well as future cellular communications since it assigns resource block (RB) to different users for transmission. Scheduling algorithm makes a decision based on the information of link state, number of sessions, reserved rates and status of the session queues. The information required by a scheduler implemented in the base station can easily be collected from the downlink transmission. Methods This paper reflects on the importance of schedulers for future wireless communications taking LTE-A networks as a case study. It compares the performance of four well-known scheduling algorithms including round robin (RR), best channel quality indicator (BCQI), proportional fair (PF), and fractional frequency reuse (FFR). The performance of these four algorithms is evaluated in terms of throughput, fairness index, spectral efficiency and overall effectiveness. System level simulations have been performed using a MATLAB based LTE-A Vienna downlink simulator. Results The results show that the FFR scheduler is the best performer among the four tested algorithms. It also exhibits flexibility and adaptability for radio resource assignment.


2012 ◽  
Vol 3 (2) ◽  
pp. 39-57 ◽  
Author(s):  
Ioan Sorin Comsa ◽  
Mehmet Aydin ◽  
Sijing Zhang ◽  
Pierre Kuonen ◽  
Jean–Frédéric Wagen

The use of the intelligent packet scheduling process is absolutely necessary in order to make the radio resources usage more efficient in recent high-bit-rate demanding radio access technologies such as Long Term Evolution (LTE). Packet scheduling procedure works with various dispatching rules with different behaviors. In the literature, the scheduling disciplines are applied for the entire transmission sessions and the scheduler performance strongly depends on the exploited discipline. The method proposed in this paper aims to discuss how a straightforward schedule can be provided within the transmission time interval (TTI) sub-frame using a mixture of dispatching disciplines per TTI instead of a single rule adopted across the whole transmission. This is to maximize the system throughput while assuring the best user fairness. This requires adopting a policy of how to mix the rules and a refinement procedure to call the best rule each time. Two scheduling policies are proposed for how to mix the rules including use of Q learning algorithm for refining the policies. Simulation results indicate that the proposed methods outperform the existing scheduling techniques by maximizing the system throughput without harming the user fairness performance.


Author(s):  
Johann Max Hofmann Magalhães ◽  
Saulo Henrique da Mata ◽  
Paulo Roberto Guardieiro

The design of a scheduling algorithm for LTE networks is a complex task, and it has proven to be one of the main challenges for LTE systems. There are many issues to be addressed in order to obtain a high spectral efficiency and to meet the application's QoS requirements. In this context, this chapter presents a study of the resource allocation process in LTE networks. This study starts with an overview of the main concepts involved in the LTE resource allocation, and brings two new proposals of scheduling algorithms for downlink and uplink, respectively. Simulations are used to compare the performance of these proposals with other scheduler proposals widely known and explored in the literature.


2021 ◽  
Vol 8 (2) ◽  
pp. 23-34
Author(s):  
Olawale Oluwasegun Ogunrinola ◽  
Isaiah Opeyemi Olaniyi ◽  
Segun A. Afolabi ◽  
Gbemiga Abraham Olaniyi ◽  
Olushola Emmanuel Ajeigbe

Modern radio communication services transmit signals from an earth station to a high-altitude station, space station or a space radio system via a feeder link while in Global Systems for Mobile Communication (GSM) and computer networks, the radio uplink transmit from cell phones to base station linking the network core to the communication interphase via an upstream facility. Hitherto, the Single-Carrier Frequency Division Multiple Access (SC-FDMA) has been adopted for uplink access in the Long-Term Evolution (LTE) scheme by the 3GPP. In this journal, the LTE uplink radio resource allocation is addressed as an optimization problem, where the desired solution is the mapping of the schedulable UE to schedulable Resource Blocks (RBs) that maximizes the proportional fairness metric. The particle swarm optimization (PSO) has been employed for this research. PSO is an algorithm that is very easy to implement to solve real time optimization problems and has fewer parameters to adjust when compared to other evolutionary algorithms. The proposed scheme was found to outperform the First Maximum Expansion (FME) and Recursive Maximum Expansion (RME) in terms of simulation time and fairness while maintaining the throughput.


Currently Long-Term Evolution (LTE) technologies are developing with an advanced network content delivery infrastructure sharing capabilities for different users and operators. Therefore, an efficient open-infrastructure of Position-orthogonal multiple access (POMA) based LTE networks with the content delivery capability is crucial to design and implement at this stage of research in LTE area. In this paper, the main contribution is the design and implementation of such LTE networks users open-infrastructure to enable the content delivery and sharing capabilities to be performed in virtual iterative precoding architectures. The proposed user content delivery infrastructure enables this service and sharing to the components of LTE architecture. A channel spectrum scheduling algorithm has been implemented through validation with test level experiments conducted to estimate the compatibility of LTE infrastructure virtualization in proposed open-infrastructure. The impact and scope of the proposed algorithm service can make the operational cost of the LTE mobile user to reduce


Author(s):  
Mariyam Ouaissa ◽  
Abdallah Rhattoy

<span lang="EN-US">The introduction of Machine-to-Machine (M2M) communications in cellular networks creates a new set of challenges because of the unique service requirements and features of M2M devices. One of these challenges is the management of radio resources, especially on the uplink because of the unfairness and poor performance that occurs when allocating resources to users. Long Term Evolution (LTE) and LTE-Advanced (LTE-A) are excellent candidates for supporting M2M communications because of their native IP connectivity and scalability for a variety of devices. Therefore, LTE schedulers should be able to meet the needs of M2M devices such as time constraints and specific Quality of Service (QoS) requirements. In this paper, these constraints are studied and analyzed, focusing on three schedulers; they are Round Robin (RR), First Maximum Expansion (FME) and Maximum Throughput (MT). These methods do not provide QoS to users who use different types of traffic flows. The solution proposed in this work is a hybrid model between two schedulers each of them is the best scheduling solution for the real-time service and the other for the non-real-time service, in order to meet QoS criteria maximizing throughput and minimizing packet loss. Video and voip were selected as real-time traffic and best effort for non-real time. The simulations results show that the proposed scheduler reaches the lowest Packet Loss Rate (PLR), delivering highest throughput and goodput among the other strategies.</span>


In this paper proposed for future generation Long Term Evolution (LTE) networks a radio resource management using QoS with aware QOC-RRM method. In QOC-RRM scheme we present the hybrid Recurrent Deep Neural Network (RDNN) technique to differentiate the operators by priority wise based on multiple constraints and it control the allocated resource bybase stations. For routing share queuing criterion data with other schaotic weed optimization (CWO) algorithm are proposed. Once information received each BS schedules the resources for priority user first. The proposed QOC-RRM scheme is implemented in Network Simulator (NS3) tool and performance can better than conventional RRM schemes in terms of minimum date rate requirement, maximum number of active users and utilization of the radio spectrums.


Author(s):  
Yusmardiah Yusuf ◽  
Darmawaty Mohd Ali ◽  
Norsuzila Ya’acob

Scheduling mechanism is the process of allocating radio resources to User Equipment (UE) that transmits different flows at the same time. It is performed by the scheduling algorithm implemented in the Long Term Evolution base station, Evolved Node B. Normally, most of the proposed algorithms are not focusing on handling the real-time and non-real-time traffics simultaneously. Thus, UE with bad channel quality may starve due to no resources allocated for quite a long time. To solve the problems, Exponential Blind Equal Throughput (EXP-BET) algorithm is proposed. User with the highest priority metrics is allocated the resources firstly which is calculated using the EXP-BET metric equation. This study investigates the implementation of the EXP-BET scheduling algorithm on the FPGA platform. The metric equation of the EXP-BET is modelled and simulated using System Generator. This design has utilized only 10% of available resources on FPGA. Fixed numbers are used for all the input to the scheduler. The system verification is performed by simulating the hardware co-simulation for the metric value of the EXP-BET metric algorithm. The output from the hardware co-simulation showed that the metric values of EXP-BET produce similar results to the Simulink environment.  Thus, the algorithm is ready for prototyping and Virtex-6 FPGA is chosen as the platform.


Sign in / Sign up

Export Citation Format

Share Document