scholarly journals Enhanced ABSF Algorithm with a Relay Function in LTE Heterogeneous Networks

Electronics ◽  
2020 ◽  
Vol 9 (9) ◽  
pp. 1343
Author(s):  
Lehung Nguyen ◽  
Sungoh Kwon

In this study, we enhance the almost blank subframe (ABSF) algorithm in a Long Term Evolution (LTE) heterogeneous network (HetNet) by providing a relay function. The ABSF is a technique proposed by the Third Generation Partnership Project to reduce interference in a HetNet. Despite the fact that the ABSF effectively mitigates intercell interference, it has two major disadvantages. First, the ABSF algorithm alters the scheduling policy of macro base stations. Second, it degrades the capacity of users served by femto base stations. Our proposed enhanced algorithm applies a relay function to assist victim macro user equipment (VMUE), and it reduces the side effects caused by the ABSF algorithm. Taking resource allocation and power control into account, the relay function assists VMUE in such a way that interference with other users is minimized. Via simulation results, the proposed algorithm exhibits improvements of 18% and 8% for system throughput and user satisfaction, respectively, in comparison with the conventional ABSF.

Sensors ◽  
2021 ◽  
Vol 22 (1) ◽  
pp. 117
Author(s):  
Haitao Wang ◽  
Xiaoyong Lyu ◽  
Kefei Liao

Passive radars based on long-term evolution (LTE) signals suffer from sever interferences. The interferences are not only from the base station used as the illuminator of opportunity (BS-IoO), but also from the other co-channel base stations (CCBS) working at the same frequency with the BS-IoO. Because the reference signals of the co-channel interferences are difficult to obtain, cancellation performance degrades seriously when traditional interference suppression methods are applied in LTE-based passive radar. This paper proposes a cascaded cancellation method based on the spatial spectrum cognition of interference. It consists of several cancellation loops. In each loop, the spatial spectrum of strong interferences is first recognized by using the cyclostationary characteristic of LTE signal and the compressed sensing technique. A clean reference signal of each interference is then reconstructed according to the spatial spectrum previously obtained. With the reference signal, the interferences are cancelled. At the end of each loop, the energy of the interference residual is estimated. If the interference residual is still strong, then the cancellation loop continues; otherwise it terminates. The proposed method can get good cancellation performance with a small-sized antenna array. Theoretical and simulation results demonstrate the effectiveness of the proposed method.


2017 ◽  
Vol 2017 ◽  
pp. 1-25 ◽  
Author(s):  
Arthi Murugadass ◽  
Arulmozhivarman Pachiyappan

The densification of serving nodes is one of the potential solutions to maximize the spectral efficiency per unit area. This is preposterous on account of conventional base stations (BS) for which site procurement is costly. Long term evolution-advanced (LTE-A) defines the idea of heterogeneous networks (HetNets), where BSs with different coverage and capacity are utilized to guarantee the quality of service (QoS) requirements of the clients. To maximize the transmission quality of the clients in the coverage holes, LTE-A also defines multihop relay (MHR) networks, where the relay stations (RSs) are also placed along with the BSs. Unfortunately, the placement approaches for HetNet and MHR serving nodes are not standardized. In this work, two different approaches like site selection with maximum service coverage (SSMSC) and site selection with minimum placement cost (SSMPC) are proposed, which identifies the required number of serving nodes, their types, and the placement locations to maximize the coverage and to maintain the placement cost (PC) within the limits of the total placement budget. The simulation results demonstrate that the proposed approaches are computationally less complex and offer enhanced performance in terms of aggregate PC, coverage, and power proportion compared to the other conventional approaches.


2015 ◽  
Vol 2015 ◽  
pp. 1-10 ◽  
Author(s):  
Jenhui Chen ◽  
Ching-Yang Sheng

This paper deals with the problem of triggering handoff procedure at an appropriate point of time to reduce the ping-pong effect problem in the long-term evolution advanced (LTE-A) network. In the meantime, we also have studied a dynamic handoff threshold scheme, named adaptive measurement report period and handoff threshold (AMPHT), based on the user equipment’s (UE’s) reference signal received quality (RSRQ) variation and the moving velocity of UE. AMPHT reduces the probability of unnecessarily premature handoff decision making and also avoids the problem of handoff failure due to too late handoff decision making when the moving velocity of UE is high. AMPHT is achieved by two critical parameters: (1) a dynamic RSRQ threshold for handoff making; (2) a dynamic interval of time for the UE’s RSRQ reporting. The performance of AMPHT is validated by comparing numerical experiments (MATLAB tool) with simulation results (the ns-3 LENA module). Our experiments show that AMPHT reduces the premature handoff probability by 34% at most in a low moving velocity and reduces the handoff failure probability by 25% in a high moving velocity. Additionally, AMPHT can reduce a large number of unnecessary handoff overheads and can be easily implemented because it uses the original control messages of 3GPP E-UTRA.


2017 ◽  
Vol 2017 ◽  
pp. 1-11 ◽  
Author(s):  
Matías Toril ◽  
Rocío Acedo-Hernández ◽  
Almudena Sánchez ◽  
Salvador Luna-Ramírez ◽  
Carlos Úbeda

In cellular networks, spectral efficiency is a key parameter when designing network infrastructure. Despite the existence of theoretical model for this parameter, experience shows that real spectral efficiency is influenced by multiple factors that greatly vary in space and time and are difficult to characterize. In this paper, an automatic method for deriving the real spectral efficiency curves of a Long Term Evolution (LTE) system on a per-cell basis is proposed. The method is based on a trace processing tool that makes the most of the detailed network performance measurements collected by base stations. The method is conceived as a centralized scheme that can be integrated in commercial network planning tools. Method assessment is carried out with a large dataset of connection traces taken from a live LTE system. Results show that spectral efficiency curves largely differ from cell to cell.


2013 ◽  
Vol 765-767 ◽  
pp. 611-614
Author(s):  
Qin Zhu ◽  
Xiao Wen Li

In the long-term evolution (LTE) system, channel equalization makes compensation to restore the original signal, the paper puts forward iteration sphere decoding algorithm which combines traditional sphere decoding and the improved QR based on the conventional QR decomposition detection algorithm. It can effectively reduce the system complexity. At the same time, in QPSK and 16QAM, the simulation results show that the improved QR iterative sphere decoding algorithm performance is better with higher SNR in AWGN channel.


Device-to-Device (D2D) communications is expected to be a key technology of the forthcoming mobile communication networks because of its benefits in terms of spectral efficiency, energy efficiency, and system capacity. To mitigate frequency collisions as well as reduce the effects of co-channel interference between user's connections, we propose an interference-aware coordinated access control (IaCAC) mechanism for heterogeneous cellular D2D communication networks with dense device deployment of user equipment (UEs). In the proposed network setting, we consider the co-existence of both macro base stations (MBSs) and smallcell base stations (SBSs). In the proposed IaCAC mechanism, MBSs and SBSs are coordinated to perform access control to their UEs while MBSs allocate bandwidth parts dynamically to SBSs based on the interference levels measured at SBSs. Besides, to reduce D2D-to-cellular interference, device user equipments (DUEs) can perform power control autonomously. Simulation results show that the proposed IaCAC can provide higher system throughput and user throughput than those achieved by the network-assisted device-decided scheme proposed in [21]. Moreover, simulation results also reveal that the proposed IaCAC also significantly improve SINR of MUE’s and SUE’s uplink connections.


Author(s):  
Tsung-Hui Chuang ◽  
Guan-Hong Chen ◽  
Meng-Hsun Tsai ◽  
Chun-Lung Lin

In the LTE-Advanced network, some femtocells are deployed within a macroecell for improving throughput of indoor user equipments (UEs), which are referred to as femtocell UEs (FUEs). Cross-tier interference is an important issue in this deployment, which may significantly impact signal quality between Macrocell Base Stations (MBSs) and Macrocell User Equipments (MUEs), especially for MUEs near the femtocell. To relieve this problem, the Third Generation Partnership Project Long Term Evolution-Advanced (3GPP LTE-Advanced) de<br /> fined the cognitive radio enhanced femtocell to coordinate interference for LTE-Advanced Network. Cognitive radio femtocells have the ability to sense radio environment to obtain radio parameters. In this paper, we investigated the performance of existing schemes based on fractional frequency reuse. Therefore, we proposed a scheme with cognitive radio technology to improve the performance of fractional fre-quency reuse scheme. Simulation results showed that our scheme can effectively enhance average downlink throughput of FUEs as well as the total downlink throughput in LTE-Advanced Networks.


Author(s):  
Prof. Muhamad Angriawan

In this article, A bi-planar receiving antenna is planned for LTE convenient base stations. The proposed reception apparatus involves two parts one for the upper band and another for lower band. It can in like manner be shown without granulating folds. The lower band has the pair of printed dipoles with two or three parasitic parts for transmission limit improvement. The upper band contains a few crumbled dipoles. The microstrip line and dipole are cut on a comparative substrate. The upper band segments are wrapped in lower band segments encircling a diminished structure. The bi-planar receiving wire achieves an information transmission of around 2 GHz. The arranged receiving wire can be completed in versatile base stations. The radio wire increase achieved around ~12 dbi which is sensible for the adaptable correspondence base station structures.


2019 ◽  
Vol 11 (1) ◽  
pp. 19 ◽  
Author(s):  
Djorwé Témoa ◽  
Anna Förster ◽  
Kolyang ◽  
Serge Doka Yamigno

Long Term Evolution networks, which are cellular networks, are subject to many impairments due to the nature of the transmission channel used, i.e. the air. Intercell interference is the main impairment faced by Long Term Evolution networks as it uses frequency reuse one scheme, where the whole bandwidth is used in each cell. In this paper, we propose a full dynamic intercell interference coordination scheme with no bandwidth partitioning for downlink Long Term Evolution networks. We use a reinforcement learning approach. The proposed scheme is a joint resource allocation and power allocation scheme and its purpose is to minimize intercell interference in Long Term Evolution networks. Performances of proposed scheme shows quality of service improvement in terms of SINR, packet loss and delay compared to other algorithms.


Sign in / Sign up

Export Citation Format

Share Document