Improving Cell Edge Performance for LTE Network Using 0.8 Ghz and 2.6 Ghz Frequency Bands
No Thumbnail Available
Date
2021-03
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
KNUST
Abstract
To provide maximum Physical Downlink Shared Channel (PDSCH) capacity for cell-edge Long Term Evolution (LTE) customers, the received signal strength from the Base Station (BS) to the User Equipment (UE) should be high. However, increasing power levels at the BS to ensure signal availability for users at the cell edge presents Inter-Cell Interference (ICI) in the LTE network, which drastically impacts the Quality of Service (QoS) negatively. As a solution to curb this problem, researchers have adopted several techniques such as Geometric Factor Model (GFM), Heterogeneous Network (HN), and Power Variation System (PVS) to assign a carrier frequency to users based on their respective distance from the base station to boost the signal at the cell edge. However, these methods do not include the current channel conditions present in a multipath fading environment which are the main concern that needs to be taken into consideration. It is therefore imperative to investigate the performance of the LTE network by considering the impairment in the wireless channel to see its corresponding effect on the user performance. The inclusion of these challenges in the wireless channel will give an idea of the performance of the LTE network and the means to improve it for the optimum benefit for the users. This work presents the performance of cell-edge users in the LTE network on 0.8 GHz low band and 2.6 GHz high band, signal propagation experiment in MATLAB\Simulink environment based on Markov model. Carrier aggregation provides a technique for LTE users to access the network using multiple frequency bands, which have varying penetration losses and enlarged bandwidth for the user. The low and high band frequencies used in this work help the user at the cell edge to achieve coverage and throughput at the same time in the LTE network. The simulated signal-to-noise ratio of the two frequencies achieves a better performance metric for customers to experience a good internet service. The average bit error rate (BER) for users using 0.8 GHz was 5.50e-5 while the average bit error rate for users using 2.6 GHz was 1.98e-4. Cell edge users using 0.8 GHz frequency carrier experienced an average of 92% throughput while those on 2.6 GHz experienced an average of 70% throughput. These results provide a realistic and reliable approach to mitigate cell edge challenges for LTE users than those achieved by other methods such as GFM, HN, and PVS used by earlier researchers.