Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 1896 2023-09-14 13:12:40 |
2 format correct Meta information modification 1896 2023-09-15 08:41:11 |

Video Upload Options

Do you have a full video?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Ahmed, F.Y.H.; Masli, A.A.; Khassawneh, B.; Yousif, J.H.; Zebari, D.A. Optimized Downlink Scheduling over Long-Term Evolution Network. Encyclopedia. Available online: https://encyclopedia.pub/entry/49171 (accessed on 07 July 2024).
Ahmed FYH, Masli AA, Khassawneh B, Yousif JH, Zebari DA. Optimized Downlink Scheduling over Long-Term Evolution Network. Encyclopedia. Available at: https://encyclopedia.pub/entry/49171. Accessed July 07, 2024.
Ahmed, Falah Y. H., Amal Abulgasim Masli, Bashar Khassawneh, Jabar H. Yousif, Dilovan Asaad Zebari. "Optimized Downlink Scheduling over Long-Term Evolution Network" Encyclopedia, https://encyclopedia.pub/entry/49171 (accessed July 07, 2024).
Ahmed, F.Y.H., Masli, A.A., Khassawneh, B., Yousif, J.H., & Zebari, D.A. (2023, September 14). Optimized Downlink Scheduling over Long-Term Evolution Network. In Encyclopedia. https://encyclopedia.pub/entry/49171
Ahmed, Falah Y. H., et al. "Optimized Downlink Scheduling over Long-Term Evolution Network." Encyclopedia. Web. 14 September, 2023.
Optimized Downlink Scheduling over Long-Term Evolution Network
Edit

Long-Term Evolution (LTE) technology is utilized efficiently for wireless broadband communication for mobile devices. It provides flexible bandwidth and frequency with high speed and peak data rates. Optimizing resource allocation is vital for improving the performance of the Long-Term Evolution (LTE) system and meeting the user’s quality of service (QoS) needs. The resource distribution in video streaming affects the LTE network performance, reducing network fairness and causing increased delay and lower data throughput. 

Long-Term Evolution downlink scheduling artificial neural network artificial intelligence machine learning

1. Introduction

The 3rd Generation Partnership Project (3GPP) implemented the (LTE) strategy for fulfilling the increasing demand for wireless networks. The radio resource management scheduling algorithms were also used in this LTE setup. They assigned radio services to the final users based on different criteria for quality of service (QoS). However, the development of the downlink scheduling algorithms was a major problem noted during resource allocation in the LTE system. Different scheduling techniques were proposed for addressing this issue, and an investigation of the downlink algorithms garnered a lot of research interest during the LTE implementation, as several researchers started shifting to packet scheduling over LTE since it was regarded as a rapidly growing technology that can significantly affect the future of wireless networks. In the LTE downlink algorithm used for QoS class identifiers, the radio resource allocation steps use the QoS specifications and channel condition reports to determine the users’ transmission orders. However, inefficient resource allocation in the LTE networks can be noted due to poor network performance that deteriorates the data throughput and network fairness index and increases the average delay.
A better communication system could be developed by integrating the different artificial intelligence (AI) concepts and machine learning (ML) techniques for scheduling the end-user devices and designing wireless structures. The use of artificial intelligence technologies in the wireless communication system was based on its ability to encourage the massive advancement of wireless traffic and the emergence of new uses for wireless services. The services that were not tested in the past varied between general multimedia and video-based services. The major issue in the evolution and development of wireless networks in the past decade is a greater need for wireless networks that provide better communication services with higher reliability, lower latency, and a higher end rate [1].

2. Using Artificial Neural Network Models to Enhance the Performance of the Downlink Communication System in LTE Networks

Many researchers have used artificial neural network (ANN) models to enhance the performance of the downlink communication system in LTE networks. Different models were used, where some studies considered channel estimation, while others investigated user device condition or mobile location. Predictive analyses of various ANN models for predicting and classifying the data over the LTE network were conducted in some other studies. Several studies have reported significant improvements in various performance metrics, such as throughput, fairness, and quality of service, when using ANNs for downlink scheduling. For example, a study by [2] showed that an ANN-based downlink scheduling algorithm achieved up to 30% higher throughput than a conventional rule-based algorithm.
Charrada [3] proposed an accurate channel environment estimation technique using the ANN and support vector machine regression (SVR) models for the standardized signal structure of the LTE downlink system [3]. This technique was used under the impulsive, non-linear noise that interfered with reference codes after considering the high mobility conditions. He studied the SVR and ANN performances using the simulation results, which performed better than the decision feedback (DF), least squares (LS), and ANN algorithms. Another study [4] presented a method for relaying the reference symbol information. This information was used for estimating the total frequency response of the channel. This technique was summarized in two steps: firstly, channel differences were adapted after applying the ANN-based learning methods trained using the genetic algorithm (ANN-GA). Secondly, the channel matrix was estimated to improve the performance of LTE networks. They validated the proposed algorithms using various ANN-based estimator algorithms, such as the feed-forward neural network, the layered recurrent neural network, the least squares (LS) algorithm, and the cascade-forward neural network for closed-loop spatial multiplexing (CLSM) single-user multi-input. The results of this comparison indicated that the proposed ANN-GA algorithm showed better accuracy than others.
Furthermore, the significant increase in network subscribers led to resource allocation issues. To resolve this problem, the researcher proposed a downlink algorithm that ensured an effective and faster resource allocation solution for real-time video applications [5]. This solution used an ANN algorithm that allowed resource allocation after considering the UE conditions. They noted that the AI techniques used for resource allocation on the LTE network generated accurate results, but the ANN-based training process could take a long time. Hence, dynamic resource allocation can be done by realizing the daily ANN training processes whenever the eNodeB is intense.
In terms of predicting and classifying data over the LTE network via applying ANN models, many researchers have carried this out. In an earlier study [6], the researchers investigated the performance of two ANN models for prediction and training algorithms (i.e., Levenberg-Marquardt and Bayesian regularization). They primarily focused on integrating an ANN into the LTE network during the mobile handover start-up phase. They compared the received signal strength (RSS) and the hysteresis fringe parameters for the adaptive neural hysteresis fringe reduction algorithm. The study aimed to resolve the channel estimation problem noted in LTE networks. In [7], the researchers determined the adaptive learning and predictive ability of three ANN models, i.e., RBFNN, GRNN, and MLPNN, using the spatial radio signal dataset derived from the commercial LTE cellular networks. Thus, they could verify the efficiency and accuracy of the adaptive prediction system using attenuation and oscillation landscapes to determine the radio signal strength propagated in the LTE urban micro-cell topography. Their results indicated that the ANN prediction techniques could adapt to the measurement errors regarding the attenuation of the LTE radio signals. A comparison of the performance of the different techniques indicated that all ANN models could predict the transmitted LTE radio signals with numerous errors. A recent study by Ojo [8] aimed to resolve these issues related to the existing models (experimental and deterministic) by implementing ML-based algorithms to predict path loss in LTE networks. They developed the RBFNN and multilayer perception neural network (MLPNN) models with the measured data as an input variable and compared it to the measured path loss. They noted that the RBFNN was more accurate, as it showed lower root mean squared errors (RMSEs) than the MLPNN. Also, an ANN-aided scheduling scheme for the UEs with mobility in LTE dynamic HetNets was proposed in another study [9]. They determined a faster eICIC reconfiguration technique in the LTE HetNets. This technique helped to achieve a marginal gap compared to the centralized solution. Using historical data, this proposed technique could train the RBFNN to determine the relationship between the surrounding environment (channel and UE deployment), an optimal cell range extension, and a nearly blank subframe pattern. The researchers investigated the performance of their proposed algorithm concerning its throughput and utility during simulations. They noted that an optimal resource assignment helped rapidly vary the HetNets with low-performance degradation. In [10], the researchers studied the probabilistic GRNN model for modelling and estimating the data corresponding to the spatial signal power loss. Commercial data was collected from the LTE network interface’s outdoor location. This examined GRNN model was trained with a power loss measurement of a spatial signal. The data was collected from three different outdoor signal propagation locations. It is noted that the proposed model showed better results in comparison to the conventional least square regression modelling process. Some researchers proposed an ANN-based classification system with higher accuracy and performance [11] for fingerprint images using the NN models. In [12], the researchers used this technique for classifying bacteria. The results indicated that the ANN was effective and feasible. In [13], the researchers classified aerial photographs using the ANN. They noted that the ANN was suitable for classifying the remotely sensed data, exceeded the maximal probability classification for the classification accuracy, and showed a positive effect. In [14], the researchers used the ANN for classifying spoken letters. They noted a 100% (training) and 93% (testing) classification accuracy. All these studies stated that the ANN could be used for classification owing to its better performance. To improve machine-type communication (MTC) security, some researchers [15] studied the issues related to the lack of authentication requirements and attack detection for LTE-based MTC devices. As a result, they introduced a better NN model for detecting attacks. The results indicated the efficiency of this model in detecting attacks and compared the system’s security.
ANN models could be used for optimizing the resource allocation algorithms for communication networks. Furthermore, ANN technology could overcome the resource allocation problems noted in the LTE network. Some common ANN activation functions that were applied included the sigmoidal, binary, and hyperbolic sigmoidal functions based on the RBFNN, MLPNN, recurrent neural network, and perceptron models. All these functions were used for developing network communication. The researchers also considered the backpropagation and the gradient descent algorithms as training algorithms for the ANN. ANNs are used for different tasks, like approximation and prediction of functions, pattern classification, prediction, and clustering. However, the performance of the algorithm was significantly affected by data preparation and the setup used for the NN structure. ANN and mathematical models are used to evaluate and validate experimental data [16][17]. Artificial neural network (ANN) techniques are used in the GRNN-RBFNN model and other proposed methods to optimize downlink scheduling over LTE networks. These models use historical data to learn and predict optimal scheduling policies for different users. However, some challenges still need to be addressed in LTE network scheduling. One such problem is ensuring fairness among users with different QoS requirements. The scheduling algorithms used in these models may not always be able to provide equal QoS to all users, leading to a lower fairness index for some users. Therefore, further research is needed to develop scheduling algorithms to ensure fairness among users with varying QoS requirements. Another challenge is the dynamic nature of the LTE network, which may lead to fluctuations in network conditions and user demands. This can make it difficult to predict optimal scheduling policies accurately. Therefore, there is a need to develop adaptive scheduling algorithms that can adjust to changes in various network channels’ conditions and user demands in real time considering the data over the network. The proposed methods have shown promise for optimizing downlink scheduling on LTE networks. To improve the performance of LTE network scheduling, the problems of fairness and changing network conditions still need to be studied.
The proposed approach of using artificial neural networks (ANNs) for downlink scheduling in LTE networks is suitable for solving the critical problem of optimizing the allocation of radio resources to users. ANNs are powerful machine-learning models that can learn from historical data to identify patterns and relationships. In the last few years, many researchers have used ANN models to enhance the performance of the downlink communication system in LTE networks. Many models have been used; some studies considered channel estimation, while others investigated the condition of the user devices or estimated the mobile location. Predictive analyses of various ANN models for predicting and classifying the data over the LTE network were conducted in some other studies. Several studies have reported significant improvements in various performance metrics, such as throughput, fairness, and quality of service, when using ANNs for downlink scheduling.

References

  1. Chen, M.; Challita, U.; Saad, W.; Yin, C.; Debbah, M. Artificial neural networks-based machine learning for wireless networks: A tutorial. IEEE Commun. Surv. Tutor. 2019, 214, 3039–3071.
  2. Ghalut, T.; Larijani, H. Content-Aware and QOE optimization of video stream scheduling over lte networks using genetic algorithms and random neural networks. J. Ubiquitous Syst. Pervasive Netw. 2018, 9, 21–33.
  3. Charrada, A.; Samet, A. Support vector machine regression and artificial neural network for channel estimation of lte downlink in high-mobility environments. Trans. Mach. Learn. Artif. Intell. 2016, 4, 36.
  4. Reshamwala, N.S.; Suratia, P.S.; Shah, S.K. Artificial neural network trained by genetic algorithm for smart MIMO channel estimation for downlink LTE-advance system. Int. J. Comput. Netw. Inf. Secur. 2014, 6, 10–19.
  5. Yigit, T.; Ersoy, M. Resource allocation using ANN in LTE. In Proceedings of the International Conference of Numerical Analysis and Applied Mathematics (ICNAAM 2016), Rhodes, Greece, 19–25 September 2016; p. 250005.
  6. Ekong, E.; Adewale, A.; Ben-Obaje, A.; Alalade, A.; Ndujiuba, C. Performance comparison of ANN training algorithms for hysteresis determination in LTE networks. J. Phys. Conf. Ser. 2019, 1378, 42094.
  7. Isabona, J.; Osaigbovo, A.I. Investigating predictive capabilities of RBFNN, MLPNN and GRNN models for LTE cellular network radio signal power datasets. FUOYE J. Eng. Technol. 2019, 4, 155–159.
  8. Ojo, S.; Imoize, A.; Alienyi, D. Radial basis function neural network path loss prediction model for LTE networks in multi-transmitter signal propagation environments. Int. J. Commun. Syst. 2021, 34, e4680.
  9. Li, H.; Liang, Z.; Ascheid, G. Artificial neural network aided dynamic scheduling for eICIC in LTE HetNets. In Proceedings of the 2016 IEEE 17th International Workshop on Signal Processing Advances in Wireless Communications (SPAWC), Edinburgh, UK, 3–6 July 2016; pp. 1–5.
  10. Obahiagbon, K.; Isabona, J. Generalized regression neural network: An alternative approach for reliable prognostic analysis of spatial signal power loss in cellular broadband networks. Int. J. Adv. Res. Phys. Sci. 2018, 5, 35–42.
  11. Naim, N.F.; Yassin, A.I.M.; Zakaria, N.B.; Wahab, N.A. Classification of thumbprint using artificial neural network (ANN). In Proceedings of the 2011 IEEE International Conference on System Engineering and Technology, Shah Alam, Malaysia, 27–28 June 2011; pp. 231–234.
  12. Fei, Y.; Huyin, Z.; Shiming, T. Semi-supervised classification via full-graph attention neural networks. Neurocomputing 2022, 476, 63–74.
  13. Olugboja, A.; Zenghui, W. Intelligent Waste Classification System Using Deep Learning Convolutional Neural Network. Procedia Manuf. 2019, 35, 607–612.
  14. Daud, M.S.; Yassin, I.M.; Zabidi, A.; Johari, M.; Salleh, M. Investigation of MFCC feature representation for classification of spoken letters using multi-layer perceptions (MLP). In Proceedings of the 2011 IEEE International Conference on Computer Applications and Industrial Electronics (ICCAIE), Penang, Malaysia, 4–7 December 2011; pp. 16–20.
  15. Jyothi, K.K.; Chaudhari, S. Optimized neural network model for attack detection in LTE network. Comput. Electr. Eng. 2020, 88, 106879.
  16. Yousif, J.H.; Kazem, H.A.; Al-Balushi, H.; Abuhmaidan, K.; Al-Badi, R. Artificial neural network modelling and experimental evaluation of dust and thermal energy impact on monocrystalline and polycrystalline photovoltaic modules. Energies 2022, 15, 4138.
  17. Yousif, J.H.; Abdalgader, K. Experimental and mathematical models for real-time monitoring and auto watering using IoT architecture. Computers 2022, 11, 7.
More
Information
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : , , , ,
View Times: 170
Revisions: 2 times (View History)
Update Date: 15 Sep 2023
1000/1000
Video Production Service