The 3rd Generation Partnership Project (3GPP) implemented the (LTE) strategy for fulfilling the increasing demand for wireless networks. The radio resource management scheduling algorithms were also used in this LTE setup. They assigned radio services to the final users based on different criteria for quality of service (QoS). However, the development of the downlink scheduling algorithms was a major problem noted during resource allocation in the LTE system. Different scheduling techniques were proposed for addressing this issue, and an investigation of the downlink algorithms garnered a lot of research interest during the LTE implementation, as several researchers started shifting to packet scheduling over LTE since it was regarded as a rapidly growing technology that can significantly affect the future of wireless networks. In the LTE downlink algorithm used for QoS class identifiers, the radio resource allocation steps use the QoS specifications and channel condition reports to determine the users’ transmission orders. However, inefficient resource allocation in the LTE networks can be noted due to poor network performance that deteriorates the data throughput and network fairness index and increases the average delay.
2. Using Artificial Neural Network Models to Enhance the Performance of the Downlink Communication System in LTE Networks
Many researchers have used artificial neural network (ANN) models to enhance the performance of the downlink communication system in LTE networks. Different models were used, where some studies considered channel estimation, while others investigated user device condition or mobile location. Predictive analyses of various ANN models for predicting and classifying the data over the LTE network were conducted in some other studies. Several studies have reported significant improvements in various performance metrics, such as throughput, fairness, and quality of service, when using ANNs for downlink scheduling. For example, a study by
[5][2] showed that an ANN-based downlink scheduling algorithm achieved up to 30% higher throughput than a conventional rule-based algorithm.
Charrada
[6][3] proposed an accurate channel environment estimation technique using the ANN and support vector machine regression (SVR) models for the standardized signal structure of the LTE downlink system
[6][3]. This technique was used under the impulsive, non-linear noise that interfered with reference codes after considering the high mobility conditions. He studied the SVR and ANN performances using the simulation results, which performed better than the decision feedback (DF), least squares (LS), and ANN algorithms. Another study
[7][4] presented a method for relaying the reference symbol information. This information was used for estimating the total frequency response of the channel. This technique was summarized in two steps: firstly, channel differences were adapted after applying the ANN-based learning methods trained using the genetic algorithm (ANN-GA). Secondly, the channel matrix was estimated to improve the performance of LTE networks. They validated the proposed algorithms using various ANN-based estimator algorithms, such as the feed-forward neural network, the layered recurrent neural network, the least squares (LS) algorithm, and the cascade-forward neural network for closed-loop spatial multiplexing (CLSM) single-user multi-input. The results of this comparison indicated that the proposed ANN-GA algorithm showed better accuracy than others.
Furthermore, the significant increase in network subscribers led to resource allocation issues. To resolve this problem, the researcher proposed a downlink algorithm that ensured an effective and faster resource allocation solution for real-time video applications
[8][5]. This solution used an ANN algorithm that allowed resource allocation after considering the UE conditions. They noted that the AI techniques used for resource allocation on the LTE network generated accurate results, but the ANN-based training process could take a long time. Hence, dynamic resource allocation can be done by realizing the daily ANN training processes whenever the eNodeB is intense.
In terms of predicting and classifying data over the LTE network via applying ANN models, many researchers have carried this out. In an earlier study
[9][6], the researchers investigated the performance of two ANN models for prediction and training algorithms (i.e., Levenberg-Marquardt and Bayesian regularization). They primarily focused on integrating an ANN into the LTE network during the mobile handover start-up phase. They compared the received signal strength (RSS) and the hysteresis fringe parameters for the adaptive neural hysteresis fringe reduction algorithm. The study aimed to resolve the channel estimation problem noted in LTE networks. In
[10][7], the researchers determined the adaptive learning and predictive ability of three ANN models, i.e., RBFNN, GRNN, and MLPNN, using the spatial radio signal dataset derived from the commercial LTE cellular networks. Thus, they could verify the efficiency and accuracy of the adaptive prediction system using attenuation and oscillation landscapes to determine the radio signal strength propagated in the LTE urban micro-cell topography. Their results indicated that the ANN prediction techniques could adapt to the measurement errors regarding the attenuation of the LTE radio signals. A comparison of the performance of the different techniques indicated that all ANN models could predict the transmitted LTE radio signals with numerous errors. A recent study by Ojo
[11][8] aimed to resolve these issues related to the existing models (experimental and deterministic) by implementing ML-based algorithms to predict path loss in LTE networks. They developed the RBFNN and multilayer perception neural network (MLPNN) models with the measured data as an input variable and compared it to the measured path loss. They noted that the RBFNN was more accurate, as it showed lower root mean squared errors (RMSEs) than the MLPNN. Also, an ANN-aided scheduling scheme for the UEs with mobility in LTE dynamic HetNets was proposed in another study
[12][9]. They determined a faster eICIC reconfiguration technique in the LTE HetNets. This technique helped to achieve a marginal gap compared to the centralized solution. Using historical data, this proposed technique could train the RBFNN to determine the relationship between the surrounding environment (channel and UE deployment), an optimal cell range extension, and a nearly blank subframe pattern. The researchers investigated the performance of their proposed algorithm concerning its throughput and utility during simulations. They noted that an optimal resource assignment helped rapidly vary the HetNets with low-performance degradation. In
[13][10], the researchers studied the probabilistic GRNN model for modelling and estimating the data corresponding to the spatial signal power loss. Commercial data was collected from the LTE network interface’s outdoor location. This examined GRNN model was trained with a power loss measurement of a spatial signal. The data was collected from three different outdoor signal propagation locations. It is noted that the proposed model showed better results in comparison to the conventional least square regression modelling process. Some researchers proposed an ANN-based classification system with higher accuracy and performance
[14][11] for fingerprint images using the NN models. In
[15][12], the researchers used this technique for classifying bacteria. The results indicated that the ANN was effective and feasible. In
[16][13], the researchers classified aerial photographs using the ANN. They noted that the ANN was suitable for classifying the remotely sensed data, exceeded the maximal probability classification for the classification accuracy, and showed a positive effect. In
[17][14], the researchers used the ANN for classifying spoken letters. They noted a 100% (training) and 93% (testing) classification accuracy. All these studies stated that the ANN could be used for classification owing to its better performance. To improve machine-type communication (MTC) security, some researchers
[18][15] studied the issues related to the lack of authentication requirements and attack detection for LTE-based MTC devices. As a result, they introduced a better NN model for detecting attacks. The results indicated the efficiency of this model in detecting attacks and compared the system’s security.
ANN models could be used for optimizing the resource allocation algorithms for communication networks. Furthermore, ANN technology could overcome the resource allocation problems noted in the LTE network. Some common ANN activation functions that were applied included the sigmoidal, binary, and hyperbolic sigmoidal functions based on the RBFNN, MLPNN, recurrent neural network, and perceptron models. All these functions were used for developing network communication. The researchers also considered the backpropagation and the gradient descent algorithms as training algorithms for the ANN. ANNs are used for different tasks, like approximation and prediction of functions, pattern classification, prediction, and clustering. However, the performance of the algorithm was significantly affected by data preparation and the setup used for the NN structure. ANN and mathematical models are used to evaluate and validate experimental data
[19,20][16][17]. Artificial neural network (ANN) techniques are used in the GRNN-RBFNN model and other proposed methods to optimize downlink scheduling over LTE networks. These models use historical data to learn and predict optimal scheduling policies for different users. However, some challenges still need to be addressed in LTE network scheduling. One such problem is ensuring fairness among users with different QoS requirements. The scheduling algorithms used in these models may not always be able to provide equal QoS to all users, leading to a lower fairness index for some users. Therefore, further research is needed to develop scheduling algorithms to ensure fairness among users with varying QoS requirements. Another challenge is the dynamic nature of the LTE network, which may lead to fluctuations in network conditions and user demands. This can make it difficult to predict optimal scheduling policies accurately. Therefore, there is a need to develop adaptive scheduling algorithms that can adjust to changes in various network channels’ conditions and user demands in real time considering the data over the network. The proposed methods have shown promise for optimizing downlink scheduling on LTE networks. To improve the performance of LTE network scheduling, the problems of fairness and changing network conditions still need to be studied.
The proposed approach of using artificial neural networks (ANNs) for downlink scheduling in LTE networks is suitable for solving the critical problem of optimizing the allocation of radio resources to users. ANNs are powerful machine-learning models that can learn from historical data to identify patterns and relationships. In the last few years, many researchers have used ANN models to enhance the performance of the downlink communication system in LTE networks. Many models have been used; some studies considered channel estimation, while others investigated the condition of the user devices or estimated the mobile location. Predictive analyses of various ANN models for predicting and classifying the data over the LTE network were conducted in some other studies. Several studies have reported significant improvements in various performance metrics, such as throughput, fairness, and quality of service, when using ANNs for downlink scheduling.