Steel Plates Faults Prediction Methods: Comparison
Please note this is a comparison between Version 2 by Peter Tang and Version 1 by Bita Ghasemkhani.

Fault prediction is a vital task to decrease the costs of equipment maintenance and repair, as well as to improve the quality level of products and production efficiency. Steel plates fault prediction is a significant materials science problem that contributes to avoiding the progress of abnormal events.

  • fault prediction
  • machine learning
  • logistic model tree
  • classification
  • artificial intelligence
  • steel plates

1. Introduction

A fault is defined as an unexpected, abnormal, and undesirable situation, behavior, or imperfection at the equipment, component, or sub-system level which may cause a failure. Faults influence the wear and corrosion resistance of the product, reduce the production quality, and produce non-usable materials in the worst case. Such a physical malfunction can lead to unavoidable crashes and stop the system from working properly. Fault prediction is the process of identifying fault-prone components related to specific domains based on predictive analytics. In other words, it predicts different deviations in materials from their expected or normal states. Determining fault types in an effective way can reduce unexpected waste, maintenance, repair, or replacement costs, as well as improve the quality level of products and production efficiency. Fault prediction leads to extend equipment lifetime and asset utilization in various industrial environments. Moreover, it avoids long-term decline in total profits of the related system and also the outflow of customer confidence. The higher level of quality a product requires, the better fault prediction technique the industries should develop. In this context, intelligent systems, derived from research on machine learning, have been established to handle this issue correctly and quickly.
The steel industry has been shown to be one of the primary industries that requires fault prediction to produce materials in the most meticulous way. From making machines to beautiful artworks, steel plates are commonly used in a diverse range of applications, namely in industrial machinery, building construction, automobile chassis construction, bridge structures, and shipbuilding. Having such widespread applications, high-accuracy control of steel plate surfaces is important for meeting strict quality requirements. However, the difficulty of flat steel sheet manufacturing has always been considered in the industry because of the deformation tendency which is often caused by the steel surface coming in contact with different machines in manufacturing steps such as casting, drawing, pressing, cutting, and folding. Consequently, this sentudry aims to recognize the types of defects that steel plates have. One of the traditional ways is the manual inspection of steel plates by human experts to detect defects. However, this practice is very time-consuming, inaccurate, and costly, which needs considerably more human effort and overlooked investigation. Therefore, automation of fault prediction is necessary to reduce costs and minimize the time needed for monitoring. Here, machine learning plays an important role by analyzing past data to find hidden patterns and then construct models to predict the faults. Machine learning-based fault prediction methods contribute to facilitating precautionary maintenance and avoiding quality problems of the materials by more accurate and efficient decisions.
Machine learning (ML) draws inferences to predict future outcomes by finding patterns from historic data. It provides computers with the ability to learn by utilizing different algorithms and makes predictive models for artificial intelligent-enabled systems. As one of the ML methods, the logistic model tree (LMT) [1] is a decision tree-based model, which fits the logistic regression learning algorithm. The competitive advantages of LMT are the efficient construction and the simplicity of its interpretation. LMT builds a single compact tree by means of effective pruning mechanisms. In addition, the key features of the LMT algorithm include working with numeric and binary values, nominal qualities, numeric variables, and missing data, which all provide it with the information to achieve the best result in many studies [2,3,4,5,6,7][2][3][4][5][6][7].
Although LMT usually provides high classification performance and strong generalization ability [8,9,10,11[8][9][10][11][12][13][14][15][16][17][18][19],12,13,14,15,16,17,18,19], building a single tree classifier may not be enough and may lead to less accurate predictions. On the other hand, in ensemble learning, the weakness of a classifier can be overcome by the strengths of other classifiers. Although several classifiers in the ensemble produce incorrect outputs, other classifiers may have the ability to correct those errors.

2. Machine Learning-Based Fault Prediction

Machine learning-based fault prediction has been investigated with real-time monitoring in manufacturing environments. In [52][20], random forest (RF) classification was employed for the prediction of input data issues, and the NoSQL MongoDB as a big data technique was applied to the collected environmental dataset from the Internet of Things (IoT) sensors in an automotive manufacturing production line. Moreover, blockchain technology was utilized for covering system security. In another work [53][21], the utilization of machine learning models in the battery management system of a lithium-ion battery for the prediction of faults in the remaining useful life (RUL), charge state, and health state were presented by means of a neural network (NN) with a support vector machine (SVM), genetic algorithm back propagation neural network (GA-BPNN), RF, Gaussian process regression (GPR), logistic regression (LR), and long short-term memory recurrent neural network (LSTM-RNN). In another study [54][22], the authors focused on a bearing fault prediction method for electric motors by applying a medium Gaussian support vector machine (MG-SVM) on a motor bearing dataset. The application of deep learning methods to predict faults has been investigated in various studies [55,56,57,58][23][24][25][26]. In [55][23], a fault prediction workflow by deep learning for seismic data was developed, in which convolutional neural networks (CNNs) for image recognition, U-Net architecture for image segmentation, random forest for identifying the most important attributes, and GANs-based reconstruction approach for clarifying fault locations were used on the seismic data. As a result, the highest importance for the “discontinuity along dip” feature among seismic attributes was specified, and the prediction accuracy of fault probability maps was improved. Similarly, in another work [56][24], the authors proposed a structure-based data augmentation framework to boost the variety of the semi-real-semi-synthetic seismic dataset collected from various work areas in the Tarim Basin of China for improving fault prediction and identification on the basis of deep neural networks and U-Net, respectively. In another work [57][25], fault prediction and cause identification approaches based on deep learning in complex industrial processes were reported. The authors utilized deep learning to predict the fault events, long short-term memory (LSTM) to adapt to the branch structures, and an attention mechanism algorithm for fault detection and cause identification on the sensor-based data in a production line considering various fault types. Yang and Kim [58][26] detected recurrent and accumulative fault situations and calculated the anomaly scores in the data by using the LSTM method. Fault prediction in wind turbines has been investigated in previous studies [59,60][27][28] since it is a critical issue for maintaining the reliability and safety of energy systems. In [59][27], a novel solution for predictive maintenance in the generator of wind turbines was developed by means of supervisory control and data acquisition (SCADA) systems to control the state of operations in generators. Principal component analysis (PCA), SVM, NN, K-nearest neighbors (KNN), and naive Bayes (NB) classifiers were used to discriminate the various statuses of wind turbine generators. The synthetic minority oversampling technique (SMOTE) technique was applied to manage the imbalanced dataset for the wind power plants consisting of numerous wind turbines located in China. Low deployment costs were considered in the presented work by diagnosing the specific type of generator faults with high accuracies. In another study [60][28], the authors focused on a stacking gearbox fault prediction model for wind turbines on the basis of the SCADA data for wind turbines in a wind farm. The applied main techniques were recursive feature elimination (RFE) for selecting appropriate features, and RF, extreme gradient boosting (XGBoost), and gradient boosting decision tree (GBDT) for describing the usual circumstances of the wind turbines. The results revealed that RF, GBDT, and XGBoost approaches outperformed KNN, SVM, decision tree (DT), and AdaBoost according to the high R2 scores, and the low mean absolute error (MAE) and root mean square error (RMSE) metrics for various turbine types. Wan et al. [61][29] presented a model based on the Dempster–Shafer (DS) evidence theory and a quantum particle swarm optimization back-propagation (QPSO-BP) neural network for the prediction of rolling bearing faults types under different operation conditions. They found the optimal initial weights and thresholds of the neural network. The authors used a rolling bearing dataset and achieved high-performance accuracy with the presented method in comparison to SVM-DS, DT-DS, RF-DS, KNN-DS, and K-means-DS regarding the macro area under curve (AUC) metric. Yang and Li [62][30] developed a fault prediction method for wind energy conversion systems to improve the performance of the fault prediction model, shorten the time of fault prediction, and reduce the deviation between the actual fault value and the fault prediction value. The outperformance of the presented method was proved based on the kurtosis factor in comparison with the revealed results for fault prediction in different wind energy conversion systems. In the other work [63][31], the performances of various machine learning approaches were reported for forecasting heating appliance failures with the aim of predictive maintenance. In the mentioned work, the necessary data were collected from installed sensors of boiler appliances in homes. The results indicated that the LSTM models achieved higher accuracy than DT, NN, and weighted NN models based on different metrics for no fault, light fault, and severe fault states. In the other study [64][32], a smart machinery monitoring system based on machine learning was implemented to simulate the operating state of machinery for fault detection with a reduced volume of transmission information in an industrial IoT. The obtained accuracy from the non-linear SVM algorithm was higher than the results of the NB, RF, DT, KNN, and AdaBoost algorithms. Syafrudin et al. [65][33] introduced a hybrid prediction model which includes a real-time monitoring system for automotive manufacturing on the basis of IoT sensors and big data processing. Various approaches, namely Apache Storm as a real-time processing engine, Apache Kafka as a message queue, MongoDB for storage of the sensor data, density-based spatial clustering of applications with noise (DBSCAN) for outlier detection, and RF classification for removing outliers were used in the mentioned study. In the other study [66][34], a fault prediction method was proposed to accelerate the speed of alarm processing and to improve the accuracy in the energy management system of microgrids via online monitoring, failure prejudging, and optimized SVM analysis. Early warning time and the high success rate of the proposed method were the consequences of their study. In another work [67][35], fault prediction of the in-orbit spacecraft was investigated based on deep machine learning and the massive telemetry and fault data. The algorithms such as least squares support vector regression (LS-SVR), auto-regressive integrated moving average (ARIMA), and Wavelet NN were utilized to determine the best model regarding normalized mean square error (NMSE). Haneef and Venkataraman [68][36] employed LSTM, RNN, and a computation memory and power (CRP) rule-based network policy for predicting fog device faults. They collected related data by running the Internet of Things applications on different fog nodes. Their proposed method outperformed the traditional LSTM, SVM, and LR methods in terms of improved accuracy, lower processing time, minimal delay, and faster fault prediction rates. In the other work [69][37], the authors developed a machine learning-enabled method for fault prediction in centrifugal pumps in the gas and oil industry through multi-layer perceptron (MLP) and SVM techniques. They gathered the related data from the process and equipment sensors of centrifugal pumps to generate fault prediction alerts properly in decision support systems for operatives. In another study [70][38], the authors reported a fault prediction model with the aim of real-time tracking of sensor data in an IoT-enabled cloud environment for a hospital by machine learning. They applied the DT, KNN, NB, and RF techniques for controlling unanticipated losses produced by different faults. In another work [71][39], a real-time fault prediction recommendation model was developed by machine learning for a sensor-based smart office environment by means of a fault dataset retrieved from the sensors of office appliances. In their study, KNN, DT, NB, and RF were compared, and as a result, the RF algorithm revealed the highest accuracy against the others.

3. The Application of the LMT Algorithm

LMT is a classification algorithm in the machine learning field that uses decision tree and logistic regression approaches to build a classifier as a special tree by taking advantage of both tree and regression concepts. In other words, it builds a tree with a logistic regression model at the nodes. LMT has been considered as an effective alternative for decision tree-enabled machine learning algorithms. The major benefits of LMT include working with numeric and binary values, nominal qualities, numeric variables, and missing data. In addition, LMT avoids data overfitting as a result of regression and classification techniques. Despite the advantages of LMT, building a single tree classifier may not be enough and may lead to less accuracy in the prediction. Therefore, in the current woreseark, wech, the researchers present an ensemble method, the logistic model tree forest, which builds many LMT trees and combines them together to make a final prediction. LMT has been applied in various fields such as health [5,6[5][6][14][15][17][18],14,15,17,18], forensic science [19], environmental work [7], earthquake [3[3][8],8], agriculture [13], and transportation [16]. For example, in [11], flash flood susceptibility maps were analyzed by the use of different machine learning algorithms, including LMT, multinomial NB, radial basis function classifier (RBFC), and kernel LR for solving the flood problem in Vietnam. The dataset consisted of flash flood features such as river density, land use, flow direction, and so on. The validity of the methods was measured regarding AUC and the best performance achieved by the LMT algorithm among the others. Their work was suggested for flash flood management by relying on the high accuracy of the model to specify flood-susceptible fields. LMT was regarded as the best method among their counterparts in many studies [2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,73,74][2][3][4][5][6][7][8][9][10][11][12][13][14][15][16][17][18][19][40][41]. For example, in [9], a susceptible landslide detection model in the Cameron highlands of Malaysia was reported, in which RF, LR, and LTM algorithms were applied to various databases such as soil maps, digital elevation models, geological maps, and satellite imagery. The results revealed the superiority of LMT over LR and RF based on the AUC metric. In the other study [10], the authors constructed a trustworthy map of shallow landslide susceptibility for Bijar City in Iran by different machine learning algorithms, including LR, LMT, NB, SVM, and NN. The reliability of the models was tested according to various metrics (i.e., MAE, RMSE). The outperformance of LMT was proved in comparison with other mentioned algorithms. Thus, the authors recommended the utilization of LMT in shallow landslide phenomena to reduce the related damages. The LMT algorithm has been used in various studies to suggest solutions for machine learning-based problems due to its high accuracy in terms of different evaluation metrics. For example, in [13], the biochemical features of oil palm plants were monitored by using the spectroradiometer, machine learning, SMOTE, and unmanned aerial vehicle (UAV) techniques. In addition, three types of imbalanced datasets (leaf-raw band, canopy-VI, and canopy-raw band) were utilized to analyze nutrients in plants optimally and ensure their health and harvest. The outperformance of the LMT-SMOTEBoost was reported among alternative ones. In another work [14], LMT was applied to the medical field to predict miRNA-disease association (LMTRDA) by combining various information such as miRNA functional similarity, miRNA sequences, disease semantic similarity, and known miRNA-disease associations. Their model achieved a high accuracy regarding both sensitivity and AUC metrics on the dataset. Edited nearest neighbor (ENN) is a useful under-sampling technique focusing on eliminating noise samples [75][42]. It aims the selection of a subset of data instances from the training examples that belong to the majority class to make the classifier more robust and improve computational efficiency [76][43]. The previous studies [77,78][44][45] showed that the ENN method allowed for achieving an improvement in the classification performance in terms of accuracy.

References

  1. Landwehr, N.; Hall, M.; Frank, E. Logistic Model Trees. Mach. Learn. 2005, 59, 161–205.
  2. Kamali Maskooni, E.; Naghibi, S.A.; Hashemi, H.; Berndtsson, R. Application of Advanced Machine Learning Algorithms to Assess Groundwater Potential Using Remote Sensing-Derived Data. Remote Sens. 2020, 12, 2742.
  3. Debnath, P.; Chittora, P.; Chakrabarti, T.; Chakrabarti, P.; Leonowicz, Z.; Jasinski, M.; Gono, R.; Jasińska, E. Analysis of Earthquake Forecasting in India Using Supervised Machine Learning Classifiers. Sustainability 2021, 13, 971.
  4. Zhao, X.; Chen, W. Optimization of Computational Intelligence Models for Landslide Susceptibility Evaluation. Remote Sens. 2020, 12, 2180.
  5. Davis, J.D.; Wang, S.; Festa, E.K.; Luo, G.; Moharrer, M.; Bernier, J.; Ott, B.R. Detection of Risky Driving Behaviors in the Naturalistic Environment in Healthy Older Adults and Mild Alzheimer’s Disease. Geriatrics 2018, 3, 13.
  6. Lee, S.-W.; Kung, H.-C.; Huang, J.-F.; Hsu, C.-P.; Wang, C.-C.; Wu, Y.-T.; Wen, M.-S.; Cheng, C.-T.; Liao, C.-H. The Clinical Application of Machine Learning-Based Models for Early Prediction of Hemorrhage in Trauma Intensive Care Units. J. Pers. Med. 2022, 12, 1901.
  7. Reyes-Bueno, F.; Loján-Córdova, J. Assessment of Three Machine Learning Techniques with Open-Access Geographic Data for Forest Fire Susceptibility Monitoring—Evidence from Southern Ecuador. Forests 2022, 13, 474.
  8. Han, J.; Nur, A.S.; Syifa, M.; Ha, M.; Lee, C.-W.; Lee, K.-Y. Improvement of Earthquake Risk Awareness and Seismic Literacy of Korean Citizens through Earthquake Vulnerability Map from the 2017 Pohang Earthquake, South Korea. Remote Sens. 2021, 13, 1365.
  9. Nhu, V.-H.; Mohammadi, A.; Shahabi, H.; Ahmad, B.B.; Al-Ansari, N.; Shirzadi, A.; Geertsema, M.R.; Kress, V.; Karimzadeh, S.; Valizadeh Kamran, K.; et al. Landslide Detection and Susceptibility Modeling on Cameron Highlands (Malaysia): A Comparison between Random Forest, Logistic Regression and Logistic Model Tree Algorithms. Forests 2020, 11, 830.
  10. Nhu, V.-H.; Shirzadi, A.; Shahabi, H.; Singh, S.K.; Al-Ansari, N.; Clague, J.J.; Jaafari, A.; Chen, W.; Miraki, S.; Dou, J.; et al. Shallow Landslide Susceptibility Mapping: A Comparison between Logistic Model Tree, Logistic Regression, Naïve Bayes Tree, Artificial Neural Network, and Support Vector Machine Algorithms. Int. J. Environ. Res. Public Health 2020, 17, 2749.
  11. Pham, B.T.; Phong, T.V.; Nguyen, H.D.; Qi, C.; Al-Ansari, N.; Amini, A.; Ho, L.S.; Tuyen, T.T.; Yen, H.P.H.; Ly, H.-B.; et al. A Comparative Study of Kernel Logistic Regression, Radial Basis Function Classifier, Multinomial Naïve Bayes, and Logistic Model Tree for Flash Flood Susceptibility Mapping. Water 2020, 12, 239.
  12. Charton, E.; Meurs, M.-J.; Jean-Louis, L.; Gagnon, M. Using Collaborative Tagging for Text Classification: From Text Classification to Opinion Mining. Informatics 2014, 1, 32–51.
  13. Amirruddin, A.D.; Muharam, F.M.; Ismail, M.H.; Tan, N.P.; Ismail, M.F. Synthetic Minority Over-Sampling TEchnique (SMOTE) and Logistic Model Tree (LMT)-Adaptive Boosting Algorithms for Classifying Imbalanced Datasets of Nutrient and Chlorophyll Sufficiency Levels of Oil Palm (Elaeis Guineensis) Using Spectroradiometers and Unmanned Aerial Vehicles. Comput. Electron. Agric. 2022, 193, 106646.
  14. Wang, L.; You, Z.-H.; Chen, X.; Li, Y.-M.; Dong, Y.-N.; Li, L.-P.; Zheng, K. LMTRDA: Using Logistic Model Tree to Predict MiRNA-Disease Associations by Fusing Multi-Source Information of Sequences and Similarities. PLoS Comput. Biol. 2019, 15, e1006865.
  15. Kabir, E.; Siuly; Zhang, Y. Epileptic Seizure Detection from EEG Signals Using Logistic Model Trees. Brain Inf. 2016, 3, 93–100.
  16. Cheng, C.-H.; Yang, J.-H.; Liu, P.-C. Rule-Based Classifier Based on Accident Frequency and Three-Stage Dimensionality Reduction for Exploring the Factors of Road Accident Injuries. PLoS ONE 2022, 17, e0272956.
  17. Jha, S.K.; Ahmad, Z. An Effective Feature Generation and Selection Approach for Lymph Disease Recognition. Comp. Model. Eng. Sci. 2021, 129, 567–594.
  18. Ayyappan, G.; Babu, R.V. Knowledge Construction on NIV of COVID-19 for Managing the Patients by ML Techniques. Indian J. Comput. Sci. Eng. 2023, 14, 117–129.
  19. Gorka, M.; Thomas, A.; Bécue, A. Differentiating Individuals through the Chemical Composition of Their Fingermarks. Forensic Sci. Int. 2023, 346, 111645.
  20. Shahbazi, Z.; Byun, Y.-C. Smart Manufacturing Real-Time Analysis Based on Blockchain and Machine Learning Approaches. Appl. Sci. 2021, 11, 3535.
  21. Samanta, A.; Chowdhuri, S.; Williamson, S.S. Machine Learning-Based Data-Driven Fault Detection/Diagnosis of Lithium-Ion Battery: A Critical Review. Electronics 2021, 10, 1309.
  22. Lin, S.-L. Application of Machine Learning to a Medium Gaussian Support Vector Machine in the Diagnosis of Motor Bearing Faults. Electronics 2021, 10, 2266.
  23. Jiang, F.; Norlund, P. Seismic attribute-guided automatic fault prediction by deep learning. In Proceedings of the EAGE 2020 Annual Conference Exhibition, Online, 8–11 December 2020; European Association of Geoscientists & Engineers: Utrecht, The Netherlands, 2020; Volume 2020, pp. 1–5.
  24. Wang, S.; Si, X.; Cai, Z.; Cui, Y. Structural Augmentation in Seismic Data for Fault Prediction. Appl. Sci. 2022, 12, 9796.
  25. Li, Y. A Fault Prediction and Cause Identification Approach in Complex Industrial Processes Based on Deep Learning. Comput. Intell. Neurosci. 2021, 2021, 6612342.
  26. Yang, H.-S.; Kim, Y.-S. Design and Implementation of Machine Learning-Based Fault Prediction System in Cloud Infrastructure. Electronics 2022, 11, 3765.
  27. Zhao, Y.; Li, D.; Dong, A.; Kang, D.; Lv, Q.; Shang, L. Fault Prediction and Diagnosis of Wind Turbine Generators Using SCADA Data. Energies 2017, 10, 1210.
  28. Yuan, T.; Sun, Z.; Ma, S. Gearbox Fault Prediction of Wind Turbines Based on a Stacking Model and Change-Point Detection. Energies 2019, 12, 4224.
  29. Wan, L.; Li, H.; Chen, Y.; Li, C. Rolling Bearing Fault Prediction Method Based on QPSO-BP Neural Network and Dempster–Shafer Evidence Theory. Energies 2020, 13, 1094.
  30. Yang, J.; Li, J.-D. Fault Prediction Algorithm for Offshore Wind Energy Conversion System Based on Machine Learning. In Proceedings of the International Conference on High Performance Big Data and Intelligent Systems (HPBD&IS), Macau, China, 5–7 December 2021; pp. 291–296.
  31. Fernandes, S.; Antunes, M.; Santiago, A.R.; Barraca, J.P.; Gomes, D.; Aguiar, R.L. Forecasting Appliances Failures: A Machine-Learning Approach to Predictive Maintenance. Information 2020, 11, 208.
  32. Tsai, M.-F.; Chu, Y.-C.; Li, M.-H.; Chen, L.-W. Smart Machinery Monitoring System with Reduced Information Transmission and Fault Prediction Methods Using Industrial Internet of Things. Mathematics 2020, 9, 3.
  33. Syafrudin, M.; Alfian, G.; Fitriyani, N.; Rhee, J. Performance Analysis of IoT-Based Sensor, Big Data Processing, and Machine Learning Model for Real-Time Monitoring System in Automotive Manufacturing. Sensors 2018, 18, 2946.
  34. Yuan, H.; Zhang, Z.; Yuan, P.; Wang, S.; Wang, L.; Yuan, Y. A Microgrid Alarm Processing Method Based on Equipment Fault Prediction and Improved Support Vector Machine Learning. J. Phys. Conf. Ser. 2020, 1639, 012041.
  35. Zhang, X.; Wang, X.; Tian, H. Spacecraft in Orbit Fault Prediction Based on Deep Machine Learning. J. Phys. Conf. Ser. 2020, 1651, 012107.
  36. Haneef, S.; Venkataraman, N. Proactive Fault Prediction of Fog Devices Using LSTM-CRP Conceptual Framework for IoT Applications. Sensors 2023, 23, 2913.
  37. Orrù, P.F.; Zoccheddu, A.; Sassu, L.; Mattia, C.; Cozza, R.; Arena, S. Machine Learning Approach Using MLP and SVM Algorithms for the Fault Prediction of a Centrifugal Pump in the Oil and Gas Industry. Sustainability 2020, 12, 4776.
  38. Uppal, M.; Gupta, D.; Juneja, S.; Sulaiman, A.; Rajab, K.; Rajab, A.; Elmagzoub, M.A.; Shaikh, A. Elmagzoub; Luige Vladareanu. Cloud-Based Fault Prediction for Real-Time Monitoring of Sensor Data in Hospital Environment Using Machine Learning. Sustainability 2022, 14, 11667.
  39. Uppal, M.; Gupta, D.; Mahmoud, A.; Elmagzoub, M.A.; Sulaiman, A.; Reshan, M.S.A.; Shaikh, A.; Juneja, S. Fault Prediction Recommender Model for IoT Enabled Sensors Based Workplace. Sustainability 2023, 15, 1060.
  40. Colkesen, I.; Kavzoglu, T. The Use of Logistic Model Tree (LMT) for Pixel- and Object-Based Classifications Using High-Resolution WorldView-2 Imagery. Geocarto Int. 2016, 32, 71–86.
  41. Nithya, R.; Santhi, B. Decision Tree Classifiers for Mass Classification. Int. J. Signal Imaging Syst. Eng. 2015, 8, 39.
  42. Wilson, D.L. Asymptotic Properties of Nearest Neighbor Rules Using Edited Data. IEEE Trans. Syst. Man Cybern. 1972, SMC-2, 408–421.
  43. Alejo, R.; Sotoca, J.M.; Valdovinos, R.M.; Toribio, P. Edited Nearest Neighbor Rule for Improving Neural Networks Classifications. In Proceedings of the 7th International Symposium on Neural Networks (ISNN 2010), Shanghai, China, 6–9 June 2010; pp. 303–310.
  44. Oyewola, D.O.; Dada, E.G.; Misra, S.; Damaševičius, R. Predicting COVID-19 Cases in South Korea with All K-Edited Nearest Neighbors Noise Filter and Machine Learning Techniques. Information 2021, 12, 528.
  45. Blachnik, M.; Kordos, M. Comparison of Instance Selection and Construction Methods with Various Classifiers. Appl. Sci. 2020, 10, 3933.
More
Video Production Service