Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 2112 2024-01-26 01:23:35 |
2 layout & references Meta information modification 2112 2024-01-26 02:20:28 |

Video Upload Options

Do you have a full video?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Mohammed, S.I.; Hussein, N.K.; Haddani, O.; Aljohani, M.; Alkahya, M.A.; Qaraad, M. Salp Swarm Algorithm. Encyclopedia. Available online: https://encyclopedia.pub/entry/54371 (accessed on 19 June 2024).
Mohammed SI, Hussein NK, Haddani O, Aljohani M, Alkahya MA, Qaraad M. Salp Swarm Algorithm. Encyclopedia. Available at: https://encyclopedia.pub/entry/54371. Accessed June 19, 2024.
Mohammed, Shahad Ibrahim, Nazar K. Hussein, Outman Haddani, Mansourah Aljohani, Mohammed Abdulrazaq Alkahya, Mohammed Qaraad. "Salp Swarm Algorithm" Encyclopedia, https://encyclopedia.pub/entry/54371 (accessed June 19, 2024).
Mohammed, S.I., Hussein, N.K., Haddani, O., Aljohani, M., Alkahya, M.A., & Qaraad, M. (2024, January 26). Salp Swarm Algorithm. In Encyclopedia. https://encyclopedia.pub/entry/54371
Mohammed, Shahad Ibrahim, et al. "Salp Swarm Algorithm." Encyclopedia. Web. 26 January, 2024.
Salp Swarm Algorithm
Edit

The Salp Swarm Algorithm (SSA) is a bio-inspired metaheuristic optimization technique that mimics the collective behavior of Salp chains hunting for food in the ocean. While it demonstrates competitive performance on benchmark problems, the SSA faces challenges with slow convergence and getting trapped in local optima like many population-based algorithms. 

swarm intelligence Salp swarm optimizer global optimization locally weighted approach

1. Introduction

Machine learning has emerged as a promising solution to automate cardiovascular disease (CVD) risk prediction from electronic health records at a population scale. A variety of supervised algorithms have been explored, including logistic regression, decision trees, support vector machines, naïve Bayes classifiers, and ensemble methods [1]. Among these, ensemble techniques such as random forests and gradient boosting machines have achieved state-of-the-art performance due to their ability to handle complex interactions between risk variables [2]. In particular, extreme gradient boosting (XGBoost) has proven highly effective for medical applications through its efficient implementation of regularized model fitting [3]. Nonetheless, machine learning models remain limited by the improper selection of hyperparameters that determine model complexity, learning, and regularization strategies [4]. Conventional grid searches exhaustively evaluate a fixed set of predefined parameter combinations but scale poorly with dimensions and fail to explore interactions between settings effectively [5]. Random searches offer superior coverage of the search space but evaluate many suboptimal configurations without exploiting promising regions [5]. To address these limitations, metaheuristic techniques have emerged as automated approaches to nonlinear, multimodal hyperparameter optimization problems [6][7][8][9][10]. The class of optimization methods known as metaheuristic algorithms has seen widespread adoption. These methods can be broken down into nine classes, with contributions from fields as varied as biology, physics, sociology, music, chemistry, sports, mathematics, collective behaviors (swarm-based), and hybrid approaches that combine elements from several of these classes [11]. In particular, metaheuristic algorithms are useful for solving high-dimensional, nonlinear, constrained optimization problems that arise in the real world. Metaheuristic algorithms have gained substantial recognition and utilization in diverse fields of research and practical applications. One area where these algorithms have shown promise is in the domain of cardiovascular disease (CVD) risk assessment, a critical healthcare concern worldwide. The use of metaheuristic algorithms in CVD risk assessment presents an opportunity to enhance the accuracy and efficiency of predictive models, enabling healthcare professionals to better identify individuals at high risk of developing cardiovascular diseases. Recent studies have explored the application of metaheuristic algorithms, such as Genetic Algorithms, Particle Swarm Optimization, and Artificial Neural Networks, in developing CVD risk prediction models [12]. These algorithms can effectively process large datasets containing diverse risk factors, biomarkers, and clinical parameters, thereby providing a more comprehensive assessment of an individual’s risk profile [13]. The use of these algorithms helps not only with predicting the risk of CVD but also in identifying the key contributing factors and their complex interactions, which can be instrumental in tailoring personalized prevention and treatment strategies [14]. Recent research has also emphasized the importance of integrating data from various sources, including clinical records, medical imaging, and genomic information. Metaheuristic algorithms can aid in fusing these heterogeneous data sources to build more holistic and accurate CVD risk prediction models [15]. Furthermore, considering the rapid advancements in medical technology and the increasing availability of health-related data, the application of metaheuristic algorithms in CVD risk assessment is expected to evolve. This evolution includes the incorporation of deep learning and hybrid models that combine the strengths of different metaheuristic techniques to enhance prediction accuracy and robustness [16].
Bio-inspired swarm intelligence algorithms take inspiration from social behaviors observed in nature to robustly solve complex optimization tasks. The domain of bio-inspired swarm intelligence research is experiencing burgeoning growth, with an increasing number of scholars amalgamating machine learning techniques with optimization algorithms rooted in swarm intelligence in pursuit of enhancing the efficacy and efficiency of machine learning methods [17]. Annually, a plethora of novel swarm intelligence algorithms are introduced with the primary aim of addressing a myriad of optimization problems, including but not limited to Particle Swarm Optimization (PSO), which Kennedy and Eberhart initially introduced [18], or Genetic Algorithms (GAs) [19], Salp Swarm Algorithm (SSA) [20], Artificial Bee Colonies (ABCs) [21], Ant Colony optimization (ACO) [22], differential evolution (DE) [23], and others. Other types that can be listed here include Black Holes (BHs) [24], Thermal Exchange Optimization (TEO) [25], Chemical Reaction Optimization (CRO) [26], etc., all of which are examples of techniques that are based on physics or chemistry. More sophisticated examples include those based on the social behaviors of a human like Teaching–Learning-Based Optimization (TLBO) [27] and Imperialist Competitive Algorithm (ICA) [28], and music-based examples such as Melody Search [29] and Salp Swarm Algorithm (SSA) [20].
The Salp Swarm Algorithm (SSA) constitutes a notable advancement in the realm of bio-inspired swarm intelligence. Developed from the inspiration drawn from the collective behavior of Salps, a type of marine organism, the SSA has garnered considerable attention within the academic community. This algorithm has demonstrated its prowess in addressing complex optimization problems by employing a population of virtual Salps that emulate the biological characteristics and interactions observed in their natural counterparts. The SSA leverages these principles to guide the search process, facilitating the identification of optimal solutions across a spectrum of domains. The algorithm’s unique features, including its adaptive mechanisms and its capacity to adapt to dynamic environments, render it a compelling subject of investigation and application within various scientific disciplines [7][30][31][32][33]. Nonetheless, the standard SSA faces some limitations that hinder its search abilities. Like many population-based metaheuristics, the SSA can become stuck in local optima and exhibit slow convergence characteristics [34]. This is due in part to the lack of fine-grained neighborhood information incorporated into Salps’ movement updates, which focus exploration more globally without sufficient local refinement [35]. Recent studies have also shown that the SSA’s performance degrades for highly multimodal problems with many local optima due to its tendency to converge prematurely to suboptimal solutions [36]. Additionally, the use of fixed control parameters facilitates exploitation but constrains exploration over time, restricting flexibility when tackling diverse problem landscapes [37]. To address these drawbacks, localized adaptations guiding Salps towards high-quality neighboring solutions have been explored as an effective means of balancing exploration and exploitation abilities [35]. To address these drawbacks, in the past few years, multiple modified forms of the SSA have been developed by researchers with the aim of enhancing its performance, rectifying its shortcomings, and augmenting its capabilities.

2. Salp Swarm Algorithm and Its Variants

Algorithms inspired by nature have unique characteristics that have attracted the interest of researchers in many fields, since they may be used for a wide variety of problems. Furthermore, the No Free Lunch Theorem states that there is no silver bullet for optimization problems [38], citing a lack of optimal solutions. Therefore, it is still a challenging task to create novel optimization algorithms suited to the requirements of practical application settings. As a result, there has been a rise in interest in developing new optimization techniques by combining existing, simpler meta-algorithms. Hybridization is gaining popularity because it combines the best features of different algorithms into one, creating new systems with improved efficiency and accuracy. Since its invention in 2017, the Salp Swarm Algorithm (SSA) has been improved iteratively by incorporating many adjustments proposed by researchers. These additions have been carefully selected to strengthen the algorithm’s flexibility and efficiency across a wide range of problem-solving domains by catering to specific optimization issues. According to [39], the conventional Salp Swarm Algorithm’s (SSA) performance was improved by using the DE algorithm’s operators in order to avoid becoming stuck in local optimum solutions and hasten the convergence to the global optimum. The SSAGWO approach was proposed by [31] to adjust the locations of Salp followers using the GWO search strategy. Differential evolution and the SSA are integrated into the proposed study [40] to improve accuracy and convergence rate. El-Shorbagy et al. improved the SSA method to achieve this goal by utilizing a chaotic collection of functions to expand the algorithm’s capacity to search across wide regions for the best solutions [41]. QSSALEO [30] presents a novel hybrid that improves the Salp swarm by combining Local Escape Operator (LEO) and Quadratic Interpolation (QI), where the QI technique enhanced the new algorithm’s capacity to exploit and the precision of the optimal solution. The LEO was employed on the site of QI of the ideal search element at the same time to solve the local optima problem. The TBLSBCL [32] is a novel search technique that aims to show issues of population diversity, imbalanced exploitation and exploration, and premature convergence in the SSA. The hybridization process includes two stages: The first stage is temporary dynamic fundaments used to represent the hierarchy of followers and leaders in the SSA. This approach raises the number of leaders and decreases the number of followers linearly. The leader’s position in the population is also edited using the effective exploitation capabilities of the SSA. The second stage, a competitive learning strategy, is employed to revise the condition of the followers by allowing them to learn from the leading member. Problems with population diversity, unequal exploration and exploitation, and premature convergence are all handled by the one-of-a-kind search strategy known as SSALEO [33]. The SSA has these issues, but SSALEO is able to solve them. The LEO (Local Escape Operator) is the component that contributes to SSALEO by streamlining the search procedure in the Salp Swarm Algorithm and boosting the local search effectiveness of the swarm members. The utilization of the SSA has been widely observed across many applications due to its notable optimization potential. Table 1 presents a comprehensive compilation of various modifications and hybridizations within the field of the SSA. In the majority of instances, the initial Salp Swarm Algorithm (SSA) was suitably enhanced prior to its implementation. However, it was uncommon for the unmodified SSA to be explicitly implemented. The concept of the No Free Lunch (NFL) theorem effectively exemplifies this notion. This serves as an incentive for academics to participate in the endeavor of enhancing the SSA.
Table 1. SSA modifications and hybridizations.
Following an analysis of the relevant published material, one can reach the conclusion that there is a significant body of work devoted to the Salp Swarm Algorithm. Although the SSA has proven its efficacy in a variety of contexts by outperforming conventional optimizers, it is not immune to the influence of local optima. In response to this issue, numerous distinct iterations of the SSA have been devised. These versions vary the mechanism in order to improve the rate of convergence, strike a balance between exploration and exploitation, and steer clear of locally optimal solutions. Moreover, diversity techniques are used in complex optimization algorithms in order to increase search quality and decrease the impacts of genetic drift, which can result in a loss of diversity in bio-inspired algorithms. This is performed in order to maximize the potential of the algorithms.

References

  1. Ward, A.; Sarraju, A.; Chung, S.; Li, J.; Harrington, R.; Heidenreich, P.; Palaniappan, L.; Scheinker, D.; Rodriguez, F. Machine Learning and Atherosclerotic Cardiovascular Disease Risk Prediction in a Multi-Ethnic Population. npj Digit. Med. 2020, 3, 125.
  2. Chen, T.; Guestrin, C. Undefined Xgboost: A Scalable Tree Boosting System. In Proceedings of the 2nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016.
  3. Ke, G.; Meng, Q.; Finley, T.; Wang, T.; Chen, W.; Ma, W.; Ye, Q.; Liu, T.-Y. Lightgbm: A Highly Efficient Gradient Boosting Decision Tree. In Proceedings of the Advances in Neural Information Processing Systems 30 (NIPS 2017), Long Beach, CA, USA, 4–9 December 2017.
  4. Probst, P.; Boulesteix, A.; Bischl, B. Tunability: Importance of Hyperparameters of Machine Learning Algorithms. J. Mach. Learn. Res. 2019, 20, 1934–1965.
  5. Bergstra, J.; Bengio, Y. Random Search for Hyper-Parameter Optimization. J. Mach. Learn. Res. 2012, 13, 281–305.
  6. Qaraad, M.; Amjad, S.; Hussein, N.K.; Farag, M.A.; Mirjalili, S.; Elhosseini, M.A. Quadratic Interpolation and a New Local Search Approach to Improve Particle Swarm Optimization: Solar Photovoltaic Parameter Estimation. Expert Syst. Appl. 2024, 236, 121417.
  7. Qaraad, M.; Aljadania, A.; Elhosseini, M. Large-Scale Competitive Learning-Based Salp Swarm for Global Optimization and Solving Constrained Mechanical and Engineering Design Problems. Mathematics 2023, 11, 1362.
  8. Qaraad, M.; Amjad, S.; Hussein, N.K.; Badawy, M.; Mirjalili, S.; Elhosseini, M.A. Photovoltaic Parameter Estimation Using Improved Moth Flame Algorithms with Local Escape Operators. Comput. Electr. Eng. 2023, 106, 108603.
  9. Qaraad, M.; Amjad, S.; Hussein, N.K.; Mirjalili, S.; Elhosseini, M.A. An Innovative Time-Varying Particle Swarm-Based Salp Algorithm for Intrusion Detection System and Large-Scale Global Optimization Problems. Artif. Intell. Rev. 2022, 56, 8325–8392.
  10. Ojha, V.K.; Abraham, A.; Snášel, V. Metaheuristic Design of Feedforward Neural Networks: A Review of Two Decades of Research. Eng. Appl. Artif. Intell. 2017, 60, 97–116.
  11. Akyol, S.; Alatas, B. Plant Intelligence Based Metaheuristic Optimization Algorithms. Artif. Intell. Rev. 2017, 47, 417–462.
  12. Sharma, S.; Kumar, V. Application of Genetic Algorithms in Healthcare: A Review. Stud. Comput. Intell. 2022, 1039, 75–86.
  13. Kumar, S.; Sahoo, G. A Random Forest Classifier Based on Genetic Algorithm for Cardiovascular Diseases Diagnosis (RESEARCH NOTE). Int. J. Eng. 2017, 30, 1723–1729.
  14. Amma, N.G.B. Cardiovascular Disease Prediction System Using Genetic Algorithm and Neural Network. In Proceedings of the 2012 International Conference on Computing, Communication and Applications, Dindigul, India, 22–24 February 2012.
  15. Ay, Ş.; Ekinci, E.; Garip, Z. A Comparative Analysis of Meta-Heuristic Optimization Algorithms for Feature Selection on ML-Based Classification of Heart-Related Diseases. J. Supercomput. 2023, 79, 11797–11826.
  16. Sheeba, P.T.; Roy, D.; Syed, M.H. A Metaheuristic-Enabled Training System for Ensemble Classification Technique for Heart Disease Prediction. Adv. Eng. Softw. 2022, 174, 103297.
  17. Tharwat, A.; Schenck, W. A Conceptual and Practical Comparison of PSO-Style Optimization Algorithms. Expert Syst. Appl. 2021, 167, 114430.
  18. Okwu, M.O.; Tartibu, L.K. Particle Swarm Optimisation. Stud. Comput. Intell. 2021, 927, 5–13.
  19. Tang, K.S.; Man, K.F.; Kwong, S.; He, Q. Genetic Algorithms and Their Applications. IEEE Signal Process. Mag. 1996, 13, 22–37.
  20. Mirjalili, S.; Gandomi, A.H.; Mirjalili, S.Z.; Saremi, S.; Faris, H.; Mirjalili, S.M. Salp Swarm Algorithm: A Bio-Inspired Optimizer for Engineering Design Problems. Adv. Eng. Softw. 2017, 114, 163–191.
  21. Karaboga, D.; Basturk, B. A Powerful and Efficient Algorithm for Numerical Function Optimization: Artificial Bee Colony (ABC) Algorithm. J. Glob. Optim. 2007, 39, 459–471.
  22. Dorigo, M.; Birattari, M. Ant Colony Optimization. IEEE Comput. Intell. Mag. 2006, 1, 28–39.
  23. Grandgirard, J.; Poinsot, D.; Krespi, L.; Nénon, J.P.; Cortesero, A.M. Costs of Secondary Parasitism in the Facultative Hyperparasitoid Pachycrepoideus Dubius: Does Host Size Matter? Entomol. Exp. Appl. 2002, 103, 239–248.
  24. Hatamlou, A. Black Hole: A New Heuristic Optimization Approach for Data Clustering. Inf. Sci. 2013, 222, 175–184.
  25. Kaveh, A.; Dadras, A. A Novel Meta-Heuristic Optimization Algorithm: Thermal Exchange Optimization. Adv. Eng. Softw. 2017, 110, 69–84.
  26. Lam, A.Y.S.; Member, S.; Li, V.O.K. Chemical-Reaction-Inspired Metaheuristic for Optimization. IEEE Trans. Evol. Comput. 2009, 14, 381–399.
  27. Rao, R.V.; Savsani, V.J.; Vakharia, D.P. Teaching–Learning-Based Optimization: A Novel Method for Constrained Mechanical Design Optimization Problems. Comput. Des. 2011, 43, 303–315.
  28. Abdollahi, M.; Isazadeh, A.; Abdollahi, D. Imperialist Competitive Algorithm for Solving Systems of Nonlinear Equations. Comput. Math. Appl. 2013, 65, 1894–1908.
  29. Ashrafi, S.M.; Dariane, A.B. A Novel and Effective Algorithm for Numerical Optimization: Melody Search (MS). In Proceedings of the 2011 11th International Conference on Hybrid Intelligent Systems (HIS), Melacca, Malaysia, 5–8 December 2011; pp. 109–114.
  30. Qaraad, M.; Amjad, S.; Hussein, N.K.; Elhosseini, M.A. An Innovative Quadratic Interpolation Salp Swarm-Based Local Escape Operator for Large-Scale Global Optimization Problems and Feature Selection. Neural Comput. Appl. 2022, 34, 17663–17721.
  31. Qaraad, M.; Amjad, S.; Hussein, N.K.; Elhosseini, M.A. Large Scale Salp-Based Grey Wolf Optimization for Feature Selection and Global Optimization. Neural Comput. Appl. 2022, 34, 8989–9014.
  32. Qaraad, M.; Amjad, S.; Hussein, N.K.; Elhosseini, M.A. Addressing Constrained Engineering Problems and Feature Selection with a Time-Based Leadership Salp-Based Algorithm with Competitive Learning. J. Comput. Des. Eng. 2022, 9, 2235–2270.
  33. Qaraad, M.; Amjad, S.; Hussein, N.K.; Mirjalili, S.; Halima, N.B.; Elhosseini, M.A. Comparing SSALEO as a Scalable Large Scale Global Optimization Algorithm to High-Performance Algorithms for Real-World Constrained Optimization Benchmark. IEEE Access 2022, 10, 95658–95700.
  34. Chen, J.; Luo, Q.; Zhou, Y.; Huang, H. Firefighting Multi Strategy Marine Predators Algorithm for the Early-Stage Forest Fire Rescue Problem. Appl. Intell. 2023, 53, 15496–15515.
  35. Abualigah, L.; Shehab, M.; Alshinwan, M.; Alabool, H. Salp Swarm Algorithm: A Comprehensive Survey. Neural Comput. Appl. 2020, 32, 11195–11215.
  36. Cuevas, E. An Optimization Algorithm Inspired by the States of Matter That Improves the Balance between Exploration and Exploitation. Appl. Intell. 2014, 40, 256–272.
  37. Castelli, M.; Manzoni, L.; Mariot, L.; Nobile, M.S.; Tangherloni, A. Salp Swarm Optimization: A Critical Review. Expert Syst. Appl. 2022, 189, 116029.
  38. Wolpert, D.H.; Macready, W.G. No Free Lunch Theorems for Optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82.
  39. Fathi, R.; Tousi, B.; Galvani, S. Allocation of Renewable Resources with Radial Distribution Network Reconfiguration Using Improved Salp Swarm Algorithm. Appl. Soft Comput. 2023, 132, 109828.
  40. Zhang, H.; Liu, T.; Ye, X.; Asghar, A.; Guoxi, H.; Huiling, L.; Zhifang, C. Differential Evolution-Assisted Salp Swarm Algorithm with Chaotic Structure for Real-World Problems; Springer: London, UK, 2022; ISBN 0123456789.
  41. El-Shorbagy, M.A.; Eldesoky, I.M.; Basyouni, M.M.; Nassar, I.; El-Refaey, A.M. Chaotic Search-Based Salp Swarm Algorithm for Dealing with System of Nonlinear Equations and Power System Applications. Mathematics 2022, 10, 1368.
  42. Nautiyal, B.; Prakash, R.; Vimal, V.; Liang, G.; Chen, H. Improved Salp Swarm Algorithm with Mutation Schemes for Solving Global Optimization and Engineering Problems. Eng. Comput. 2021, 38, 3927–3949.
  43. Kansal, V.; Dhillon, J.S. Emended Salp Swarm Algorithm for Multiobjective Electric Power Dispatch Problem. Appl. Soft Comput. 2020, 90, 106172.
  44. Zhang, H.; Wang, Z.; Chen, W.; Heidari, A.A.; Wang, M.; Zhao, X.; Liang, G.; Chen, H.; Zhang, X. Ensemble Mutation-Driven Salp Swarm Algorithm with Restart Mechanism: Framework and Fundamental Analysis. Expert Syst. Appl. 2021, 165, 113897.
  45. Wang, C.; Xu, R.-Q.; Ma, L.; Zhao, J.; Wang, L.; Xie, N.-G.; Cheong, K.H. An Efficient Salp Swarm Algorithm Based on Scale-Free Informed Followers with Self-Adaption Weight. Appl. Intell. 2023, 53, 1759–1791.
  46. Ren, H.; Li, J.; Chen, H.; Li, C.Y. Adaptive Levy-Assisted Salp Swarm Algorithm: Analysis and Optimization Case Studies. Math. Comput. Simul. 2021, 181, 380–409.
  47. Tawhid, M.A.; Ibrahim, A.M. Improved Salp Swarm Algorithm Combined with Chaos. Math. Comput. Simul. 2022, 202, 113–148.
  48. Zhang, X.; Wang, S.; Zhao, K.; Wang, Y. A Salp Swarm Algorithm Based on Harris Eagle Foraging Strategy. Math. Comput. Simul. 2023, 203, 858–877.
  49. Neggaz, N.; Ewees, A.A.; Elaziz, M.A.; Mafarja, M. Boosting Salp Swarm Algorithm by Sine Cosine Algorithm and Disrupt Operator for Feature Selection. Expert Syst. Appl. 2020, 145, 113103.
  50. Si, T.; Miranda, P.B.C.; Bhattacharya, D. Novel Enhanced Salp Swarm Algorithms Using Opposition-Based Learning Schemes for Global Optimization Problems. Expert Syst. Appl. 2022, 207, 117961.
  51. Abbassi, A.; Abbassi, R.; Heidari, A.A.; Oliva, D.; Chen, H.; Habib, A.; Jemli, M.; Wang, M. Parameters Identification of Photovoltaic Cell Models Using Enhanced Exploratory Salp Chains-Based Approach. Energy 2020, 198, 117333.
  52. Gupta, S.; Deep, K.; Heidari, A.A.; Moayedi, H.; Chen, H. Harmonized Salp Chain-Built Optimization. Eng. Comput. 2021, 37, 1049–1079.
More
Information
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : , , , , ,
View Times: 150
Revisions: 2 times (View History)
Update Date: 26 Jan 2024
1000/1000
Video Production Service