Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 1519 2024-03-18 10:43:29 |
2 update references and layout Meta information modification 1519 2024-03-18 10:54:02 |

Video Upload Options

Do you have a full video?


Are you sure to Delete?
If you have any further questions, please contact Encyclopedia Editorial Office.
Alsayyed, O.; Hamadneh, T.; Al-Tarawneh, H.; Alqudah, M.; Gochhait, S.; Leonova, I.; Malik, O.P.; Dehghani, M. Giant Armadillo Optimization. Encyclopedia. Available online: (accessed on 19 April 2024).
Alsayyed O, Hamadneh T, Al-Tarawneh H, Alqudah M, Gochhait S, Leonova I, et al. Giant Armadillo Optimization. Encyclopedia. Available at: Accessed April 19, 2024.
Alsayyed, Omar, Tareq Hamadneh, Hassan Al-Tarawneh, Mohammad Alqudah, Saikat Gochhait, Irina Leonova, Om Parkash Malik, Mohammad Dehghani. "Giant Armadillo Optimization" Encyclopedia, (accessed April 19, 2024).
Alsayyed, O., Hamadneh, T., Al-Tarawneh, H., Alqudah, M., Gochhait, S., Leonova, I., Malik, O.P., & Dehghani, M. (2024, March 18). Giant Armadillo Optimization. In Encyclopedia.
Alsayyed, Omar, et al. "Giant Armadillo Optimization." Encyclopedia. Web. 18 March, 2024.
Giant Armadillo Optimization

A new bio-inspired metaheuristic algorithm called Giant Armadillo Optimization (GAO) is introduced, which imitates the natural behavior of giant armadillo in the wild. The fundamental inspiration in the design of GAO is derived from the hunting strategy of giant armadillos in moving towards prey positions and digging termite mounds.

optimization bio-inspired metaheuristic giant armadillo

1. Introduction

There are many problems in mathematics, science, and real-world applications that have more than one feasible solution. These types of problems are known as optimization problems, and the process of obtaining the best feasible solution among all these existing solutions is called optimization [1]. Each optimization problem is mathematically modeled using three main parts: decision variables, problem constraints, and an objective function. The goal in optimization is to allocate appropriate values for decision variables so that the objective function is optimized by respecting the constraints of the problem [2]. There are numerous optimization problems in science, mathematics, engineering, technology, industry, and real-world applications that need to be solved using optimization techniques. Problem-solving techniques for solving optimization problems are classified into two classes: deterministic and stochastic approaches [3]. Deterministic approaches in two categories, gradient-based and non-gradient-based, are effective in solving linear, convex, continuous, differentiable, and low-dimensional problems [4]. However, as optimization problems become more complex, especially as the problem dimensions increase, deterministic approaches stop getting stuck in local optima [5]. This is despite the fact that many practical optimization problems are non-linear, non-convex, non-differentiable, non-continuous, and high-dimensional. The disadvantages of deterministic approaches in order to solve practical optimization problems in science have led to researchers’ efforts in designing stochastic approaches [6].
Metaheuristic algorithms are among the most efficient and well-known stochastic approaches that have been used to deal with numerous optimization problems. These algorithms are able to provide suitable solutions for optimization problems based on random search in the problem-solving space and benefit from random operators and trial-and-error processes. The optimization mechanism in metaheuristic algorithms starts with the random generation of a certain number of candidate solutions under the name of algorithm population. Then, these candidate solutions are improved during successive iterations and based on the population update steps of the algorithm. After the full implementation of the algorithm, the best candidate solution obtained is presented as a solution to the problem [7]. The nature of stochastic search results in no guarantee of definitively achieving the global optimum using metaheuristic algorithms. However, due to being close to the global optimum, the solutions obtained from metaheuristic algorithms are acceptable as pseudo-optimal [8]. The desire of researchers to achieve more effective solutions closer to the global optimum for optimization problems has led to the design of numerous metaheuristic algorithms [9]. These metaheuristic algorithms have been used to tackle optimization problems in various sciences, such as static optimization problems [10], green product design [11], feature selection [12], design for disassembly [13], image segmentation [14], and wireless sensor network applications [15].
Metaheuristic algorithms will be able to achieve effective solutions for optimization problems when they search the problem-solving space well at both global and local levels. Global search expresses the exploration power of the algorithm in the extensive search in the problem-solving space with the aim of discovering the main optimal area and preventing the algorithm from getting stuck in local optima. Local search represents the exploitation power of the algorithm in the exact search near the promising areas of the problem-solving space and the discovered solutions. In addition to exploration and exploitation abilities, what leads to the success of a metaheuristic algorithm in providing a suitable solution for an optimization problem is their balancing during the search process in the problem-solving space [16].

2. Giant Armadillo Optimization

Metaheuristic algorithms have been developed with inspiration from various natural phenomena, the behaviors of living organisms in the wild, genetic, biological, and physics sciences, game rules, human interactions, and other evolutionary phenomena. Metaheuristic algorithms are classified into five groups based on the main idea in design: swarm-based, evolutionary-based, physics-based, human-based, and game-based approaches.
Swarm-based metaheuristic algorithms are inspired by the lifestyles of animals, birds, insects, aquatics, reptiles, and other living creatures in the wild. The most well-known algorithms in this group are: Particle Swarm Optimization (PSO) [17], Ant Colony Optimization (ACO) [18], Artificial Bee Colony (ABC) [19], and Firefly Algorithm (FA) [20]. PSO is inspired by the group movement of flocks of birds and fish towards food sources. ACO is inspired by the ability of ants to discover the optimal communication path between the colony and the food source. ABC is inspired by the activities of colony bees searching for food sources. FA is inspired by optical communication between fireflies. The Grey Wolf Optimizer (GWO) is a swarm-based metaheuristic algorithm that is inspired by the hierarchical leadership structure and social behavior of gray wolves during hunting [21]. Green Anaconda Optimization (GAO) is inspired by the ability of male green anacondas to detect the position of females during the mating season and the hunting strategy of green anacondas [22]. Among the natural behaviors of living organisms in the wild, foraging, hunting, digging, migration, and chasing are much more prominent and have been employed in the design of algorithms such as: Honey Badger Algorithm (HBA) [23], African Vultures Optimization Algorithm (AVOA), Whale Optimization Algorithm (WOA) [24], Orca Predation Algorithm (OPA) [25], Reptile Search Algorithm (RSA) [26], Kookaburra Optimization Algorithm (KOA) [27], Mantis Search Algorithm (MSA) [28], Liver Cancer Algorithm (LCA) [29], Marine Predator Algorithm (MPA) [30], Tunicate Swarm Algorithm (TSA) [31], White Shark Optimizer (WSO) [32], and Golden Jackal Optimization (GJO) [33].
Evolutionary-based metaheuristic algorithms are designed with inspiration from genetic and biological sciences, concepts of natural selection, survival of the fittest, Darwin’s theory of evolution, and evolutionary operators. Genetic Algorithm (GA) [34] and Differential Evolution (DE) [35] are the most famous algorithms of this group, which are developed inspired by the reproduction process, genetic and biological concepts, and evolutionary-random operators of crossover, selection, and mutation. Artificial Immune Systems (AISs) are inspired by the mechanisms of the human body’s immune system against microbes and diseases [36]. Some other evolutionary-based metaheuristic algorithms are: Genetic programming (GP) [37], Cultural Algorithm (CA) [38], and Evolution Strategy (ES) [39].
Physics-based metaheuristic algorithms are designed with inspiration from the phenomena, forces, transformations, laws, and concepts of physics. Simulated Annealing (SA) is one of the most widely used algorithms of this group, which is inspired by the annealing process of metals, in which metals are first melted under heat, then slowly cooled with the aim of achieving an ideal crystal. Physical forces and Newton’s laws of motion have been the source of design in algorithms such as the Momentum Search Algorithm (MSA) [40] based on momentum force, the Gravitational Search Algorithm (GSA) based on gravitational attraction force [41], and the Spring Search Algorithm (SSA) [42] based on the elastic force of the spring and Hooke’s law. Cosmological concepts have been the origin of design in algorithms such as the Multi-Verse Optimizer (MVO) [43] and the Black Hole Algorithm (BHA) [44]. Some other physics-based metaheuristic algorithms are: Archimedes Optimization Algorithm (AOA) [45], Water Cycle Algorithm (WCA) [46], Artificial Chemical Process (ACP) [47], Chemotherapy Science Algorithm (CSA) [48], Nuclear Reaction Optimization (NRO) [49], Henry Gas Optimization (HGO) [50], Electro-Magnetism Optimization (EMO) [51], Lichtenberg Algorithm (LA) [52], Thermal Exchange Optimization (TEO) [53], and Equilibrium Optimizer (EO) [54].
Human-based metaheuristic algorithms are designed with inspiration from thoughts, choices, decisions, communication, interactions, and other human activities in individual and social life. Teaching-Learning-Based Optimization (TLBO) is one of the most famous human-based metaheuristic algorithms, which is introduced with the inspiration of educational communication in the classroom environment and the exchange of knowledge between teachers and students and students with each other [55]. The Mother Optimization Algorithm (MOA) is proposed based on the modeling of Eshrat’s care of her children [56]. Doctor and Patient Optimization (DPO) is introduced based on modeling the process of treating patients by doctors [57]. Sewing Training-Based Optimization (STBO) is proposed with the inspiration of teaching sewing skills by the instructor to students in sewing schools [58]. Ali Baba and the Forty Thieves (AFT) is presented based on modeling the strategies of forty thieves in the search for Ali Baba [59]. Some other human-based metaheuristic algorithms are: Election-Based Optimization Algorithm (EBOA) [60], Coronavirus Herd Immunity Optimizer (CHIO) [61], Group Teaching Optimization Algorithm (GTOA) [62], Ebola Optimization Search Algorithm (ESOA) [63], Driving Training-Based Optimization (DTBO) [5], Gaining Sharing Knowledge-Based Algorithm (GSK) [64], and War Strategy Optimization (WSO) [65].
Game-based metaheuristic algorithms are inspired by the rules governing individual and team games and the strategies of players, coaches, referees, and other influential people in these games. Darts Game Optimizer (DGO) is one of the most well-known game-based metaheuristic algorithms, whose design is inspired by the strategy and skill of players in throwing darts and collecting points [66]. Hide Object Game Optimizer (HOGO) is proposed based on the simulation of players’ strategies for finding the hidden object on the playing field [67]. The Orientation Search Algorithm (OSA) is designed based on modeling the players’ position changes on the playing field based on the referee’s commands [68]. Some other game-based metaheuristic algorithms are: Ring toss game-based optimization (RTGBO) [69], Football Game Based Optimization (FGBO) [70], Archery Algorithm (AA) [6], Golf Optimization Algorithm (GOA) [71], and Volleyball Premier League (VPL) [72].
Some other recently proposed metaheuristic algorithms are: Monarch Butterfly Optimization (MBO) [73], Slime Mould Algorithm (SMA) [74], Moth Search Algorithm (MSA) [75], Hunger Games Search (HGS) [76], Runge Kutta method (RUN) [77], Colony Predation Algorithm (CPA) [78], weighted mean of vectors (INFO) [79], Harris Hawks Optimization (HHO) [80], and Rime optimization algorithm (RIME) [81].


  1. Zhao, S.; Zhang, T.; Ma, S.; Chen, M. Dandelion Optimizer: A nature-inspired metaheuristic algorithm for engineering applications. Eng. Appl. Artif. Intell. 2022, 114, 105075.
  2. Sergeyev, Y.D.; Kvasov, D.; Mukhametzhanov, M. On the efficiency of nature-inspired metaheuristics in expensive global optimization with limited budget. Sci. Rep. 2018, 8, 1–9.
  3. Liberti, L.; Kucherenko, S. Comparison of deterministic and stochastic approaches to global optimization. Int. Trans. Oper. Res. 2005, 12, 263–285.
  4. Alshanti, W.G.; Batiha, I.M.; Hammad, M.M.A.; Khalil, R. A novel analytical approach for solving partial differential equations via a tensor product theory of Banach spaces. Partial Differ. Equ. Appl. Math. 2023, 8, 100531.
  5. Dehghani, M.; Trojovská, E.; Trojovský, P. A new human-based metaheuristic algorithm for solving optimization problems on the base of simulation of driving training process. Sci. Rep. 2022, 12, 9924.
  6. Zeidabadi, F.-A.; Dehghani, M.; Trojovský, P.; Hubálovský, Š.; Leiva, V.; Dhiman, G. Archery Algorithm: A Novel Stochastic Optimization Algorithm for Solving Optimization Problems. Comput. Mater. Contin. 2022, 72, 399–416.
  7. de Armas, J.; Lalla-Ruiz, E.; Tilahun, S.L.; Voß, S. Similarity in metaheuristics: A gentle step towards a comparison methodology. Nat. Comput. 2022, 21, 265–287.
  8. Dehghani, M.; Montazeri, Z.; Dehghani, A.; Malik, O.P.; Morales-Menendez, R.; Dhiman, G.; Nouri, N.; Ehsanifar, A.; Guerrero, J.M.; Ramirez-Mendoza, R.A. Binary spring search algorithm for solving various optimization problems. Appl. Sci. 2021, 11, 1286.
  9. Jakšić, Z.; Devi, S.; Jakšić, O.; Guha, K. A Comprehensive Review of Bio-Inspired Optimization Algorithms Including Applications in Microelectronics and Nanophotonics. Biomimetics 2023, 8, 278.
  10. Gölcük, İ.; Ozsoydan, F.B. Q-learning and hyper-heuristic based algorithm recommendation for changing environments. Eng. Appl. Artif. Intell. 2021, 102, 104284.
  11. Huang, Z.; Zhang, H.; Wang, D.; Yu, H.; Wang, L.; Yu, D.; Peng, Y. Preference-based multi-attribute decision-making method with spherical-Z fuzzy sets for green product design. Eng. Appl. Artif. Intell. 2023, 126, 106767.
  12. Gharehchopogh, F.S.; Maleki, I.; Dizaji, Z.A. Chaotic vortex search algorithm: Metaheuristic algorithm for feature selection. Evol. Intell. 2022, 15, 1777–1808.
  13. Zhang, H.; Huang, Z.; Tian, G.; Wang, W.; Li, Z. A Hybrid QFD-Based Human-Centric Decision Making Approach of Disassembly Schemes Under Interval 2-Tuple q-Rung Orthopair Fuzzy Sets. IEEE Trans. Autom. Sci. Eng. 2023, 1–12.
  14. Dinkar, S.K.; Deep, K.; Mirjalili, S.; Thapliyal, S. Opposition-based Laplacian Equilibrium Optimizer with application in Image Segmentation using Multilevel Thresholding. Expert Syst. Appl. 2021, 174, 114766.
  15. Mohar, S.S.; Goyal, S.; Kaur, R. Localization of sensor nodes in wireless sensor networks using bat optimization algorithm with enhanced exploration and exploitation characteristics. J. Supercomput. 2022, 78, 11975–12023.
  16. Trojovská, E.; Dehghani, M.; Trojovský, P. Zebra Optimization Algorithm: A New Bio-Inspired Optimization Algorithm for Solving Optimization Algorithm. IEEE Access 2022, 10, 49445–49473.
  17. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95—International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; IEEE: Perth, WA, Australia, 1995; Volume 4, pp. 1942–1948.
  18. Dorigo, M.; Maniezzo, V.; Colorni, A. Ant system: Optimization by a colony of cooperating agents. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 1996, 26, 29–41.
  19. Karaboga, D.; Basturk, B. Artificial bee colony (ABC) optimization algorithm for solving constrained optimization problems. In Proceedings of the International Fuzzy Systems Association World Congress, Cancun, Mexico, 18–21 June 2007; Springer: Berlin/Heidelberg, Germany, 2007; pp. 789–798.
  20. Yang, X.-S. Firefly algorithm, stochastic test functions and design optimisation. Int. J. Bio-Inspired Comput. 2010, 2, 78–84.
  21. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61.
  22. Dehghani, M.; Trojovský, P.; Malik, O.P. Green Anaconda Optimization: A New Bio-Inspired Metaheuristic Algorithm for Solving Optimization Problems. Biomimetics 2023, 8, 121.
  23. Hashim, F.A.; Houssein, E.H.; Hussain, K.; Mabrouk, M.S.; Al-Atabany, W. Honey Badger Algorithm: New metaheuristic algorithm for solving optimization problems. Math. Comput. Simul. 2022, 192, 84–110.
  24. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67.
  25. Jiang, Y.; Wu, Q.; Zhu, S.; Zhang, L. Orca predation algorithm: A novel bio-inspired algorithm for global optimization problems. Expert Syst. Appl. 2022, 188, 116026.
  26. Abualigah, L.; Abd Elaziz, M.; Sumari, P.; Geem, Z.W.; Gandomi, A.H. Reptile Search Algorithm (RSA): A nature-inspired meta-heuristic optimizer. Expert Syst. Appl. 2022, 191, 116158.
  27. Dehghani, M.; Montazeri, Z.; Bektemyssova, G.; Malik, O.P.; Dhiman, G.; Ahmed, A.E. Kookaburra Optimization Algorithm: A New Bio-Inspired Metaheuristic Algorithm for Solving Optimization Problems. Biomimetics 2023, 8, 470.
  28. Abdel-Basset, M.; Mohamed, R.; Zidan, M.; Jameel, M.; Abouhawwash, M. Mantis Search Algorithm: A novel bio-inspired algorithm for global optimization and engineering design problems. Comput. Methods Appl. Mech. Eng. 2023, 415, 116200.
  29. Houssein, E.H.; Oliva, D.; Samee, N.A.; Mahmoud, N.F.; Emam, M.M. Liver Cancer Algorithm: A novel bio-inspired optimizer. Comput. Biol. Med. 2023, 165, 107389.
  30. Faramarzi, A.; Heidarinejad, M.; Mirjalili, S.; Gandomi, A.H. Marine Predators Algorithm: A nature-inspired metaheuristic. Expert Syst. Appl. 2020, 152, 113377.
  31. Kaur, S.; Awasthi, L.K.; Sangal, A.L.; Dhiman, G. Tunicate Swarm Algorithm: A new bio-inspired based metaheuristic paradigm for global optimization. Eng. Appl. Artif. Intell. 2020, 90, 103541.
  32. Braik, M.; Hammouri, A.; Atwan, J.; Al-Betar, M.A.; Awadallah, M.A. White Shark Optimizer: A novel bio-inspired meta-heuristic algorithm for global optimization problems. Knowl.-Based Syst. 2022, 243, 108457.
  33. Chopra, N.; Ansari, M.M. Golden Jackal Optimization: A Novel Nature-Inspired Optimizer for Engineering Applications. Expert Syst. Appl. 2022, 198, 116924.
  34. Goldberg, D.E.; Holland, J.H. Genetic Algorithms and Machine Learning. Mach. Learn. 1988, 3, 95–99.
  35. Storn, R.; Price, K. Differential evolution—A simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341–359.
  36. De Castro, L.N.; Timmis, J.I. Artificial immune systems as a novel soft computing paradigm. Soft Comput. 2003, 7, 526–544.
  37. Koza, J.R. Genetic Programming: On the Programming of Computers by Means of Natural Selection; MIT Press: Cambridge, MA, USA, 1992; Volume 1.
  38. Reynolds, R.G. An introduction to cultural algorithms. In Proceedings of the Third Annual Conference on Evolutionary Programming, San Diego, CA, USA, 24–26 February 1994; World Scientific: Singapore, 1994; pp. 131–139.
  39. Beyer, H.-G.; Schwefel, H.-P. Evolution strategies—A comprehensive introduction. Nat. Comput. 2002, 1, 3–52.
  40. Dehghani, M.; Samet, H. Momentum search algorithm: A new meta-heuristic optimization algorithm inspired by momentum conservation law. SN Appl. Sci. 2020, 2, 1720.
  41. Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A gravitational search algorithm. Inf. Sci. 2009, 179, 2232–2248.
  42. Dehghani, M.; Montazeri, Z.; Dhiman, G.; Malik, O.; Morales-Menendez, R.; Ramirez-Mendoza, R.A.; Dehghani, A.; Guerrero, J.M.; Parra-Arroyo, L. A spring search algorithm applied to engineering optimization problems. Appl. Sci. 2020, 10, 6173.
  43. Mirjalili, S.; Mirjalili, S.M.; Hatamlou, A. Multi-verse optimizer: A nature-inspired algorithm for global optimization. Neural Comput. Appl. 2016, 27, 495–513.
  44. Hatamlou, A. Black hole: A new heuristic optimization approach for data clustering. Inf. Sci. 2013, 222, 175–184.
  45. Hashim, F.A.; Hussain, K.; Houssein, E.H.; Mabrouk, M.S.; Al-Atabany, W. Archimedes optimization algorithm: A new metaheuristic algorithm for solving optimization problems. Appl. Intell. 2021, 51, 1531–1551.
  46. Eskandar, H.; Sadollah, A.; Bahreininejad, A.; Hamdi, M. Water cycle algorithm—A novel metaheuristic optimization method for solving constrained engineering optimization problems. Comput. Struct. 2012, 110, 151–166.
  47. Irizarry, R. LARES: An artificial chemical process approach for optimization. Evol. Comput. 2004, 12, 435–459.
  48. Salmani, M.H.; Eshghi, K. A metaheuristic algorithm based on chemotherapy science: CSA. J. Optim. 2017, 2017, 3082024.
  49. Wei, Z.; Huang, C.; Wang, X.; Han, T.; Li, Y. Nuclear reaction optimization: A novel and powerful physics-based algorithm for global optimization. IEEE Access 2019, 7, 66084–66109.
  50. Hashim, F.A.; Houssein, E.H.; Mabrouk, M.S.; Al-Atabany, W.; Mirjalili, S. Henry gas solubility optimization: A novel physics-based algorithm. Future Gener. Comput. Syst. 2019, 101, 646–667.
  51. Cuevas, E.; Oliva, D.; Zaldivar, D.; Pérez-Cisneros, M.; Sossa, H. Circle detection using electro-magnetism optimization. Inf. Sci. 2012, 182, 40–55.
  52. Pereira, J.L.J.; Francisco, M.B.; Diniz, C.A.; Oliver, G.A.; Cunha, S.S., Jr.; Gomes, G.F. Lichtenberg algorithm: A novel hybrid physics-based meta-heuristic for global optimization. Expert Syst. Appl. 2021, 170, 114522.
  53. Kaveh, A.; Dadras, A. A novel meta-heuristic optimization algorithm: Thermal exchange optimization. Adv. Eng. Softw. 2017, 110, 69–84.
  54. Faramarzi, A.; Heidarinejad, M.; Stephens, B.; Mirjalili, S. Equilibrium optimizer: A novel optimization algorithm. Knowl.-Based Syst. 2020, 191, 105190.
  55. Rao, R.V.; Savsani, V.J.; Vakharia, D. Teaching–learning-based optimization: A novel method for constrained mechanical design optimization problems. Comput.-Aided Des. 2011, 43, 303–315.
  56. Matoušová, I.; Trojovský, P.; Dehghani, M.; Trojovská, E.; Kostra, J. Mother optimization algorithm: A new human-based metaheuristic approach for solving engineering optimization. Sci. Rep. 2023, 13, 10312.
  57. Dehghani, M.; Mardaneh, M.; Guerrero, J.M.; Malik, O.P.; Ramirez-Mendoza, R.A.; Matas, J.; Vasquez, J.C.; Parra-Arroyo, L. A new “Doctor and Patient” optimization algorithm: An application to energy commitment problem. Appl. Sci. 2020, 10, 5791.
  58. Dehghani, M.; Trojovská, E.; Zuščák, T. A new human-inspired metaheuristic algorithm for solving optimization problems based on mimicking sewing training. Sci. Rep. 2022, 12, 17387.
  59. Braik, M.; Ryalat, M.H.; Al-Zoubi, H. A novel meta-heuristic algorithm for solving numerical optimization problems: Ali Baba and the forty thieves. Neural Comput. Appl. 2022, 34, 409–455.
  60. Trojovský, P.; Dehghani, M. A new optimization algorithm based on mimicking the voting process for leader selection. PeerJ Comput. Sci. 2022, 8, e976.
  61. Al-Betar, M.A.; Alyasseri, Z.A.A.; Awadallah, M.A.; Abu Doush, I. Coronavirus herd immunity optimizer (CHIO). Neural Comput. Appl. 2021, 33, 5011–5042.
  62. Zhang, Y.; Jin, Z. Group teaching optimization algorithm: A novel metaheuristic method for solving global optimization problems. Expert Syst. Appl. 2020, 148, 113246.
  63. Oyelade, O.N.; Ezugwu, A.E.-S.; Mohamed, T.I.; Abualigah, L. Ebola optimization search algorithm: A new nature-inspired metaheuristic optimization algorithm. IEEE Access 2022, 10, 16150–16177.
  64. Mohamed, A.W.; Hadi, A.A.; Mohamed, A.K. Gaining-sharing knowledge based algorithm for solving optimization problems: A novel nature-inspired algorithm. Int. J. Mach. Learn. Cybern. 2020, 11, 1501–1529.
  65. Ayyarao, T.L.; RamaKrishna, N.; Elavarasam, R.M.; Polumahanthi, N.; Rambabu, M.; Saini, G.; Khan, B.; Alatas, B. War Strategy Optimization Algorithm: A New Effective Metaheuristic Algorithm for Global Optimization. IEEE Access 2022, 10, 25073–25105.
  66. Dehghani, M.; Montazeri, Z.; Givi, H.; Guerrero, J.M.; Dhiman, G. Darts game optimizer: A new optimization technique based on darts game. Int. J. Intell. Eng. Syst. 2020, 13, 286–294.
  67. Dehghani, M.; Montazeri, Z.; Saremi, S.; Dehghani, A.; Malik, O.P.; Al-Haddad, K.; Guerrero, J.M. HOGO: Hide objects game optimization. Int. J. Intell. Eng. Syst. 2020, 13, 216–225.
  68. Dehghani, M.; Montazeri, Z.; Malik, O.P.; Ehsanifar, A.; Dehghani, A. OSA: Orientation search algorithm. Int. J. Ind. Electron. Control Optim. 2019, 2, 99–112.
  69. Doumari, S.A.; Givi, H.; Dehghani, M.; Malik, O.P. Ring Toss Game-Based Optimization Algorithm for Solving Various Optimization Problems. Int. J. Intell. Eng. Syst. 2021, 14, 545–554.
  70. Dehghani, M.; Mardaneh, M.; Guerrero, J.M.; Malik, O.; Kumar, V. Football game based optimization: An application to solve energy commitment problem. Int. J. Intell. Eng. Syst. 2020, 13, 514–523.
  71. Montazeri, Z.; Niknam, T.; Aghaei, J.; Malik, O.P.; Dehghani, M.; Dhiman, G. Golf Optimization Algorithm: A New Game-Based Metaheuristic Algorithm and Its Application to Energy Commitment Problem Considering Resilience. Biomimetics 2023, 8, 386.
  72. Moghdani, R.; Salimifard, K. Volleyball premier league algorithm. Appl. Soft Comput. 2018, 64, 161–185.
  73. Wang, G.-G.; Deb, S.; Cui, Z. Monarch butterfly optimization. Neural Comput. Appl. 2019, 31, 1995–2014.
  74. Li, S.; Chen, H.; Wang, M.; Heidari, A.A.; Mirjalili, S. Slime mould algorithm: A new method for stochastic optimization. Future Gener. Comput. Syst. 2020, 111, 300–323.
  75. Wang, G.-G. Moth search algorithm: A bio-inspired metaheuristic algorithm for global optimization problems. Memetic Comput. 2018, 10, 151–164.
  76. Yang, Y.; Chen, H.; Heidari, A.A.; Gandomi, A.H. Hunger games search: Visions, conception, implementation, deep analysis, perspectives, and towards performance shifts. Expert Syst. Appl. 2021, 177, 114864.
  77. Ahmadianfar, I.; Heidari, A.A.; Gandomi, A.H.; Chu, X.; Chen, H. RUN beyond the metaphor: An efficient optimization algorithm based on Runge Kutta method. Expert Syst. Appl. 2021, 181, 115079.
  78. Tu, J.; Chen, H.; Wang, M.; Gandomi, A.H. The Colony Predation Algorithm. J. Bionic Eng. 2021, 18, 674–710.
  79. Ahmadianfar, I.; Heidari, A.A.; Noshadian, S.; Chen, H.; Gandomi, A.H. INFO: An efficient optimization algorithm based on weighted mean of vectors. Expert Syst. Appl. 2022, 195, 116516.
  80. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872.
  81. Su, H.; Zhao, D.; Heidari, A.A.; Liu, L.; Zhang, X.; Mafarja, M.; Chen, H. RIME: A physics-based optimization. Neurocomputing 2023, 532, 183–214.
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to : , , , , , , ,
View Times: 51
Revisions: 2 times (View History)
Update Date: 18 Mar 2024