Pufferfish Optimization Algorithm: History
Please note this is an old version of this entry, which may differ significantly from the current revision.

Optimization problems are a kind of problem that have more than one feasible solution. According to this, optimization is the process of obtaining the best optimal solution among all feasible solutions for an optimization problem.

  • optimization
  • bio-inspired
  • metaheuristic
  • pufferfish
  • exploration
  • exploitation

1. Introduction

Optimization problems are a kind of problem that have more than one feasible solution. According to this, optimization is the process of obtaining the best optimal solution among all feasible solutions for an optimization problem [1]. From a mathematical point of view, any optimization problem can be modeled using three parts: decision variables, constraints, and the objective function of the problem. The main goal in optimization is to assign values to the decision variables so that the objective function is optimized by respecting the constraints of the problem [2]. There are numerous optimization problems in science, engineering, mathematics, technology, industry, and real-world applications that must be optimized using appropriate techniques. Problem solving techniques in dealing with optimization problems are classified into two groups: deterministic and stochastic approaches [3]. Deterministic approaches in two classes, gradient-based and non-gradient-based, have effective performance in optimizing convex, linear, continuous, differentiable, and low-dimensional problems [4]. Although, when problems become more complex and especially the dimensions of the problem increase, deterministic approaches are inefficient as they get stuck in local optima [5]. On the other hand, many practical optimization problems have features such as being non-convex, non-linear, discontinuous, non-differentiable, and high dimensions. The disadvantages and ineffectiveness of deterministic approaches in solving practical optimization problems with such characteristics have led researchers to develop stochastic approaches [6].
Metaheuristic algorithms are one of the most effective stochastic approaches for solving optimization problems, which can achieve suitable solutions for optimization problems based on random search in the problem-solving space and the use of random operators and trial and error processes. The optimization process in metaheuristic algorithms is such that the first several candidate solutions are initialized randomly in the problem-solving space under the name of algorithm population. Then, these candidate solutions are improved based on the steps of updating the algorithm population during successive iterations. After the full implementation of the algorithm, the best candidate solution obtained during the algorithm iterations is presented as a solution to the problem [7]. The random search process in the performance of metaheuristic algorithms provides no guarantee to achieving the global optimum, although the solutions obtained from metaheuristic algorithms are acceptable as quasi-optimal because they are close to the global optimum. Achieving more effective solutions closer to the global optimum for optimization problems has been a motivation for researchers to design numerous metaheuristic algorithms [8].
A metaheuristic algorithm, to have an effective search process to achieve a suitable solution for the optimization problem, must be able to search the problem-solving space well at both global and local levels. The goal in global search with the concept of exploration is to comprehensively scan the problem-solving space to avoid getting stuck in local optima and to discover the region containing the main optima. The goal in local search with the concept of exploitation is to scan accurately and with small steps in promising areas in the problem-solving space to achieve better solutions closer to the global optimum. Balancing exploration and exploitation during algorithm iterations and the search process in the problem-solving space is the key point in the success of the metaheuristic algorithm in addition to having a high ability in exploration and exploitation [9].

2. Pufferfish Optimization Algorithm

Metaheuristic algorithms have been developed by taking inspiration from various natural phenomena, lifestyles of living organisms, concepts in biological, genetics, physics sciences, rules of games, human interactions, and other evolutionary phenomena. According to the employed inspiration source in the design, metaheuristic algorithms are placed in five groups: swarm-based, evolutionary-based, physics-based, human-based, and game-based approaches.
Swarm-based metaheuristic algorithms are developed with inspiration from the natural behavior and strategies of animals, insects, birds, reptiles, aquatics, and other living creatures in the wild. Particle Swarm Optimization (PSO) [10], Ant Colony Optimization (ACO) [11], Artificial Bee Colony (ABC) [12], and Firefly Algorithm (FA) [13] are among the most well-known swarm-based metaheuristic algorithms. PSO is designed based on modeling the movement of flocks of birds and swarms of fish that are searching for food. ACO is proposed based on modeling the ability of ants to explore the shortest communication path between the food source and the colony. ABC is introduced based on the modeling of the hierarchical activities of honeybees in an attempt to reach new food sources. FA is designed with inspiration from optical communication between fireflies. Pelican Optimization (PO) is another swarm-based metaheuristic algorithm, that is inspired by the strategy of pelicans during hunting [14]. Among the natural behavior of living organisms in the wild, the processes of hunting, foraging, chasing, digging, and migration are much more prominent and have been a source of inspiration in the design of swarm-based metaheuristic algorithms such as the Snake Optimizer (SO) [15], Sea Lion Optimization (SLnO) [16], Flying Foxes Optimization (FFO) [17], Mayfly Algorithm (MA) [18], White Shark Optimizer (WSO) [19], African Vultures Optimization Algorithm (AVOA) [20], Grey Wolf Optimizer (GWO) [21], Reptile Search Algorithm (RSA) [22], Whale Optimization Algorithm (WOA) [23], Golden Jackal Optimization (GJO) [24], Honey Badger Algorithm (HBA) [25], Marine Predator Algorithm (MPA) [26], Orca Predation Algorithm (OPA) [27], and Tunicate Swarm Algorithm (TSA) [28].
Evolutionary-based metaheuristic algorithms are developed with inspiration from the concepts of biology and genetics, natural selection, survival of the fittest, and Darwin’s evolutionary theory. The Genetic Algorithm (GA) [29] and Differential Evolution (DE) [30] are the most well-known algorithms of this group, whose design is inspired by the reproduction process, genetic concepts, and the use of random mutation, selection, and crossover operators. The Artificial Immune System (AIS) is introduced based on the simulation of the mechanism of the body’s defense system against diseases and microbes [31]. Some other evolutionary-based metaheuristic algorithms are the Cultural Algorithm (CA) [32], Genetic Programming (GP) [33], and Evolution Strategy (ES) [34].
Physics-based metaheuristic algorithms are developed with inspiration from laws, transformations, processes, phenomena, forces, and other concepts in physics. Simulated Annealing (SA) is one of the most well-known physics-based metaheuristic algorithms, which is developed based on the modeling of the metal annealing phenomenon. In this process, with the aim of achieving an ideal crystal, metals are first melted under heat, then slowly cooled [35]. Physical forces and Newton’s laws of motion have been fundamental inspirations in designing algorithms such as the Gravitational Search Algorithm (GSA) based on gravitational attraction force [36], the Momentum Search Algorithm (MSA) [37] based on momentum force, and the Spring Search Algorithm (SSA) [38] based on the elastic force of a spring. The Water Cycle Algorithm (WCA) is proposed based on the modeling of physical transformations in the natural water cycle [39]. Some other physics-based metaheuristic algorithms are Fick’s Law Algorithm (FLA) [40], Prism Refraction Search (PRS) [41], Henry Gas Optimization (HGO) [42], Black Hole Algorithm (BHA) [43], Nuclear Reaction Optimization (NRO) [44], Equilibrium Optimizer (EO) [45], Multi-Verse Optimizer (MVO) [46], Lichtenberg Algorithm (LA) [47], Archimedes Optimization Algorithm (AOA) [48], Thermal Exchange Optimization (TEO) [49], and Electro-Magnetism Optimization (EMO) [50].
Human-based metaheuristic algorithms are developed with inspiration from the thoughts, choices, decisions, interactions, communications, and other activities of humans in society or personal life. The Teaching–Learning-Based Optimization (TLBO) is one of the most widely used human-based metaheuristic algorithms, whose design is inspired by educational communication and knowledge exchange between teachers and students, as well as students with each other [51]. The Mother Optimization Algorithm (MOA) is introduced with inspiration from Eshrat’s care of her children [6]. The Election-Based Optimization Algorithm (EBOA) is proposed based on modeling the process of voting and holding elections in society [8]. The Chef-Based Optimization Algorithm (CHBO) is designed based on the simulation of teaching cooking skills by chefs to applicants in culinary schools [52]. The Teamwork Optimization Algorithm (TOA) is developed with the inspiration of collaboration among team members in providing teamwork in order to achieve specified team goals [53]. Some other human-based metaheuristic algorithms are Driving Training-Based Optimization (DTBO) [5], War Strategy Optimization (WSO) [54], Ali Baba and the Forty Thieves (AFT) [55], Gaining Sharing Knowledge-based Algorithm (GSK) [56], and Coronavirus Herd Immunity Optimizer (CHIO) [57].
Game-based metaheuristic algorithms are developed by taking inspiration from the rules of games as well as the behavior of players, coaches, referees, and other influential people in individual and team games. The Darts Game Optimizer (DGO) is one of the most well-known algorithms of this group, which is proposed based on modeling the competition of players in throwing darts and collecting more points in order to win the game [58]. The Golf Optimization Algorithm (GOA) is introduced based on the simulation of players hitting the ball in order to place the ball in the holes [59]. The Puzzle Algorithm (PA) is designed based on modeling the strategy of players putting puzzle pieces together in order to complete it according to the pattern [60]. Some other game-based metaheuristic algorithms are Volleyball Premier League (VPL) [61], Running City Game Optimizer (RCGO) [62], and Tug of War Optimization (TWO) [63].

This entry is adapted from the peer-reviewed paper 10.3390/biomimetics9020065

References

  1. Zhao, S.; Zhang, T.; Ma, S.; Chen, M. Dandelion Optimizer: A nature-inspired metaheuristic algorithm for engineering applications. Eng. Appl. Artif. Intell. 2022, 114, 105075.
  2. Sergeyev, Y.D.; Kvasov, D.; Mukhametzhanov, M. On the efficiency of nature-inspired metaheuristics in expensive global optimization with limited budget. Sci. Rep. 2018, 8, 453.
  3. Liberti, L.; Kucherenko, S. Comparison of deterministic and stochastic approaches to global optimization. Int. Trans. Oper. Res. 2005, 12, 263–285.
  4. Alshanti, W.G.; Batiha, I.M.; Hammad, M.A.; Khalil, R. A novel analytical approach for solving partial differential equations via a tensor product theory of Banach spaces. Partial Differ. Equ. Appl. Math. 2023, 8, 100531.
  5. Dehghani, M.; Trojovská, E.; Trojovský, P. A new human-based metaheuristic algorithm for solving optimization problems on the base of simulation of driving training process. Sci. Rep. 2022, 12, 9924.
  6. Matoušová, I.; Trojovský, P.; Dehghani, M.; Trojovská, E.; Kostra, J. Mother optimization algorithm: A new human-based metaheuristic approach for solving engineering optimization. Sci. Rep. 2023, 13, 10312.
  7. de Armas, J.; Lalla-Ruiz, E.; Tilahun, S.L.; Voß, S. Similarity in metaheuristics: A gentle step towards a comparison methodology. Nat. Comput. 2022, 21, 265–287.
  8. Trojovský, P.; Dehghani, M. A new optimization algorithm based on mimicking the voting process for leader selection. PeerJ Comput. Sci. 2022, 8, e976.
  9. Zhao, W.; Wang, L.; Zhang, Z.; Fan, H.; Zhang, J.; Mirjalili, S.; Khodadadi, N.; Cao, Q. Electric eel foraging optimization: A new bio-inspired optimizer for engineering applications. Expert Syst. Appl. 2024, 238, 122200.
  10. Kennedy, J.; Eberhart, R. Particle Swarm Optimization. Proceedings of ICNN’95—International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; IEEE: Perth, WA, Australia, 1995; Volume 4, pp. 1942–1948.
  11. Dorigo, M.; Maniezzo, V.; Colorni, A. Ant system: Optimization by a colony of cooperating agents. IEEE Trans. Syst. Man Cybern. Part B 1996, 26, 29–41.
  12. Karaboga, D.; Basturk, B. Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems. In Proceedings of the International Fuzzy Systems Association World Congress, Cancun, Mexico, 18–21 June 2007; Springer: Berlin/Heidelberg, Germany, 2007; pp. 789–798.
  13. Yang, X.-S. Firefly algorithm, stochastic test functions and design optimisation. Int. J. Bio-Inspired Comput. 2010, 2, 78–84.
  14. Trojovský, P.; Dehghani, M. Pelican Optimization Algorithm: A Novel Nature-Inspired Algorithm for Engineering Applications. Sensors 2022, 22, 855.
  15. Hashim, F.A.; Hussien, A.G. Snake Optimizer: A novel meta-heuristic optimization algorithm. Knowl.-Based Syst. 2022, 242, 108320.
  16. Masadeh, R.; Mahafzah, B.A.; Sharieh, A. Sea lion optimization algorithm. Int. J. Adv. Comput. Sci. Appl. 2019, 10, 388–395.
  17. Zervoudakis, K.; Tsafarakis, S. A global optimizer inspired from the survival strategies of flying foxes. Eng. Comput. 2023, 39, 1583–1616.
  18. Zervoudakis, K.; Tsafarakis, S. A mayfly optimization algorithm. Comput. Ind. Eng. 2020, 145, 106559.
  19. Braik, M.; Hammouri, A.; Atwan, J.; Al-Betar, M.A.; Awadallah, M.A. White Shark Optimizer: A novel bio-inspired meta-heuristic algorithm for global optimization problems. Knowl.-Based Syst. 2022, 243, 108457.
  20. Abdollahzadeh, B.; Gharehchopogh, F.S.; Mirjalili, S. African vultures optimization algorithm: A new nature-inspired metaheuristic algorithm for global optimization problems. Comput. Ind. Eng. 2021, 158, 107408.
  21. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61.
  22. Abualigah, L.; Abd Elaziz, M.; Sumari, P.; Geem, Z.W.; Gandomi, A.H. Reptile Search Algorithm (RSA): A nature-inspired meta-heuristic optimizer. Expert Syst. Appl. 2022, 191, 116158.
  23. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67.
  24. Chopra, N.; Ansari, M.M. Golden Jackal Optimization: A Novel Nature-Inspired Optimizer for Engineering Applications. Expert Syst. Appl. 2022, 198, 116924.
  25. Hashim, F.A.; Houssein, E.H.; Hussain, K.; Mabrouk, M.S.; Al-Atabany, W. Honey Badger Algorithm: New metaheuristic algorithm for solving optimization problems. Math. Comput. Simul. 2022, 192, 84–110.
  26. Faramarzi, A.; Heidarinejad, M.; Mirjalili, S.; Gandomi, A.H. Marine Predators Algorithm: A nature-inspired metaheuristic. Expert Syst. Appl. 2020, 152, 113377.
  27. Jiang, Y.; Wu, Q.; Zhu, S.; Zhang, L. Orca predation algorithm: A novel bio-inspired algorithm for global optimization problems. Expert Syst. Appl. 2022, 188, 116026.
  28. Kaur, S.; Awasthi, L.K.; Sangal, A.L.; Dhiman, G. Tunicate Swarm Algorithm: A new bio-inspired based metaheuristic paradigm for global optimization. Eng. Appl. Artif. Intell. 2020, 90, 103541.
  29. Goldberg, D.E.; Holland, J.H. Genetic Algorithms and Machine Learning. Mach. Learn. 1988, 3, 95–99.
  30. Storn, R.; Price, K. Differential evolution—A simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341–359.
  31. De Castro, L.N.; Timmis, J.I. Artificial immune systems as a novel soft computing paradigm. Soft Comput. 2003, 7, 526–544.
  32. Reynolds, R.G. An Introduction to Cultural Algorithms. In Proceedings of the Third Annual Conference on Evolutionary Programming, San Diego, CA, USA, 24–26 February 1994; World Scientific: Singapore, 1994; pp. 131–139.
  33. Koza, J.R.; Koza, J.R. Genetic Programming: On the Programming of Computers by Means of Natural Selection; MIT Press: Cambridge, MA, USA, 1992; Volume 1.
  34. Beyer, H.-G.; Schwefel, H.-P. Evolution strategies–a comprehensive introduction. Nat. Comput. 2002, 1, 3–52.
  35. Kirkpatrick, S.; Gelatt, C.D.; Vecchi, M.P. Optimization by simulated annealing. Science 1983, 220, 671–680.
  36. Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A gravitational search algorithm. Inf. Sci. 2009, 179, 2232–2248.
  37. Dehghani, M.; Samet, H. Momentum search algorithm: A new meta-heuristic optimization algorithm inspired by momentum conservation law. SN Appl. Sci. 2020, 2, 1720.
  38. Dehghani, M.; Montazeri, Z.; Dhiman, G.; Malik, O.; Morales-Menendez, R.; Ramirez-Mendoza, R.A.; Dehghani, A.; Guerrero, J.M.; Parra-Arroyo, L. A spring search algorithm applied to engineering optimization problems. Appl. Sci. 2020, 10, 6173.
  39. Eskandar, H.; Sadollah, A.; Bahreininejad, A.; Hamdi, M. Water cycle algorithm–A novel metaheuristic optimization method for solving constrained engineering optimization problems. Comput. Struct. 2012, 110, 151–166.
  40. Hashim, F.A.; Mostafa, R.R.; Hussien, A.G.; Mirjalili, S.; Sallam, K.M. Fick’s Law Algorithm: A physical law-based algorithm for numerical optimization. Knowl.-Based Syst. 2023, 260, 110146.
  41. Kundu, R.; Chattopadhyay, S.; Nag, S.; Navarro, M.A.; Oliva, D. Prism refraction search: A novel physics-based metaheuristic algorithm. J. Supercomput. 2024.
  42. Hashim, F.A.; Houssein, E.H.; Mabrouk, M.S.; Al-Atabany, W.; Mirjalili, S. Henry gas solubility optimization: A novel physics-based algorithm. Future Gener. Comput. Syst. 2019, 101, 646–667.
  43. Hatamlou, A. Black hole: A new heuristic optimization approach for data clustering. Inf. Sci. 2013, 222, 175–184.
  44. Wei, Z.; Huang, C.; Wang, X.; Han, T.; Li, Y. Nuclear reaction optimization: A novel and powerful physics-based algorithm for global optimization. IEEE Access 2019, 7, 66084–66109.
  45. Faramarzi, A.; Heidarinejad, M.; Stephens, B.; Mirjalili, S. Equilibrium optimizer: A novel optimization algorithm. Knowl.-Based Syst. 2020, 191, 105190.
  46. Mirjalili, S.; Mirjalili, S.M.; Hatamlou, A. Multi-verse optimizer: A nature-inspired algorithm for global optimization. Neural Comput. Appl. 2016, 27, 495–513.
  47. Pereira, J.L.J.; Francisco, M.B.; Diniz, C.A.; Oliver, G.A.; Cunha Jr, S.S.; Gomes, G.F. Lichtenberg algorithm: A novel hybrid physics-based meta-heuristic for global optimization. Expert Syst. Appl. 2021, 170, 114522.
  48. Hashim, F.A.; Hussain, K.; Houssein, E.H.; Mabrouk, M.S.; Al-Atabany, W. Archimedes optimization algorithm: A new metaheuristic algorithm for solving optimization problems. Appl. Intell. 2021, 51, 1531–1551.
  49. Kaveh, A.; Dadras, A. A novel meta-heuristic optimization algorithm: Thermal exchange optimization. Adv. Eng. Softw. 2017, 110, 69–84.
  50. Cuevas, E.; Oliva, D.; Zaldivar, D.; Pérez-Cisneros, M.; Sossa, H. Circle detection using electro-magnetism optimization. Inf. Sci. 2012, 182, 40–55.
  51. Rao, R.V.; Savsani, V.J.; Vakharia, D. Teaching–learning-based optimization: A novel method for constrained mechanical design optimization problems. Comput.-Aided Des. 2011, 43, 303–315.
  52. Trojovská, E.; Dehghani, M. A new human-based metahurestic optimization method based on mimicking cooking training. Sci. Rep. 2022, 12, 14861.
  53. Dehghani, M.; Trojovský, P. Teamwork Optimization Algorithm: A New Optimization Approach for Function Minimization/Maximization. Sensors 2021, 21, 4567.
  54. Ayyarao, T.L.; RamaKrishna, N.; Elavarasam, R.M.; Polumahanthi, N.; Rambabu, M.; Saini, G.; Khan, B.; Alatas, B. War Strategy Optimization Algorithm: A New Effective Metaheuristic Algorithm for Global Optimization. IEEE Access 2022, 10, 25073–25105.
  55. Braik, M.; Ryalat, M.H.; Al-Zoubi, H. A novel meta-heuristic algorithm for solving numerical optimization problems: Ali Baba and the forty thieves. Neural Comput. Appl. 2022, 34, 409–455.
  56. Mohamed, A.W.; Hadi, A.A.; Mohamed, A.K. Gaining-sharing knowledge based algorithm for solving optimization problems: A novel nature-inspired algorithm. Int. J. Mach. Learn. Cybern. 2020, 11, 1501–1529.
  57. Al-Betar, M.A.; Alyasseri, Z.A.A.; Awadallah, M.A.; Abu Doush, I. Coronavirus herd immunity optimizer (CHIO). Neural Comput. Appl. 2021, 33, 5011–5042.
  58. Dehghani, M.; Montazeri, Z.; Givi, H.; Guerrero, J.M.; Dhiman, G. Darts game optimizer: A new optimization technique based on darts game. Int. J. Intell. Eng. Syst. 2020, 13, 286–294.
  59. Montazeri, Z.; Niknam, T.; Aghaei, J.; Malik, O.P.; Dehghani, M.; Dhiman, G. Golf Optimization Algorithm: A New Game-Based Metaheuristic Algorithm and Its Application to Energy Commitment Problem Considering Resilience. Biomimetics 2023, 8, 386.
  60. Zeidabadi, F.A.; Dehghani, M. POA: Puzzle Optimization Algorithm. Int. J. Intell. Eng. Syst. 2022, 15, 273–281.
  61. Moghdani, R.; Salimifard, K. Volleyball premier league algorithm. Appl. Soft Comput. 2018, 64, 161–185.
  62. Ma, B.; Hu, Y.; Lu, P.; Liu, Y. Running city game optimizer: A game-based metaheuristic optimization algorithm for global optimization. J. Comput. Des. Eng. 2023, 10, 65–107.
  63. Kaveh, A.; Zolghadr, A. A Novel Meta-Heuristic Algorithm: Tug of War Optimization. Int. J. Optim. Civ. Eng. 2016, 6, 469–492.
More
This entry is offline, you can click here to edit this entry!
Video Production Service