4. Challenges and Future Directions
Indeed, how to improve the performance of NIOAs is a very complex problem, which is influenced comprehensively by the methods of parameter tuning, topology structure and learning strategy. In this study, we draw some preliminary conclusions in order to provide a referencing framework for the selection and improvement of NIOAs. In the past 30 years, a large number of meta-heuristic algorithms (more than 120 in our statistics) and their variants have been proposed in order to provide efficient and effective solutions to optimization problems in the field of AI. Although great progress has been made on the NIOAs, which have been widely and successfully applied to various application fields, challenging problems still exist, mainly reflected in the following four aspects.
1. The first one is the lack of sufficient research in fundamental theories and tools of NIOAs. From our observation, the challenges of the fundamental researches on NIOAs include the following four points.
(1) The biological or natural mechanisms imitated by the NIOAs are not yet fully clear. Most of the NIOAs are proposed by the experts of psychology or computer science and engineering, and close collaboration with biologists is extremely important in order to deeply understand and abstract such mechanisms and functions so that NIOAs can be reasonably and effectively integrated into nature, biology and the real environment.
(2) It is also necessary to lay a solid foundation of mathematical theories to support NIOAs. Such examples include a rigorous time complexity analysis and convergence proof, a deep analysis of topological structures of various NIOAs, a suitable and comprehensive theoretical explanation to balance the contradiction between easily falling into local optimum and slow convergence speed, and an in-depth analytic study of the methods of automatic parameters tuning in order to solve the parameter-dependence problem. Specifically, while working on classic fundamental works
[27][28][29] with some achievements in time complexity analysis and convergence proof, the researchers give a list of future research directions, which we brief as follows: for topology analysis, it is indicated that the local neighborhood topology for some specific algorithm is more suitable for complex problems
[30], and the investigation into the PSO paradigm finds that the effect of population topology interacted with the function is optimized
[31]. Although these previous efforts have recommended population topologies, they still have not precisely identified the topological factors that may result in the best performance on a range of functions
[31]. An automatic tuning process for parameters is usually computationally expensive, especially for real-world application problems; therefore, it is desirable to have a benchmark test that suits the NIOAs’ tuning toolbox and is easy to use
[32]. Due to the lack of a solid mathematical foundation, almost all the NIOAs are working under the black-box mode; thus, there are always researchers proposing so-called “novel” algorithms and declaring that their optimizers find better solutions than other NIOAs
[33].
(3) The research is not sufficient on the problem extension of basic continuous NIOAs to different optimization problems, including COPs and MOOPs. The study here on different learning strategies and topological structures of more than 120 MHAs can provide diverse solutions to COPs and MOOPs. Actually, the current research of mathematical theories in the aforementioned (2) and problem extensions mainly focus on a few NIOAs, including GA, PSO and DE, so it is required to pursue further research to more NIOAs.
(4) Another problem is the visualization platforms of NIOAs research. From our observation, there are few discussions on this aspect except for an early simple attempt
[34]. In addition, few benchmark tests suit specific optimization problems such as automatic parameter tuning
[32]. Owing to the insufficient and inadequate theoretical investigation on the NIOAs, it becomes quite difficult to clearly distinguish the characteristics of different NIOAs (most of the algorithms look very similar) and this, per se, becomes another optimization problem of an optimal selection of the NIOAs for a given problem. This is also a motivation that we attempt to compare and analyze 11 common NIOAs theoretically and experimentally.
2. The second one is that NIOAs are less capable of solving continuous optimization problems in complex environments. The real environments are complicated, and the optimization problems can be high-dimensional, large-scale, multi-modal and multi-objective; the optimization environments can be dynamic, highly constrained and uncertain; the fitness evaluations may contain noises, be imprecise and time-consuming, and sometimes the fitness functions can be un-deterministic. The complexity of the real environments poses a great challenge to NIOAs. Although some efforts
[35][36][37] have been made to solve the aforementioned problems, how to handle these issues is still a very difficult problem.
3. The third one is too few combinations of NIOAs with other related disciplines. NIOAs intrinsically have a parallel and distributed architecture, while less attention is paid to the combinations with parallel and distributed technologies, including GPU-based hardware, robot swarm and cloud platforms. A few works
[38][39][40] focus on the above issues, and interdisciplinary research is a great potential for NIOAs.
4. The fourth one is that less effort has been made to apply NIOAs to various domain problem fields. Actually, on the one hand, it is impossible to have one single NIOA to adapt to all the application problems. On the other hand, a certain kind of NIOAs may be more effective for certain kinds of problems
[41]. Existing enhanced methods of NIOAs (for example, a combination of different NIOAs) lack an in-depth and targeted discussion on the reason why the enhanced methods are adopted. Furthermore, various NIOAs have been adopted to handle the same application problem, but it is not clear why this NIOA was chosen (researchers just happened to use it).
Consequently, it is our belief that in the future, researchers should tackle the following three problems in the NIOAs. These three problems indicate three future research directions for the NIOAs study.
1. Strengthening solid theoretical analysis for the NIOAs. Some theoretical problems of NIOAs are only studied in specific NIOA (for example, PSO), such as the time complexity analysis, the convergence proof, topology analysis, the automatic parameter tuning, the mechanisms of the exploitation and exploration processes. There are still many problems to be solved in the existing research work
[27][28][29], and the theoretical analysis of other NIOAs needs to be analyzed deeply. COPs and MOOPs should be further studied by extending and combining the various existing NIOAs. Furthermore, it is necessary to develop a visualization platform of deconstructing, modeling and simulation of the interactions of components in NIOAs, to make it convenient to study the mechanisms of self-organization, direct/indirect communication and the processes of intelligent emergence on various swarm systems and application cases. It is also necessary to establish a benchmark test suite and easy-to-use algorithm toolbox for different problems, for example, automatic parameter tuning and the aforementioned problems in complex environments.
2. Designing novel NIOAs to solve complicated optimization problems. Many real-world optimization problems are very complex, such as the multi-model and multi-objective problems, the constrained or uncertainty problems, the large-scale optimization problems, the optimization problems with noisy, imprecise or time-varying fitness evaluations. It is another important direction to design more targeted and effective NIOAs for the above problems.
3. Deep fusion with other related disciplines. In order to improve the performance of the current NIOAs, it is indispensable to combine the NIOAs with other related disciplines or directions, such as distributed and parallel computing, machine learning, quantum computation and robot engineering. More concretely, because NIOAs by nature possess the characteristics of distributed parallelism, it is more easily and natural for them to be implemented in distributed and parallel environments, such as cloud platforms and GPU-based hardware environments. Furthermore, for some large-scale optimization problems, the robot swarm can be a good solution that combines NIOAs and robot engineering. With the support from machine learning methods, NIOAs can become efficient to handle the multi-modal multi-objective optimization problems, and on the other way around, NIOAs can provide optimization support to machine learning tasks, such as the clustering problem and the association rules mining problem.
4. Combination with specific applications. It is necessary to design customized NIOA for specific application problems; the topological structure, learning strategy and method of parameters’ selection of customized NIOAs may be suitable to a specific problem, which can acquire the good convergence speed and optimization performance. Existing applications rarely have targeted design of NIOAs; more of them use NIOAs directly or cannot explain the reason for algorithm design with specific problems.
5. Conclusions
Nature-Inspired Optimization Algorithms (NIOAs) can provide satisfactory solutions to the NP-hard problems, which are difficult and sometimes even impossible for traditional optimization methods to handle. Thus, the NIOAs have been widely applied to various fields both theoretically and in practice; examples including function optimization problems (convex, concave, high or low dimension and single peak or multiple peaks), combinatorial optimization problems (traveling salesman problem (TSP), knapsack problem, bin-packing problem, layout-optimization problem, graph-partitioning problem and production-scheduling problem), automatic control problems (control system optimization, robot structure and trajectory planning), image-processing problems (image recognition, restoration and edge-feature extraction), data-mining problems (feature selection, classification, association rules mining and clustering).
Many NIOAs and their variants have been proposed in the last 30 years. However, for the specific optimization problems, researchers tend to choose the NIOAs based on their narrow experiences or biased knowledge because there lacks an overall and systematic comparison and analysis study of these NIOAs. This study aims to bridge this gap; the contributions of this paper are fourfold. First, we summarize the uniform formal description for the NIOAs, analyze the similarities and differences among the 11 common NIOAs; second, we compare the performance of 11 NIOAs comprehensively, which can reflect the essential characteristics of each algorithm; third, we present a relatively comprehensive list of all the NIOAs so far, the first attempt to systematically summarize existing NIOAs, although it is very hard work; fourth, we comprehensively discuss the challenges and future directions of the whole NIOAs field, which can provide a reference for the further research of NIOAs. Actually, we are not aiming to find a super algorithm that can solve all problems in different fields once and for all (it is an impossible task). Instead, we propose a useful reference to help researchers to choose suitable algorithms more pertinently for different application scenarios in order to take a good advantage and make full use of the different NIOAs. We believe, with this survey work, that more novel-problem-oriented NIOAs will emerge in the future, and we hope that this work can be a good reference and handbook for the NIOAs innovation and applications.
Undoubtedly, it is necessary and meaningful to make a 34 comprehensive comparison of the common NIOAs, and we believe that more efforts are required to further this review in the future. First, the state-of-the-art variants of the 11 common NIOAs will be compared and analyzed comprehensively, discussing their convergence, topological structures, learning strategies, the method of parameter tuning and the application field. Second, there are more than 120 MHAs with various topological structures and learning strategies. For example, the recently proposed chicken swarm optimization (CSO) and spider monkey optimization (SMO) algorithms have a hierarchical topological structure and grouping/regrouping learning strategies. Thus, the comprehensive analysis of various topological structures and learning strategies of NIOAs is another future work.