An enhanced adaptive global‐best harmony search algorithm for continuous optimization problems

This paper presents an enhanced adaptive global‐best harmony search (EAGHS) to solve global continuous optimization problems. The global‐best HS (GHS) is one of the strongest versions of the classical HS algorithm that hybridizes the concepts of swarm intelligence and conventional HS. However, randomized selection of harmony in the permissible interval diverts the GHS algorithm from the global optimum. To address this issue, the proposed EAGHS method introduces a dynamic coefficient into the GHS algorithm to increase the search power in early iterations. Various complex and extensively‐applied benchmark functions are used to validate the developed EAGHS algorithm. The results indicate that the EAGHS algorithm offers faster convergence and better accuracy than the standard HS, GHS and other similar algorithms. Further analysis is performed to evaluate the sensitivity of the proposed method to the changes of parameters such as harmony memory consideration rate, harmony search memory, and larger dimensions.


INTRODUCTION
The computational effort required by traditional numerical optimization method for an exhaustive search of the optimal design space is probably infeasible for complex engineering optimization problems. In the last two decades, there has been a growing interest in developing alternative optimization strategies to the traditional, mathematically derived and gradient-based optimization methods. 1,2 In this context, heuristic optimization algorithms have proved to be viable approaches for solving complex engineering optimization tasks (eg, References 3,4). The heuristic optimization algorithms are based on computational procedures to determine an optimal solution by iteratively trying to improve a candidate solution with regard to a given measure of quality. These techniques are derived from natural optimization processes and incorporate structured randomness. Few or no assumptions is made by these algorithms about the problem being optimized. Accordingly, they can search large spaces of candidate solutions toward finding optimal or near-optimal solutions at a reasonable computational cost. 5 However, heuristic algorithms tend to easily get stuck in local optima. 6 To improve these algorithms, a new generation of optimization algorithms called metaheuristic algorithms has emerged. The metaheuristic algorithms can explicitly or implicitly manage the contrast between search diversification and search. 6 As a result, they have shown considerable success in solving different complex optimization problems. The difference between the heuristic and metaheuristic methods is not significant. 7 The heuristics algorithms are designed to solve a specific problem without the possibility of generalization or application to other similar problems. 8 On the other hand, metaheuristics represent a higher-level heuristic in the sense that they are problem-independent and can be applied to a broad range of problems. 3 Extensive research has been carried out to develop single-solution, population-based, hybrid, nature-inspired and metaphor-based metaheuristic methods. Genetic algorithm (GA), [9][10][11] ant colony optimization, 12,13 simulated annealing, 14,15 colonial competition algorithm, 16,17 particle swarm optimization, 18,19 gray wolf optimization, 20 Monarch butterfly optimization (MBO), 21 earthworm optimization algorithm (EWA), 22 elephant herding optimization (EHO), 23 and moth search (MS) algorithm 24 are some of the efficient metaheuristic methods that have been successfully applied to real-world optimization problems. Harmony search (HS) is one of the well-known population-based metaheuristic optimization algorithms introduced by Geem et al. 25 The advantages of this optimization algorithm include the low requirements for mathematical and derivative knowledge and startup settings. The HS method also uses random parameters to increase the discovery and accuracy of each member resulting in lower computational and time cost compared to other metaheuristic algorithms. 26 The performance of the HS algorithm has been verified in many practical applications. [27][28][29] The capabilities of the HS algorithm have motivated researchers to improve its performance especially for solving high dimensional examples as well as functions with a large number of local optimal points. This paper presents an enhanced HS version based on one of its variants called global-best HS (GHS). 30 Here, we report an enhanced adaptive global-best harmony search (EAGHS) method by expanding the GHS algorithm. A dynamic coefficient is introduced to the GHS algorithm to increase the search power of EAGHS in early iterations. Two new equations are introduced for the harmony vector and bandwidths in the search process. Compared to its counterparts, EAGHS performs more efficiently and accurately in global optimization. The rest of the paper is drawn as follows: Section 2 introduces the principle of the HS algorithm. Section 3 presents the enhanced HS algorithms and the proposed EAGHS method. Section 4 discusses the performance of EAGHS by comparing it with the standard HS algorithm and its other robust variants. Section 5 summarizes the main findings in this study.

HARMONY SEARCH ALGORITHM
The HS algorithm is a population-based method inspired by music composing. In music composing, musicians collaborate with different instruments to create a melodious music. Creation of a song is a process of tuning and improving to achieve an ideal and more harmonic piece. Generally, musicians attempt to evolve at every stage of the creation, which results in a better coordination among them. 25 Over time, musicians create music by trying various harmonies. After playing several rounds, musicians remember the harmonies of each trial. Assume k harmonies are made by n musicians, the memory size of the musicians or HS memory (HSM) is assumed to be k harmonics. A matrix with k rows (number of harmonics that the musicians remember) and n + 1 columns, in which first n columns are the variables affecting the problem and the last column is used to store the cost function value of each variable. This is called the HM matrix or harmonic memory. Musicians use different musical instruments to create a new music piece, and the beauty of music is raised from the harmony among different musical notes. In that manner, the musicians play each instrument within its possible range of steps to form a harmonic vector. The formed and remembered vectors will be considered to produce next vector. Musicians usually have two stepping modes for each instrument: play with the memory vectors one step or play with the steps for that instrument randomly within the permissible range. However, the HS algorithm starts with initialized population members which are independent in each dimension and within the permissible range. In each iteration of the algorithm, only one new member is created. Then, each dimension of the new point is generated from all of solutions in the HM using either a memory consideration rule and a pitch adjustment factor or all random re-initialization in authorized range of dimensions. The new generation of solutions is compared with the population member that has maximum cost function value and is replaced if new solution possesses lower cost. This procedure is repeated until one of the termination criteria is met. The HS algorithm sequence can be described in below: 25 1. Define the cost function (ie, f (x)) that needs to be minimized to achieve the goal of the algorithm. 2. Initialize parameter set as below In the above equation, HS memory (HMS) is the population number and n is the number of variables dimensions. All the population member dimensions are set randomly in authorized range. HS consideration rate (HMCR) and pitch adjustment rate (PAR) initial values are, respectively, set as (0, 1) and (0.3, 0.5) constantly.

Generate new point
as follows: For each of the n dimensions, a corresponding member dimension is set by chance of HMCR. The value of new point is set all randomly in the authorized range: Compare the new harmony vector x new to the worst member of population and replace it if x new possesses better cost. 5. Check the termination criteria, return to step 3 if the criteria is not met, otherwise optimum point is found.
The pseudo-code of the HS algorithm is depicted in Figure 1. 25 The HS algorithm pseudo-code

ENHANCED HS ALGORITHMS
There have been efforts to improve the original HS algorithm. The PAR and HMCR parameters used in step 3 of the algorithm are important parameters, which determine the accuracy and speed of this algorithm. Adjusting these parameters is crucial and therefore, most of the updated HS versions have focused on quantifying these two parameters. Note that large HMCRs and PAR increase the population diversity and accuracy, respectively. [31][32][33] Improvements of nearly all of optimization algorithms are accomplished by either adjusting the algorithm parameters or combining it with other algorithms. Dynamization of algorithms are derived from the cost-function behavior and optimization routines.

Improved harmony search
Improved harmony search (IHS) algorithm, one of the recent versions of the HS method, has attracted research interests in recent years. 32 The pseudo-code of IHS is shown in Figure 2.
The IHS algorithm adapts using the following the following PAR and bandwidths (bw) for the tth iteration 32 :

Global-best harmony search
The GHS algorithm is one of the most robust versions of the HS algorithm that emphasizes in the use of the best member for generating a new point in the iterative procedure of the HS algorithm. 30 The GHS general implementation procedure is illustrated in Figure 3. The GHS algorithm improves the classical HS harmonic search algorithm by utilizing all individuals' experiences and exploiting the best experience. This algorithm can generate each dimension randomly within the permissible range which results in slow convergence.

.Self-adaptive global-best harmony search
The self-adaptive global-best harmony search (SGHS) algorithm is based on the GHS algorithm. 34 The pseudo-code for new member generation in each iteration is shown in Figure 4. HMCR, PAR, and bw are dynamically quantified in this algorithm. The bw parameter decreases linearly during the runs. Large bw value increases the search range of the algorithm and small bw value increases the depth of search in the algorithm. The value of bw in each iteration is determined by the following relation: where bw(t) corresponds to the replication t and to the minimum and maximum bandwidths. The HMCR parameter determines the usage rate of all members of the population. A large value of HMCR is used to increase the convergence rate or increase local search power, and a small value of HMCR could be used to increase the precision. In SGHS these two parameters are not considered as constants. For each LP iteration, these two parameters are reset by normal distribution with HMCR initial value of 0.98 (standard deviation of 0.01) and PAR value of 0.9 (standard deviation of 0.05).

F I G U R E 3
The GHS algorithm F I G U R E 4 New member production routines per iteration in the SGHS algorithm F I G U R E 5 New member production procedure in NGHS

Novel GLOBAL-BEST HARMONY SEARCH
More recently, a newer algorithm called the novel global harmony search (NGHS) algorithm is proposed by Zou et al. 33 The NGHS pseudo-code is depicted in Figure 5. In NGHS, new points are generally produced in the neighborhood next to the current best point. The algorithm does not have the HMCR parameter which will increase the probability of falling into local optima.

Enhanced adaptive global-best harmony search
In this study, we develop an enhanced adaptive GHS (EAGHS) method to overcome the limitations of the existing algorithms. In particular, new equations for x New and bw are introduced in the search process as: F I G U R E 6 The pseudo-code of the proposed EAGHS algorithm The value of bw max is set to 3 and bw min is assigned 0.05 to increase the search power in early iterations. By moving to the last iteration, this variable will increase to enhance the accuracy of the algorithm. Given the overwhelming use of the best population experience, the probability of trapping at the beginning of the algorithm is high. Thus, the overwhelming weakness of GHS is resolved. The PAR value is also determined using Equation (3). The pseudo-code of the proposed EAGHS algorithm is shown in Figure 6.

Comparison of HS variations
To evaluate the proposed EAGHS algorithm, its performance is compared with HS, IHS, 32 SGHS, 34 and NGHS 33 using the 10 complex benchmark functions reported in References 35,36. Their optimal points are at the origin of the coordinates. The average, maximum (worst), minimum (best), and standard deviation of the algorithms during the 50 independent runs with the same starting point are used to eliminate the effect of the good or bad coincidences. The criteria for the functions are listed in Table 1. 2D representations of the benchmark mathematical functions are shown in Figure 7.

Name Range
Sphere Step Ackley's The following parameters are used for each algorithm: The convergence graphs corresponding to each algorithm in 10 and 30-dimensional spaces are shown in Figures 8  and 9, respectively. It can be seen that the proposed algorithm notably improves the convergence. Table 2 summarizes the results of the HS variations using the 10 benchmark functions. Note that the 30 independent simulations are performed by each HS variation. In Table 2, the best solutions (lowest is best) are highlighted in bold.

Effect of HMCR
In this section, we investigate the effect of the HMCR parameter on the performance of the HS variants. The results for the same 10 benchmark functions using various HMCR values (ie, 0.5, 0.7, and 0.99) are summarized in Table 3. The results are reported in terms of the averages and standard deviations of 20 experimental independent replications. In general, the performance of the proposed HS variations is improved by increasing the HMCR value. The larger value of HMCR means the probability of using the HM is high and therefore, the exploration will be decreased. In contrast, using a small value of HMCR increases the diversity and hinders the convergence speed. In the case of HMCR = 0.99, the diversity is almost lost as the HS variations get easily stuck in the local minima.

Scalability study: results for 100-dimensional problems
In this section, the effect of larger dimensions (ie, N = 100) on the performance of the HS variations is investigated using the 10 functions. The same parameter values as N = 30 are used. The results shown in Table 4 imply that almost the same performance is achieved by the HS variations when N = 100 and N = 30. However, the results are not better than those  produced for the 30-dimensional problems. In general, increasing the number of decision variables (or dimensionality) affects the results produced.

CONCLUSIONS
This paper presents an improved algorithm called EAGHS for global optimization problems. In the proposed algorithm, a dynamic parameter is added to the GHS algorithm to avoid falling into the local optima. Several test functions are used to test the proposed algorithm. A comparative study is further conducted to provide an insight into the performance of the EAGHS and other robust variants of the HS algorithm. The results show a better accuracy and faster convergence of EAGHS compared to the other algorithm. In addition, the EAGHS algorithm has lower time penalty than its counterparts. Future research can focus on developing binary and multi-objective EAGHS algorithms. EAGHS can be adapted in the Non-dominated Sorting Genetic Algorithm II (NSGA-II) framework for solving multi-objective problems. One of the limitations of the EAGHS algorithm is that its overall performance is heavily dependent on fine-tuning its parameters. However, different chaotic maps can be utilized to tune the performance of the EAGHS algorithm. Besides, future studies can be performed to harness the power of the EAGHS algorithm for real-world applications such robotics and automation, motion planning, worker scheduling, vehicle routing problem, assembly line balancing, shortest sequence planning, sensor placement, unmanned-aerial vehicles (UAV) communication, and so on. EAGHS can also be considered as a variable tool for tackling optimization problems with incomplete or imperfect information or limited computation capacity.