Futureinternet 14 00178
Futureinternet 14 00178
Futureinternet 14 00178
Article
EBBA: An Enhanced Binary Bat Algorithm Integrated with
Chaos Theory and Lévy Flight for Feature Selection
Jinghui Feng 1,2 , Haopeng Kuang 1 and Lihua Zhang 1, *
1 Academy for Engineering & Technology, Fudan University, Shanghai 200433, China;
jhfeng20@fudan.edu.cn (J.F.); hpkuang19@fudan.edu.cn (H.K.)
2 Academy for Electromechanical, Changchun Polytechnic, Changchun 130033, China
* Correspondence: lihuazhang@fudan.edu.cn
Abstract: Feature selection can efficiently improve classification accuracy and reduce the dimension
of datasets. However, feature selection is a challenging and complex task that requires a high-
performance optimization algorithm. In this paper, we propose an enhanced binary bat algorithm
(EBBA) which is originated from the conventional binary bat algorithm (BBA) as the learning algo-
rithm in a wrapper-based feature selection model. First, we model the feature selection problem and
then transfer it as a fitness function. Then, we propose an EBBA for solving the feature selection
problem. In EBBA, we introduce the Lévy flight-based global search method, population diversity
boosting method and chaos-based loudness method to improve the BA and make it more applica-
ble to feature selection problems. Finally, the simulations are conducted to evaluate the proposed
EBBA and the simulation results demonstrate that the proposed EBBA outmatches other comparison
benchmarks. Moreover, we also illustrate the effectiveness of the proposed improved factors by tests.
Keywords: feature selection; bat algorithm; optimization; chaos theory; Lévy flight
Citation: Feng, J.; Kuang, H.; Zhang,
1. Introduction
L. EBBA: An Enhanced Binary Bat
Algorithm Integrated with Chaos
Communication networks, computers, and artificial intelligence technologies provide
Theory and Lévy Flight for Feature a vast array of tools and techniques to improve efficiency. With the development of
Selection. Future Internet 2022, 14, 178. these technologies and tools, huge amounts of data are generated, stored, and utilized [1].
https://doi.org/10.3390/fi14060178 For example, a large number of IoT devices monitor, sense, and generate continuous
data from the edge [2]. In addition, operators retain large amounts of historical user
Academic Editor: Paolo Bellavista
data in various transaction and computing platforms. Then again, the generation of rich
Received: 10 May 2022 media data such as short videos and motion pictures also makes the amount of data
Accepted: 3 June 2022 in the network grow exponentially [3]. Machine learning can effectively use these data
Published: 9 June 2022 and learn rules and patterns from them to help people make predictions and decisions.
Publisher’s Note: MDPI stays neutral
Machine learning algorithms have been successfully applied to various fields of life, such
with regard to jurisdictional claims in
as medicine, materials science, and physics. Specifically, machine learning algorithms
published maps and institutional affil-
can extract features from various types of data and use the features to train models for
iations. classification, regression, and clustering operations [4].
Despite the advantages of the mentioned machine learning algorithms in terms of
effectiveness, wide application, and malleability, there are still some challenges and urgent
issues that machine learning algorithms need to address. First, the training of machine
Copyright: © 2022 by the authors. learning algorithms is a time-consuming, computationally intensive, and energy-intensive
Licensee MDPI, Basel, Switzerland. process, which can lead to limited applications of machine learning algorithms [5]. Second,
This article is an open access article the training datasets of machine learning algorithms are derived from the features they
distributed under the terms and extract, which are mostly extracted using automated tools and human experience, and
conditions of the Creative Commons have many repetitive, meaningless, or even misleading features [6]. These features can
Attribution (CC BY) license (https:// slow down the training process of machine learning algorithms even more and reduce the
creativecommons.org/licenses/by/
effectiveness of classification, clustering, and regression of machine learning algorithms.
4.0/).
Therefore, the elimination of these useless and redundant features is important to improve
the performance of machine learning algorithms and reduce their training consumption.
Feature selection is an effective means to solve the above problem. Feature selection
can eliminate useless and redundant features in the dataset, thus reducing the number of
features and improving the classification accuracy of machine learning algorithms. For
a dataset with N features, there are 2 N feature selection schemes available, producing a
combinatorial explosion. Therefore, selecting a subset of features with high classification
accuracy and a low number of features can be regarded as an optimization problem. On
the other hand, the feature selection problem is also proved to be an NP-hard problem. It is
important to select and propose a suitable algorithm to solve the feature selection problem.
In general, feature selection methods can be classified into three categories, namely
filter-based methods, wrapper-based methods, and embedded methods. Specifically, filter-
based methods use a statistical measure that gives a score to the relevance of each feature
in the dataset, by which the importance of the feature can be quantified. Subsequently, the
decision-maker can set a threshold to remove features with scores below the threshold,
thus achieving a reduction in the number of features. However, such methods do not
consider the complementarity and mutual exclusivity among features, and therefore the
classification accuracy obtained by the subset of features selected by such methods is
low [7]. Embedded-based methods are a special kind of wrapper-based method, so they are
not discussed in this paper. The wrapper-based methods introduce classifiers and learning
algorithms. The learning algorithm continuously generates new feature subsets, while the
classifier evaluates the generated feature subsets and selects the optimal one in continuous
iterations [8]. In this type of method, the classification accuracy of feature selection is high
but consumes more time due to the introduction of classifiers. On the other hand, this type
of method has a great relationship with the performance of the learning algorithm.
Akin to some previous works [7–9], we aim to adopt the swarm intelligence algorithm
as the learning algorithm in the wrapper-based feature selection method. Specifically, a
swarm intelligence algorithm studies the complex behaviors of a swarm consisting of sev-
eral simple agents. By iteratively updating the swarm, the agents will have a more powerful
performance than before, so that the algorithm can provide a sub-optimal solution. Swarm
intelligence has the benefits of high convergence and powerful solving ability. Moreover,
swarm intelligence can also handle NP-hard problems such as feature selection. Thus, it can
be seen as an effective method to overcome the challenges of feature selection. For instance,
some well-known swarm intelligence, i.e., genetic algorithm (GA) [10], particle swarm
optimization (PSO) [11], dragonfly algorithm (DA) [12], ant-lion optimizer (ALO) [13], and
grey wolf optimizer (GWO) [14] has been applied in feature selection.
Bat algorithm (BA) and binary BA (BBA) are promising forms of swarm intelligence
and have been demonstrated to be better than other algorithms in some applications due
to their effectiveness and high performance. However, as suggested in no-free lunch
(NFL) theory, there are no algorithms that can suitably solve all optimization problems. In
addition, BA also has some shortcomings in solving feature selection problems. Thus, we
aim to enhance the performance of BA for solving feature selection. The contributions of
this work are summarized as follows:
• We show that the feature selection is a multi-objective optimization problem, and we
present the decision variables and the optimization goals of the feature selection problem.
• We propose an enhanced binary BA (EBBA) for solving the feature selection problem.
In EBBA, we propose Lévy flight-based global search method, which enables the
algorithm to jump out of the local optimum. Moreover, we propose a population
diversity boosting method so that the exploration capability of the algorithm can
be further enhanced. In addition, we use a recently proposed chaotic mapping to
assign values to the key parameter of the algorithm, thus enhancing the exploitation
capability of the algorithm.
• Simulations are conducted based on open datasets of UC Irvine machine learning
repository to verify the solving ability of the proposed EBBA. First, we introduce
Future Internet 2022, 14, 178 3 of 16
some benchmark algorithms for comparisons. Then, we show the effectiveness of the
proposed improved factors.
The rest of this work is arranged as follows. Section 2 reviews some key related works
about swarm intelligence algorithms and feature selection. Section 3 gives the model of
feature selection. Section 4 proposes the EBBA and details the improved factors. Section 5
provides simulation results and Section 6 concludes this work.
2. Related Works
In this work, we aim to use one of the swarm intelligence algorithms, i.e., BA, to solve
the feature selection problem, and thus some key related works are briefly introduced in
this section.
Li et al. [29] proposed an improved binary GWO (IBGWO) algorithm for solving
feature selection problems, in which an enhanced opposition-based learning (E-OBL) ini-
tialization and a local search strategy were proposed for improving the performance of
the algorithm. Kale et al. [30] presented four different improved versions of the sine
cosine algorithm (SCA), where the updating mechanism of SCA is the improvements
and innovations. Ouadfel et al. [31] proposed a hybrid feature selection approach based
on the ReliefF filter method and equilibrium optimizer (EO), which is composed of two
phases and tested in some open datasets. Abdel-Basset et al. [14] proposed three vari-
ants of BGWO in addition to the standard variant, applying different transfer functions
to tackle the feature selection problem. In [32], two different wrapper feature selection
approaches were proposed based on farmland fertility algorithm (FFA), which denoted
as BFFAS and BFFAG, and these methods are effective in solving feature selection prob-
lems. On the other hand, BA and some variants have been adopted for solving feature
selection problems. Varma et al. [33] proposed a bat optimization algorithm for wrapper-
based feature selection and conducted simulations based on the CICInvesAndMal2019
benchmark dataset. Naik et al. [34] proposed a feature selection method to identify the
relevant subset of features for the machine-learning task using the wrapper approach via
BA. Rodrigues et al. [35] presented a wrapper feature selection approach based on bat
algorithm (BA) and optimum-path forest (OPF). In [36], the authors proposed an improved
BPSO algorithm as an essential tool of pre-processing for solving classification problem,
in which a new updating mechanism for calculating Pbest and Gbest were proposed.
Moreover, the authors in [37] proposed a binary DA (BDA) and use it to solve the feature
selection problems. Likewise, Nakamura et al. [38] proposed a binary version of the bat
algorithm, i.e., BBA, and evaluate its performance in solving the feature selection problems.
In addition, in [39], a new hybrid feature selection method was proposed by using the sine
cosine algorithm (SCA) and genetic algorithm (GA), and the algorithm is used for solving
feature selection problems. Furthermore, Nagpal et al. [40] proposed a feature selection
method via binary gravitational search algorithms (BGSA) in medical datasets, in which
they can reduce the number of features by an average of 66% and enhance the accuracy of
prediction.
The aforementioned methods can solve feature selection problems in various appli-
cations. However, according to NFL theory, different swarm intelligence algorithms may
have different performances in various applications. Therefore, the existing methods are
insufficient to solve all feature selection problems, which motivates us to propose an EBBA
to handle more feature selection problems in this work.
where Ndim is the number of features, in other words, the dimension number of the dataset.
Under this model, there are two main objectives of the feature selection, i.e., to reduce the
classification error rate of the obtained feature subsets, and to reduce the feature number of
Future Internet 2022, 14, 178 5 of 16
feature subsets. Thus, the feature selection problem is a multi-objective problem in which
the first objective can be expressed as follows:
where f acc is the classification accuracy of the obtained feature subsets. Note that we
introduce the KNN as a classifier to evaluate the feature subsets and the reasons are
analyzed in following section. Moreover, the second objective of this work is to reduce the
feature number of feature subsets, which can be expressed as follows:
0
Ndim
(Objective 2) f2 = , (3)
Ndim
0
where Ndim is the feature number of the selected feature subsets. To simultaneously the
aforementioned objectives, we introduce the fitness function as follows:
where a ∈ [0, 1] and b = 1 − a are constants that denote the weights of the two objectives f 1
and f 2 , respectively. Specifically, we can increase a to obtain a higher classification accuracy
or increase b to obtain a smaller dimensional feature subset.
4. Proposed Algorithm
Based on the aforementioned feature selection model, we can optimize the decision
variables shown in Equation (1) to obtain a better fitness function shown in Equation (4).
Accordingly, we propose an EBBA in this section for solving the feature selection problem.
4.1. Conventional BA
BA is a swarm intelligence algorithm for global optimization, which is inspired by
the echolocation behavior of bats. Specifically, bats look for prey by flying at a random
velocity Vi at a random point Xi with a fixed frequency f min , changing wavelength l, and
loudness A0 . Depending on the proximity of their target, these bats can autonomously
modify the wavelength (in other words, frequency) of their generated pulses as well as
the rate of pulse emission r in the range of [0, 1]. The corresponding mathematical model
of BA can be detailed as follows. In the tth iteration, the frequency f i of the the ith bat is
expressed as follows.
f i = f min + ( f max − f min ) × β, (5)
where f max and f min are upper and lower bounds on the frequencies of all bats, respectively,
and β is a random number between [0, 1].
Moreover, the velocity of the ith bat vi can be modeled as follows:
where X ∗ is the bat with the highest fitness function value of the swarm.
In addition, the update method of the ith bat is shown as follows:
where Xit is the position of the ith bat in the tth iteration.
Furthermore, BA also enhances search ability through local random walks. Specifically,
BA asks the best bat in the swarm to conduct a local search with a certain probability, which
can be expressed as follows:
X N = X ∗ + e × At , (8)
where X N is the newly generated bat after the random walk, At is the loudness of all bats
in the t iteration, and e is a random variable that ranges [−1, 1].
Additionally, the loudness Ai and the rate ri of pulse emission are also updated as the
iterations proceed, which is shown as follows:
4.2. BBA
To make the BA can handle the binary solution space of the feature selection, Mirjalili
et al. [41] introduce a binary operator. Specifically, the authors introduced a v-shape
transfer function to map the continuous parameters into binary solution space, which can
be expressed as follows:
t 2 π
V vi,j = arctan vt , (10)
π 2 i,j
−1
xt t +1
i,j if rand < V v
t +1
xi,j = i,j 0 (11)
t t +1
xi,j if rand ≥ V vi,j
t and vt indicate the position and velocity of ith individual at tth iteration in jth
where xi,j i,j
t )−1 is the complement of x t . As such, the BBA can handle and update
dimension, and ( xi,j i,j
the binary decision variable reasonably.
4.3. EBBA
Conventional BA may confront some key challenges in solving the feature selection
problems. First, when dealing with the big solution space of feature selection problem, BA
may lack exploration ability, which may make the algorithm fall in local optima. Second,
the bats of the BA are guided by the best bat of the swarm, i.e., X ∗ , which means that the
population diversity is lower for the large scale datasets. Third, the exploration and explo-
ration abilities of BA should be further balanced. Finally, BA is proposed for continuous
problems whereas the feature selection problems are with a binary solution space. Thus,
these reasons motivate us to enhance BA for better feature selection performance. The
Future Internet 2022, 14, 178 7 of 16
main steps of the proposed EBBA are detailed in Algorithm 1, and the correspondingly
improved factors are as follows:
Algorithm 1 EBBA
1 Define the related parameters: population size Npop , bat dimension Ndim ,
maximum iteration tmax , and fitness function, etc.;
2 Initialize the bat population, pulse frequency, pulse rates and loudness;
3 for t = 1 to tmax do
4 Generate new bats by adjusting frequency, and updating velocities and
locations by using Equations (5), (6) and (11);
5 if Nrand < ri then
6 if Nrand < ri ’ then
7 Select the second best bat;
8 end
9 else
10 Select the best bat;
11 end
12 Generate a new bat around the selected bat by using Equation (8) or (12);
13 end
14 if Nrand < Ai and f f it ( Xi ) < f f it ( X ∗ orX ∗∗ ) then
15 Accept the new bats and update ri and Ai by using Equations (9) and (15);
16 end
17 Rank the bats and find the current best X ∗ ;
18 Generate X new ’ by using Equation (12);
19 if f f it ( X ∗ ) > f f it ( X new ) then
20 X ∗ = X new ;
21 end
22 end
23 Return X ∗ ; //X ∗ is the final feature selection result of the EBBA
where α is a parameter and its value is often assigned according to applications. Moreover,
Lévy flight is taken from the Lévy distribution, which can be expressed as follows:
Second, the newly generated bat X new is evaluated to obtain its fitness value, and then
we compare the X new with the best bat X ∗ based on their fitness function values. If the
X new outmatches X ∗ , then X ∗ = X new . By using this method, the best bat of the swarm is
easy to jump out of local optima, thereby enhancing the exploration ability of EBBA.
Future Internet 2022, 14, 178 8 of 16
X N2 = X ∗∗ + e × At , (14)
where X N2 is the newly generated bat and whether the method used is determined by a
parameter r 0 which can be expressed as follows:
rit
rit ’ = , (15)
2
By using this method, the bat swarm is simultaneously guided by the best bat and the
second-best bat, so that enhancing the population diversity of the algorithm.
Ait = C t , (16)
where C t is the tth dimension of the fractional one dimensional chaotic map, which can be
expressed as follows:
h i
21 − βCt if Ct ∈ 0, 1
Ct +α h α ,
Ct+1 = f (Ct ) = (17)
− 2
1
− βCt if Ct ∈ −1 , 0
tC +α α
where α and β are two real parameters, and they are assigned as 0.001 and 0.9 in this
work, respectively. By using the high chaotic behavior of the method, the exploitation and
exploration abilities of EBBA can be balanced.
5. Simulations
In this section, we conduct the simulations to evaluate the proposed EBBA. First, the
datasets and setups are presented. Second, we compare the EBBA with some benchmark
algorithms. Third, we illustrate the effectiveness of the improved factors.
Table 1. Datasets.
Moreover, the used CPU is 11th Gen Intel(R) Core(TM) i7-11700 @ 2.50 GHz and
the RAM is 16 GB. We use python to implement the simulation codes and adopt KNN
(k = 5). Note that we use KNN classifier since it is simple, easy and highly accurate, it
is also insensitive to outliers and no data entry settings. Moreover, using a simple and
relatively cheap classification algorithm in a wrapper approach can obtain a good feature
subset that is also suitable for complex classification algorithms. In contrast, if an advanced
classification algorithm is used for wrapper-based feature selection, the obtained feature
subset will be failed for simple classification algorithms. The reason is that when using
advanced classification algorithms, the learning algorithm of the wrapper approach (e.g.,
the proposed EBBA) will capture the characteristics of the classification algorithm instead
of the relationship of different features. In addition, a and b in the fitness function are set to
0.99 and 0.01, respectively. Furthermore, In this paper, binary PSO (BPSO) [36], BGWO [43],
BDA [37], and BBA [41] are introduced as the benchmarks, and the key parameters of these
algorithms are shown in Table 2. Note that the population size and iteration number of
EBBA and other benchmarks are set as 24 and 100, respectively. Additionally, to avoid
the experiment’s random bias, each algorithm is performed 30 times independently in
these selected datasets, as specified by the central limit theorem. What’s more, 80% of the
instances are utilized for training, while the remaining 20% are used for testing [7,44,45].
Note that the classifier only provides feedback to the EBBA, which means that the overfitting
affected by the division of the dataset will not have too much impact on the feature selection
method based on the wrapper and swarm intelligence.
Table 3. Optimization results achieved by various algorithms. (The best values are highlighted
in bold).
EBBA, EBBA-IF1, EBBA-IF2, and EBBA-IF3 outperform conventional BBA, which means
that the proposed improved factors are non-trivial and effective.
Table 4. Optimization results achieved by various EBBA versions. (The best values are highlighted
in bold).
Table 5. Optimization results obtained by different algorithms under decision tree. (The best values
are highlighted in bold).
Table 6. Optimization results obtained by different algorithms under random forest. (The best values
are highlighted in bold).
6. Conclusions
In this paper, the feature selection problems which can enhance the classification and
reduce data dimension are studied. First, we model the feature selection problem and then
transfer it as a fitness function. Then, we propose an EBBA for solving the feature selection
problem. In EBBA, we introduce Lévy flight-based global search method, population
diversity boosting method and chaos-based loudness method to improve the BA and make
it more applicable to feature selection problems. Finally, the simulations are conducted
to evaluate the proposed EBBA and the simulation results demonstrate that the proposed
EBBA outmatches other comparison benchmarks. Moreover, the non-trivial of the proposed
improved factors is illustrated. In the future, we intend to use more realistic datasets to
evaluate the proposed EBBA.
Author Contributions: Data curation, H.K.; Supervision, L.Z.; Writing—original draft, J.F. All authors
have read and agreed to the published version of the manuscript.
Funding: This research was supported by National Natural Science Foundation of China under Grant
82090052, and Shanghai Municipal Science and Technology Major Project 2021SHZDZX0103.
Data Availability Statement: Not applicable.
Future Internet 2022, 14, 178 15 of 16
References
1. Xia, Q.; Zhou, L.; Ren, W.; Wang, Y. Proactive and intelligent evaluation of big data queries in edge clouds with materialized
views. Comput. Netw. 2022, 203, 108664. [CrossRef]
2. Berger, C.; Eichhammer, P.; Reiser, H.P.; Domaschka, J.; Hauck, F.J.; Habiger, G. A Survey on Resilience in the IoT: Taxonomy,
Classification, and Discussion of Resilience Mechanisms. ACM Comput. Surv. 2022, 54, 147. [CrossRef]
3. Xu, L.; Tolmochava, T.; Zhou, X. Search History Visualization for Collaborative Web Searching. Big Data Res. 2021, 23, 100180.
[CrossRef]
4. Notash, A.Y.; Bayat, P.; Haghighat, S.; Notash, A.Y. Evolutionary ensemble feature selection learning for image-based assessment
of lymphedema arm volume. Concurr. Comput. Pract. Exp. 2022, 34, e6334. [CrossRef]
5. Abdulla, M.; Khasawneh, M.T. Integration of aggressive bound tightening and Mixed Integer Programming for Cost-sensitive
feature selection in medical diagnosis. Expert Syst. Appl. 2022, 187, 115902. [CrossRef]
6. Alsahaf, A.; Petkov, N.; Shenoy, V.; Azzopardi, G. A framework for feature selection through boosting. Expert Syst. Appl. 2022,
187, 115895. [CrossRef]
7. Li, J.; Kang, H.; Sun, G.; Feng, T.; Li, W.; Zhang, W.; Ji, B. IBDA: Improved Binary Dragonfly Algorithm With Evolutionary
Population Dynamics and Adaptive Crossover for Feature Selection. IEEE Access 2020, 8, 108032–108051. [CrossRef]
8. Ji, B.; Lu, X.; Sun, G.; Zhang, W.; Li, J.; Xiao, Y. Bio-Inspired Feature Selection: An Improved Binary Particle Swarm Optimization
Approach. IEEE Access 2020, 8, 85989–86002. [CrossRef]
9. Agrawal, P.; Ganesh, T.; Oliva, D.; Mohamed, A.W. S-shaped and V-shaped gaining-sharing knowledge-based algorithm for
feature selection. Appl. Intell. 2022, 52, 81–112. [CrossRef]
10. Lappas, P.Z.; Yannacopoulos, A.N. A machine learning approach combining expert knowledge with genetic algorithms in feature
selection for credit risk assessment. Appl. Soft Comput. 2021, 107, 107391. [CrossRef]
11. Li, A.; Xue, B.; Zhang, M. Improved binary particle swarm optimization for feature selection with new initialization and search
space reduction strategies. Appl. Soft Comput. 2021, 106, 107302. [CrossRef]
12. Too, J.; Mirjalili, S. A Hyper Learning Binary Dragonfly Algorithm for Feature Selection: A COVID-19 Case Study. Knowl. Based
Syst. 2021, 212, 106553. [CrossRef]
13. Wang, M.; Wu, C.; Wang, L.; Xiang, D.; Huang, X. A feature selection approach for hyperspectral image based on modified ant
lion optimizer. Knowl. Based Syst. 2019, 168, 39–48. [CrossRef]
14. Abdel-Basset, M.; Sallam, K.M.; Mohamed, R.; Elgendi, I.; Munasinghe, K.S.; Elkomy, O.M. An Improved Binary Grey-Wolf
Optimizer With Simulated Annealing for Feature Selection. IEEE Access 2021, 9, 139792–139822. [CrossRef]
15. Bacanin, N.; Bezdan, T.; Venkatachalam, K.; Zivkovic, M.; Strumberger, I.; Abouhawwash, M.; Ahmed, A.B. Artificial Neural
Networks Hidden Unit and Weight Connection Optimization by Quasi-Refection-Based Learning Artificial Bee Colony Algorithm.
IEEE Access 2021, 9, 169135–169155. [CrossRef]
16. Zhao, H.; Zhang, C.; Zheng, X.; Zhang, C.; Zhang, B. A decomposition-based many-objective ant colony optimization algorithm
with adaptive solution construction and selection approaches. Swarm Evol. Comput. 2022, 68, 100977. [CrossRef]
17. Meng, X.; Liu, Y.; Gao, X.Z.; Zhang, H. A New Bio-inspired Algorithm: Chicken Swarm Optimization. In Advances in Swarm
Intelligence; Tan, Y., Shi, Y., Coello, C.A.C., Eds.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2014;
Volume 8794, pp. 86–94. [CrossRef]
18. Yang, X.S.; Gandomi, A.H. Bat algorithm: A novel approach for global engineering optimization. Eng. Comput. 2012, 29, 464–483.
[CrossRef]
19. Jiang, Q.; Wang, L.; Hei, X. Parameter identification of chaotic systems using artificial raindrop algorithm. J. Comput. Sci. 2015, 8,
20–31. [CrossRef]
20. Kaveh, A.; Khayatazad, M. A new meta-heuristic method: Ray Optimization. Comput. Struct. 2012, 112–113, 283–294. [CrossRef]
21. Chandrashekar, G.; Sahin, F. A survey on feature selection methods. Comput. Electr. Eng. 2014, 40, 16–28. [CrossRef]
22. Doshi, M. Correlation based feature selection (CFS) technique to predict student Perfromance. Int. J. Comput. Netw. Commun.
2014, 6, 197. [CrossRef]
23. Senliol, B.; Gulgezen, G.; Yu, L.; Cataltepe, Z. Fast Correlation Based Filter (FCBF) with a different search strategy. In Proceedings
of the 2008 23rd International Symposium on Computer and Information Sciences, Istanbul, Turkey, 27–29 October 2008; pp. 1–4.
24. Subramani, P.; Sahu, R.; Verma, S. Feature selection using Haar wavelet power spectrum. BMC Bioinform. 2006, 7, 432. [CrossRef]
[PubMed]
25. Azhagusundari, B.; Thanamani, A.S. Feature selection based on information gain. Int. J. Innov. Technol. Explor. Eng. (IJITEE) 2013,
2, 18–21.
26. Spolaôr, N.; Cherman, E.A.; Monard, M.C.; Lee, H.D. ReliefF for Multi-label Feature Selection. In Proceedings of the Brazilian
Conference on Intelligent Systems, BRACIS 2013, Fortaleza, Brazil, 19–24 October 2013; pp. 6–11. [CrossRef]
27. Rostami, M.; Berahmand, K.; Nasiri, E.; Forouzandeh, S. Review of swarm intelligence-based feature selection methods. Eng.
Appl. Artif. Intell. 2021, 100, 104210. [CrossRef]
28. Dhal, P.; Azad, C. A comprehensive survey on feature selection in the various fields of machine learning. Appl. Intell. 2022, 52,
4543–4581. [CrossRef]
Future Internet 2022, 14, 178 16 of 16
29. Li, W.; Kang, H.; Feng, T.; Li, J.; Yue, Z.; Sun, G. Swarm Intelligence-Based Feature Selection: An Improved Binary Grey Wolf
Optimization Method. In Knowledge Science, Engineering and Management; Qiu, H., Zhang, C., Fei, Z., Qiu, M., Kung, S., Eds.;
Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2021; Volume 12817, pp. 98–110. [CrossRef]
30. Kale, G.A.; Yüzgeç, U. Advanced strategies on update mechanism of Sine Cosine Optimization Algorithm for feature selection in
classification problems. Eng. Appl. Artif. Intell. 2022, 107, 104506. [CrossRef]
31. Ouadfel, S.; Elaziz, M.A. Efficient high-dimension feature selection based on enhanced equilibrium optimizer. Expert Syst. Appl.
2022, 187, 115882. [CrossRef]
32. Hosseinalipour, A.; Gharehchopogh, F.S.; Masdari, M.; Khademi, A. A novel binary farmland fertility algorithm for feature
selection in analysis of the text psychology. Appl. Intell. 2021, 51, 4824–4859. [CrossRef]
33. Varma, P.R.K.; Mallidi, S.K.R.; Jhansi, S.J.; Dinne, P.L. Bat optimization algorithm for wrapper-based feature selection and
performance improvement of android malware detection. IET Netw. 2021, 10, 131–140. [CrossRef]
34. Naik, A.K.; Kuppili, V.; Edla, D.R. Efficient feature selection using one-pass generalized classifier neural network and binary bat
algorithm with a novel fitness function. Soft Comput. 2020, 24, 4575–4587. [CrossRef]
35. Rodrigues, D.; Pereira, L.A.M.; Nakamura, R.Y.M.; Costa, K.A.P.; Yang, X.; de Souza, A.N.; Papa, J.P. A wrapper approach for
feature selection based on Bat Algorithm and Optimum-Path Forest. Expert Syst. Appl. 2014, 41, 2250–2258. [CrossRef]
36. Huda, R.K.; Banka, H. A group evaluation based binary PSO algorithm for feature selection in high dimensional data. Evol. Intell.
2021, 14, 1949–1963. [CrossRef]
37. Mafarja, M.M.; Eleyan, D.; Jaber, I.; Hammouri, A.; Mirjalili, S. Binary dragonfly algorithm for feature selection. In Proceedings
of the 2017 International Conference on New Trends in Computing Sciences (ICTCS), Amman, Jordan, 11–13 October 2017; pp.
12–17.
38. Nakamura, R.Y.M.; Pereira, L.A.M.; Rodrigues, D.; Costa, K.A.P.; Papa, J.P.; Yang, X.S. Binary bat algorithm for feature selection.
In Swarm Intelligence and Bio-Inspired Computation; Elsevier: Amsterdam, The Netherlands, 2013; pp. 225–237.
39. Abualigah, L.M.; Dulaimi, A.J. A novel feature selection method for data mining tasks using hybrid Sine Cosine Algorithm and
Genetic Algorithm. Clust. Comput. 2021, 24, 2161–2176. [CrossRef]
40. Nagpal, S.; Arora, S.; Dey, S.; Shreya. Feature selection using gravitational search algorithm for biomedical data. Procedia Comput.
Sci. 2017, 115, 258–265. [CrossRef]
41. Mirjalili, S.; Mirjalili, S.M.; Yang, X.S. Binary bat algorithm. Neural Comput. Appl. 2014, 25, 663–681. [CrossRef]
42. Talhaoui, M.Z.; Wang, X. A new fractional one dimensional chaotic map and its application in high-speed image encryption. Inf.
Sci. 2021, 550, 13–26. [CrossRef]
43. Emary, E.; Zawbaa, H.M.; Hassanien, A.E. Binary grey wolf optimization approaches for feature selection. Neurocomputing 2016,
172, 371–381. [CrossRef]
44. Mafarja, M.M.; Aljarah, I.; Heidari, A.A.; Hammouri, A.I.; Faris, H.; Al-Zoubi, A.M.; Mirjalili, S. Evolutionary Population
Dynamics and Grasshopper Optimization approaches for feature selection problems. Knowl. Based Syst. 2018, 145, 25–45.
[CrossRef]
45. Tubishat, M.; Ja’afar, S.; Alswaitti, M.; Mirjalili, S.; Idris, N.; Ismail, M.A.; Omar, M.S. Dynamic Salp swarm algorithm for feature
selection. Expert Syst. Appl. 2021, 164, 113873. [CrossRef]