retracted_12652_2018_924_MOESM1_ESM (1)

Download as pdf or txt
Download as pdf or txt
You are on page 1of 25

Journal of Ambient Intelligence and Humanized Computing

https://doi.org/10.1007/s12652-018-0924-y

ORIGINAL RESEARCH

A hybrid crow search algorithm based on rough searching scheme


for solving engineering optimization problems

E
Aboul Ella Hassanien1 · Rizk M. Rizk‑Allah2 · Mohamed Elhoseny3

CL
Received: 3 April 2018 / Accepted: 20 June 2018
© Springer-Verlag GmbH Germany, part of Springer Nature 2018

Abstract
In this paper, a hybrid intelligent algorithm, named rough crow search algorithm (RCSA), by combining crow search

TI
algorithm (CSA) with rough searching scheme (RSS) is presented for solving engineering optimization problems. RCSA
integrates the merits of the CSA and RSS to intensify the search in the promising region where the global solution resides.
In terms of robustness and efficiency of the available optimization algorithms, some algorithms may not be in a position
to specify the global optimal solution precisely but can rather specify them in a ‘rough sense’. Thus, the main reason for

AR
incorporating the RSS is handling the impreciseness and roughness of the available information about the global optimal,
particularly for the problems with high dimensionality. By upper and lower approximations of the RST, the promising region
becomes under siege. Therefore this can accelerate the optimum seeking operation and achieve the global optimum with a low
computational cost. The proposed RCSA algorithm is validated on 30 benchmark problems of IEEE CEC 2005, IEEE CEC
2010 and 4 engineering design problems. The obtained results by RCSA are compared with different algorithms from the
literature. The comparisons demonstrate that the RCSA outperform the other algorithms for almost all benchmark problems
ED
in terms of solution quality based on the results of statistical measures and Wilcoxon signed ranks test.

Keywords Crow search algorithm · Rough set theory · Nonlinear programming problems
T

1 Introduction complex task. Moreover, in many instances, complex opti-


mization problems present peaks, channels and/or valleys
AC

Large-scale nonlinear programming arises in a wide variety which make traditional deterministic methods inefficient to
of scientific and engineering applications including struc- find the global solutions.
tural optimization, engineering design, very large-scale cell The traditional optimization methods (TOMs) (Rao 2009)
layout design, economics, resource allocation and many such as Newton and steepest-descent methods always rely on
other applications (Bartholomew-Biggs 2008; Rao 2009). the restrictions of gradient information regarding the objec-
TR

In most cases there are many optimization problems that tive function and the goodness of the initial solution. These
involve some attributes, such as high dimensionality and methods can perform well for small scale problems. Chal-
multimodality, the solution of these problems are usually a lenges can be appeared when coping with complex tasks that
are characterized by the non-linearity, high dimensionality,
multimodality, prohibited regions induced by constraints
RE

Aboul Ella Hassanien, Rizk M. Rizk-Allah, Mohamed Elhoseny: and large search areas. Handling such complex tasks using
Scientific Research Group in Egypt http://www.egypt​scien​ce.net.
TOMs is almost impossible or requires notable computa-
* Mohamed Elhoseny tional efforts (Xiaohui et al. 2017; Rizk-Allah et al. 2018a,
mohamed_elhoseny@mans.edu.eg b).
Alternatively, the metaheuristic algorithms (MAs) have
1
Faculty of Computers and Information, Cairo University, exhibited promising performance when dealing with com-
Cairo, Egypt
plex tasks that are extremely nonlinear, high dimension and
2
Faculty of Engineering, Menoufia University, multimodal (Rizk-Allah 2018; Tharwat et al. 2018; Yang
Shibīn al‑Kawm, Egypt
2008). The MAs have some features which include the capa-
3
Faculty of Computers and Information, Mansoura University, bility of searching within a wide search area for the global
Mansoura, Egypt

13
Vol.:(0123456789)
A. E. Hassanien et al.

or near global solutions, they established based on embraces Section 4 presents the proposed rough crow search algo-
probabilistic transition rules that preserve the diversity, no rithm in details. In Sect. 5, the results and discussions are
reliance on the derivatives of objective function, and they provided, and finally the conclusions and future work are
are independent on the problems nature, thus they have flex- presented in Sect. 6.
ibility to be applicable to a great assortment of complex
tasks. However, by the means of the NFL Theorem (No Free
Lunch) (Yang 2008), no meta-heuristic algorithm can be 2 Related work
suited to deal with all optimization problems. So develop-

E
ing a new algorithm or modified algorithms undoubtedly is This section provides the preliminaries of the nonlinear
a true challenge. programming problem (NLPP). Also, the mechanisms of
Crow search algorithm (CSA) is an efficient meta-heu-

CL
metaheuristic techniques and their challenges are discussed.
ristic algorithm that was developed by Askarzadeh (Alireza Eventually, the motivation and main contribution of the pro-
2016) for solving global optimization problems. It was posed work are showed.
inspired by the intelligent behavior of crows in nature. Crows
are greedy birds since they follow each other to obtain bet- 2.1 Problem formulation

TI
ter sources of food. They have very strong memory to store
and retrieve food across seasons which are superior to other A nonlinear programming problem (NLPP) is stated as fol-
birds. The process of finding food source hidden by a crow lows (Rao 2009):
is not an easy task. The crow has an intelligent ruse that is

AR
the crow tries to cheat another crow by going to another MinΩ f (𝐱) = f (x1 , x2 , … , xn )
(1)
position of the environment, if it finds another one following Subject to: 𝐱 ∈ Ω,
it. The CSA has been applied to some real-world problems
such as feature selection (Sayed et al. 2017), fractional opti- Ω = {𝐱| gj (𝐱) ≤ 0, j = 1, … , q, hj (𝐱) = 0, j = q + 1,
mization (Rizk-Allah et al. 2018a, b), and nonlinear opti- (2)
… , m, LBi ≤ xi ≤ UBi , i = 1, … , n}
mization problems (Mohit et al. 2017). However as a new
D
algorithm, CSA acquires some disadvantages. The first is
that the updating mechanism employs unidirectional search where f (𝐱) is the objective function, 𝐱 = (x1 , x2 , … , xn ) is
which deteriorates the diversity of solutions and can lead a vector of n decision variables from some universe Ω , Ω
TE

to the stuck in local solution. The second is that no sieging contains all possible 𝐱 that can be used to satisfy an evalu-
strategy regarding the promising region is utilized and this ation of f (𝐱) and its constraints, and LBi and UBi represent
may lead to the running without improvement in the quality the lower bound and the upper bound for decision variable
of solution. xi , respectively. There are q inequality constraints gi (𝐱) and
This paper presents a hybrid intelligent algorithm, named (m − q) equality constraints hj (𝐱).
AC

rough crow search algorithm (RCSA) for solving IEEE CEC The method for finding the global optimum of any func-
2005, IEEE CEC 2010 benchmark problems and engineering tion (may not be unique) is referred to as global optimiza-
design problems. The proposed methodology operates in two tion. In general, the global minimum of a single-objective
phases: in the first phase, an improved version of the CSA problem is presented in Definition 1 (Rao 2009):
is introduced based on two modifications namely, changing
TR

the crow flight length dynamically as well as searching in Definition 1 (The global minimum) Given a func-
the opposite direction. However, the effective performance tion f ∶ Ω ⊆ ℝn → R, Ω ≠ 𝜙, for 𝐱 ∈ Ω the value
of CSA in solving optimization problems, it cannot perform f ∗ ≜ f (𝐱∗ ) > −∞ is called a global minimum if and only if:
well for all test problems. Thus, the second phase incorpo- ∀ 𝐱 ∈ Ω ∶ f (𝐱∗ ) ≤ f (𝐱) (3)
rates the RSS which is inspired by Pawlak’s rough set theory
RE

to avoid the trapping in the local optimum. Meanwhile this where 𝐱 is by definition the global minimum solution, f (.) is

phase breaks new regions in the search space to improve the the objective function, ℝn is an n-dimensional real space and
exploration search. Furthermore, these regions are shrunken the set Ω is the feasible region of 𝐱. The goal of determining
with the iteration to obtain precise optimal solution. There- the global minimum solution is called the global optimiza-
fore, the embedding of the RSS phase with the CSA is a tion problem for a single-objective problem.
prudent way to prevent the premature convergence of the
swarm and avoid the local solutions.
The rest of the paper is arranged as follows. Section 2
introduces the related work of this study. Section 3 provides
the basics of crow search algorithm and rough set theory.

13
A hybrid crow search algorithm based on rough searching scheme for solving engineering…

2.2 Literature review 2.3 Motivation and contribution of the study

Researchers have been relied on metaheuristics algorithms The theoretical researches on the optimization algorithms
(Yang 2008) because of their superior abilities over the tra- in the literature have been mainly concerned with two
ditional optimization methods, especially for complicated directions: improving the current techniques and hybrid-
optimization problems. The main reason behind their supe- izing different algorithms. The goal of these directions
rior abilities is caused by the following features: flexibility, is mainly to improve the diversity, prevent the premature
simplicity, derivative free approaches and ability to avoid convergence and increase the convergence rate. However,

E
local optima. They can be divided into two main categories: despite the successful test of these algorithms on some
evolutionary computational algorithms (ECAs) and swarm optimization problems and their high convergence speeds,
optimization algorithms (SOAs). ECAs mimic the biologi- theses algorithms suffer from premature convergence and

CL
cal evolutionary mechanism to solve optimization problems. weak diversity, particularly when handling highly nonlin-
The most well-known paradigms of evolutionary algorithms ear optimization problems and/or the optimum solution
contain genetic algorithm (GA) (Rubén et al. 2015), and resides in a tiny subset of the search space. Further, the
differential evolution (DE) (Xiang and Wang 2015). SOAs conflicting between precision and computing time makes

TI
generally imitate the collective behavior of animals such these methods often yield an unsatisfactory solution that is
as birds, ants and bees. Particle swarm optimization (PSO) characterized by lack of precision and slow convergence.
(Chijun et al. 2016), ant colony optimization (ACO) (Mousa These disadvantages are the main motivations of this work.
et al. 2011) and firefly algorithm (FA) (Rizk-Allah et al. This paper is motivated by several features that distin-

AR
2013) are the most famous paradigms of the SOAs. These guish the proposed methodology over the existing in the
algorithms outperform the traditional numerical methods on literatures. First, modified variant of the crow search algo-
providing better solutions for some difficult and complicated rithm (CSA) based on opposite direction search is intro-
real-world optimization problems (Rizk-Allah et al. 2017a, duced to preserve the exploration ability. Second, the inte-
b, 2018a, b; Elhoseny et al. 2018a; Metawa et al. 2017). gration between the rough searching scheme (RSS) that
According to NFL Theorem (No Free Lunch) (Yang is inspired by Pawlak’s rough set theory and crow search
D
2008), no metaheuristic algorithm is suited appropriately for algorithm (CSA) to solve global optimization problems
solving all optimization problems, where it can achieve very which have not been studied yet. Third, the rough search-
promising results for a set of problems, and can show poor ing scheme (RSS)-based evolved and shrunken regions
TE

performance in a set of different problems. Therefore the can over overcome the drawbacks of many algorithms.
integrations with some strategies are established for obtain- Lastly, solving large scale optimization problems have not
ing effective performance. In this regard, we integrate rough received adequate attention yet. Hence solving these prob-
set theory (RST) that was proposed by Pawlak (1982) with lems to optimality undoubtedly becomes a true challenge.
the crow search algorithm (CSA). RST presents an exten- The main contributions of this study are as follows:
AC

sion of the classical set theory and differ from the fuzzy
set theory in its independence on any prior knowledge. It 1. RCSA is proposed for solving large scale optimization
expresses the vagueness not by the means of member rela- tasks which integrates the merits of two phases namely:
tionship but by employing the boundary region of a set. RST crow search algorithm (CSA) and rough searching
relies on replacing any vague concept, namely a subset of scheme (RSS).
TR

the universe, by two crisp concepts, which are called the 2. CSA phase exhibits a dynamic flight length to adjust the
upper approximation and the lower approximation of the tendency of approaching the optimal solution.
vague concept. The upper one represents the maximal crisp 3. An opposition-based learning is adopted for updating
set while the lower one represents the minimal crisp set of the solution to improve the diversity of solutions.
the vague concept. The boundary region is the difference 4. RSS is proposed to siege the promising regions and then
RE

between the upper and the lower approximations. Naturally, this can refine the quality of solution and avoiding the
the presence of ill posed data in real-world problems is local solution.
inevitable, so applying the RST methodology is a vital step. 5. The effectiveness of RCSA is investigated and validated
Since its inception by Pawlak (1982), it has attracted the through comprehensive experiments and comparisons
attention of scientists and researchers in many fields such as for solving IEEE CEC 2005, IEEE CEC 2010 bench-
feature selection (Shu and Shen 2014), knowledge discovery mark problems and engineering design problems.
(Li et al. 2009), and attribute reduction (Xiuyi et al. 2016)
and among others (Jie et al. 2017; Rizk-Allah 2016).

13
A. E. Hassanien et al.

3 Methods and materials

This section describes the basics of crow search algorithm


(CSA) and the preliminaries of rough set theory.

3.1 Crow search algorithm (CSA)

Crow search algorithm (CSA) is a novel metaheuristic algo-

E
rithm that is proposed by Askarzadeh (Alireza 2016) for
solving optimization problems. It is inspired by the clever-
ness of crows in finding food sources. The classical CSA

CL
consists of three consecutive phases. Firstly, the position
of hiding place of each crow is created randomly and the
memory of each crow is initialized with this position as the
best experience. Secondly, crow evaluates the quality of its

TI
position according to the objective function. Finally, crow
randomly selects one of the flock crows and follows it to
discover the position of the foods hidden by this crow. If
the found position of the food is tasty, the crow updates its

AR
position. Otherwise, the crow stays in the current position
and does not move to the generated position. The procedure
of the CSA is summarized as follows (Alireza 2016).
Step 1: Initialize a swarm of crows within the n dimen-
sional search space, where the algorithm assigns a random
vector 𝐱i = (xi,1 , xi,2 , … , xi,n ) for the ith crow, i = 1, 2, … , N. Fig. 1  Searching mechanism by the crow in the two states: a fl ≤ 1
D
Furthermore, each crow of the swarm is characterized by and b fl > 1
its memory (i.e., initially, the memory of each crow is filled
with the initial position, 𝐦i = (mi,1 , mi,2 , … , mi,n ), where the
TE

crows have no experience about the food sources). Step 4: After generating the crow’s positions, the new
Step 2: Each crow is evaluated according to the quality of positions are evaluated and each crow updates its memory
its position which is related to the desired objective function. as follows:
Step 3: Crows create new positions in the search space {
xi,Iter+1 f (xi,Iter+1 ) ≻ f (mi,Iter )
as follows: crow i selects one of the flock crows randomly, (5)
AC

mi,Iter+1 =
mi,Iter otherwise
i.e., crow j, and follows it to discover the position of the
foods hidden by this crow, where the new position of crow i where f (.) denotes the objective function value, ≻ denotes
is generated as follows: better than.
{ The nature behavior of crow is characterized by memo-
xi,Iter + ri × f li,Iter (mj,Iter − xi,Iter ) aj ≥ APj,t
rizing the position of hidden places of food and retrieving
TR

xi,Iter+1 =
a random position otherwise
it across seasons. In this regard, it is assumed that each
(4) crow memorizes the position of hidden places in a memory
where ri , aj are random numbers with uniform distribution denoted by m, thus at iteration Iter, the position of hidden
between 0 and 1, APj,t denotes the awareness probability of place of crow j is denoted by mj,Iter . In the initialize step, the
crow j at iteration Iter and f li,t denotes the flight length of
RE

memory mj,Iter of crow j is initialized with its initial posi-


crow i at iteration Iter . mj,Iter denotes the memory of crow tion xj,Iter , then this memory is updated at each iteration by
j at iteration Iter. Eq. (5) to attain best position of food source (hidden place).
Figure 1 shows the effect of the parameter fl on the Equation (5) operates by filling the memory of the crow with
search capability where the small values of fl(fl ≤ 1) leads its new position if it is better that than the sored one.
to explore new position of crow lies on the dashed line Step 5: End the algorithm if the maximum number of
between mj,t and xi,t as in Fig. 1a, while Fig. 1b shows that, generations is met and the best position of the memory in
if the value of fl is selected more than 1, the new position of terms of the objective function value is reported as the solu-
crow lies on the dashed line which outside the line segment tion of the optimization problem; otherwise, go back to Step
between mj,t and xi,t . 3.

13
A hybrid crow search algorithm based on rough searching scheme for solving engineering…

3.2 Rough set theory (RST) parameter such that changes dynamically with iteration num-
ber, instead of the fixed value. The second improvement has
The basic concept of the RST is the indiscernibility rela- been considered the searching in the opposite directions.
tion, which is generated by the information about the In addition, RSS is introduced to define the bounds for the
objects (Pawlak 1982). Because the discerning knowl- obtained optimal solution so far and then new solutions are
edge is lack, one cannot identify some objects based on generated randomly inside these bounds.
the available information. The indiscernibility relation
expresses this fact by considering granules of indiscern- 4.2 Rough CSA

E
ible objects as a fundamental basis. Some relevant con-
cepts of the RST are presented in Pawlak (1982) (i.e., see Phase 1: CSA
Appendix 1). In CSA, crucial influence on algorithm performance

CL
refers to the calculation of the hiding places of a crow. Basic
implementation of this metaheuristic technique assumes a
fixed value of the flight length which cannot be changed dur-
4 The proposed RCSA algorithm ing iterations. The main drawback of this technique appears
in the flight length value that the algorithm needs to cover

TI
The metaheuristic algorithms have been devised to over- the overall search space for finding the optimal solution. The
come the computational drawbacks of existing numerical small value of the flight length explores the solutions inside
algorithms such as complex derivatives, sensitivity to ini- the line segment and the large value of the flight length

AR
tial values and the large amount of enumeration memory explores the solutions outside the line segment. Thus, this is
required. Their conventional procedures for finding the opti- often not a good choice, especially when dealing with more
mal solution are iterative based and depend on randomness complex nonlinear and multimodal problems. In order to
to imitate natural phenomena. Thus the process for search- accelerate the convergence, eliminate the drawbacks which
ing the global optimal solution would reveal that some of caused by fixed values of the flight length and balance explo-
metaheuristic algorithms fail to find a precise value for the ration and exploitation, the flight length is changed dynami-
D
global optimal solution but they can obtain an approximate cally with iteration number as shown in Fig. 2 using the
value or rough sense value. In such situations, it is desir- following equation (Pan et al. 2014):
able to provide more exploration in finding the global solu- ( ( ) ( ))
TE

f lmin Iter
tion and to prevent premature convergence of the swarm. f lIter = f lmax ⋅ exp log ⋅ (6)
f lmax Itermax
Towards this objective, we focused in this study on the
hybridization between RSS and CSA. This hybridization where f lIter is the flight length in each iteration, f lmin is the
is called rough crow search algorithm (RCSA). The pro- minimum flight length, f lmax is the maximum flight length,
posed RCSA operates in two phases: in the first one, CSA
AC

Iter is the iteration number and Itermax is the maximum itera-


is implemented as global optimization system to find an tion number.
approximate solution of the global optimization problem.
In the second phase, RSS is introduced to improve the solu-
tion quality through the roughness of the obtained optimal
solution so far. By this way, the roughness of the obtained
TR

optimal solution can be represented as a pair of precise con- flmax


cepts based on the lower and upper approximations which
are used to constitute the interval of boundary region. After
that, new solutions are randomly generated inside this region
to enhance the diversity of solutions and achieve an effective
RE

exploration to avoid premature convergence of the swarm.


fl

We start the explanation of the RCSA as follows.

4.1 Algorithm initialization
flmin
Instead of fixing the parameters, the fine-tuning process of
the algorithm parameters may have major influence in faster iter
convergence and final outcome. As a first improvement,
RCSA introduces a new modification on the flight length Fig. 2  Representation of fl versus iterations

13
A. E. Hassanien et al.

In addition, we modified the Eq. (4) so that it involves


Apr (xi ) = ∪{y ∈ U∕C(y) ≤ xi }. (8)
the new positions inside and outside the segment that is
extended equally on both sides determined by a specified The upper approximation Apr (xi ) of xi can be defined as
value for f lmax. follows

⎧𝐱 + ri × f li,Iter (mj,Iter − 𝐱i,Iter ) if aj ≥ APj,Iter


⎪ i,Iter
𝐱i,Iter+1 = ⎨ 𝐱i,Iter − ri × f li,Iter (mj,Iter − 𝐱i,Iter ) else d = i , i = 1, 2, … , N (7)
⎪ LB + rand × (UB − LB) otherwise

E
CL
where d ∈ {1, 2, … , N} is a randomly chosen index.
To further enhance the intensive search, we force the crows
Apr (xi ) = ∪{y ∈ U∕C(y) ≥ xi }. (9)
to search towards the opposite directions (Seif and Ahmadi Accordingly, the boundary region of xi is given by
2015) in the random manner through use the negative sign
that is inspired by fooling process. Equation (7) presents a new BN(xi ) = ∪{y ∈ U∕C(y) ≠ xi }
(10)

TI
modification on the Eq. (4), meanwhile Eq. (7) is introduced = {y ∈ U∕C(y) > xi } ∪ {y ∈ U∕C(y) < xi }.
with the aim of generating a new position of crow and its
opposite instead of the generating a position only as in Eq. (4). In classical RST, any subset X ⊆ U is described by its lower
In this context, the flight length, f lIter , that is introduced in and upper approximations, i.e., BX ⊆ X ⊆ BX. As a counter-

AR
Eq. (7) is evolved by Eq. (6) instead of using fixed value as in part when dealing with crisp value, the less than or equal (≤)
traditional CSA. and the greater than or equal (≥) are used instead of using the
Phase 2: Rough searching scheme (RSS) subset (⊆) and superset (⊇), respectively.
However, the optimization producers of the CSA phase For example, if we have U∕C = {{4}, {5}, {7}}, then the
yields an approximated optimal solution 𝐱∗ = (x1∗ , x2∗ , … , xn∗ ) lower and upper approximations for each class using Eqs. 8
which is not specified in precise crisp term, rough searching and 9 can be obtained as follows:
D
scheme-based local search is proposed in this paper to guide
Apr(4) = 4 Apr(4) = {4 + 5 + 7} BN(4) = {5, 7}
CSA for approaching the global optimal solution, where the
approximated optimal solution is converted into rough number. Apr(5) = {4, 5} Apr(5) = {5, 7} BN(5) = {4, 7} .
TE

Afterwards the rough interval is obtained through the upper Apr(7) = {4, 5, 7} Apr(7) = 7 BN(7) = {4, 5}
and lower approximations for each variable. Therefore a new
offspring is generated inside the obtained rough interval. The The Step 2 (i.e., of rough approximations) makes usage of
detailed description of this phase is described as follows: the upper and lower approximations to describe the proposed
AC

Step 1: Information system process. It is extended from the basics of the rough set theory
This step uses the swarm solutions as a key function for that are described in Sect. 2.3. Hence the obtained solutions for
implementing the information system. The information system ith dimension are represented as the degrees under increasing
is denoted as the ordered pair (U, A), where each individual order condition (i.e., x1 < x2 < ⋯ < xN ), then the subset (⊆)
is treated as an object (solution) of a non-empty finite set U. and superset (⊇) are analogous to the less than or equal (≤) and
TR

Attribute set A = {d1 , d2 , … , dn } is a non-empty finite set of the greater than or equal (≥), respectively. In simple words, for
attributes (i.e., the dimensions of the candidate problem, where a degree, xi , in a set of ordered degrees, the lower approxima-
each dimension is represented as a conditional attribute). tion of xi contains all the solutions (degrees) in the information
Step 2: Rough approximations system that have values equal to or less than xi . The upper
The ordered pair S = (U, C) is called an approximation approximation of xi contains all the solutions (degrees) in the
RE

space generated by C on U, where U is a non-empty finite set same information system that have values equal to or greater
of solutions per dimension (i.e., obtained solutions from CSA than xi ; and the boundary region of xi contains all the degrees
phase where each solution is represented as a class) and C is in the information table that have different values from xi .
a reflexive relation on U that partitions U into N classes, i.e., Step 3: Rough interval
U∕C = {{x1 }, {x2 }, … , {xi }, … , {xN }}, where {xi } is the ith Based on the approximations of a class defined above, the
class and the all classes are expressed in increasing ordered so-called rough number can be defined as follows: the degree,
such that {x1 } ≤ {x2 } ≤ ⋯ ≤ {xN }. Additionally, {xi } ≤ {xj } xi , of the ith dimension can be represented by its rough number
if and only if xi ≤ xj . For each dimension dj , j = 1, 2, … , n, composed of the lower bound (xiLB ) and the upper bound (xiUB )
the lower approximation of xi , i = 1, 2, … , N, is denoted as is denoted by RN(xi ). Mathematically,
Apr (xi ) and can be defined as follows:

13
A hybrid crow search algorithm based on rough searching scheme for solving engineering…

1 ∑ Step 4: Generation
xiLB = y | y ∈ Apr (xi )
NLB (11) In this step, new solutions are generated inside the new
intervals randomly, and then these solutions are evolved
1 ∑ through the use of the RCSA phase.
xiUB = y | y ∈ Apr (xi ) (12) Step 5: Evaluation
NUB
In this step, the solutions are evaluated to judge if the
where NLB is the number of objects contained in the lower best fitness is superior to the previous one, if so, update
approximation of xi and NUB is the number of objects con- the best fitness value and at this moment, the best location

E
tained in the upper approximation of xi . is updated.
The interval between the lower bound (xiLB ) and the upper Step 6: Stopping criterion
bound (xiUB ) is known as the rough boundary interval, which This phase is terminated when either the maximum

CL
is denoted as RBI(xi ) as follows: number of generations has been produced or the obtained
optimal solution is exact (crisp). Schematic diagram of the
RBI(xi ) = xiUB − xiLB . (13)
rough set phase is demonstrated in Fig. 3.
Definition 6 Any vague class is characterized by a so- In summary, the first phase of the proposed, CSA, is
called rough number (RN(xi )) consisting of the lower bound responsible for delivering a population of solutions for

TI
and the upper bound of the said class as follows: the second phase, RSS, where each dimension is repre-
sented by a vector of N-degrees. To effectively create
RN(xi ) = [xiLB , xiUB ]. (14)
a reliable regions in the search space, two concepts are

AR
Remark According to the above example, the Eqs. 11 and introduced, namely rough set approximations and rough
12 can be calculated as follows: interval with the aim to achieve a prices optimal solution.
x1LB = 4 x1UB = 5.33, RN(x1 ) = [4, 5.33] The rough set approximations consist of lower and upper
x2LB = 4.5 x2UB = 6, RN(x2 ) = [4.5, 6] . approximations, where the lower approximation for certain
x3LB = 5.33 x3UB = 7, RN(x3 ) = [5.33, 7] degree is defined by all degrees that are less or equal this
degree while the upper approximation for certain degree is
D
By using Eq. (14) we can generate new solutions ran- defined by all degrees that are greater or equal this degree.
domly inside each interval and the unified interval. The Afterwards, the concept of rough interval in carried out
unified interval of ith dimension is computed as follows. based on these approximations, where the rough interval
TE

The rough solution of ith dimension is computed as is represented by the lower bound and upper bound. The
follows: lower (upper) bound for any degree is the mean of all
degrees in its lower (upper) approximation. By the two
xiLB = (xi1 (15)
LB LB LB
+ xi2 + ⋯ + xiN )∕N concepts, the search is concentrated in the reliable region
and therefore this can achieve more accurate solutions as
AC

xiUB = (xi1 (16)


UB UB UB
+ xi2 + ⋯ + xiN )∕N. well as save the computational time.
Briefly, the lower bound of a ith dimension is the mean Figure 4 shows the general architecture of the proposed
value of the degrees contained in its lower approximations RCSA algorithm, where the highlighted boxes represent
whereas the upper bound of a ith dimension is the mean the introduced modifications on the original one. Fig-
value of the degrees contained in its upper approximations. ure 4 starts with initial positions associated with initial
TR

The rough boundary interval of a ith dimension is the differ-


ence between its upper and lower bounds, which describes
the vagueness of the said class. A class with a larger rough
boundary interval is said to be vague, or less precise.
Therefore, the unified rough intervals or the regions of
RE

interest for overall dimensions are represented as follows:


RI(𝐱) = {[x1LB , x1UB ], [x2LB , x2UB ], … , [xnLB , xnUB ]}. (17)
Definition 7 The optimal solution is a rough number
denoted by 𝐱∗ , whose lower and upper bounds are denoted
by 𝐱UB and 𝐱LB , respectively.

Remark 1 If 𝐱UB = 𝐱LB , then the optimal solution 𝐱∗ is exact


(crisp), otherwise 𝐱∗ is inexact (rough).
Fig. 3  Schematic diagram of rough searching scheme phase

13
A. E. Hassanien et al.

Start

Initialize RCSA algorithm parameters

Initialize the position for each crow

E
Initialize the memory for each crow

CL
Yes
Stopping the CSA
Best position from the CSA phase ( )
phase?

TI
No Formulate the information system (U,A):
U: Finite set of solutions
Apply the new the flight length using Eq.(6) A: Finite set of dimensions

AR
Generate new positions based on Make the partitions of U using the reflexive
opposite direction using Eq.(7) relation to obtain U/C

Determine the lower and upper


Generate new ones if the positions do not approximations for each class using Eqs. (8)
satisfy all constraints and (9) respectively
D
Evaluate the new positions Determine the interval for each class
TE

Update the memory if the new fitness better Generate new positions inside the new
than the previous one intervals

Evaluate the new positions


AC

Update the best position ( )


TR

Yes No
Stopping the
RSS phase?

Stopping
condition is No
RE

Yes

The optimal solution

Fig. 4  Architecture of the proposed RCSA algorithm

13
A hybrid crow search algorithm based on rough searching scheme for solving engineering…

CSA phase
Input: Parameters: N , Itermax , AP , flmax and flmin .

Initialize the positions of crows randomly in the search space


Evaluate the initial positions
Initialize the memory of crows with the Initial positions
// Looping

E
while Iter <= Itermax do

  fl   Iter 
flIter = flmax .exp  log  min  .  

CL

  flmax   Itermax 

for i = 1 to N

d = a random integer in the range of [1, N ]

TI
 xi , Iter + ri × fli , Iter (m j , Iter − xi , Iter ) if a j ≥ APj , Iter

xi , Iter +1 =  xi , Iter − ri × fli , Iter (m j , Iter − xi , Iter ) else d = i , i = 1, 2,..., N
 LB + rand .(UB − LB ) otherwise

AR
if x i ,Iter +1 > UB , then x i , Iter +1 = UB ,

if xi , Iter +1 < LB, then x i , Iter +1 = LB


end for
Evaluate the obtained positions of the crows
D
Update the memory of crows
End while
TE

x* = Best position from the CSA phase


RSS phase
Formulate the information system
Rank the obtained positions from the CSA phase
AC

Calculate the rough interval for ith dimension as follows

xiLB = ( xi1LB , xi2 LB ,..., xiNLB ) N & xiUB = ( xi1UB , xi2UB ,..., xiNUB ) N

Repeat
Generate new positions randomly
TR

for i = 1 to N

 LB * LB
 x + rand .( x − x ) if rand ≥ 0.5
x′i =  * UB *
x + rand .( x − x )
 if rand < 0.5

End for
RE

Evaluate the obtained positions


Update the best position
Until Iter = Itermax

end Looping
Output: x*

Fig. 5  The pseudo code of the proposed RCSA algorithm

13
A. E. Hassanien et al.

memory then the updating mechanism of these positions Table 2  Parameter settings of RCSA
and memory is performed according to Eqs. (7) and (5) Population size 50
respectively until the stopping condition of CSA phase is
satisfied. Afterwards the rough searching scheme (RSS) The number of iterations 500
operates until its stopping condition is satisfied where RSS Awareness probability (AP) 0.1
receives the solutions from the CSA phase with the aim to Maximum flight length (f lmax ) (UB − LB)∕2
construct the rough interval by the means of the upper and Minimum flight length (f lmin ) 10−5
lower approximations for each variable. Therefore a new Flight length (fl) Changes dynamically

E
offspring is generated inside the obtained rough interval
and the best one among CSA and RSS phases is survival.
Also pseudo code of the proposed RCSA can be summa- 5.2 Performance analysis using the statistical

CL
rized as shown in Fig. 5. measures

To completely evaluate the performance of the proposed


5 Experiments and results RCSA algorithm, the comparison between the global solu-

TI
tion and the obtained solution by the RCSA for each test
5.1 Test functions and parameter settings function is reported as in Table 3 where in each tested case,
the solution is demonstrated before and after incorporat-
In this section, the performance of the proposed RCSA ing the RSS phase. From the Table 3, we can note that the

AR
algorithm is tested on 25 CEC’2005 benchmark functions obtained solutions after incorporating the RSS phase are
(Suganthan et al. 2005; Sedlaczek and Eberhard 2005). The more accurate and converge to the optimal value solutions
mathematical description these functions, types of functions than that obtained without incorporating the RSS phase.
and number of dimensions (Suganthan et al. 2005) are given As indicated in Table 3, the proposed RCSA algorithm
in Table 1. The robustness and effectiveness of the proposed gives the exact optimum results for the test functions 1–12
RCSA are validated through comparing it with the promi- when the algorithm is implemented with and without RSS
D
nent algorithms from literature. The algorithm is coded in phase, while RCSA algorithm performs better results than
MATLAB 7, running on a computer with an Intel Core I 5 the algorithm without RSS phase for the test functions
(1.8 GHz) processor and 4 GB RAM memory. 13–25.
TE

Additionally, the parameter configurations of the RCSA Beside the comparison with global optimal solution, we
and CSA algorithms including population size, the number additionally use the statistical measures (i.e., see Table 4)
of iterations and awareness probability are based on the sug- such as best, mean, median and worst objective values
gestions in the corresponding literature (Alireza 2016) while as well as their standard deviations and average time are
the flight length is introduced by a new manner to change obtained over 50 independent runs for each test problem.
AC

dynamically [i.e., see Eq. (6)] where the maximum flight


length is considered as the maximum radius of search and
minimum flight length is considered as the minimum radius
(i.e., see Table 2).
TR

Table 1  Benchmark functions


ID Function name D C ID Function name D C

Shifted Sphere Function 10 U Shifted Rastrigin’s Function 10 M


RE

F1 F9
F2 Shifted Schwefel’s Problem 1.2 10 U F10 Shifted Rotated Rastrigin’s Function 10 M
F3 Shifted Rotated High Conditioned Elliptic Function 10 U F11 Shifted Rotated Weierstrass Function 10 M
F4 Shifted Schwefel’s Problem 1.2 with Noise in Fitness 10 U F12 Schwefel’s Problem 2.13 10 M
F5 Schwefel’s Problem 2.6 with Global Optimum on 10 U F13 Expanded Extended Griewank’s plus Rosenbrock’s 10 M
Bounds Function (F8F2)
F6 Shifted Rosenbrock’s Function 10 M F14 Shifted Rotated Expanded Scaffers F6 10 M
F7 Shifted Rotated Griewank Function without Bounds 10 M F15 ∶F25 Hybrid functions: where each on has been composed 10 M
F8 Shifted Rotated Ackley’s Function with Global Opti- 10 M from the previous functions (different in each case)
mum on Bounds

C characteristic, U unimodal, M multimodal, D dimension of n

13
A hybrid crow search algorithm based on rough searching scheme for solving engineering…

Table 3  Comparison between the global solution and RCSA solution without and with RSS phase
Function Global solution The RCSA algo- The RCSA algo- Function Global solution The RCSA algo- The RCSA algo-
rithm without RSS rithm with RSS rithm without RSS rithm with RSS
phase phase phase phase

F1 − 450.0000 − 449.5842 − 450.0000 F14 − 300.0000 − 299.7108 − 299.9610


F2 − 450.0000 − 450.0000 − 450.0000 F15 120.0000 120.0000 120.0000
F3 − 450.0000 − 450.0000 − 450.0000 F16 120.0000 122.9352 122.9287
F4 − 450.0000 − 450.0000 − 450.0000 F17 120.0000 128.0572 121.0399

E
F5 − 310.0000 − 309.9890 − 310.0000 F18 10.0000 300.0409 300.0342
F6 390.0000 390.0000 390.0000 F19 10.0000 300.1341 300.1185

CL
F7 − 180.0000 − 179.9994 − 180.0000 F20 10.0000 300.3235 300.3200
F8 − 140.0000 − 139.99952 − 140.0000 F21 360.0000 500.0374 500.0278
F9 − 330.0000 − 330.0000 − 330.0000 F22 360.0000 700.8376 532.1342
F10 − 330.0000 − 330.0000 − 330.0000 F23 360.0000 463.0758 463.0758
F11 90.0000 90.0000 90.0000 F24 260.0000 293.7459 265.4191

TI
F12 − 460.0000 − 460.0000 − 460.0000 F25 260.0000 394.8046 375.7108
F13 − 130.0000 − 129.6139 − 129.9901

Table 4  Statistical results


of RCSA for the overall test
functions
Function

F1
F2
Best

− 450.0000
− 450.0000
Mean
AR
− 449.9974
− 450.0000
Median

− 450.0000
− 450.0000
Worst

− 449.9574
− 449.99912
SD

8.1340E−3
1.69680E−4
Ave. time (s)

0.6928095
0.8602075
D
F3 − 450.0000 − 449.8758 − 449.9964 − 448.2176 3.6265E−1 1.7043
F4 − 450.0000 − 450.0000 − 450.0000 − 450.0000 1.5951E−5 0.9525
F5 − 310.0000 − 309.9935 − 309.8324 − 306.2820 8.2210E−1 1.7133
TE

F6 390.0000 − 399.4495 − 389.1946 − 384.1468 1.4290 0.9599


F7 − 180.0000 − 180.0000 − 180.0000 − 179.9279 1.4658E−2 0.8569
F8 − 140.0000 − 140.0000 − 140.0000 − 139.9978 4.2917E−4 0.5137
F9 − 330.0000 − 330.0000 − 330.0000 − 330.0000 0.0000 0.5671
− 330.0000 − 330.0000 − 330.0000 − 330.0000 0.0000 0.8693
AC

F10
F11 90.0000 90.0000 90.0000 90.0000 0.0000 0.5613
F12 − 460.0000 − 460.0000 − 460.0000 − 460.0000 0.0000 0.6504
F13 − 129.9901 − 129.7794 − 129.7607 − 129.6294 8.9215E−2 0.7215
F14 − 299.9610 − 299.9289 − 299.9461 − 299.8408 3.9559E−2 1.2306
F15 120.0000 120.0000 120.0000 120.0000 0.0000 0.9582
TR

F16 122.9287 122.9287 122.9287 122.9287 2.9032E−14 0.9965


F17 121.0399 121.1665 121.0399 128.1268 9.4701E−1 1.1208
F18 300.0342 300.0423 300.0423 300.1388 1.9347E−2 0.8273
F19 300.1185 300.1185 300.1185 300.1185 0.0000 0.6134
300.3200 300.3209 300.32008 300.3235 1.7073E−3 1.2885
RE

F20
F21 500.0278 5017.2094 500.0707 5039.4986 2.0850 1.8769
F22 532.1342 580.0254 532.1342 651.8622 60.7131 1.8652
F23 463.0758 463.0758 463.0758 463.0758 6.9618E−014 0.6875
F24 265.4191 269.8181 265.4191 298.5889 10.5783 0.9856
F25 375.7108 383.7492 384.0652 391.2666 1.9513 1.5698

13
A. E. Hassanien et al.

Table 5  Comparison of RCSA with other recent algorithms (best results are given in bold)
Function PSO IPOP-CMA-ES CHC SSGA SS-BLX SS-Arit DE-Bin DE-Exp SaDE Proposed RCSA

F1 1.234E−4 0.000 2.464 8.420E−9 3.402E+1 1.064 7.716E−9 8.260E−9 8.416E−9 0.000
Rank 6 Rank 1 Rank 8 Rank 5 Rank 9 Rank 7 Rank 2 Rank 3 Rank 4 Rank 1
F2 2.595E−2 0.000 1.180E−2 8.719E−5 1.730 5.282 8.342E−9 8.181E−9 8.208E−9 0.000
Rank 7 Rank 1 Rank 6 Rank 5 Rank 8 Rank 9 Rank 4 Rank 2 Rank 3 Rank 1
F3 5.174E+4 0.000 2.699E+5 7.948E+4 1.844E+5 2.535E+5 4.233E+1 9.935E+1 6.560E++3 0.000

E
Rank 5 Rank 1 Rank 9 Rank 6 Rank 7 Rank 8 Rank 2 Rank 3 Rank 4 Rank 1
F4 2.488 2.932E+3 9.190E+1 2.585E−3 6.228 5.755 7.686E−9 8.350E−9 8.087E−9 0.000
Rank 6 Rank 10 Rank 9 Rank 5 Rank 8 Rank 7 Rank 2 Rank 4 Rank 3 Rank 1

CL
F5 4.095E+2 8.104E−10 2.641E+2 1.343E+2 2.185 1.443E+1 8.608E−9 8.514E−9 8.640E−9 0.000
Rank 10 Rank 2 Rank 9 Rank 8 Rank 7 Rank 6 Rank 4 Rank 3 Rank 5 Rank 1
F6 7.310E+2 0.000 1.416E+6 6.171 1.145E+2 4.945E+2 7.956E−9 8.391E−9 1.612E−2 0.000
Rank 8 Rank 1 Rank 9 Rank 5 Rank 6 Rank 7 Rank 2 Rank 3 Rank 4 Rank 1
2.678E+1 1.267E+3 1.269E+3 1.271E+3 1.966E+3 1.908E+3 1.266E+3 1.265E+3 1.263E+3 0.000

TI
F7
Rank 2 Rank 6 Rank 7 Rank 8 Rank 10 Rank 9 Rank 5 Rank 4 Rank 3 Rank 1
F8 2.043E+1 2.001E+1 2.034E+1 2.037E+1 2.035E+1 2.036E+1 2.033E+1 2.038E+1 2.032E+1 0.000
Rank 10 Rank 2 Rank 5 Rank 8 Rank 6 Rank 7 Rank 4 Rank 9 Rank 3 Rank 1

AR
F9 1.438E+1 2.841E+1 5.886 7.286E−9 4.195 5.960 4.546 8.151E−9 8.330E−9 0.000
Rank 9 Rank 10 Rank 7 Rank 2 Rank 5 Rank 8 Rank 6 Rank 3 Rank 4 Rank 1
F10 1.404E+1 2.327E+1 7.123 1.712E+1 1.239E+1 2.179E+1 1.228E+1 1.118E+1 1.548E+1 0.000
Rank 6 Rank 10 Rank 2 Rank 8 Rank 5 Rank 9 Rank 4 Rank 3 Rank 7 Rank 1
F11 5.590 1.343 1.599 3.255 2.929 2.858 2.434 2.067 6.796 0.000
Rank 9 Rank 2 Rank 3 Rank 8 Rank 7 Rank 6 Rank 5 Rank 4 Rank 10 Rank 1
6.362E+2 2.127E+2 7.062E+2 2.794E+2 1.506E+2 2.411E+2 1.061E+2 6.309E+1 5.634E+1 0.000
ED
F12
Rank 9 Rank 6 Rank 10 Rank 8 Rank 5 Rank 7 Rank 4 Rank 3 Rank 2 Rank 1
F13 1.503 1.134 8.297E+1 6.713E+1 3.245E+1 5.479E+1 1.573 6.403E+1 7.070E+1 0.0099
Rank 3 Rank 2 Rank 10 Rank 8 Rank 5 Rank 6 Rank 4 Rank 7 Rank 9 Rank 1
F14 3.304 3.775 2.073 2.264 2.796 2.970 3.073 3.158 3.415 0.0390
T

Rank 8 Rank 10 Rank 2 Rank 3 Rank 4 Rank 5 Rank 6 Rank 7 Rank 9 Rank 1
F15 3.398E+2 1.934E+2 2.751E+2 2.920E+2 1.136E+2 1.288E+2 3.722E+2 2.940E+2 8.423E+1 0.000
Rank 9 Rank 5 Rank 6 Rank 7 Rank 3 Rank 4 Rank 10 Rank 8 Rank 2 Rank 1
AC

F16 1.333E+2 1.170E+2 9.729E+1 1.053E+2 1.041E+2 1.134E+2 1.117E+2 1.125E+2 1.227E+2 2.9287
Rank 10 Rank 8 Rank 2 Rank 4 Rank 3 Rank 7 Rank 5 Rank 6 Rank 9 Rank 1
F17 1.497E+2 3.389E+2 1.045E+2 1.185E+2 1.183E+2 1.279E+2 1.421E+2 1.312E+2 1.387E+2 1.0399
Rank 9 Rank 10 Rank 2 Rank 4 Rank 3 Rank 5 Rank 8 Rank 6 Rank 7 Rank 1
F18 8.512E+2 5.570E+2 8.799E+2 8.063E+2 7.668E+2 6.578E+2 5.097E+2 4.482E+2 5.320E+2 2.9003E+2
TR

Rank 9 Rank 5 Rank 10 Rank 8 Rank 7 Rank 6 Rank 3 Rank 2 Rank 4 Rank 1
F19 8.497E+2 5.292E+2 8.798E+2 8.899E+2 7.555E+2 7.010E+2 5.012E+2 4.341E+2 5.195E+2 2.9011E+2
Rank 8 Rank 5 Rank 9 Rank 10 Rank 7 Rank 6 Rank 3 Rank 2 Rank 4 Rank 1
F20 8.509E+2 5.264E+2 8.960E+2 8.893E+2 7.463E+2 6.411E+2 4.928E+2 4.188E+2 4.767E+2 2.9032E+2
Rank 8 Rank 5 Rank 10 Rank 9 Rank 7 Rank 6 Rank 4 Rank 2 Rank 3 Rank 1
RE

F21 9.138E+2 4.420E+2 8.158E+2 8.522E+2 4.851E+2 5.005E+2 5.240E+2 5.420E+2 5.140E+2 1.4002E+2
Rank 10 Rank 2 Rank 8 Rank 9 Rank 3 Rank 4 Rank 6 Rank 7 Rank 5 Rank 1
F22 8.071E+2 7.647E+2 7.742E+2 7.519E+2 6.828E+2 6.941E+2 7.715E+2 7.720E+2 7.655E+2 1.7213E+2
Rank 10 Rank 5 Rank 9 Rank 4 Rank 2 Rank 3 Rank 7 Rank 8 Rank 6 Rank 1
F23 1.028E+3 8.539E+2 1.075E+3 1.004E+3 5.740E+2 5.828E+2 6.337E+2 5.824E+2 6.509E+2 1.0307E+2
Rank 9 Rank 7 Rank 10 Rank 8 Rank 2 Rank 4 Rank 5 Rank 3 Rank 6 Rank 1
F24 4.120E+2 6.101E+2 2.959E+2 2.360E+2 2.513E+2 2.011E+2 2.060E+2 2.020E+2 2.000E+2 0.5419E+1
Rank 9 Rank 10 Rank 8 Rank 6 Rank 7 Rank 3 Rank 5 Rank 4 Rank 2 Rank 1

13
A hybrid crow search algorithm based on rough searching scheme for solving engineering…

Table 5  (continued)
Function PSO IPOP-CMA-ES CHC SSGA SS-BLX SS-Arit DE-Bin DE-Exp SaDE Proposed RCSA

F25 5.099E+2 1.818E+3 1.764E+3 1.747E+3 1.794E+3 1.804E+3 1.744E+3 1.742E+3 1.738E+3 1.1571E+2
Rank 2 Rank 10 Rank 7 Rank 6 Rank 8 Rank 9 Rank 5 Rank 4 Rank 3 Rank 1

Table 6  Saving of fitness for the CEC 2005 test problems Table 7  Wilcoxon test for comparison results in Table 5

E
Function Saving rate % Function Saving rate % Compared methods Solution evaluations

0.092485 0.08348 Algorithm 1 Algorithm 2 R− R+ ρ-value Best method

CL
F1 F14
F2 0 F15 0 RCSA PSO 276 0 0.000027 RCSA
F3 0 F16 0.005287 RCSA IPOP-CMA-ES 210 0 0.000089 RCSA
F4 0 F17 5.479817 RCSA CHC 300 0 0.000018 RCSA
F5 0.003549 F18 0.002233 RCSA SSGA 231 0 0.000060 RCSA
0 0.005198

TI
F6 F19 RCSA SS-BLX 325 0 0.000012 RCSA
F7 0.000333 F20 0.001165 RCSA SS-Arit 325 0 0.000012 RCSA
F8 0.000343 F21 0.00192 RCSA DE-Bin 210 0 0.000089 RCSA
F9 0 F22 24.07168 RCSA DE-Exp 190 0 0.000132 RCSA

AR
F10 0 F23 0 RCSA SaDE 190 0 0.000132 RCSA
F11 0 F24 9.643301
F12 0 F25 4.836266
F13 0.290247
F O − F ORSS
Sfitness = × 100 (18)
FO
ED
6 Results and comparisons where F ORSS , F O are the optimal objective value with and
without RSS phase, respectively.
In addition to the above measures and results, the proposed Table 6 demonstrates that there are a significant sav-
RCSA algorithm is compared with recently developed state- ing for the functions F17, F22 , F24 , F25 and slight saving for
of-the-art methods such as PSO (Kennedy and Eberhart the functions F1, F5, F7, F8, F13, F14 , F16 , F18 −F21. So, we
T

1995), IPOP-CMA-ES (Auger and Hansen 2005), CHC conclude that the incorporating of the RSS phase improves
(Eshelman 1991; Eshelman and Schaffer 1993), SSGA (Fer- the performance of the proposed RCSA algorithm through
nandes and Rosa 2001; Mülenbein and Schlierkamp-Voosen achieving a significant reduction in the optimal objective
AC

1993), SS-BLX (Herrera et al. 2006), SS-Arit (Laguna and value as 44.5173% compared to the proposed approach with-
Marti 2003), DE-Bin (Price et al. 2005) and SaDE (Qin and out RSS phase.
Suganthan 2005) as in Table 5. The comparisons indicate In this subsection, a comparative study has been car-
that the RCSA outperforms all other algorithms in terms of ried out to evaluate the performance of the proposed RCSA
the average error except for the F24. algorithm concerning the hybridization, closeness to optimal
TR

Furthermore, the rank of the average error for different solution and computational time. On one hand, pure algo-
algorithms of each test function is reported, where the rithms suffer from reaching an optimal solution in a rea-
best value for the test function takes rank 1, worst value sonable time. Also, the sinking into premature convergence
takes rank 10 and the other values are ranked between 1 may be occurs in some of pure algorithms. Consequently,
and 10. As indicated from Table 5, we can say that the our hybridization algorithm has twofold features; avoiding
RE

proposed RCSA algorithm surpasses all other algorithms the premature convergence and enclosing the optimum solu-
on average. tion through using RSS phase by the means of the lower
Beside the use of the statistical measures for algorithm and upper approximations. On the other hand, the proposed
validations such as comparison of the proposed RCSA algo- RCSA algorithm is highly competitive when comparing it
rithm with other recent algorithms in terms of calculating with the other methods in terms of the statistical measures.
the average error as well as calculations of best, mean and So the use of the hybrid approach has a great potential for
median results, we additionally apply saving of the fitness, solving global optimization problems.
Sfitness , in case of incorporating the RSS phase and without
it. Sfitness is calculated as follows:

13
A. E. Hassanien et al.

6.1 Performance assessment In the other hand the proposed algorithm is compared


with seven different algorithms, where the results of the
This section is devoted to assess the performance of the pro- seven comparative algorithms are taken from Gaoji et al.
posed algorithm using the Wilcoxon signed ranks test. The (2016). The selected compared algorithms are defined as
Wilcoxon signed ranks test is a nonparametric procedure follows: joint operations algorithm (JOA) (Gaoji et al. 2016),
used in a hypothesis testing situation involving a design with free search (FS) (Penev 2014), social- based algorithm
two samples (Joaquín et al. 2001). It is a pair-wise test that (SBA) (Ramezani and Lotfi 2013), particle swarm optimizer
aims to detect significant differences between the behaviors with a diversity enhancing mechanism and neighborhood

E
of two methods. It is associated with ρ-value, where ρ is the search strategies (DNSPSO) (Hui et al. 2013), dynamic
probability of the null hypothesis being true. The result of multi-swarm particle swarm optimizer with a cooperative
the test is returned in ρ < 0.05 indicates a rejection of the null learning strategy (D-PSO-C) (Xu et al. 2015), dynamic

CL
hypothesis, while ρ > 0.05 indicates a failure to reject the group-based differential evolution (GDE) (Han et al. 2013)
null hypothesis. The R+ is the sum of positive ranks, while and sinusoidal differential evolution (SinDE) (Draa et al.
R− is the sum of negative ranks. In Table 7, we present the 2015).
results of the Wilcoxon signed-rank test for RCSA compared The results for large-scale test functions are reported in
Table 8, where the best, worst, mean and standard devia-

TI
against PSO, IPOP-CMA-ES, CHC, SSGA, SS-BLX, SS-
Arit, DE-Bin, DE-Exp and SaDE. We can conclude from tion (Std.) are reported over 20 runs. From Table 8, it can
Table 7 that the proposed RCSA is a significant algorithm be noted that RCSA outperforms the other algorithm in the
and it is better than the other algorithms. view of statistical measures.

6.2 Large‑scale test functions

To assess the performance of the proposed algorithm, we


apply the proposed algorithm on a large-scale test functions
for 1000 dimension. The large-scale test functions were pro-
AR
6.3 Engineering design problems

This section is devoted to validate the proposed RCSA for


solving engineering design problems. Since these design
problems involve different constraints, so the penalty func-
D
posed in the IEEE CEC 2010 (Tang et al. 2009) and also tion method is employed (Coello Coello 2002). By employ-
used in IEEE CEC 2012. Due to space limitation, the pro- ing the penalty method, the constrained optimization prob-
posed algorithm is tested on five test functions of the IEEE lem can be converted to an unconstrained one and then the
TE

CEC 2010. proposed RCSA algorithm can be implemented.

Table 8  Comparison among Fun. Metric FS SBA DNSPSO D-PSO-C GDE SinDE JOA RCSA
different algorithms on CEC
AC

2010 functions f1 Best 1.04E9 2.13E6 7.24E5 2.19E5 5.71E5 6.88E−7 1.54E−21 4.0389E−7
Worst 1.58E9 2.80E6 9.37E6 1.02E6 1.86E7 3.16E−1 4.87E−17 3.1648E−5
Mean 1.28eE9 2.43E6 3.18E6 6.12E5 5.07E6 2.18E−2 3.45E−19 1.0374E−5
Std. 8.44E7 1.92E5 2.69E6 1.88E5 5.18E6 8.14E−2 1.26E−18 1.2513E−5
f2 Best 9.02E3 7.92E3 6.01E3 1.26E3 6.45E3 1.20E3 7.51E2 4.7726E−4
TR

Worst 1.38E4 8.99E3 6.85E3 2.14E3 7.20E3 1.41E3 9.63E2 1.6704


Mean 1.01E4 8.26E3 6.46E3 1.68E3 6.93E3 1.31E3 8.42E2 0.3580
Std. 3.42E2 1.86E2 1.67E2 2.40E2 2.32E2 7.16E1 6.62E1 7.345E1
f3 Best 2.06E1 1.90E1 1.83E1 1.14E1 1.93E1 2.25E0 2.16E0 7.5E−3
Worst 2.11E1 1.96E1 1.95E1 1.59E1 2.02E1 2.76E0 2.72E0 3.42E−2
RE

Mean 2.09E1 1.95E1 1.93E1 1.33E1 1.96E1 2.48E0 2.47E0 1.85E−2


Std. 1.94E−2 4.36E−2 1.30E−1 1.30E0 2.52E−1 3.43E−1 3.16E−1 1.19E−2
f4 Best 1.48E12 2.08E11 9.76E11 3.72E12 7.60E11 1.18E12 1.41E11 8.3322E10
Worst 2.69E12 6.10E11 8.99E12 1.96E13 1.82E12 3.25E12 3.23E11 5.6982E11
Mean 2.04E12 3.71E11 2.46E12 8.16E12 1.34E12 1.71E12 2.45E11 3.1231E11
Std. 2.85E11 1.12E11 2.02E12 5.19E12 3.32E11 6.29E11 7.58E10 1.4524E11
f5 Best 9.15E7 2.74E8 1.49E8 5.21E7 7.36E7 4.08E7 3.88E7 1.2018E3
Worst 1.46E8 4.03E8 3.99E8 3.51E8 2.02E8 6.67E7 6.87E7 6.3036E3
Mean 1.11E8 3.32E8 2.63E8 2.95E8 1.26E8 5.44E7 4.85E7 4.5977E3
Std. 1.42E7 4.43E7 5.82E7 9.70E7 4.03E7 7.43E6 1.17E7 2.9409E3

13
A hybrid crow search algorithm based on rough searching scheme for solving engineering…

6.3.1 Himmelblau’s design problem The proposed RCSA algorithm has been tested on both
the versions of this problem through finding the best,
The Himmelblau design problem was originally proposed by median, mean and worst values. Further the proposed
Himmelblau (1972) and it has been considered as a bench- RCSA algorithm is compared with prominent different
mark non-linear constrained optimization problem. On the algorithms that reported in Deb (2000), He et al. (2004),
other hand many authors have been tested another varia- Lee and Geem (2005), Dimopoulos (2007), Gandomi et al.
tion of this problem (named as version II) (Omran and Sal- (2013) and Mehta and Dasgupta (2012) for the first version
man 2009), where a parameter 0.0006262 has been taken as and Omran and Salman (2009), Coello (2000), Fesanghary

E
0.00026 (typeset bold in the constraint g1). These problems et al. (2008) and Hu et al. (2003) for the second version as
can be formally defined as follows: in Table 9. Table 9 presents the statistical results for the
Version I: two versions in terms of finding the values of best, median,

CL
Min F(𝐱) = 5.3578547x32 + 0.8356891x1 x5 + 37.293239x1 − 40792.141
subject to∶
g1 (𝐱) = 85.334407 + 0.0056858x2 x5 + 𝟎.𝟎𝟎𝟎𝟔𝟐𝟔𝟐x1 x4 − 0.002205x3 x5

TI
g2 (𝐱) = 80.51249 + 0.0071317x2 x5 + 0.0029955x1 x2 + 0.0021813x32 (19)
g3 (𝐱) = 9.300961 + 0.0047026x3 x5 + 0.0012547x1 x3 + 0.00190853x3 x4
0 ≤ g1 (𝐱) ≤ 92, 90 ≤ g2 (𝐱) ≤ 110, 20 ≤ g3 (𝐱) ≤ 25,

AR
78 ≤ x1 ≤ 102, 33 ≤ x2 ≤ 45, 27 ≤ xi ≤ 45, i = 3, 4, 5.

Version II:

Min F(𝐱) = 5.3578547x32 + 0.8356891x1 x5 + 37.293239x1 − 40792.141


D
subject to∶
g1 (𝐱) = 85.334407 + 0.0056858x2 x5 + 𝟎.𝟎𝟎𝟎𝟐𝟔x1 x4 − 0.002205x3 x5
TE

g2 (𝐱) = 80.51249 + 0.0071317x2 x5 + 0.0029955x1 x2 + 0.0021813x32 (20)


g3 (𝐱) = 9.300961 + 0.0047026x3 x5 + 0.0012547x1 x3 + 0.00190853x3 x4
0 ≤ g1 (𝐱) ≤ 92, 90 ≤ g2 (𝐱) ≤ 110, 20 ≤ g3 (𝐱) ≤ 25,
78 ≤ x1 ≤ 102, 33 ≤ x2 ≤ 45, 27 ≤ xi ≤ 45, i = 3, 4, 5.
AC

Table 9  Statistical results for the Himmelblau’s problem


TR

Version Methods Best Median Mean Worst Std.

I Deb (2000) − 30,665.537 − 30,665.535 NA − 29,846.654 NA


He et al. (2004) − 30,665.539 NA − 30,643.989 NA 70.043
Lee and Geem (2005) − 30,665.500 NA NA NA NA
RE

Dimopoulos (2007) − 30,665.54 NA NA NA NA


Gandomi et al. (2013) − 30,665.2327 NA NA NA 11.6231
Mehta and Dasgupta (2012) − 30,665.538741 NA NA NA NA
Proposed RCSA − 30,665.545314 − 30,665.545314 − 30,665.544377 − 30,665.542970 0.001283
II Omran and Salman (2009) − 31,025.55626 NA − 31,025.556264 NA NA
Coello (2000) − 31,020.859 − 31,017.21369 − 30,984.240703 − 30,792.407737 73.633536
Fesanghary et al. (2008) − 31,024.3166 NA NA NA NA
Hu et al. (2003) − 31,025.56142 NA − 31,025.561420 NA 0
Proposed RCSA − 31,025.568575 − 31,025.568512 − 31,025.559812 − 31,025.507752 0.018780

NA not available

13
A. E. Hassanien et al.

4 4
x 10 x 10
-2.97 -3.055
RCSA
-2.98 -3.06

-2.99 -3.065

-3 -3.07

Objective value
Objective value

-3.01 -3.075

-3.02 -3.08

-3.085

E
-3.03

-3.04 -3.09

-3.05 -3.095

CL
-3.06 -3.1

-3.07 -3.105
0 100 200 300 400 500 0 100 200 300 400 500
Iteration Iteration

(a) version I (b) version II

TI
Fig. 6  Convergence curves for the Himmelblau’s design problem

AR
D
TE
AC

Fig. 7  Ranking of the best solutions for the Himmelblau’s design problem

mean, worst and the standard deviation (Std.) obtained


TR

by 30 runs, where the optimal solution for the version


I is 𝐱 = [78.000122 33.000129 29.995023 44.999358 36.776087]
and the objective function value is − 30,665.545314. On
the other hand the optimal solution for the version II is
𝐱 = [78.000094 33.000075 27.070838 44.999957 44.969327]with
RE

the objective function value is − 31,025.568575. Based on


the depicted comparisons in Table 9, we can see that the
proposed RCSA algorithm outperforms the other methods,
where it gives better solutions for the two versions than
the other algorithms.
Figure 6 illustrates the convergence curves for the best
objective value obtained by the proposed RCSA algorithm
for the two versions of the Himmelblau design problem. It
can be seen the convergence curve rapidly convergent to
Fig. 8  Architecture of three-bar truss design problem

13
A hybrid crow search algorithm based on rough searching scheme for solving engineering…

Table 10  Comparison between the proposed RCSA and different algorithms for three-bar truss design problem
Algorithm Best Mean Median Worst SD

Proposed RCSA 263.895843376 263.895843377 263.895843378 263.895843378 8.0468107E−010


Hui et al. (2010) 263.89584338 263.89584338 NA 263.89584338 4.5E−10
Ray and Liew (2003) 263.89584654 263.90335672 NA 263.96975638 1.3E−02

E
the optimal solution in less than 300 iterations for version 6.3.2 Three‑bar truss design problem
I and less than 200 iterations for version II.
Figure 7 provides the ranks for the different algorithms, The three-bar truss design problem is to minimize the vol-

CL
where the best value takes the order one; the second best ume of a statistically loaded three-bar truss as the objective
takes the order two; and so on. As shown from Fig. 7, we function and subject to stress (𝜎 ) constraints on each of the
can see that the proposed RCSA algorithm gives the better truss members by adjusting cross sectional areas ( x1 and x2).
rank over all other algorithms. The schematic of three-bar truss design problem is depicted

TI
in Fig. 8. This optimization problem is defined as follows:

Min F(𝐱) = (2 2x1 + x2 ) × l
subject to:

AR

2x1 + x2
265.6
g1 (𝐱) = √ P−𝜎 ≤0
RCSA
2x12 + 2x1 x2
265.4 √
2x1 + x2
265.2 g2 (𝐱) = √ P−𝜎 ≤0
2x12 + 2x1 x2
265 √
D
Objective value

2x1 + x2
264.8 g3 (𝐱) = √ P−𝜎 ≤0
2x12 + 2x1 x2
264.6
TE

0 ≤ x1 , x2 ≤ 1, l = 100 cm, P = 2 kN/cm2 , 𝜎 = 2 kN/cm2 .


264.4 (21)
264.2

264

263.8
The results of the proposed algorithm are obtained
AC

0 50 100 150 200 250 300 350 400 450 500 for three-bar truss design problem. The proposed
Iteration
RCSA yields the optimal solution and constraints
value as follow: 𝐱 = [0.7886751333,0.4082482940] and
Fig. 9  Convergence behavior of the RCSA of three-bar truss design
problem g(𝐱) = [0, − 1.4641016110, − 0.5358983889]. In addition,
the comparisons between the proposed RCSA algorithm
TR

and other different algorithms (i.e., PSO-DE; Hui et al.


RE

Fig. 10  Ranking of the optimum solutions for the three-bar truss


design problem Fig. 11  Architecture of pressure vessel design problem

13
A. E. Hassanien et al.

Table 11  Statistical results of Method Best Mean Worst SD Median


different methods for pressure
vessel Sandgren (1988) 8129.1036 N/A N/A N/A NA
Kannan and Kramer (1994) 7198.0428 N/A N/A N/A NA
Deb and Gene (1997) 6410.3811 N/A N/A N/A NA
Coello (2000) 6288.7445 6293.8432 6308.1497 7.4133 NA
Coello and Montes (2002) 6059.9463 6177.2533 6469.3220 130.9297 NA
He and Wang (2007) 6061.0777 6147.1332 6363.8041 86.4545 NA

E
Montes and Coello (2008) 6059.7456 6850.0049 7332.8798 426.0000 NA
Kaveh and Talatahari (2010) 6059.7258 6081.7812 6150.1289 67.2418 NA
Kaveh and Talatahari (2009) 6059.0925 6075.2567 6135.3336 41.6825 NA

CL
Gandomi et al. (2013) 6059.714 6447.7360 6495.3470 502.693 NA
Cagnina et al. (2008) 6059.714335 6092.0498 NA 12.1725 NA
Coello Coello et al. (2010) 6059.7208 6440.3786 7544.4925 448.4711 6257.5943
He et al. (2004) 6059.7143 6289.92881 NA 305.78 NA
Akay and Karaboga (2012) 6059.714339 6245.308144 NA 205 NA

TI
Garg (2014) 5885.403282 5887.557024 5895.126804 2.745290 5886.14928
Proposed RCSA 6059.606944 6059.844857 6061.034418 0.0582763 6059.606944

NA not available

Table 12  The optimal design


variables with their objective
values
Method

Sandgren (1988)
x1
AR
1.125000
x2

0.625000
x3

47.700000
x4

117.701000
F(𝐱)

8129.1036
ED
Kannan and Kramer (1994) 1.125000 0.625000 58.291000 43.690000 7198.0428
Deb and Gene (1997) 0.937500 0.500000 48.329000 112.67900 6410.3811
Coello (2000) 0.812500 0.437500 40.323900 200.000000 6288.7445
Coello and Montes (2002) 0.812500 0.437500 42.097398 176.654050 6059.946
He and Wang 2007 0.812500 0.437500 42.091266 176.746500 6061.0777
T

Montes and Coello (2008) 0.812500 0.437500 42.098087 176.640518 6059.7456


Kaveh and Talatahari (2010) 0.812500 0.437500 42.103566 176.573220 6059.0925
AC

Kaveh and Talatahari (2009) 0.812500 0.437500 42.098353 176.637751 6059.7258


Zhang and Wang (1993) 1.125000 0.625000 58.290000 43.6930000 7197.7000
Cagnina et al. (2008) 0.812500 0.437500 42.098445 176.6365950 6059.714335
Coello Coello et al. (2010) 0.812500 0.437500 42.098400 176.6372000 6059.7208
He et al. (2004) 0.812500 0.437500 42.098445 176.6365950 6059.7143
TR

Lee and Geem (2005) 1.125000 0.625000 58.278900 43.75490000 7198.433


Montes et al. (2007) 0.812500 0.437500 42.098446 176.6360470 6059.701660
Hu et al. (2003) 0.812500 0.437500 42.098450 176.6366000 6059.7151717976
Gandomi et al. (2013) 0.812500 0.437500 42.0984456 176.6365958 6059.7143348
Akay and Karaboga (2012) 0.812500 0.437500 42.098446 176.636596 6059.714339
RE

Garg (2014) 0.7781977 0.3846656 40.3210545 199.9802367 5885.4032828


Proposed RCSA 0.812500 0.437500 42.100204 176.614800 6059.606944

NA not available

2010; Ray and Liew 2003) are presented in Table 10. algorithm is highlighted in boldface. Therefore, it can be
Table 10 demonstrates that the proposed algorithm out- concluded that the proposed RCSA algorithm is a good
perform the other algorithms in terms of quality of the alternative algorithm for design problems. Further, the
obtained solution, where the better solution among all convergence behavior of the RCSA for obtaining the best

13
A hybrid crow search algorithm based on rough searching scheme for solving engineering…

13000 Beside these results, it can see that the proposed RCSA
RCSA
provides the better result than all other algorithms and then
12000
its result comes in the first rank, while PSO-DE provides
11000 the second best solution as the second rank result, then
Ray and Liew comes in third rank. In general, the results
Objective value

10000 of these algorithms are depicted according these ranks


9000
in Fig. 10. Figure 10 illustrates that the proposed RCSA
algorithm finds the better rank over all other algorithms.

E
8000
6.3.3 Pressure vessel design problem
7000

CL
6000
The goal of the pressure vessel design is to minimize the
0 100 200 300 400 500 total cost (i.e., the cost of material, forming and welding)
Iteration
(Sandgren 1988) of a cylindrical vessel that is capped at both
ends by hemi-spherical heads as shown in Fig. 11. Using
Fig. 12  Convergence behavior of the pressure vessel design problem
rolled steel plate, the shell is made in two halves that are

TI
joined by two longitudinal welds to form a cylinder. There
objective value of the three-bar truss design problem is are four design variable associated with it, namely the thick-
illustrated in Fig. 9. We can see that the proposed algo- ness of the pressure vessel, Ts = x1 , thickness of the head,

AR
rithm converges rapidly to the optimal solution in less than Th = x2 , inner radius of the vessel, R = x3 , and length of the
40 iterations. vessel without heads, L = x4 , i.e., the variable vectors are
ED
Fig. 13  Ranking of the opti-
mum solutions for the pressure
vessel design problem
T
AC
TR
RE

Fig. 14  Schematic of the speed


reducer design problem

13
A. E. Hassanien et al.

Table 13  Comparing of the Algorithm Best Mean Median Worst SD


speed reducer design problem
results of RCSAwith other Proposed RCSA 2994.381855 2994.381855 2994.381855 2994.381855 0.0000
algorithms
Hui et al. (2010) 2996.348167 2996.348174 NA 2996.348204 6.4E−06
Ray and Liew (2003) 2994.744241 3001.758264 NA 3009.964736 4.0E+00
Rao and Xiong (2005) 3000.959715 NA NA NA NA
Cagnina et al. (2008) 2996.347849 NA NA NA NA
Tosserams et al. (2007) 2996.645783 NA NA NA NA

E
Lu and Kim (2010) 3019.583365 NA NA NA NA

CL
16000
RCSA
14000

TI
12000
Objective value

10000

8000

AR
6000

4000

2000
0 100 200 300 400 500 Fig. 16  Ranking of the optimum solutions for the speed reducer
Iteration design problem
D
Fig. 15  Convergence behavior of the speed reducer design problem
proposed RCSA algorithm in terms of obtaining the best,
TE

mean, worst, std. and median over 30 run. Moreover these


results are compared with other algorithms (Sandgren 1988;
Kannan and Kramer 1994; Deb and Gene 1997; Coello and
Montes 2002; He and Wang 2007; Montes and Coello 2008;
given (in inches) by 𝐱 = (Ts , Th , R, Ls ) = (x1 , x2 , x3 , x4 ). Then, Kaveh and Talatahari 2010, 2009; Cagnina et al. 2008; Coe-
AC

the problem is formulated mathematically as follows: lho 2010; Akay and Karaboga 2012; Garg 2014; Zhang and
Wang 1993; Montes et al. 2007). Table 11 shows that the
Min F(𝐱) = 0.6224x1 x3 x4 + 1.7781x2 x32 + 3.1661x12 x4 + 19.84x12 x3 proposed RCSA algorithm outperforms the other optimiza-
subjec to: tion algorithms. Although the produced solution by Garg
g1 (𝐱) = −x1 + 0.0193x3 ≤ 0, (2014) is better than the proposed RCSA algorithm, the solu-
TR

g2 (𝐱) = −x3 + 0.00954x3 ≤ 0, tion of design variables are violated with the design vari-
4 ables restrictions (i.e., x1 is discrete).
g3 (𝐱) = −𝜋x32 x4 − 𝜋x33 + 1, 296, 000 ≤ 0,
3 Table 12 provides the comparisons between the proposed
g4 (𝐱) = x4 − 240 ≤ 0, RCSA algorithm and other algorithms in terms of finding the
1 × 0.0625 ≤ x1 , x2 ≤ 99 × 0.0625, 10 ≤ x3 , x4 ≤ 99 × 200. optimal design variables with their corresponding function
RE

(22) value. We can see that the produced solution of the design
variables by Garg (2014) are violated with the design vari-
T h e o p t i m a l s o l u t i o n o b t a i n e d by i m p l e - ables restrictions. However, the proposed RCSA algorithm
m e n t i n g t h e p r o p o s e d RC SA a l g o r i t h m i s provides better objective function than those provided by
𝐱 = (0.8125 0.437500 42.100204 176.614800) with cor- the literature. Table 12 indicates that the proposed RCSA
responding function value equal to f (𝐱) = 6059.606944 algorithm is more robust than the other methods.
and in addition the constraints are calculated (i.e., The convergence curve of the proposed RCSA algorithm
[g1 g2 g3 g4 ] = [3.394885E − 5 − 0.037548 − 0.000278 for the pressure vessel design problem is provided Fig. 12.
−63.385199]). Table 11 outlines the statistical results of the The convergence behavior indicates that the proposed

13
A hybrid crow search algorithm based on rough searching scheme for solving engineering…

Fig. 17  Definitions regarding


rough set approximations

E
RCSA algorithm finds the optimal solution in less than 150 Min F(𝐱) = 0.7854x1 x22 (3.3333x32 + 14.9334x3 − 43.0934)
iterations.

CL
From Fig. 13, we see that the proposed RCSA algo- − 1.508x1 (x62 + x72 ) + 7.4777(x63 + x73 )
rithm takes the order three, but Garg (2014) and Kaveh + 0.7854(x4 x62 + x5 x72 )
and Talatahari (2009) take the order one and two respec- subjec to:
tively. On the hand, the Garg (2014) violates the bound 27
restriction for the variable x1 and Kaveh and Talatahari − 1 ≤ 0,

TI
g1 (𝐱) =
x1 x22 x3
(2009) provides the value of constraint as g1 = 9.9E−5
which is greater than that provides by the proposed RCSA 397.5
g2 (𝐱) = − 1 ≤ 0,
algorithm. Consequently, the proposed RCSA algorithm x1 x22 x32

AR
is still better for this problem. 1.93x43
g3 (𝐱) = − 1 ≤ 0,
x2 x3 x64
6.3.4 Speed reducer design problem
1.93x53
g4 (𝐱) = − 1 ≤ 0,
The main goal of the speed reducer design problem is to x2 x3 x74
minimize the total weight of the speed reducer while satisfy- √( )2
D
ing some constraints. This design problem has a rather dif- 745x4
+ 16.9(106 )
x2 x3
ficult to detect feasible space as reported in Golinski (1973), g5 (𝐱) = − 1 ≤ 0,
where the constraints include limitations on the bending 110x63
TE

√( )2
stress of gear teeth, surface stress, transverse deflections 745x5
of shafts 1 and 2 due to transmitted force, and stresses in x2 x3
+ 157.5(106 )
shafts 1 and 2. Figure 14 shows the schematic shape of the g6 (𝐱) = − 1 ≤ 0,
85x73
speed reducer design problem in which seven unknown of x x
design variables is showed, where the design of the speed g7 (𝐱) = 2 3 − 1 ≤ 0,
AC

40
reducer is considered by the face width, b = x1, module of
5x2
teeth, m = x2 , number of teeth on pinion, z = x3, length of g8 (𝐱) = − 1 ≤ 0,
x1
shaft 1 between bearings, l1 = x4 , length of shaft 2 between
x
bearings, l2 = x5 , diameter of shaft 1, d1 = x6 , and diam- g9 (𝐱) = 1 − 1 ≤ 0,
12x2
eter of shaft 2 d2 = x7 i.e., the variable vectors are given by
TR

𝐱 = (b, m, z, l1 , l2 , d1 , d2 ) = (x1 , x2 , x3 , x4 , x5 , x6 , x7 ). Then, the 1.5x6 + 1.9


g10 (𝐱) = − 1 ≤ 0,
problem is formulated mathematically as follows: x4
1.1x7 + 1.9
g10 (𝐱) = − 1 ≤ 0,
x5
RE

2.6 ≤ x1 ≤ 3.6, 0.7 ≤ x2 ≤ 0.8, 17 ≤ x3 ≤ 28,


7.3 ≤ x4 , x5 ≤ 8.3, 2.9 ≤ x6 ≤ 3.9, 5 ≤ x7 ≤ 5.5.
(23)

13
A. E. Hassanien et al.

The best objective value obtained by pro- The future work will be focused on applying the meth-
p o s e d RC SA a l g o r i t h m i s F(𝐱) = 2994.381855 odology of RCSA to solve the multi-objective problems,
and the optimal solution of the design variables are mixed-type problems, and discrete optimization problems
𝐱 = (3.500006 0.700001 17.000000 7.300562 7.715339 3 in smart and complex applications (Abdelaziz et al. 2018;
.350260 5.286657) . Table 13 provides the best solution Darwish et al. 2017; Elhoseny et al. 2018b, c; Sajjad et al.
obtained by RCSA for speed reducer design problem over 2017; Shehab et al. 2018).
30 independent runs, where the comparison of the statistical
results obtained by RCSA and other algorithms is are shown.

E
As we can see, the results show that the proposed RCSA Appendix 1: Rough set theory definitions
algorithm produces promising results in comparison with the
other methods of the speed reducer design problem in terms Definition A.1 (Information system) An information system

CL
of obtaining the best, mean, median and the worst. In addition (IS) is denoted as a triplet T = (U, A, f ), where U is a non-
the minimal value of standard deviation (Std.) denotes the high empty finite set of objects and A is a non-empty finite set of
robustness of RCSA. From Fig. 15 demonstrates the conver- attributes. An information function f maps an object to its
gence behavior of the proposed RCSA algorithm where it finds attribute, i.e., fa ∶ U → Va for every a ∈ A, where Va is the
the optimal solution in less than 200 iterations. On the other value set of attribute a. A posteriori knowledge (denoted by

TI
hand Fig. 16 provides the comparisons of ranks for different d ) is expressed by one distinguished attribute. A decision
algorithms, where the proposed algorithm gives the better rank system is an IS with the form DT = (U, A ∪ {d}, f ), where
thus it outperforms the other algorithms. d ∉ A is used as supervised learning. The elements of A are

AR
called conditional attributes.

7 Conclusions Definition A.2 (Indiscernibility) For an attrib-


ute set B ⊆ A , the equivalence relation induced
We concluded that the integrated RCSA has improved by B is called
{ a B -indiscernibility relation,
} i.e.,
the quality of the found solutions and also guaranteed the INDT (B) = (x, y) ∈ U | ∀ a ∈ B, fa (x) = fa (y) . The equiv-
2
D
faster converge to the optimal solution. In the RCSA, CSA alence classes of the B-indiscernibility relation are denoted
phase is presented in the first stage to provide the initial as IB (x).
optimal solution of the optimization problem while the
TE

RSS phase is introduced as a second stage to enhance the Definition A.3 (Set approximation) Let X ⊆ U and B ⊆ A in
exploitation search. However the flight length of the tradi- an IS, the B-lower approximation of X is {the set of objects that
}
tional CSA is fixed, it may produce unsatisfactory solution. belongs to X with certainty, i.e., BX = x ∈ U | IB (x) ⊆ X .
So a dynamic flight length behavior is introduced with The B-upper { is the set of objects that}possibly belongs to X ,
the aim of eliciting values from the interval [f lmin , f lmax ] where BX = x ∈ U | IB (x) ∩ X ≠ 𝜙 .
̄
AC

to enhance the exploration process. The proposed RCSA


algorithm is investigated on 30 benchmark problems of Definition A.4 (Reducts) If XDT 1 2
, XDT r
, … , XDT are the deci-
IEEE CEC 2005, IEEE CEC 2010 and 4 engineering sion classes of DT , the set POSB (d) = BX ∪ BX 2 ∪ ⋯ ∪ BX r
1

design problems. The obtained results by RCSA are com- is the B-positive region of DT . A subset B ⊆ A is a set of
pared with different algorithms from the literature. The relative reducts of DT if and only if POSB (d) = POSC (d)
TR

simulations showed that the incorporation of RSS provides and POSB−{b} (d) ≠ POSC (d), ∀b ∈ B . In the same way,
an important modification on the CSA. In comparison with POSB (X) , BNB (X) and NEGB (X) are defined below (see
the classical CSA and other algorithms from the litera- Fig. 17).
ture, it seems that the RCSA performed significantly well.
The superior results of RCSA on the benchmark problems • POSB (X) = BX ⇒ certainly member of X
RE

and engineering design problems showed its applicability ̄ ⇒ certainly non member of X
• NEGB (X) = U − BX
for complex real-world problems. The main reason of the • BNB (X) = BX − BX ⇒ possibly member of X .
̄
superior performance of RCSA lies behind the RSS which
helps in breaking new promising regions by the means of
the lower and upper approximations and thus it can refine
the convergence rate of the algorithm and avoid the suck-
ing in the local optima. Therefore, we can conclude that
the proposed RCSA can handle engineering optimization
problems efficiently and effectively.

13
A hybrid crow search algorithm based on rough searching scheme for solving engineering…

References Elhoseny M, Ramírez-González G, Abu-Elnasr OM, Shawkat SA,


Arunkumar N, Farouk A (2018c) Secure medical data trans-
mission model for IoT-based healthcare systems. IEEE Access.
Abdelaziz A, Elhoseny M, Salama AS, Riad AM (2018) A machine
https​://doi.org/10.1109/ACCES​S.2018.28176​15
learning model for improving healthcare services on cloud com-
Eshelman LJ (1991) The CHC adaptive search algorithm: how
puting environment. Measurement 119:117–128
to have safe search when engaging in nontraditional genetic
Akay B, Karaboga D (2012) Artificial bee colony algorithm for large-
recombination. In: Rawlins GJE (ed) Foundations of genetic
scale problems and engineering design optimization. J Intell
algorithms. Morgan Kaufmann, San Mateo, pp 265–283
Manuf 23(4):1001–1014
Eshelman LJ, Schaffer JD (1993) Real-coded genetic algorithms and
Alireza A (2016) A novel metaheuristic method for solving constrained
interval schemata. In: Whitley D (ed) Foundations of genetic
engineering optimization problems: crow search algorithm. Com-

E
algorithms. Morgan Kaufmann, San Mateo, pp 187–202
put Struct 169:1–12
Fernandes C, Rosa A (2001) A study of non-random matching and
Auger A, Hansen N (2005) A restart CMA evolution strategy with
varying population size in genetic algorithm using a royal road
increasing population size. In: Proceedings of the 2005 IEEE con-

CL
function. In: Proceedings of the 2001 congress on evolutionary
gress on evolutionary computation, pp 1769–1776
computation, pp 60–66
Bartholomew-Biggs M (2008) Nonlinear optimization with engineer-
Fesanghary M, Mahdavi M, Minary-Jolandan M, Alizadeh Y (2008)
ing applications. Springer Optim Appl 19:1–14
Hybridizing harmony search algorithm with sequential quad-
Cagnina LC, Esquivel SC, Coello Coello CA (2008) Solving engineer-
ratic programming for engineering optimization problems.
ing optimization problems with the simple constrained particle
Comput Methods Appl Mech Eng 197(33–40):3080–3091
swarm optimizer. Informatica 32(3):319–326

TI
Gandomi A, Yang XS, Alavi A (2013) Cuckoo search algorithm: a
Chijun Z, Yongjian Y, Zhanwei D, Chuang M (2016) Particle swarm
metaheuristic approach to solve structural optimization prob-
optimization algorithm based on ontology model to support
lems. Eng Comput 29(1):17–35
cloud computing applications. J Ambient Intell Humaniz Com-
Gaoji S, Ruiqing Z, Yanfei L (2016) Joint operations algorithm
put 7(5):633–638
for large-scale global optimization. Appl Soft Comput

AR
Coelho LS (2010) Gaussian quantum-behaved particle swarm optimi-
38:1025–1039
zation approaches for constrained engineering design problems.
Garg H (2014) Solving structural engineering design optimization
Expert Syst Appl 37(2):1676–1683
problems using an artificial bee colony algorithm. J Ind Manag
Coello CAC (2000) Use of a self -adaptive penalty approach for engi-
Optim 10(3):777–794
neering optimization problems. Comput Ind 41(2):113–127
Golinski J (1973) An adaptive optimization system applied to machine
Coello Coello CA (2002) Theoretical and numerical constraint-
synthesis. Mech Mach Theory 8(4):419–436
handling techniques used with evolutionary algorithms: a sur-
Han MF, Liao SH, Chang JY, Lin CT (2013) Dynamic group-based
vey of the state of the art. Comput Methods Appl Mech Eng
differential evolution using a self-adaptive strategy for global opti-
D
191(11):1245–1287
mization problems. Appl Intell 39(1):41–56
Coello Coello CA, Dhaenens C, Jourdan L (2010) Multi-objective com-
He Q, Wang L (2007) An effective co-evolutionary particle swarm
binatorial optimization: problematic and context. In: Coello Coe-
optimization for constrained engineering design problems. Eng
TE

llo CA, Dhaenens C, Jourdan L (eds) Advances in multi-objective


Appl Artif Intell 20(1):89–99
nature inspired computing. Studies in computational intelligence,
He S, Prempain E, Wu QH (2004) An improved particle swarm opti-
vol 272. Springer, Berlin, Heidelberg, pp 1–21
mizer for mechanical design optimization problems. Eng Optim
Coello CAC, Montes EM (2002) Constraint-handling in genetic algo-
36(5):585–605
rithms through the use of dominance-based tournament selection.
Herrera F, Lozano M, Molina D (2006) Continuous scatter search:
Adv Eng Inf 16(3):193–203
an analysis of the integration of some combination methods and
AC

Darwish A, Hassanien AE, Elhoseny M, Sangaiah AK, Muhammad K


improvement strategies. Eur J Oper Res 169(2):450–476
(2017) The impact of the hybrid platform of internet of things and
Himmelblau DM (1972) Applied nonlinear programming. McGraw-
cloud computing on healthcare systems: opportunities, challenges,
Hill, New York
and open problems. J Ambient Intell Humaniz Comput. https​://
Hu XH, Eberhart RC, Shi YH (2003) Engineering optimization with
doi.org/10.1007/s1265​2-017-0659-1
particle swarm. In: Proceedings of the 2003 IEEE swarm intel-
Deb K (2000) An efficient constraint handling method for genetic algo-
ligence symposium, pp 53–57
rithms. Comput Methods Appl Mech Eng 186:311–338
TR

Hui L, Zixing C, Yong W (2010) Hybridizing particle swarm optimi-


Deb K, Gene AS (1997) A robust optimal design technique for
zation with differential evolution for constrained numerical and
mechanical component design. In: Dasgupta D, Michalewicz
engineering optimization. Appl Soft Comput 10(2):629–640
Z (eds) Evolutionary algorithms in engineering applications.
Hui W, Hui S, Changhe L, Shahryar R, Jeng-shyang P (2013) Swarm
Springer, Berlin, pp 497–514
optimization with neighborhood search. Inf Sci 223:119–135
Dimopoulos GG (2007) Mixed-variable engineering optimization
Jie H, Tianrui L, Chuan L, Hamido F, Yan Y (2017) Incremental fuzzy
based on evolutionary and social metaphors. Comput Methods
RE

cluster ensemble learning based on rough set theory. Knowl Based


Appl Mech Eng 196(4–6):803–817
Syst 132(15):144–155
Draa A, Bouzoubia S, Boukhalfa I (2015) A sinusoidal differen-
Joaquín D, Salvador G, Daniel M, Francisco H (2001) A practical tuto-
tial evolution algorithm for numerical optimisation. Appl Soft
rial on the use of nonparametric statistical tests as a methodology
Comput 27:99–126
for comparing evolutionary and swarm intelligence algorithms.
Elhoseny M, Tharwat A, Hassanien AE (2018a) Bezier curve based
Swarm Evol Comput 1(1):3–18
path planning in a dynamic field using modified genetic algo-
Kannan BK, Kramer SN (1994) An augmented lagrange multiplier
rithm. J Comput Sci 25:339–350
based method for mixed integer discrete continuous optimi-
Elhoseny M, Abdelaziz A, Salama AS, Riad AM, Muhammad K,
zation and its applications to mechanical design. J Mech Des
Sangaiah AK (2018b) A hybrid model of Internet of Things and
116(2):318–320
cloud computing to manage big data in health services applica-
Kaveh A, Talatahari S (2009) Engineering optimization with hybrid
tions. Future Gener Comput Syst 86:1383–1394
particle swarm and ant colony optimization. Asian J Civ Eng
(Build Hous) 10(6):611–628

13
A. E. Hassanien et al.

Kaveh A, Talatahari S (2010) An improved ant colony optimiza- Rizk-Allah RM, Zaki EM, El-Sawy AA (2013) Hybridizing ant colony
tion for constrained engineering design problems. Eng Comput optimization with firefly algorithm for unconstrained optimization
27(1):155–182 problems. Appl Math Comput 224:473–483
Kennedy J, Eberhart R (1995) Particle swarm optimization. In: Pro- Rizk-Allah RM, Abdel-Mageed HM, El-Sehiemy RA, Abdel-Aleem
ceedings of IV IEEE international conference on neural networks, SH, El-Shahat A (2017a) A new sine cosine optimization algo-
pp 1942–1948 rithm for solving combined non-convex economic and emission
Laguna M, Marti R (2003) Scatter search: methodology and implemen- power dispatch problems. Int J Energy Convers 5(6):180–192
tation in C. Kluwer Academic Publishers, Dordrecht Rizk-Allah RM, El-Sehiemy RA, Deb S, Wang GG (2017b) A novel
Lee KS, Geem ZW (2005) A new meta-heuristic algorithm for continu- fruit fly framework for multi-objective shape design of tubular
ous engineering optimization: harmony search theory and prac- linear synchronous motor. J Supercomput 73(3):1235–1256

E
tice. Comput Methods Appl Mech Eng 194(36–38):3902–3933 Rizk-Allah RM, El-Sehiemy RA, Wang GG (2018a) A novel parallel
Li Y, Liao X, Zhao W (2009) A rough set approach to knowledge dis- hurricane optimization algorithm for secure emission/economic
covery in analyzing competitive advantages of firms. Ann Oper load dispatch solution. Appl Soft Comput 63:206–222

CL
Res 168(1):205–223 Rizk-Allah RM, Hassanien AE, Bhattacharyya S (2018b) Chaotic crow
Lu S, Kim HM (2010) A regularized inexact penalty decomposition search algorithm for fractional optimization problems. Appl Soft
algorithm for multidisciplinary design optimization problems with Comput. https​://doi.org/10.1016/j.asoc.2018.03.019
complementarity constraints. J Mech Des 132(4):1–12 Rubén AR, Manuel VR, Rodríguez-Ortiz JJ (2015) Genetic algorithms
Mehta VK, Dasgupta B (2012) A constrained optimization algorithm and Darwinian approaches in financial applications: a survey.
based on the simplex search method. Eng Optim 44(5):537–550 Expert Syst Appl 42(21):7684–7697

TI
Metawa N, Hassana MK, Elhoseny M (2017) Genetic algorithm based Sajjad M, Nasir M, Muhammad K, Khan S, Jan Z, Sangaiah AK et al
model for optimizing bank lending decisions. Expert Syst Appl (2017) Raspberry Pi assisted face recognition framework for
80:75–82 enhanced law-enforcement services in smart cities. Future Gener
Mohit J, Asha R, Vijander S (2017) An improved Crow Search Comput Syst. https​://doi.org/10.1016/j.futur​e.2017.11.013
Algorithm for high-dimensional problems. J Intell Fuzzy Syst Sandgren E (1988) Nonlinear integer and discrete programming in

AR
33:3597–3614 mechanical design. In: Proceedings of the ASME design technol-
Montes EM, Coello CAC (2008) An empirical study about the use- ogy conference, F.L. Kissimine, pp 95–105
fulness of evolution strategies to solve constrained optimization Sayed GI, Hassanien AE, Azar AT (2017) Feature selection via a novel
problems. Int J Gen Syst 37(4):443–473 chaotic crow search algorithm. Neural Comput Appl. https​://doi.
Montes EM, Coello CAC, Reyes JV, Davila LM (2007) Multiple trial org/10.1007/s0052​1-017-2988-6
vectors in differential evolution for engineering design. Eng Optim Sedlaczek K, Eberhard P (2005) Constrained particle swarm optimi-
39(5):567–589 zation of mechanical systems. In: 6th world congresses of struc-
Mousa AA, Abd El-Wahed WF, RizkAllah RM (2011) A hybrid tural and multidisciplinary optimization, Rio de Janeiro, Brazil,
D
ant colony optimization approach based local search scheme pp 1–10
for multiobjective design optimizations. Electr Power Syst Res Seif Z, Ahmadi MB (2015) An opposition-based algorithm for function
81:1014–1023 optimization. Eng Appl Artif Intell 37:293–306
TE

Mülenbein H, Schlierkamp-Voosen D (1993) Predictive models for the Shehab A, Elhoseny M, Muhammad K, Sangaiah AK, Yang P, Huang
breeding genetic algorithm in continuous parameter optimization. H, Hou G (2018) Secure and robust fragile watermarking scheme
Evol Comput 1(1):25–49 for medical images. IEEE Access 6:10269–10278
Omran MGH, Salman A (2009) Constrained optimization using Shu WH, Shen H (2014) Incremental feature selection based
CODEQ. Chaos Solitons Fractals 42(2):662–668 on rough set in dynamic incomplete data. Pattern Recognit
Pan QK, Sang HY, Duan JH, Gao L (2014) An improved fruit fly 47(12):3890–3906
AC

optimization algorithm for continuous function optimization prob- Suganthan P, Hansen N, Liang J, Deb K, Chen Y, Auger A, Tiwari S
lems. Knowl Based Syst 62:69–83 (2005) Problem definitions and evaluation criteria for the CEC
Pawlak Z (1982) Rough sets. Int J Comput Inform Sci 11:341–356 2005 special session on real-parameter optimization. Nanyang
Penev K (2014) Free search-comparative analysis 100. Int J Technological University, Singapore
Metaheuristics 3(2):118–132 Tang K, Li X, Suganthan PN, Yang Z, Weise T (2009) Benchmark
Price KV, Rainer M, Lampinen JA (2005) Differential evolution: a functions for the CEC’2010 special session and competition on
TR

practical approach to global optimization. Springer, Berlin large-scale global optimization. Nature Inspired Computation and
Qin AK, Suganthan PN (2005) Self-adaptive differential evolution Applications Laboratory, Hefei
algorithm for numerical optimization. In: Proceedings of the 2005 Tharwat A, Elhoseny M, Hassanien AE, Gabel T, Kumar NA (2018)
IEEE congress on evolutionary computation, vol 2, pp 1785–1791 Intelligent Beziér curve-based path planning model using Chaotic
Ramezani F, Lotfi S (2013) Social-based algorithm (SBA). Appl Soft Particle Swarm Optimization algorithm. Cluster Comput. https​://
Comput 13:2837–2856 doi.org/10.1007/s1058​6-018-2360-3
RE

Rao SS (2009) Engineering optimization-theory and practice. Wiley, Tosserams S, Etman LFP, Rooda JE (2007) An augmented Lagrangian
New York decomposition method for quasi-separable problems in MDO.
Rao SS, Xiong Y (2005) A hybrid genetic algorithm for mixed discrete Struct Multidiscip Optim 34(3):211–227
design optimization. J Mech Des 127(6):1100–1112 Xiang L, Wang G (2015) Optimal band selection for hyperspectral data
Ray T, Liew KM (2003) Society and civilization: an optimization algo- with improved differential evolution. J Ambient Intell Humaniz
rithm based on the simulation of social behavior. IEEE Trans Evol Comput 6(5):675–688
Comput 7(4):386–396 Xiaohui Y, Elhoseny M, Hamdy KE, Alaa MR (2017) A genetic algo-
Rizk-Allah RM (2016) Fault diagnosis of the high-voltage circuit rithm-based, dynamic clustering method towards improved WSN
breaker based on granular reduction approach. Eur J Sci Res longevity. J Netw Syst Manag 25(1):21–46
138(1):29–37 Xiuyi J, Lin S, Bing Z, Yiyu Y (2016) Generalized attribute reduction
Rizk-Allah RM (2018) Hybridizing sine cosine algorithm with multi- in rough set theory. Knowl Based Syst 91:204–218
orthogonal search strategy for engineering design problems. J
Comput Des Eng 5:249–273

13
A hybrid crow search algorithm based on rough searching scheme for solving engineering…

Xu X, Tang Y, Li J, Hua CC, Guan XP (2015) Dynamic multi-swarm Zhang C, Wang HP (1993) Mixed-discrete nonlinear optimization with
particle swarm optimizer with cooperative learning strategy. Appl simulated annealing. Eng Optim 17(3):263–280
Soft Comput 29:169–183
Yang XS (2008) Nature-inspired metaheuristic algorithms. Luniver Publisher’s Note Springer Nature remains neutral with regard to
Press, Bristol jurisdictional claims in published maps and institutional affiliations.

E
CL
TI
AR
D
TE
AC
TR
RE

13

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy