(IJCST-V4I4P11) :rajni David Bagul, Prof. Dr. B. D. Phulpagar
(IJCST-V4I4P11) :rajni David Bagul, Prof. Dr. B. D. Phulpagar
(IJCST-V4I4P11) :rajni David Bagul, Prof. Dr. B. D. Phulpagar
RESEARCH ARTICLE
OPEN ACCESS
[1],
ABSTRACT
Ensemb le of classifiers is the approaches of multip le classifiers are learned fro m same dataset and this multip le trained
classifiers are used to predict the unlabelled data. The Performance of using Ensemble of classifiers is best than using single
classifier. The Bagging stands for Bootstrap aggregation are the approaches of ensemble. In bagging number of bags(n) are
formed where each bag contains number of instances (m) fro m training examples. Format ion of bags is referred as bootstrap
sampling in which m instances are selected randomly fro m training data and instanced can b e repeated. Once n bags are formed ,
no of classifiers(c) are trained using n bags. c classifiers are used for further prediction. It has some drawbacks such as all
classifiers are considered as of equal importance; there is no method optimal bag creation. Same prob lem was faced by Random
subspace classifier based ensemble (RSCE) and solved by applying Hybrid Adaptive ensemble learning framework (HA EL).In
proposed work, to solve issues in the Bagging HA EL framework is applied.Fro m experimental results, Bagging with HA EL
improves classification accuracy than simple bagging.
Keywords:- Ensemble learning ,Bagging, Random Subspace.
I.
INTRODUCTION
ISSN: 2347-8578
www.ijcstjournal.org
Page 68
International Journal of Computer Science Trends and Technology (IJCS T) Volume 4 Issue 4, Jul - Aug 2016
ISSN: 2347-8578
www.ijcstjournal.org
Page 69
International Journal of Computer Science Trends and Technology (IJCS T) Volume 4 Issue 4, Jul - Aug 2016
Applications of the Ensemble
In this paper non-parallel p lane pro ximal classifier ensemble
which gives more accuracy than single non-parallel p lane
proximal classifier. This ensemble is applied to classify
unknown tissue samples using known gene expressions as
training data are describe in paper [13]. Genetic algorith m
based scheme is used to train non-parallel plane pro ximal
classifier. To predict the unlabelled data, classifiers with
positive performance are selected.
To aggregate the
prediction results min imu m average pro ximity-based decision
combiner is used. System is co mpared with SVM proves that
it gives co mparable accuracy with les s average training time.
Takemura et al. [14] designed combined heterogeneous
classifier ensemb les using a kappa statistic diversity measure,
and applied it to the electro myography signal datasets.
Takemura et al. [14] used the classifier ensemble approach to
identify breast tumours in the u ltrasonic images. In the area of
data mining, Windeatt et al. [15] applied the M LP ensembles
to perform feature ranking. Rasheed et al.
Base classifier
with weights
ISSN: 2347-8578
Competition
among
different local
environment
Local
environment
Competition
within local
environment
Terminati
End.
on
Random
Fig.2.Adaptive process of base classifier competition
in SAEL.
Condition
competition
?
WI = 1/n is the initial weight for each classifier. First step of
this process is to assign the initial weight to the each classifier.
In the next step T training instances are selected randomly
fro m main train ing data(Dtr). Each selected instance is
classified by each classifier. Each classifier maintains
predicted values of each instance. Indicator vector IV is
generated for each classifier is generated whose length is
equal to T. ith Value in the indicator vector denotes the
whether ith p redicted value was correct or not. 0 values
indicate correct predict ion and 1 indicates error. Instances S
which are pred icted correctly by some classifiers and
predicted wrongly by some classifiers are considered for the
further procedure.
Weight of the classifier is increased if it predicts the S
correctly otherwise reduced. Predicted error of the classifiers
for the i-th training samples is calculated as follows:
Ei =
/
(1)
www.ijcstjournal.org
Page 70
International Journal of Computer Science Trends and Technology (IJCS T) Volume 4 Issue 4, Jul - Aug 2016
Where IVin indicates value in the indicator vector for ith
instance and nth classifier
Cumulative error of samples is calculated as
E=
(2)
Where T are instances whose prediction is varied across the
classifiers. T < T. Local environments are created as per
described in the [5]. Each classifier is assigned to the local
environment. Three operators are used for weight calculat ion
of classifier. 1) Co mpetition among classifiers in the same
local environment (CSLE)
, the co mpetition among the
classifiers In different local environments (CDLE), and the
random co mpetit ion (RC), and executes three operators one
by one.
SAELs
Cooperation
Competition
Wj = Wj a1 Wj
if (AccJ < AccB)
Where Wj and Wb is updated weight
End
Termination
Condition?
Random
Exchange
.
ISSN: 2347-8578
www.ijcstjournal.org
Page 71
International Journal of Computer Science Trends and Technology (IJCS T) Volume 4 Issue 4, Jul - Aug 2016
Above three operators are executed until termination
condition is reached wh ich is given by user. Output of this
step will be S with an updated ensembles. Ensemble of
classifier with minimu m cu mulative error is selected as
optimal ensemble.
4) Testing
In this step instances are tested using ensemble o f classifier
obtained in above process.
HBBE
BBE
85.50%
84.04%
Bupa
66.81%
60.68%
V. CONCLUSIONS
This paper describes the effectiveness of ensemble learning,
various approaches for ensemble learning, Problems in
Random subspace ensemble learning and method to overcome
this method. We observed bagging based ensembles also
faces same problems as RSCE and proposed method HBBE
to solve the problem in bagging based ensembles. Bagging
Based ensemble by using HAEL approach gives results into
increase in prediction accuracy.
ACKNOWLEDGMENT
Rajani Bagul received the B.E in
A) Experimental Setup
Goal of the experimental evaluation is to check the
effectiveness of the proposed approach. In proposed work
HAEL is applied on Bagging based ensemble. Accuracy, t ime
and memory requirements of the Bagging based ensemble
with HA EL (HBBE) and existing bagging based ensemble
(BBE) are co mpared. Random subspace based ensemble with
ISSN: 2347-8578
www.ijcstjournal.org
Page 72
International Journal of Computer Science Trends and Technology (IJCS T) Volume 4 Issue 4, Jul - Aug 2016
REFERENCES
[1]
[2]
[3]
[4]
[5]
[6]
[7]
[8]
[9]
[10]
[11]
ISSN: 2347-8578
[12]
[13]
[14]
[15]
[16]
[17]
www.ijcstjournal.org
Page 73