Icme Raja

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/309429520

Emotion classification of EEG brain signal using SVM and KNN

Conference Paper · June 2015


DOI: 10.1109/ICMEW.2015.7169786

CITATIONS READS
89 2,691

2 authors:

Raja Majid Mehmood Hyo Jong Lee


Xiamen University Malaysia Chonbuk National University
52 PUBLICATIONS 1,410 CITATIONS 167 PUBLICATIONS 2,534 CITATIONS

SEE PROFILE SEE PROFILE

All content following this page was uploaded by Raja Majid Mehmood on 26 October 2016.

The user has requested enhancement of the downloaded file.


EMOTION CLASSIFICATION OF EEG BRAIN SIGNAL USING SVM AND KNN

1 1, 2, *
Raja Majid Mehmood and Hyo Jong Lee

1 Division of Computer Science and Engineering


2 Center for Advanced Image and Information Technology
* Corresponding author
Chonbuk National University, Jeonju, SOUTH KOREA
rmeex07 @gmail.com, hlee @chonbuk.ac.kr

ABSTRACT authors presented the results of two classifiers such as,


Fisher Discriminant Analysis (FDA) and NaiveBayes
Affective computing research field is growing in large scale classifier with an accuracy of 70% and 72%, respectively.
with the frequent development of human-computer Khalili et al. [4] explored the physiological signals through
applications. These applications use information of mental EEG recording at arousal and valence levels. The authors
or affective conditions of desired subjects to train their brain performed the recognition analysis for three emotional
responses. Facial impressions, text physiology, vocal and classes. Their results showed the accuracy of Linear
other information about the subjects are used in the Discriminant Analysis (LDA) and K-Nearest Neighbor
classification algorithms. However, the classification (KNN) were 40% and 61%, respectively. Horlings et al [5]
frameworks for EEG brain signals have been used used Encephalogram (EEG) for classifying five different
infrequently due to the lack of a complete theoretical kinds of emotion on two affective dimensions (valence and
framework. Therefore, we present here an analysis of two arousal,separately). They had used the training dataset from
different classification methods which are SVM and KNN. the database of the Enterface project [6], and extended it
Four different types of emotional stimulus were presented to with their own data. They employed ten subjects for the task
each subject. After preprocessing of raw EEG data, we of EEG acquisition using a Truescan32 system. Emotion
employed Hjorth parameters for feature extraction of all elicitation was achieved by using the International Affective
EEG channels at each epoch. Five male subjects were Picture System (lAPS) protocol [7, 8]. Subjects were
selected in this experiment. Our results show that the instructed to rate their level of emotion on a 2D arousal and
emotion recognition from EEG brain signals might be valence scale according to the Self-Assessment Manikin
possible. (SAM) [9]. They accomplished two recording sessions
consisted of 25 to 35 trials each. Rest time of 5 minutes was
Index Terms- EEG, emotion, Hjorth, SVM, KNN included in between of each session. Five pictures were
presented on each trial, and each picture was shown for two
1. INTRODUCTION and a half seconds. The EEG data was further preprocessed
and filtered between 2-30 Hz. They adopted the band pass
Nowadays brain signal processing technologies are opening filtering to remove the artifacts and noise from the EEG
the windows to new ways of looking at emotions and other signal. They also removed the baseline value from each
affective states. Categorical and dimensional models have EEG signal. They extracted about 114 features such as,
been debated since long time in the area of psychology. frequency band power, cross-correlation between EEG
Formerly,a discrete number of emotions (e.g. 'Excited') can band-power, peak frequency in alpha band and Hjorth
be detected or recognized through behavioral changes such parameters. They selected the best 40 features for each of
as, physiological measures or facial actions [1, 2]. But, the valence and arousal dimensions by using the max
further assumes an essential set of variables considered to be relevance min redundancy (mRMR) algorithm [10]. They
two that are arousal and valence. Valence is described as used two classifiers to train the feature dataset. A separate
going from very positive feelings to very negative and classifier was selected for each dimension (arousal and
arousal also called activation, which is going from sleep to valence). According to author's results, accuracy of 32% for
excited. the valence and 37% for the arousal dimension were
Chanel et al. [3] performed EEG based emotion achieved through SVM classifier with 3-fold cross
classification with two emotional classes at arousal level. validation.
They used 64 electrodes at 1024 Hz for EEG recording. The
Previous researches had commonly employed the KNN The EEG signal data were recorded through Emotiv­
and SVM as a classifier of human brain signals. Both EPOC headset. Emotiv EPOC consists of 14 EEG channels
classifiers showed a good indication for further analysis in plus 2 reference channels offering optimal positioning for
our research. The aim of our study is to classify four accurate spatial resolution. This device used the
emotions in two dimensions of arousal and valence. We international lO/20 electrode location system for electrode
employed the SVM [11] and KNN [12] classifiers for placement. The following Fig. 2 shows the EEG channel
recognition of emotion. Hjorth Parameters were selected as placement in our experiment.
a feature extraction method. This study provides new data
on EEG based emotion recognition, and presents a Channel l ocati ons

performance comparison of KNN and SVM using an


adaptive classification technique. Section 2 discusses the
material and methods used in this research work. Section 3
presents the results and discussion part of our paper, and -F8

Section 4 presents our conclusions.

2. MATERIALS AND METHODS

Our experiment was aimed to elicit the emotional response


from subjects while they were watching the emotional
stimulus. lAPS database was used during presentation of the
emotional stimulus. This database was specifically designed -P8
for emotion based experiments in two dimensional domain
of arousal and valence. We employed a common method to
induce the distinct emotions from subjects by displaying the
emotion-related stimuli [13-16]. We defined the four
14 of 14 electrode loc ations shown
emotional states such as, happy, calm, sad and scared
separately. On the basis of these ratings, 180 stimuli (45
Fig. 2. Emotiv-EPOC headset 14 channel placement
stimulus X 4 states) were selected from equally distributed
groups along the arousal-valence axes from lAPS database. We used 14 electrodes for recoding our experiment such
The emotional pictures were selected with help of as, AF3, F7, F3, FC5, T7, P7, 01, 02, P8, T8, FC6, F4, F8,
arousal/valence ratings from lAPS database. The picture AF4, with CMS/DRL references in the P3/P4 locations.
rating is displayed in the Fig. 1 and our selection of Emotiv EPOC uses a sequential sampling method at a rate
emotional stimuli is shown with red circles. of 128 samples per seconds (SPS).
In this paper, we are presenting the results of five male
subjects those were participated in this experiment. All
9.00
subjects were students of the same institute, and aged from
12 to 14 years. The selected subjects were middle school
students of the same institution. They were informed about
the purpose of our experiment. All subjects were given a
simple presentation about the stages of experiment. They
also signed the consent forms after the introduction of our
experiment.
The stimulus was presented randomly for 1.5 seconds
following another 0.5 seconds with a blank image which is
black. A blank image was used to release the emotional
feeling of a subject which was generated from previous
3.00 stimulus. We used a fixation cross-sign that was projected
for four seconds exactly in center of screen to attract the
attention of the subject. Fig. 3 shows the timing diagram of
. lAPS Oat. this experiment where the total time of collecting EEG
1.00 Selected Data
o
recording was 368 seconds for each subject.
1.00 3.00 5.00 7.00 9.00 The recorded EEG brain signals were preprocessed using
Valence the EEGLAB toolbox from SCCN Lab [17]. The toolbox is
running under the Matlab. We cleaned the EEG signal data
Fig. 1. The scatter plot ofIntemational Affective Picture System
of each subject using Independent Component Analysis
(lAPS) images database, based on valence-arousal model.
(ICA) and manual rejection of artifacts such as, eye blink,
eye movement, muscle movement, and bad channel. We the characteristics of EEG signal and they can be used as
presented the raw signal of one subject in Fig. 4 (a). We can features for emotion classification [19]. The Hjorth
see the some noise artifacts in red oval which were produced parameters are defmed as normalized slope descriptors
during the recording session. We applied our method of (NSDs) which contain an activity, mobility and complexity.
artifact rejection. Fig. 4 (b) presents the clean EEG signal in Hjorth parameters are derived by means of 1 st and 2nd
a green oval after the preprocessing of raw data. derivatives. The computational cost of these parameters is
Hjorth parameters are statistical methods available in comparatively very lower than other methods.
time and frequency domain [18]. These parameters compute

Runl

+ +

Fig. 3. Timing diagram of emotional stimuli

"...",.
.1 .w-


-- A... - r,�
��\
AF3 .w- �
l.
F7 � � cor � .......\ /'<
"":1 '"
F3 "'" ...... """,,'1
;:- .,.. -.r--..r

ft.

''''-''
'W

FC5 ...... '''''


1 ,...,
� �
�.

T7 � .yo

P7 �,� .... �'


......
01 �
02 � � w..- � -....
P8 � '�
...., ....,
T8 � ......
1 1
�' ---....
FC6 �
....�
.
F4 ....... �I ..... /'0. .A
"V' ,.....


Scale
l ''''-'A ]

F8 1 �
5
AF4
�� f
O � -500
J\

a 500
'� .-

-500 a 500 -500


J
a 500
""

-500
..",
-""

a 500 -50 a
r

(a) Raw EEG signal data with noise artifacts


"
I I,
AF3 I .\
F7 I 1 ......
,•• 1 "'v"'r ,--L �
�,

F3
FC5
, -,, 1 -." , .....
.. ..
•• :1
... �� .....,

�� --- .",, �.......
....
--

T7 1 �
PI

01 -.-...--
I""" �
...... .r. n. �
02

P8 � --
T8
FC6 ...... � -.-
F4 -
Scale

F8 , ...... - "W 7.735


AF4 I -1 ...... '" --- ..,. ....
r

O �'7
\
-500 a 500 -500 a SOD -SO� a 500 -500 a 500 -50 a

(b) Cleaned EEG signal data after preprocessing

Fig. 4. EEG signal preprocessing


We used a total number of 42 features set in our The main objective of our experiment was to evaluate
proposed method. Further,the extracted features set for each our selected feature extraction method through SVM and
emotional class provided to the classifier. These features KNN. Our result shows that it is not trivial to process and
were consisted of three Hjorth Parameter and 14 EEG classify data to be accurate for every subject. Hence, the
channels. We considered only single frequency range in classification result of four emotions varies in subjects due
proposed method which is 0.5-30 Hz. The duration of to emotional dependency of subjects.
extracted window is first 1500 milliseconds of each epoch. According to previous research work [5] that was
All EEG signal patterns were obtained at j'h EEG channel already discussed in introduction of this paper, the SVM
andj'h epoch. correctly classify affects in valence and arousal dimension
with accuracy of 32% and 37%, respectively.
(1) Comparatively, our results shown that the accuracy of SVM
is similar to previous research. But,we can conclude that the
where 'i' and 'j' are indices for EEG channels and epoch, KNN is always better than SVM.
,
respectively. The function 'F hp computes the total of 42
4. CONCLUSION
features set by using of Hjorth parameters at i'h EEG channel
for every j'h epoch. This function returns the '[I42]j' as
We proposed a novel method of emotion recognition from
instance of 'j' epoch and it contains three parameter
brain signals using 14 EEG channels. The results of our
(Activity, Mobility, and Complexity) values against each
research show that the emotion recognition from brain EEG
EEG channel. Furthermore, these features were prepared for
channels might be possible. Despite the lack of strong
WEKA [20] to process the features dataset '[I42]j' into
emotion related physiological indication to correlate the
SVM. Each instance [I]j contains an epoch type as a class brain activity at the cortical level, our proposed method of
value for the classifier. We adopted a WEKA for feature extraction indicates the possibility of emotion
classification analysis of extracted features dataset. The recognition. However, this study mainly focused on the
classifier was trained to recognize four different emotions in feature extraction and classification techniques that could be
arousal-valence domain. We employed the default used for EEG signal processing. Our results have shown the
parameters of SVM which are available in WEKA. lO-fold high accuracy of KNN over SVM in all selected subjects.
cross validation was used for classification purpose in this Future work would look at using a dynamic approach for
analysis. Classification accuracy is presented in the recognizing the emotion in real time automated system.
following section.
ACKNOWLEDGMENT
3. RESULTS AND DISCUSSION

This work was supported by the National Research


The classification results for five subjects are presented in Foundation of Korea (NRF) grant funded by the Korea
Fig. 5. This figure contains the subjects on the x-axis while government (MEST) (No. 2012RIA2A2A03).
accuracy is displayed over y-axis. The two classifiers KNN
and SVM can be identified by blue and green color bars, REFERENCES
respectively. Both classifiers' results are displayed for each
subject in same figure. The highest accuracy was obtained [1] R. W. Picard, E. Vyzas, and J. Healey, "Toward
with KNN with k=3,which was 61%. machine emotional intelligence: Analysis of
affective physiological state," Pattern Analysis and
Machine Intelligence, IEEE Transactions on, vol.
60.0
23,pp. 1175-1191,2001.
'#.50.0 [2] R. Du, R. M. Mehmood, and H.-J. Lee, "Alpha
�40.0

I I I
Activity during Emotional Experience Revealed by
IU
� 30.0 ERSP," Journal of Internet Technology, vol. 15, pp.
v
� 20.0 775-782,2014.
[3] G. Chanel, J. Kronegg, D. Grandjean, and T. Pun,
10.0
"Emotion assessment: Arousal evaluation using
0.0
51 52 53 54 55 EEG's and peripheral physiological signals," in
.5VM
Multimedia content representation, classification
38.9 33.3 33.3 38.9 38.9
and security, ed: Springer,2006,pp. 530-537.
• KNN 44.4 50.0 38.9 44.4 61.1
[4] Z. Khalili and M. Moradi, "Emotion detection

Subjects using brain and peripheral signals," in Biomedical


Engineering Conference, 2008. CIBEC 2008.
Cairo International, 2008,pp. 1-4.
Fig. 5. SVM VS. KNN classification results
[S] R. Horlings, D. Datcu, and L. J. Rothkrantz, [16] Y. Ogino, H. Nemoto, K. Inui, S. Saito, R. Kakigi,
"Emotion recognition using brain activity," in and F. Goto, "Inner experience of pain: imagination
Proceedings of the 9th international conference on of pain while viewing images showing painful
computer systems and technologies and workshop events forms subjective pain representation in
for PhD students in computing, 2008,p. 6. human brain," Cerebral Cortex, vol. 17, pp. 1139-
[6] A. Savranl, K. Ciftcil, G. Chanel, J. C. Mota, L. H. 1146,2007.
Viet, B. Sankurl, L. Akarun\ A. Caplier, and M. [17] A. Delorme and S. Makeig, "EEGLAB: an open
Rombaut, "EmotionDetection in the Loop from source toolbox for analysis of single-trial EEG
Brain Signals and Facial Images," 2006. dynamics including independent component
[7] P. J. Lang, M. M. Bradley, and B. N. Cuthbert, analysis,"Journal of neuroscience methods, vol.
"International affective picture system (lAPS): 134,pp. 9-21,2004.
Instruction manual and affective ratings," The [18] B. Hjorth, "EEG analysis based on time domain
center for research in psychophysiology, University Electroencephalography and clinical
properties,"
of Florida, 1999. Neurophysiology, vol. 29,pp. 306-310,1970.
[8] M. Bradley and P. J. Lang, The International [19] R. M. Mehmood and H. J. Lee, "Exploration of
affective digitized sounds (IADS)[: stimuli, Prominent Frequency Wave in EEG Signals from
instruction manual and affective ratings: NIMH Brain Sensors Network," International Journal of
Center for the Study of Emotion and Attention, Distributed Sensor Networks, 201S.
1999. [20] M. Hall, E. Frank, G. Holmes, B. Pfahringer, P.
[9] M. M. Bradley and P. J. Lang, "Measuring emotion: Reutemann, and I. H. Witten, "The WEKA data
the self-assessment manikin and the semantic mining software: an update," ACM SIGKDD
differential,"Journal of behavior therapy and explorations newsletter, vol. 11,pp. 10-18,2009.
experimental psychiatry, vol. 2S,pp. 49-S9,1994.
[10] H. Peng, F. Long, and C. Ding, "Feature selection
based on mutual information criteria of max-
dependency, max-relevance, and min-redundancy,"
Pattern Analysis and Machine Intelligence, IEEE
Transactions on, vol. 27,pp. 1226-1238,200S.
[11] R.-E. Fan, P.-H. Chen, and c.-J. Lin, "Working set
selection using second order information for
training support vector machines," The Journal of
Machine Learning Research, vol. 6, pp. 1889-1918,
200S.
[12] D. W. Aha, D. Kibler, and M. K. Albert, "Instance­
based learning algorithms," Machine learning, vol.
6,pp. 37-66,1991.
[13] L. Aftanas, A. Varlamov, S. Pavlov, V. Makhnev,
and N. Reva, "Event-related synchronization and
desynchronization during affective processing:
emergence of valence-related time-dependent
hemispheric asymmetries in theta and upper alpha
band,"International journal of Neuroscience, vol.
110,pp. 197-219,200l.
[14] L. Aftanas, A. Varlamov, S. Pavlov, V. Makhnev,
and N. Reva, "Affective picture processing: event­
related synchronization within individually defined
human theta band is modulated by valence
dimension,"Neuroscience Letters, vol. 303, pp.
IIS-118,2001.
[IS] L. I. Aftanas, A. A. Varlamov, S. V. Pavlov, V. P.
Makhnev, and N. V. Reva, "Time-dependent
cortical asymmetries induced by emotional arousal:
EEG analysis of event-related synchronization and
desynchronization in individually defmed
frequency International Journal
bands," of
Psychophysiology, vol. 44,pp. 67-82,2002.

View publication stats

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy