0% found this document useful (0 votes)
120 views

EEG Based Classification of Emotions With CNN and RNN

Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-3 , April 2020,Pdf Url :https://www.ijtsrd.com/papers/ijtsrd30374.pdf Paper Url :https://www.ijtsrd.com/engineering/electronics-and-communication-engineering/30374/eeg-based-classification-of-emotions-with-cnn-and-rnn/s-harshitha

Uploaded by

Editor IJTSRD
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
120 views

EEG Based Classification of Emotions With CNN and RNN

Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-3 , April 2020,Pdf Url :https://www.ijtsrd.com/papers/ijtsrd30374.pdf Paper Url :https://www.ijtsrd.com/engineering/electronics-and-communication-engineering/30374/eeg-based-classification-of-emotions-with-cnn-and-rnn/s-harshitha

Uploaded by

Editor IJTSRD
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

International Journal of Trend in Scientific Research and Development (IJTSRD)

Volume 4 Issue 4, June 2020 Available Online: www.ijtsrd.com e-ISSN: 2456 – 6470

EEG Based Classification of Emotions with CNN and RNN


S. Harshitha1, Mrs. A. Selvarani2
1Student, 2Associate
Professor,
1,2Electronics and Communication Engineering,

1,2Panimalar Institute of Technology, Chennai, Tamil Nadu, India

ABSTRACT How to cite this paper: S. Harshitha | Mrs.


Emotions are biological states associated with the nervous system, especially A. Selvarani "EEG Based Classification of
the brain brought on by neurophysiological changes. They variously cognate Emotions with CNN
with thoughts, feelings, behavioural responses, and a degree of pleasure or and RNN" Published
displeasure and it exists everywhere in daily life. It is a significant research in International
topic in the development of artificial intelligence to evaluate human behaviour Journal of Trend in
that are primarily based on emotions. In this paper, Deep Learning Classifiers Scientific Research
will be applied to SJTU Emotion EEG Dataset (SEED) to classify human and Development
emotions from EEG using Python. Then the accuracy of respective classifiers (ijtsrd), ISSN: 2456- IJTSRD30374
that is, the performance of emotion classification using Convolutional Neural 6470, Volume-4 |
Network (CNN) and Recurrent Neural Networks are compared. The Issue-4, June 2020, pp.1289-1293, URL:
experimental results show that RNN is better than CNN in solving sequence www.ijtsrd.com/papers/ijtsrd30374.pdf
prediction problems.
Copyright © 2020 by author(s) and
KEYWORDS: Emotion classification, SEED, EEG, CNN, RNN, Confusion matrix International Journal of Trend in Scientific
Research and Development Journal. This
is an Open Access article distributed
under the terms of
the Creative
Commons Attribution
License (CC BY 4.0)
(http://creativecommons.org/licenses/by
/4.0)
INTRODUCTION
Emotions play a significant role in how we think and behave through time. It is useful in time series prediction because it
in daily life. They can compel us to take action and influence can remember previous inputs very well which is called
the decisions we make about our lives, both large and small. Long Short-Term Memory. Also, recent developments in
As technology and the understanding of emotions are machine learning show that neural networks provide
progressing, there are growing prospects for automatic considerable accuracy in a variety of different tasks, such as
emotion recognition systems. There exists a successful text analysis, image recognition, speech analysis and so on.
breakthrough in facial expressions or gestures as simulative
emotional recognition. As an advanced machine learning Related Works
technique for emotion classification, neural network is Automated classification of human emotion using EEG
considered as a machine used to simulate how the brain signals has been researched upon meticulously by various
performs a specific task. It stimulates the brain as a concept scholars. In related studies, there were many strategies
of complex nonlinear and parallel computers, which can proposed to perform emotion classification from EEG
evaluate complex functions according to various factors. A signals. It ranges from finding physiological patterns of
new direction this paper is heading towards is EEG-based emotions, identifying remarkable features, and any other
technologies for automatic emotion recognition, as it combination of those strategies.
becomes less intrusive and more affordable, leading to
ubiquitous adoption in healthcare applications. In this paper In [1] the research provides a concise evaluation for CNN
we focus on classifying user emotions from raw classifier which was reasonably accurate in identifying the
Electroencephalogram (EEG) signals, using various neural two criteria of pleasure (Valence) and arousal degree
network models and advanced techniques. We particularly (Arousal) with an average accuracy of 84.3% and 81.2%,
explore end-to-end deep learning approaches, CNN and RNN, respectively but for an individual subject, the average
for emotion classification from SEED dataset directly. classification accuracy was only 63%.
Convolution neural networks are forward neural networks,
generally including a primary pattern extraction layer and In [2] the paper showed, for both Valence and Arousal, DNN
feature mapping layer, and can learn local patterns in data performs maximum classification accuracies of 75.78 and
by a method called convolution. We found that CNN [1] is 73.281 respectively while their CNN model had classification
better, and deep neural models are more preferable in accuracies of 81.41% and 73.35% respectively.
emotion classification and estimation for machine computer
interface system that models brain signals for emotions. In [4] the research proved that the convolutional networks
However, an RNN remembers the entire information in comparison with the classic algorithms of machine

@ IJTSRD | Unique Paper ID – IJTSRD30374 | Volume – 4 | Issue – 4 | May-June 2020 Page 1289
International Journal of Trend in Scientific Research and Development (IJTSRD) @ www.ijtsrd.com eISSN: 2456-6470
learning demonstrated a better performance in the emotion The differences and ratios between the differential entropy
detection in physiological signals, despite being conceived (DE) features of 27 pairs of hemispheric asymmetry
for the object recognition in images. electrodes give the differential asymmetry (DASM) and
rational asymmetry (RASM) features. By using the
For classification of emotion, the most popular method, K- conventional moving average and linear dynamic systems
Nearest Neighbour algorithm [5] had achieved 62.3% overall (LDS) approaches, all the features are further levelled.
accuracy by applying features namely, wavelet energy and
entropy. The results showed 78.7±2.6% sensitivity, B. Methodology
82.8±6.3% specificity and 62.3±1.1% accuracy on the ‘DEAP’ The project exhibits emotion recognition using EEG signals
database. to detect emotions, namely, happiness, sadness, neutral and
fear using SEED dataset. Two different neural models are
In [6] the paper showed that Hierarchical Structure-Adaptive used, a simple Convolutional Neural Network and Recurrent
RNN (HSA-RNN) for video summarization tasks, can Neural Network (RNN) as the classifiers. Both models are
adaptively exploit the video structure and generate the augmented using contemporary deep learning techniques
summary simultaneously. like Dropout technique and Rectilinear Units in order to
In [7] the results showed that the proposed framework introduce non-linearity in our model. Both models are
improves the accuracy and efficiency performance of EEG implemented using Keras [1] libraries in Python.
signal classification compared with traditional methods,
including support vector machine (SVM), artificial neural
network (ANN), and standard CNN by 74.2%.

In [8] the research proved that the TranVGG-19 obtains a


good result with mean prediction accuracy is 99.175%.

In comparison to SqueezeNet, Residual Squeeze VGG16 of [9]


can be more easily adapted and fully integrated with residual
learning for compressing other contemporary deep learning
CNN models since their model was 23.86% faster and 88.4%
smaller in size than the original VGG16.

In paper [10], it was proved that the feature of mean gives


the highest contribution to the classification using deep
learning classifier, Naïve Bayes with the highest
classification result of reached 87.5% accuracy of emotion
recognition.

Proposed Work
A. Database Agglomeration and Description
The proposed method is conducted by SJTU Emotion EEG
Dataset (SEED) which is a collection of EEG dataset provided
by the Brain-like Computing and Machine Intelligence
(BCMI) laboratory. This dataset contains the subjects' EEG
signals when they were watching film clips. The film clips are
prudently selected to induce different types of emotion,
which are neutral, sad, fear and happy. 15 subjects (7 males
and 8 females; Mean: 23.27, STD: 2.37) watched 15 video
clips in the experiments. Each subject is experimented three
times in about one week. The down-sampled, pre-processed
and segmented versions of the EEG data in Matlab (.mat file)
were obtained [2]. The detailed order of a total of 62
channels is included in the dataset. The EEG cap according to
the international 10-20 system for 62 channels is shown
figure [1].

Fig.2 Proposed Method

RNN is a type of neural network which transforms a


sequence of inputs into a sequence of outputs. An RNN can
learn and detect an event, such as the presence of a
particular expression, irrespective of the time, at which it
Fig.1 EEG cap according to the international 10-20 occurs in a sequence. Hence, it naturally deals with a variable
system for 62 channels number of frames.

@ IJTSRD | Unique Paper ID – IJTSRD30374 | Volume – 4 | Issue – 4 | May-June 2020 Page 1290
International Journal of Trend in Scientific Research and Development (IJTSRD) @ www.ijtsrd.com eISSN: 2456-6470

Fig.3 RNN Architecture


Fig.4 Confusion Matrix
At each time step t, a hidden state ht is computed based on
the hidden state at time t - 1 and the input xt at time t True Positive (TP): Prediction is positive and it’s true.

ℎ‫ݐݔ ܹ݊݅( ߪ = ݐ‬+ܹ‫ܿ݁ݎ‬ℎ‫ݐ‬−1) True Negative (TN): Predicted is negative and it’s true.

where Win is the input weight matrix, Wrec is the recurrent False Positive (FP) – Type 1 Error: Predicted is positive
matrix and σ is the hidden activation function. Each timestep and it’s false.
also computes outputs, based on the current hidden state: False Negative (FN) – Type 2 Error: Predicted is negative
and it’s false.
‫ݐݑ݋ܹ(݂ = ݐݕ‬ℎ‫)ݐ‬
Accuracy:
where Wout is the output weight matrix and f is the output
activation function.
Specificity:
We use a simple RNN with Rectified Linear hidden Units
(ReLUs). We train the RNN to classify the emotions in a video
agglomerated in the SEED dataset. Then using the confusion
matrix, we measure the performance of each classifier in Sensitivity:
terms of accuracy, specificity, sensitivity, and precision.

C. Confusion Matrix Parameters


Confusion Matrix is a performance measurement table for
Precision:
machine learning classification where output can be two or
more classes. It is composed of 4 different combinations of
predicted and actual values.

Experimental results and analysis


RNN model is applied to construct EEG based emotion detection of four classes, namely Neutral, Sad, Fear and Happy. The
dataset used here is SEED. The 62-channel EEG signals are recorded from 15 subjects while they are watching emotional film
clips with a total of 24 trials. This data set is available with extracted features like Moving Average and PSD of Differential
Entropy and LDS, differential and rational asymmetries, and caudal differential entropy. Further, these feature data belong to
alpha and beta frequency bands are normalized with mean values and these have been used here.

Table1. Confusion matrix parameters for CNN and RNN

Confusion parameters CNN RNN


Neutral Sad Fear Happy Neutral Sad Fear Happy
Accuracy 95.53 88.13 88.01 87.89 96.50 89.88 90.85 89.61
Specificity 98.56 98.29 57.44 55.52 93.44 98.11 74.18 77.17
Sensitivity 94.54 83.57 96.68 97.20 98.29 84.01 98.14 98.34
Precision 85.50 72.85 83.07 85.08 94.38 81.37 90.43 88.74

With these features of preprocessed EEG data, the existing From table (2) we see that RNN achieves the highest
CNN and proposed RNN models are trained. Then accuracy of 96.50%, 89.88%, 90.85% and 89.61% for
classification is performed using trained models. The Neutral, Sad, Fear and Happy while CNN model-based
experimental results show that the pools of features that classification output gives 95.53%, 88.13%, 88.01% and
have been chosen can achieve relatively classification 87.89% for the respective emotions.
performance across all the experiments of different subjects.

@ IJTSRD | Unique Paper ID – IJTSRD30374 | Volume – 4 | Issue – 4 | May-June 2020 Page 1291
International Journal of Trend in Scientific Research and Development (IJTSRD) @ www.ijtsrd.com eISSN: 2456-6470
A. CNN B. RNN
For CNN, figure 5 shows the exponential behaviour of the For RNN, figure 8 shows the exponential behaviour of the
accuracy during the training and testing for the 20 epochs. accuracy during the training and testing for the 20 epochs.
Similarly, in figure 6, the values of loss are displayed during Similarly, in figure 9, the values of loss are displayed during
the learning and testing that is decreasing for each epoch. the learning and testing that is decreasing for each epoch.
The confusion matrix is showing the results of prediction for The confusion matrix is showing the results of prediction for
the four classes, respectively (figure 7). the four classes, respectively (figure 10).

Fig.5. Accuracy result for CNN Fig.8 Accuracy result for RNN

Fig.6 Loss result for CNN Fig.9 Loss result for RNN

Fig.10 Confusion matrix of RNN for the prediction of


Fig.7 Confusion matrix of CNN for the prediction of four emotions
four emotions

@ IJTSRD | Unique Paper ID – IJTSRD30374 | Volume – 4 | Issue – 4 | May-June 2020 Page 1292
International Journal of Trend in Scientific Research and Development (IJTSRD) @ www.ijtsrd.com eISSN: 2456-6470
Hence, this paper shows that RNN figure (10) has made the ABDULHAY, ANDN. ARUNKUMAR, “Using Deep
highest prediction for the four classes compared to CNN Convolutional Neural Network for Emotion Detection
figure (7). on a Physiological Signals Dataset (AMIGOS)”, New
Trends in Brain Signal Processing and Analysis, IEEE
Conclusion: Access (Volume: 7), 2018
Based on the existing research in the field of emotion
[5] Md. Rabiul Islam, Mohiuddin Ahmad, “Wavelet Analysis
recognition, this study explores the effectiveness of the
Based Classification of Emotion from EEG Signal”,
recurrent neural network. The experimental results show
International Conference on Electrical, Computer and
that the RNN model achieves higher precision to classify
Communication Engineering, 2019
emotions with CNN based approach. The reliability of
classification performance suggests that specific emotional [6] Bin Zhao1, Xuelong Li2, Xiaoqiang Lu2, “HSA-RNN:
states can be identified with brain activities. The learning by Hierarchical Structure-Adaptive RNN for Video
RNN suggests that neural signatures associated with Neutral, Summarization”, 2018 IEEE/CVF Conference on
Sad, Fear and Happy emotions do exist and they share Computer Vision and Pattern Recognition, 2018
commonality across individuals. Thus, RNN provides an
[7] GAOWEI XU, XIAOANG SHEN, SIRUI CHEN, YONGSHUO
appealing framework for propagating information over a
ZONG, CANYANG ZHANG, HONGYANG YUE, MIN LIU,
sequence using a continuous-valued hidden layer
FEI CHEN, and WENLIANG CHE “A Deep Transfer
representation.
Convolutional Neural Network Framework for EEG
Signal Classification”, IEEE Data-Enabled Intelligence
References
for Digital Health, 2019
[1] Guolu Cao, Yuliang Ma, Xiaofei Meng, Yunyuan Gao,
Ming Meng, “Emotion Recognition Based On CNN”, [8] Long Wen, X. Li, Xinyu Li*, Liang Gao, “A New Transfer
Proceedings of the 38th Chinese Control Conference, Learning Based on VGG-19 Network for Fault
2019 Diagnosis”, Proceedings of the 2019 IEEE 23rd
International Conference on Computer Supported
[2] Wei-Long Zheng, Jia-Yi Zhu, and Bao-Liang Lu*,
Cooperative Work in Design, 2019
“Identifying Stable Patterns over Time for Emotion
Recognition from EEG”, IEEE Transactions on Affective [9] Hussam Qassim, Abhishek Verma, David Feinzimer,
Computing, 2017 “Compressed Residual-VGG16 CNN Model for Big Data
Places Image Recognition”, 2018 IEEE 8th Annual
[3] Samarth Tripathi, Shrinivas Acharya, Ranti Dev
Computing and Communication Workshop and
Sharma, “Using Deep and Convolutional Neural
Conference (CCWC), 2018
Networks for Accurate Emotion Classification on DEAP
Dataset”, Proceedings of the Twenty-Ninth AAAI [10] Nur Yusuf Oktavia, Adhi Dharma Wibawa, Evi Septiana
Conference on Innovative Applications (IAAI-17)), Pane, Mauridhi Hery Purnomo, “Human Emotion
2017 Classification Based on EEG Signals Using Naive Bayes
Method”, International Seminar on Application for
[4] LUZ SANTAMARIA-GRANADOS, MARIO MUNOZ-
Technology of Information and Communication, 2019
ORGANERO, GUSTAVO RAMIREZ-GONZALEZ, ENAS

@ IJTSRD | Unique Paper ID – IJTSRD30374 | Volume – 4 | Issue – 4 | May-June 2020 Page 1293

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy