Review On Sensors For Emotion Recognition
Review On Sensors For Emotion Recognition
Recognition
Abstract Emotions, on a regular basis, play an important role in our lives. Not
only in the case of human activity in our everyday lives, but also in the decision-
making process, emotions play an significant part. Emotions affect our view of the
natural universe as well. Such thoughts are often initially assumed as meaningless.
A slight change in emotion, though, will bring a big change in behavior. Nowadays,
emotion detection with the assistance of physiological signal is an area of research.
This paper is based on a wide-ranging review of biological signal-based emotion
recognition. Many methodologies are recommended in various papers to understand
human emotional states in an artificial way. Physiological signals such as galvanic
skin reaction (GSR), electrocardiogram (ECG), electroencephalogram (EEG), elec-
tromyogram (EMG), photoplethysmogram (PPG), respiration, and temperature of
the skin are often used. In this article, on the basis of the physiological signal, the
researchers will present a thorough analysis of emotion detection and suggest a work-
flow to classify multiple emotional analyses using various physiological signals to
make the precision and output even better.
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 571
K. A. Reddy et al. (eds.), Data Engineering and Communication Technology,
Lecture Notes on Data Engineering and Communications Technologies 63,
https://doi.org/10.1007/978-981-16-0081-4_57
572 S. Dutta et al.
1 Introduction
2 Emotion Recognition
2.1 Electroencephalography
At the point when a neuron fires, voltage changes happen. Despite the fact that the
EEG is clearly not an exact estimation, it, despite everything, gives a significant
understanding into the electrical activity of the cortex. Frequency and amplitude are
the qualities of the recorded EEG designs. The frequency run is typically from 1 to
80 Hz (partitioned in alpha, beta, theta, delta), with amplitudes of 10–100 µv [5].
Alpha (α) (8–16 Hz) is mostly identified with relaxation, creativity. The alpha
waves generally exist during the condition of wakeful unwinding with eyes shut. It is
the state when the brain and the mind are at resting state [6, 7]. Beta (β) (16–32 Hz)
mostly identified with cautiousness, attention. These waves are delivered when the
individual is in an alarm or on edge state. In this state, cerebrums can without much
of a stretch perform: investigation, arrangements of the data, produce arrangements,
and new thought [6, 7]. Theta (θ ) (4–8 Hz), for the most part, identified with deep
unwinding, meditation. Fundamentally grown-ups generate these waves when the
individual is in slight rest or dreams. The wave frequency is essentially related to the
release of pressure and memory recollection [6, 7]. Basically, delta (δ) (0.54 Hz) is
identified with a deep sense of sympathy and instinct. These waves transmitted in the
waking state show a potential for subconscious behavior to be accessed [6, 7]. In the
research paper [8, 9], the identification of human emotions using a neural network
is defined.
2.2 Electrocardiography
As we know, the heart is known to be one of the most basic organs in our bodies,
and electrocardiography (ECG) is used as an excellent indicative medication system
that is often used to assess the operating state of the heart. As a biochemical symbol,
ECG is used as the usual technique for non-invasive continuous localization of the
electrical activity of the heart [10]. Since heart movement is connected with the
central human framework, ECG is valuable not just in dissecting the heart’s action;
it very well may be additionally utilized for emotion recognition [11]. P (This wave
is an effect of strial contraction), PR (The PR interim estimated from the beginning
of the P wave to the beginning of Q wave). There are simple parameters QRS.
Complex (the QRS complex measured from the beginning of the Q wave to the
end of the S wave), QT/QTc (measured from the beginning of the Q wave to the
end of the T wave), which are often used to test ECG signals [12]. Generally, all
parameters are investigated uniquely for clinical purposes, attempting to characterize
irregular heart action and to get its deviation parameter. In the majority of tests,
QRS complex is used for the identification of emotions, which characterizes the
enactment of the heart relevant to the human emotional state and is an effective
marker for perceiving key feelings. However, because of the fact that this pointer has
a decrease of response to explicit feelings, there are additional difficulties of emotion
detection. Study findings given by Cai et al. [13] suggest that pity can be interpreted
more easily and unambiguously than the feeling of happiness. The classification and
measurement of QRS amplitudes were compiled by several of the studies associated
574 S. Dutta et al.
with ECG and also called the duration between such waves. There is also a set of
studies that run in the QT/QTc dispersion vertical [14]. This provides proof that this
interim is related to the degree of stress and can be used to interpret extraordinary
frustration as a marker. The key drawback of the 12-lead ECG is that it creates
immense data measurements, particularly when they are used for long hours [15].
When used for automated emotion detection, the ECG program involves the use of
sophisticated signal handling techniques, allowing the position and abstraction of
the appropriate verticals from the crude verticals signal. As there is a complexity of
ECG signal examination in functional applications, frequently, ECG is utilized along
with other sensors for emotion detection [3].
The GSR works on the basis of skin conductance, and it is a calculation of some
electrical parameters of the skin. These parameters of the skin are not in the control of
humans consciously [16], as per the traditional theory, and it is dependent on sweat,
which shows changes in the sympathetic nervous system (S.N.S.) [17]. If there is an
emotional change, then there is some noticeable sweat on the palm, finger, and soles.
Due to sweat, there is a variation of salt in the skin, and because of that, there is also a
change of resistance of the skin [18]. In that case, the conductance increases. So, when
there is such kind of ecological change that causes the adjustment in our mindset or
emotions, then the perspiration gland more explicitly, the eccrine gland expands its
action. Diverse sorts of feelings can cause a higher excitement and furthermore in
raise in skin conduction. [19–21]. The sensor has two terminals that are put on the
fingers, and it sends the information through the system to an organizer that advances
it to a computer system [16].
3 Related Work
In research work [22], the authors provided some review work where they have
reviewed different physiological signals for emotion recognition. In [23], the
researchers have projected a model for multisubject emotion classification. The
novelty of this research work is to take out the high-level features with the help
of a deep learning model. Convolutional neural network (CNN) has been used by the
authors for the abstraction of feature for the automatic abstraction of the correlation
information that is present between multichannels for constructing more abstract
features which are discriminatory, namely high-level features. The accuracy of the
average result of 32 subjects was 87.27%. And [24] used the DWT features where
the window width was varying (1–60 s), and the calculation of entropy was done of
the detail coefficients related to the alpha, beta, and gamma bands. With the help of
the SVM classification in case of arousal, the classification accuracy is up to 65.33%
Review on Sensors for Emotion Recognition 575
where the window length will be of 3–10 s, while in case of valance classification
accuracy as per the researcher can be 65.13% where the window length will be 3–
12 s. There are different experiments and analyses related to human emotions on
the basis of the biosignal. Most of the methods are done by collecting data through
multiple physiological signals. The study of the research work [25] classifiers evoked
emotions on the basics of two types of physiological signals of short-term ECG and
GSR and have also recorded and analyzed estimated recognition time. Firstly, they
have performed the experiment with the help of experts to extract target emotions
that included anger, fear, happiness, grief, and calmness. This experiment was done
with the help of ECG and GSR signals. For the processing of the truncated ECG data,
they have applied wavelet transform, and for the processing of the truncated GSR
signal, they have applied a Butterworth filter. With the help of an artificial neural
network (ANN) Fig. 1 finally, they have classified five different emotions types. The
average classification accuracy rates that were achieved in the experiment were 89.1
4% and 82.2 9% for ECG and GSR data, respectively, and for emotion classification
and feature extraction and the total time required did not exceed 0.15 s for either
GSR or ECG signal.
It has been observed from the related work section that for the evaluation of
different emotional states and behavioral prediction, emotion recognition has become
a useful technique nowadays. In the case of the development process of different
types of a human machine interaction system, emotion recognition and evaluation
of sentiments have played a pivotal role in recent days. Though the relationship
exists between the human body reaction due to particular emotion is a known fact,
still there are some ambiguity exists in the method of analysis of emotions. These
problems of uncertainty are addressed by several outcomes of the research work, as
discussed in this related work section. Most of the systems classify specific emotion
state as the studies are limited. Different classification accuracy is calculated and is
compared between different biosystem-based emotion recognition which may not
be accurate, as the comparison has been made between the different methodologies
where different type of datasets involving a variety of biosignal has been utilized. It
is observed that combining the maximum number of physiological signals like ECG,
GSR, PPG, EEG, etc., will give some significant results in the arena of emotion
recognition.
4 Proposed Work
In this work, we reviewed several papers based on analysis and prediction techniques
of identifying various emotions of a user through physiological signals collected
from various non-invasive sensors. There are several limitations we come across
in several research work as discussed in this paper. To address those limitations,
we have proposed a workflow model in this paper for identifying and analyzing
several different emotions using various physiological signals and derived a best-fit
algorithm for identifying automatic emotional state detection system. In order to
analyze his/her emotion firstly, we need to implement emotion analysis based on
several physiological signals.
The following are the major steps in identifying emotions through physiological
signals: (1) preprocessing, (2) feature extraction, and (3) regression. Figure 2 repre-
sents the flow of events of the workflow model. Here, the inputs are the physiological
signals gathered from various sensors such as EEG, ECG, PPG, EMG, and GSR in
an automated way based on several human emotional recognition techniques. In the
proposed method, on the training dataset we will first do data preprocessing to avoid
any missing values or any other errors in dataset. Therefore, once the signals are
extracted in the form of training datasets, it is being validated using data validation
model as discussed in Fig. 2.
Since emotion classification through physiological signals is very vital, it can
be useful to know a person’s feeling for a condition. There are several supervised
learning algorithms that analyze and classify data and they perform well when classi-
fying human emotion. In this context, we will be proposing a classifier method in our
work to identify the emotion of person in a situation. Therefore, the training datasets
and test data simultaneously are being classified based on some machine learning
classifier model that can generate better output in terms of performance, reliability,
and accuracy. The feature extraction stage goes further in finding the more descriptive
parts of an emotion detection. Once the test datasets are classified, it will be analyzed
through proposed data regression model and further it will be being verified and vali-
dated using some proposed algorithm based on machine learning concept. There are
many methods that are used to understand and identify the different types of emotion
that are being expressed, however the output at the end depends on how accurate
the algorithm is. Moreover, there are other issues that are needed to be considered
that if the prediction of probabilities of different emotions is equal by the algorithm
then it will be difficult to decide the emotion. We need to improve the accuracy in
Review on Sensors for Emotion Recognition 577
our algorithm in order to correctly classify the emotion. Thus, the workflow works
in identifying emotions based on physiological signals as discussed in this paper.
This workflow model can be applied in various domains such as healthcare appli-
cations, fraud detection, and social media domain by finding out his emotion whether
he is feeling nervous or not which expresses his/her fear by this. When classification
is done on a smaller subset of highly distinguishable expressions, such as anger,
happiness, and fear, then the accuracy is high. However, we will get lower accuracy
when we will classify larger subsets or if the subsets are small with less distinct
emotion, such as anger and disgust.
5 Conclusions
Detecting and recognizing human emotion is a big challenge in computer vision and
artificial intelligence. Emotions are a big part of human communication. Most of
the communication takes place through emotion. The main aim of this paper is to
study various systems which can detect as well as recognize human emotion from
a live feed. Some feelings are universal to all human beings like angry, sad, happy,
surprise, fear, disgust, and neutral. The various systems used different algorithms
578 S. Dutta et al.
based on human emotions, and then the emotions are verified and recognized using
several deep neural network techniques. It has been observed that emotions can be
predicted by taking the maximum number of physiological signals as taken by the
various system using a different hybrid model. Here, the researchers have proposed a
workflow for identifying several emotion analyses to find the best-fit algorithm using
different physiological signals to make accuracy and performance much better.
References
19. https://www.seeedstudio.com/depot/Grove
20. Dutta S, Dash S, Mitra A (2020) A model of socially connected things for emotion detection. In:
2020 international conference on computer science, engineering and applications (ICCSEA).
IEEE, pp 1–3
21. Dutta S, Dash S, Padhy N (2020) Analysis of human emotions based data using M.I.O.T.
technique, (book chapter in) medical internet of things (M.I.O.T.): recent techniques, practices
and applications. CRC, Taylor and Francis (accepted and under publication)
22. Shu L, Xie J, Yang M, Li Z, Li Z, Liao D, Xu X, Yang X (2018) A review of emotion recognition
using physiological signals. Sensors 18(7):2074
23. Qiao R, Qing C, Zhang T, Xing X, Xu X (2017) A novel deep-learning based framework
for multi-subject emotion recognition. In: 2017 4th international conference on information,
cybernetics and computational social systems (ICCSS). IEEE, pp 181–185
24. Candra H, Yuwono M, Chai R, Handojoseno A, Elamvazuthi I, Nguyen HT, Su S (2015)
Investigation of window size in classification of EEG-emotion signal with wavelet entropy and
support vector machine. In: 2015 37th annual international conference of the IEEE engineering
in medicine and biology society (EMBC). IEEE, pp 7250–7253
25. Zhang S, Liu G, Lai X (2015) Classification of evoked emotions using an artificial neural
network based on single, short-term physiological signals. J Adv Comput Intell Intell Inf
19(1):118–126