An EEG Data Processing Approach For Emot
An EEG Data Processing Approach For Emot
1558-1748 © 2022 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
See https://www.ieee.org/publications/rights/index.html for more information.
Authorized licensed use limited to: University of Michigan Library. Downloaded on February 10,2023 at 15:39:24 UTC from IEEE Xplore. Restrictions apply.
10752 IEEE SENSORS JOURNAL, VOL. 22, NO. 11, JUNE 1, 2022
Authorized licensed use limited to: University of Michigan Library. Downloaded on February 10,2023 at 15:39:24 UTC from IEEE Xplore. Restrictions apply.
LI et al.: EEG DATA PROCESSING APPROACH FOR EMOTION RECOGNITION 10753
Authorized licensed use limited to: University of Michigan Library. Downloaded on February 10,2023 at 15:39:24 UTC from IEEE Xplore. Restrictions apply.
10754 IEEE SENSORS JOURNAL, VOL. 22, NO. 11, JUNE 1, 2022
TABLE I
S TATISTICAL A NALYSIS R ESULTS OF PSD F EATURES (SEED D ATASET ) ACROSS 58 E LECTRODE C HANNELS IN THE F IVE
F REQUENCY B ANDS . T HE R ED C OLOR H IGHLIGHTS THE R ESULTS W ITH S TATISTICAL S IGNIFICANCE ( P≤0.05)
Fig. 3. The positions of the six selected sets of electrode channels. (The orange circles in each subgraph represent the selected channels) (a) set
1: 58 electrode channels. (b) set 2: 55 electrode channels. (c) set 3: 48 electrode channels. (d) set 4: 37 electrode channels. (e) set 5: 26 electrode
channels. (f) set 6: 9 electrode channels.
C. Activation Status of Brain Regions in Different PSD features across the three emotions from different subjects
Emotions for Different Subjects are created, which can demonstrate the power distribution of
To further investigate the relationship between the nine EEG [41], [42]. The PSD features from the same subject in
selected channels and human emotions, topo maps of the the same emotional state are averaged over time for each
Authorized licensed use limited to: University of Michigan Library. Downloaded on February 10,2023 at 15:39:24 UTC from IEEE Xplore. Restrictions apply.
LI et al.: EEG DATA PROCESSING APPROACH FOR EMOTION RECOGNITION 10755
Fig. 4. PSD feature topo maps of subjects in the five frequency bands for different emotions. The subjects are evenly selected.
emotional state. Thus, for each subject, three sets of emotion would diminish sensitivity of features in recognizing emo-
data (positive, neutral, and negative) are obtained for each tions for a specific subject. Thus, within-subject normaliza-
frequency band (i.e., delta, theta, alpha, beta, and gamma). tion methods are needed to improve the subject-independent
The topo maps of each emotion are then visualized and emotion recognition performance.
compared across different subjects. The corresponding results
of the PSD features from three evenly selected subjects are D. Batch Normalization
shown in Fig. 4. The illustrated results show that: (1) The Previous studies have reported that there exist individual dif-
activated channels with human emotions mainly distribute in ferences in human behavior and physiological responses [32].
the temporal lobe brain regions, especially the left temporal Fig. 5 illustrates the PSD feature topo maps from differ-
lobe. As shown in Fig. 3(f), the nine selected EEG channels ent experimental sessions of the same subject, which shows
in this study mainly distribute in the temporal lobe regions, the existence of individual differences in SEED. To reduce
describing why the nine selected channels can effectively the impact of individual differences on emotion recognition,
help recognize human emotions with even higher accuracies. an experiment-level batch normalization (BN) method is pro-
(2) Large individual differences can be observed across sub- posed. Specifically, the features in each frequency band of the
jects in each emotion. Therefore, normalization across subjects 9 selected channels are normalized within the experimental
Authorized licensed use limited to: University of Michigan Library. Downloaded on February 10,2023 at 15:39:24 UTC from IEEE Xplore. Restrictions apply.
10756 IEEE SENSORS JOURNAL, VOL. 22, NO. 11, JUNE 1, 2022
TABLE II
S TATISTICAL A NALYSIS R ESULTS OF PSD F EATURES (SEED IV
D ATASET ) ACROSS THE N INE S ELECTED C HANNELS IN THE F IVE
F REQUENCY B ANDS . T HE R ED C OLOR H IGHLIGHTS THE
R ESULTS W ITH S TATISTICAL S IGNIFICANCE ( P≤0.05)
Authorized licensed use limited to: University of Michigan Library. Downloaded on February 10,2023 at 15:39:24 UTC from IEEE Xplore. Restrictions apply.
LI et al.: EEG DATA PROCESSING APPROACH FOR EMOTION RECOGNITION 10757
TABLE III
E MOTION RECOGNITION R ESULTS ACROSS I NDIVIDUALS B ASED ON D IFFERENT S ETS O F S ELECTED E LECTRODE C HANNELS . T HE M EAN
ACCURACY I S T HE AVERAGE OF A LL THE C LASSIFIERS . T HE B EST ACCURACY OF E ACH C LASSIFIER W HEN U SING PSD OR DE
F EATURES I S H IGHLIGHTED IN B OLD R ED, AND THE S ECOND -B EST IN B OLD B LUE , AND THE T HIRD -B EST IN B OLD
TABLE IV
T HE R ECOGNITION R ESULTS OF PSD F EATURE AND DE F EATURE F ROM THE F IVE F REQUENCY B ANDS IN 9 E LECTRODE C HANNELS W ITH 80%
F EATURE D ATA IN E ACH E XPERIMENT F ROM 15 S UBJECTS AS T RAINING S ET AND THE R EST 20% F EATURE D ATA AS T ESTING S ET
features with the proposed BN for emotion recognition can All the above results are obtained from each individual
achieve obviously better accuracies than using features without separately. To further examine the effectiveness of our pro-
the BN, no matter which classifier is used. When comparing posed method on emotion recognition, the data from all the
the performances of different classifiers, SVM and LR are 15 subjects are combined, with 80% of the combined samples
superior to the other classifiers, which is consistent with the being used for training and 20% for testing. Almost all
results shown in Table III, IV, and V. the recognition accuracies when using the proposed BN are
Authorized licensed use limited to: University of Michigan Library. Downloaded on February 10,2023 at 15:39:24 UTC from IEEE Xplore. Restrictions apply.
10758 IEEE SENSORS JOURNAL, VOL. 22, NO. 11, JUNE 1, 2022
TABLE V
E MOTION R ECOGNITION R ESULTS ACROSS I NDIVIDUALS B ASED ON DE F EATURES (W ITH BN) F ROM THE S ELECTED 9 E LECTRODE
C HANNELS . T HE B EST ACCURACY OF E ACH C LASSIFIER I S H IGHLIGHTED IN B OLD R ED,
AND THE S ECOND -B EST IN B OLD B LUE , AND THE T HIRD -B EST IN B OLD
greater than the accuracies without it (see Table VI). The temporal region where the nine selected channels locate in is
greatest recognition accuracy is achieved with the SVM and closely related to human emotional responses. Furthermore,
LR classifier when using DE features with the proposed BN. other researchers proposed that perception-emotion linkages
This highest number is 11.85% and 16.30% greater than could be stored at the temporal lobe, which would react when
the numbers without BN for the SVM and LR classifiers, emotions were perceived or imagined [54]. They speculated
respectively. These results indicate that our proposed BN can that dorsal portions of the temporal pole were responsible for
also perform well when using the combined data from all the coupling visceral emotional responses with representations of
subjects. complex auditory stimuli while the ventral portions coupled
In addition, as shown in Table V, the recognition accuracies visceral emotional responses to complex visual stimuli. This
of SVM and LR when using 80% of data for training are may explain why the nine selected channels in the temporal
77.85% and 77.89%, respectively. The corresponding numbers region are sensitive to human emotions elicited by films
in Table VI are about 12% higher. This is probably because the clips (i.e., auditory stimuli and visual stimuli). However, the
general characteristics of a person’s emotional EEG responses researchers in [50] and [51] also reported that the front lobe
can be learned from the data of that person in the training was sensitive to human emotions as well. A previous study
set. Therefore, the emotion recognition models can better also proposed that patients with frontal lobe brain damage
recognize the emotion samples from the same person in the might change their emotion behavior [55]. But our results
test set. in Table I show that the PSD features from the channels
in the front lobe are not statistically significant among the
IV. D ISCUSSION examined emotions. It may be because the PSD feature, as a
A. The Relationship Between the Nine Selected single index, cannot totally reflect all the changes of our brain
Channels and Human Emotion with emotional stimuli. This indicates that the mechanism of
Fig. 3 and Fig. 4 show that the nine selected channels are brain EEG responses to human emotions should be further
mainly in the temporal lobe brain regions, indicating their investigated [56].
connection with emotion. Similarly, Liu et al. [50] found
that the temporal lobes were sensitive to emotion activities B. The Potential of Using Only Nine Channels for
in human brain. [51] and [52] reported similar findings. Emotion Recognition
An earlier study suggested that the temporal region affected The results in Table VI show that the recognition accuracy
visceral emotional responses to evocative stimuli based on can achieve 89.63% when only using the nine selected chan-
anatomical connectivity experiments [53], indicating that the nels. The numbers in [57] and [31] are 82.87% and 83.99%
Authorized licensed use limited to: University of Michigan Library. Downloaded on February 10,2023 at 15:39:24 UTC from IEEE Xplore. Restrictions apply.
LI et al.: EEG DATA PROCESSING APPROACH FOR EMOTION RECOGNITION 10759
TABLE VI
E MOTION R ECOGNITION R ESULTS ACROSS I NDIVIDUALS B ASED ON PSD F EATURES (W ITH BN) F ROM THE S ELECTED
9 E LECTRODE C HANNELS . T HE B EST ACCURACY OF E ACH C LASSIFIER I S H IGHLIGHTED IN B OLD
R ED, AND THE S ECOND -B EST IN B OLD B LUE , AND THE T HIRD -B EST IN B OLD
TABLE VII
S TATISTICAL A NALYSIS R ESULTS OF PSD F EATURES IN THE F IVE F REQUENCY B ANDS A FTER A PPLYING BN.
T HE R ED C OLOR H IGHLIGHTS THE R ESULTS W ITH S TATISTICAL S IGNIFICANCE ( P≤0.05)
when using all the 62 electrode channels, respectively. Based and achieved a recognition accuracy of 70.10%. When using
on the EEG signals from 62 channels in SEED, the authors of features from 12 selected channels for emotion recognition
[58] developed single-task DNN (deep neural network), multi- in [31], the accuracy can be up to 86.65%. Compared with
task DNN and adversarial DNN for emotion recognition, and these previous studies, our proposed method based on only
accuracies were 59.05%, 62.15% and 75.31%, respectively. nine electrode channels has the best performance with an
The accuracy when using convolution neural network (CNN) accuracy of 89.63%, superior to those accuracies based on
was 84.35% in [59]. When using a linear formulation of other subsets of EEG channels.
DE feature exactor and a bidirectional long short-term mem-
ory (BiLSTM) network, the method proposed in [60] achieved C. The Effect of Experimental-Level BN on Feature
a recognition accuracy of 80.64%. These results show that Sensitivity to Human Emotions
only using the nine selected channels in this study can achieve To further investigate how the experiment-level BN con-
competitive recognition results with the previous studies using tributes to the recognition accuracy improvement, signif-
all the 62 channels. icance analysis is applied to examine the difference of
Previous studies also used subsets of EEG channels for emo- accuracies between the PSD features with and without the
tion recognition based on SEED. For example, 16 electrode experiment-level BN in the five frequency bands across chan-
channels were selected to classify human emotions by using nels. Comparing the results in Tables I and VII, the number of
SVM in [61]. The reported maximal recognition accuracy channels with statistically significant differences is increased
was 74.06%. Researchers in [62] applied 13 electrode channels when involving the experiment-level BN, and the number of
Authorized licensed use limited to: University of Michigan Library. Downloaded on February 10,2023 at 15:39:24 UTC from IEEE Xplore. Restrictions apply.
10760 IEEE SENSORS JOURNAL, VOL. 22, NO. 11, JUNE 1, 2022
Authorized licensed use limited to: University of Michigan Library. Downloaded on February 10,2023 at 15:39:24 UTC from IEEE Xplore. Restrictions apply.
LI et al.: EEG DATA PROCESSING APPROACH FOR EMOTION RECOGNITION 10761
applications. (5) The numbers of participants in the SEED and [16] H. Dabas, C. Sethi, C. Dua, M. Dalawat, and D. Sethia, “Emotion
SEED IV datasets are limited. More experiment data should classification using EEG signals,” in Proc. 2nd Int. Conf. Comput. Sci.
Artif. Intell. (CSAI), 2018, pp. 380–384.
be collected in future studies. [17] J. Li, S. Qiu, Y.-Y. Shen, C.-L. Liu, and H. He, “Multisource transfer
learning for cross-subject EEG emotion recognition,” IEEE Trans.
V. C ONCLUSION Cybern., vol. 50, no. 7, pp. 3281–3293, Jul. 2020.
[18] T. Song, W. Zheng, P. Song, and Z. Cui, “EEG emotion recognition using
Six sets of EEG electrode channels are selected from the dynamical graph convolutional neural networks,” IEEE Trans. Affect.
source channels based on the public SEED dataset. The best Comput., vol. 11, no. 3, pp. 532–541, Jul./Sep. 2020.
set with only nine channels mainly from the temporal lobes is [19] Y. Yin, X. Zheng, B. Hu, Y. Zhang, and X. Cui, “EEG emotion
recognition using fusion model of graph convolutional neural networks
then determined based upon the emotion recognition accura- and LSTM,” Appl. Soft Comput., vol. 100, Mar. 2021, Art. no. 106954.
cies. An experiment-level BN method is developed to reduce [20] D. Maheshwari, S. K. Ghosh, R. K. Tripathy, M. Sharma, and
the effect of individual differences so as to improve feature U. R. Acharya, “Automated accurate emotion recognition system using
rhythm-specific deep convolutional neural network technique with multi-
sensitivity on emotion recognition. Our results show that the channel EEG signals,” Comput. Biol. Med., vol. 134, Jul. 2021,
recognition performance achieved based on the experiment- Art. no. 104428.
level BN features from the nine selected channels is better [21] Y. Li, W. Zheng, L. Wang, Y. Zong, and Z. Cui, “From regional to global
than the results when using the signals from all the source brain: A novel hierarchical spatial-temporal neural network model for
EEG emotion recognition,” IEEE Trans. Affect. Comput., early access,
channels. Our proposed method has the potential to facilitate Jun. 14, 2019, doi: 10.1109/TAFFC.2019.2922912.
the deployment of emotion recognition applications based on [22] Z. Liang, S. Oba, and S. Ishii, “An unsupervised EEG decoding system
cost-effective devices with fewer EEG channels. for human emotion recognition,” Neural Netw., vol. 116, pp. 257–268,
Aug. 2019.
[23] W. Wang, Y. Peng, and W. Kong, “EEG-based emotion recognition via
R EFERENCES joint domain adaptation and semi-supervised RVFL network,” in Proc.
[1] A. Toisoul, J. Kossaifi, A. Bulat, G. Tzimiropoulos, and M. Pantic, Int. Conf. Intell. Automat. Soft Comput. Cham, Switzerland: Springer,
“Estimation of continuous valence and arousal levels from faces in pp. 413–422.
naturalistic conditions,” Nature Mach. Intell., vol. 3, no. 1, pp. 42–50, [24] S. Liu, X. Wang, L. Zhao, J. Zhao, Q. Xin, and S.-H. Wang, “Subject-
Jan. 2021. independent emotion recognition of EEG signals based on dynamic
[2] B. Ko, “A brief review of facial emotion recognition based on visual empirical convolutional neural network,” IEEE/ACM Trans. Comput.
information,” Sensors, vol. 18, no. 2, p. 401, Jan. 2018. Biol. Bioinf., vol. 18, no. 5, pp. 1710–1721, Sep. 2021.
[3] D. Y. Liliana, “Emotion recognition from facial expression using deep [25] A. Topic and M. Russo, “Emotion recognition based on EEG feature
convolutional neural network,” J. Phys., Conf., vol. 1193, Apr. 2019, maps through deep learning network,” Eng. Sci. Technol., Int. J., vol. 24,
Art. no. 012004. no. 6, pp. 1442–1454, Dec. 2021.
[4] N. Mehendale, “Facial emotion recognition using convolutional neural [26] H. Cui, A. Liu, X. Zhang, X. Chen, K. Wang, and X. Chen, “EEG-
networks (FERC),” Social Netw. Appl. Sci., vol. 2, no. 3, pp. 1–8, based emotion recognition using an end-to-end regional-asymmetric
Mar. 2020. convolutional neural network,” Knowl. Syst., vol. 205, Oct. 2020,
[5] M. Mohammadpour, H. Khaliliardali, S. M. R. Hashemi, and Art. no. 106243.
M. M. AlyanNezhadi, “Facial emotion recognition using deep convo- [27] M. R. Islam et al., “Emotion recognition from EEG signal focusing on
lutional networks,” in Proc. IEEE 4th Int. Conf. Knowl. Eng. Innov. deep learning and shallow learning techniques,” IEEE Access, vol. 9,
(KBEI), Dec. 2017, pp. 17–21. pp. 94601–94624, 2021.
[6] G. Li, Y. Yang, X. Qu, D. Cao, and K. Li, “A deep learning based image [28] Y. Li, W. Zheng, L. Wang, Y. Zong, and Z. Cui, “From regional to global
enhancement approach for autonomous driving at night,” Knowl. Syst., brain: A novel hierarchical spatial-temporal neural network model for
vol. 213, Feb. 2021, Art. no. 106617. EEG emotion recognition,” IEEE Trans. Affect. Comput., early access,
[7] M. S. Özerdem and H. Polat, “Emotion recognition based on EEG Jun. 14, 2019, doi: 10.1109/TAFFC.2019.2922912.
features in movie clips with channel selection,” Brain Inf., vol. 4, no. 4, [29] Z. Wang, S. Hu, and H. Song, “Channel selection method for EEG emo-
pp. 241–252, 2017. tion recognition using normalized mutual information,” IEEE Access,
[8] J. X. Chen, P. W. Zhang, Z. J. Mao, Y. F. Huang, D. M. Jiang, and vol. 7, pp. 143303–143311, 2019.
Y. N. Zhang, “Accurate EEG-based emotion recognition on combined [30] G. Brown, A. Pocock, M.-J. Zhao, and M. Luján, “Conditional like-
features using deep convolutional neural networks,” IEEE Access, vol. 7, lihood maximisation: A unifying framework for information theoretic
pp. 44317–44328, 2019. feature selection,” J. Mach. Learn. Res., vol. 13, no. 1, pp. 27–66,
[9] K. Giannakaki, G. Giannakakis, C. Farmaki, and V. Sakkalis, “Emotional Jan. 2012.
state recognition using advanced machine learning techniques on EEG
[31] W.-L. Zheng and B.-L. Lu, “Investigating critical frequency bands and
data,” in Proc. IEEE 30th Int. Symp. Comput. Med. Syst. (CBMS),
channels for EEG-based emotion recognition with deep neural net-
Jun. 2017, pp. 337–342.
works,” IEEE Trans. Auton. Mental Develop., vol. 7, no. 3, pp. 162–175,
[10] R. K. Jeevan, V. M. R. S.P., P. S. Kumar, and M. Srivikas, “EEG-based
Sep. 2015.
emotion recognition using LSTM-RNN machine learning algorithm,” in
Proc. 1st Int. Conf. Innov. Inf. Commun. Technol. (ICIICT), Apr. 2019, [32] J. Li, S. Qiu, C. Du, Y. Wang, and H. He, “Domain adaptation for
pp. 1–4. EEG emotion recognition based on latent representation similarity,”
[11] C. Qing, R. Qiao, X. Xu, and Y. Cheng, “Interpretable emotion recogni- IEEE Trans. Cognit. Develop. Syst., vol. 12, no. 2, pp. 344–353,
tion using EEG signals,” IEEE Access, vol. 7, pp. 94160–94170, 2019. Jun. 2020.
[12] M. Z. Soroush, K. Maghooli, S. K. Setarehdan, and A. M. Nasrabadi, [33] M. Li, H. Xu, X. Liu, and S. Lu, “Emotion recognition from multi-
“Emotion classification through nonlinear EEG analysis using machine channel EEG signals using K-nearest neighbor classification,” Technol.
learning methods,” Int. Clin. Neurosci. J., vol. 5, no. 4, pp. 135–149, Health Care, vol. 26, no. S1, pp. 509–519, Jul. 2018.
Dec. 2018. [34] L. Yao, M. Wang, Y. Lu, H. Li, and X. Zhang, “EEG-based emotion
[13] A. Hassouneh, A. M. Mutawa, and M. Murugappan, “Development of a recognition by exploiting fused network entropy measures of complex
real-time emotion recognition system using facial expressions and EEG networks across subjects,” Entropy, vol. 23, no. 8, p. 984, Jul. 2021.
based on machine learning and deep neural network methods,” Informat. [35] C. Tan, M. Šarlija, and N. Kasabov, “NeuroSense: Short-term emo-
Med. Unlocked, vol. 20, 2020, Art. no. 100372. tion recognition and understanding based on spiking neural network
[14] V. Doma and M. Pirouz, “A comparative analysis of machine learning modelling of spatio-temporal EEG patterns,” Neurocomputing, vol. 434,
methods for emotion recognition using EEG and peripheral physiologi- pp. 137–148, Apr. 2021.
cal signals,” J. Big Data, vol. 7, no. 1, pp. 1–21, Dec. 2020. [36] R. Ning, C. L. P. Chen, and T. Zhang, “Cross-subject EEG emo-
[15] O. Bazgir, Z. Mohammadi, and S. A. H. Habibi, “Emotion recognition tion recognition using domain adaptive few-shot learning networks,”
with machine learning using EEG signals,” in Proc. 25th Nat. 3rd Int. in Proc. IEEE Int. Conf. Bioinf. Biomed. (BIBM), Dec. 2021,
Iranian Conf. Biomed. Eng. (ICBME), Nov. 2018, pp. 1–5. pp. 1468–1472.
Authorized licensed use limited to: University of Michigan Library. Downloaded on February 10,2023 at 15:39:24 UTC from IEEE Xplore. Restrictions apply.
10762 IEEE SENSORS JOURNAL, VOL. 22, NO. 11, JUNE 1, 2022
[37] Y. Lu, M. Wang, W. Wu, Y. Han, Q. Zhang, and S. Chen, “Dynamic [62] L. Tong, J. Zhao, and W. Fu, “Emotion recognition and channel selection
entropy-based pattern learning to identify emotions from EEG signals based on EEG signal,” in Proc. 11th Int. Conf. Intell. Comput. Technol.
across individuals,” Measurement, vol. 150, Jan. 2020, Art. no. 107003. Autom. (ICICTA), Sep. 2018, pp. 101–105.
[38] P. Philippot, “Inducing and assessing differentiated emotion-feeling [63] A. Al-Nafjan, M. Hosny, Y. Al-Ohali, and A. Al-Wabil, “Review and
states in the laboratory,” Cognition Emotion, vol. 7, no. 2, pp. 171–193, classification of emotion recognition based on EEG brain-computer
Mar. 1993. interface system research: A systematic review,” Appl. Sci., vol. 7, no. 12,
[39] A. Gramfort et al., “MEG and EEG data analysis with MNE-Python,” p. 1239, Dec. 2017.
Frontiers Neurosci., vol. 7, p. 267, Dec. 2013. [64] G. Li et al., “Influence of traffic congestion on driver behavior in post-
[40] G. F. González, G. Žarić, J. Tijms, M. Bonte, L. Blomert, and congestion driving,” Accident Anal. Prevention, vol. 141, Jun. 2020,
M. W. van der Molen, “Brain-potential analysis of visual word recog- Art. no. 105508.
nition in dyslexics and typically reading children,” Frontiers Hum. [65] X. Hu et al., “SAfeDJ: A crowd-cloud codesign approach to situation-
Neurosci., vol. 8, p. 474, Jun. 2014. aware music delivery for drivers,” ACM Trans. Multimedia Comput.,
[41] O. Dressler, G. Schneider, G. Stockmanns, and E. F. Kochs, “Aware- Commun., Appl., vol. 12, no. 1s, pp. 1–24, Oct. 2015.
ness and the EEG power spectrum: Analysis of frequencies,” Brit. J. [66] D. Nemrodov, M. Niemeier, A. Patel, and A. Nestor, “The neural
Anaesthesia, vol. 93, no. 6, pp. 806–809, Dec. 2004. dynamics of facial identity processing: Insights from EEG-based pattern
[42] S. A. Unde and R. Shriram, “PSD based coherence analysis of EEG analysis and image reconstruction,” Eneuro, vol. 5, no. 1, Jan. 2018,
signals for stroop task,” Int. J. Comput. Appl., vol. 95, no. 16, pp. 1–5, Art. no. e0358-17.
Jun. 2014. [67] G. Li, W. Yan, S. Li, X. Qu, W. Chu, and D. Cao, “A temporal-spatial
[43] J. W. Gibbs, Elementary Principles in Statistical Mechanics: Developed deep learning approach for driver distraction detection based on EEG
With Especial Reference to the Rational Foundation of Thermodynamics. signals,” IEEE Trans. Autom. Sci. Eng., early access, Jun. 24, 2021, doi:
New Haven, CT, USA: Yale Univ. Press, 1914. 10.1109/TASE.2021.3088897.
[44] R.-N. Duan, J.-Y. Zhu, and B.-L. Lu, “Differential entropy feature for
EEG-based emotion classification,” in Proc. 6th Int. IEEE/EMBS Conf.
Neural Eng. (NER), Nov. 2013, pp. 81–84.
[45] K. Yang, L. Tong, J. Shu, N. Zhuang, B. Yan, and Y. Zeng, “High
gamma band EEG closely related to emotion: Evidence from functional
network,” Frontiers Hum. Neurosci., vol. 14, p. 89, Mar. 2020.
[46] W.-L. Zheng, W. Liu, Y. Lu, B.-L. Lu, and A. Cichocki, “EmotionMeter: Guofa Li (Member, IEEE) received the Ph.D.
A multimodal framework for recognizing human emotions,” IEEE Trans. degree in mechanical engineering from Tsinghua
Cybern., vol. 49, no. 3, pp. 1110–1122, Mar. 2019. University, Beijing, China, in 2016. He is cur-
[47] F. Pereira, T. Mitchell, and M. Botvinick, “Machine learning clas- rently an Associate Research Professor with the
sifiers and fMRI: A tutorial overview,” NeuroImage, vol. 45, no. 1, College of Mechatronics and Control Engineer-
pp. S199–S209, Mar. 2009. ing, Shenzhen University, Guangdong, China.
[48] M. M. R. Mamun, O. Sharif, and M. M. Hoque, “Classification of textual He has published more than 60 papers in his
sentiment using ensemble technique,” Social Netw. Comput. Sci., vol. 3, research areas. His research interests include
no. 1, p. 49, Jan. 2022. environment perception, driver behavior analy-
[49] C. A. Ul Hassan, M. S. Khan, and M. A. Shah, “Comparison of sis, human-like decision-making based on arti-
machine learning algorithms in data classification,” in Proc. 24th Int. ficial intelligence technologies in autonomous
Conf. Autom. Comput. (ICAC), Sep. 2018, pp. 1–6. vehicles, and intelligent transportation systems. He was a recipient of the
[50] X. Liu et al., “Emotion recognition and dynamic functional connectivity Young Elite Scientists Sponsorship Program in China and the Best Paper
analysis based on EEG,” IEEE Access, vol. 7, pp. 143293–143302, 2019. Awards from the China Association for Science and Technology (CAST)
[51] H. Jiang, Z. Wang, X. Gui, and G. Yang, “Correlation study of and the Automotive Innovation Journal. He serves as an Associate Editor
emotional brain areas induced by video,” in Proc. Int. Conf. Testbeds for IEEE SENSORS JOURNAL and a Lead Guest Editor for IEEE Intelligent
Res. Infrastruct. Cham, Switzerland: Springer, 2019, pp. 199–212. Transportation Systems Magazine and Automotive Innovation.
[52] P. A. Kragel and K. S. LaBar, “Decoding the nature of emotion in the
brain,” Trends Cogn. Sci., vol. 20, no. 6, pp. 444–455, 2016.
[53] H. Kondo, K. S. Saleem, and J. L. Price, “Differential connections
of the temporal pole with the orbital and medial prefrontal net-
works in macaque monkeys,” J. Comparative Neurol., vol. 465, no. 4,
pp. 499–523, Oct. 2003. Delin Ouyang received the bachelor’s degree
[54] I. R. Olson, A. Plotzker, and Y. Ezzyat, “The enigmatic temporal from Shenzhen University, Shenzhen, China,
pole: A review of findings on social and emotional processing,” Brain, in 2021, where he is currently pursuing the mas-
vol. 130, no. 7, pp. 1718–1731, May 2007. ter’s degree in mechanical engineering with the
[55] E. T. Rolls, J. Hornak, D. Wade, and J. McGrath, “Emotion-related College of Mechatronics and Control Engineer-
learning in patients with social and emotional changes associated with ing. His research interests include the application
frontal lobe damage,” J. Neurol., Neurosurgery Psychiatry, vol. 57, of brain computer interface (BCI) and affective
no. 12, pp. 1518–1524, Dec. 1994. computing.
[56] J. Zhang, S. Zhao, W. Huang, and S. Hu, “Brain effective connectivity
analysis from EEG for positive and negative emotion,” in Proc. Int. Conf.
Neural Inf. Process. Cham, Switzerland: Springer, 2017, pp. 851–857.
[57] Z. Wang, R. Jiao, and H. Jiang, “Emotion recognition using WT-SVM in
human-computer interaction,” J. New Media, vol. 2, no. 3, pp. 121–130,
2020.
[58] S. Hwang, M. Ki, K. Hong, and H. Byun, “Subject-independent EEG-
based emotion recognition using adversarial learning,” in Proc. 8th Int.
Winter Conf. Brain-Comput. Interface (BCI), Feb. 2020, pp. 1–4. Yufei Yuan received the bachelor’s degree
[59] Z. Wang, Y. Tong, and X. Heng, “Phase-locking value based graph from the Hebei University of Technology, China,
convolutional neural networks for emotion recognition,” IEEE Access, in 2021. He is currently pursuing the master’s
vol. 7, pp. 93711–93722, 2019. degree in mechanical engineering with the Col-
[60] V. M. Joshi and R. B. Ghongade, “EEG based emotion detection lege of Mechatronics and Control Engineer-
using fourth order spectral moment and deep learning,” Biomed. Signal ing, Shenzhen University, Shenzhen, China. His
Process. Control, vol. 68, Jul. 2021, Art. no. 102755. research interests include the application of brain
[61] R. Nivedha, M. Brinda, D. Vasanth, M. Anvitha, and K. V. Suma, computer interface (BCI) and distraction detec-
“EEG based emotion recognition using SVM and PSO,” in Proc. Int. tion.
Conf. Intell. Comput., Instrum. Control Technol. (ICICICT), Jul. 2017,
pp. 1597–1600.
Authorized licensed use limited to: University of Michigan Library. Downloaded on February 10,2023 at 15:39:24 UTC from IEEE Xplore. Restrictions apply.
LI et al.: EEG DATA PROCESSING APPROACH FOR EMOTION RECOGNITION 10763
Wenbo Li received the B.S., M.Sc., and Ph.D. Xingda Qu received the Ph.D. degree in human
degrees in automotive engineering from factors and ergonomics from Virginia Tech,
Chongqing University, Chongqing, China, Blacksburg, VA, USA, in 2008. He is currently a
in 2014, 2017, and 2021, respectively. He is Professor with the Institute of Human Factors and
currently a Postdoctoral Fellow with Tsinghua Ergonomics, Shenzhen University, Shenzhen,
University. He is also a visiting Ph.D. student China. His research interests include transporta-
with the Waterloo Cognitive Autonomous Driving tion safety, occupational safety and health, and
(CogDrive) Laboratory, University of Waterloo, human–computer interaction.
Canada, from 2018 to 2020. His research inte-
rests include smart cockpit, intelligent vehicle,
human emotion, driver emotion detection,
affective computing, emotion regulation, human–machine interaction,
and brain–computer interface.
Authorized licensed use limited to: University of Michigan Library. Downloaded on February 10,2023 at 15:39:24 UTC from IEEE Xplore. Restrictions apply.