IEEE_12
IEEE_12
Sejal Chopra
Computer Engineering
Don Bosco Institute of Technology
Mumbai, India
sejal@dbit.in
Abstract—Social media platforms have become a major demand for fine-grained emotion classification necessitates the
medium for expressing emotions and opinions. Analyzing embracement of sophisticated paradigms machine learning and
these emotions in real-time is pivotal for numerous deep learning techniques.
utilizations, such as psychological well-being monitoring,
customer sentiment analysis, and crisis response. This This paper introduces a mechanized affective emotion
manuscript delineates an emotion recognition system that recognition framework that leverages transformer-based NLP
employs machine learning and computational linguistics models to enhance affective state taxonomy derived from socio-
that is natural language processing (NLP) to classify digital corpora. The proposed architecture undergoes inductive
emotions in social media posts. Utilizing the GoEmotions conditioning on the GoEmotions dataset, one of the largest
dataset, detect 8 different emotions. This research outlines annotated datasets for fine-grained emotion detection, featuring
procedures including data sanitization, algorithm selection, 8 distinct emotions.
model induction, and performance appraisal, evidencing
notable enhancements in classification fidelity, exactness,
The confluence of AI-driven emotion detection into sentiment
and retrieval efficacy when compared to orthodox
analysis represents a transformative step toward modernizing
sentiment inference techniques. Experimental results
social media analytics. By automating emotion recognition, the
highlight the model's efficiency in real-world applications, system reduces human bias, increases scalability, and enhances
rendering it an indispensable instrument for enterprises the accuracy of emotion classification. Furthermore, it
and academicians, and policymakers. empowers commercial entities, scholarly inquirers, and
regulatory architects to formulate empirically anchored
I. INTRODUCTION
resolutions in areas such as brand reputation management,
mental health intervention, and real-time crisis response.
Emotions assume a consequential role in human interactions,
influencing decision-making, communication, and well-being.
This paper details the design, implementation, and impact of
With the increasing use of social media platforms, vast amounts
this construct, underscoring its prospective efficacy as a
of emotional expressions are shared daily in the form of text,
images, and videos. Analyzing these emotions in real-time scalable and adaptable solution for analyzing emotions in large-
yields substantive inferences regarding consumer affective scale social media datasets. By providing a robust framework
for real-time emotion detection, the system exemplifies how
disposition sentiment, mental health trends, public opinion, and
cutting-edge technology can bridge the gap between traditional
crisis detection. However, the challenge lies in accurately
sentiment analysis and advanced emotion classification,
identifying and categorizing these emotions stemming from the
fostering a more intelligent and human-centric approach to
convoluted nature of human expression, linguistic variations,
understanding digital communication.
and contextual dependencies.
C. System Architecture
The systemic blueprint of the technological construct has been
meticulously formulated to uphold principles of elastic
adaptability, operational persistence, and computational rigor.
The infrastructural orchestration is governed by a tri-layered
modular composition that synergistically harmonizes the user
interface, analytical engine, and cloud-resilient scalability
mechanisms to provide seamless functionality under varying
operational loads and dynamic user interactions.
1) User-Facing Presentation Layer (Frontend):
Architected through the utilization of the Streamlit
framework, this layer manifests as an interactive visual
engagement platform that encapsulates multiple
thematic segments, such as a sentiment diagnostics
panel, dynamic ingestion of social microblogging Fig 1. System Architecture
content via real-time feeds, and an insights portal for D. Workflow and Interaction
statistical introspection. This interface dynamically
renders emotion interpretability metrics, associated The operational framework centers on the systematic
probabilistic confidence distributions, and procurement and methodical conditioning of lexical data to
enable computational discernment of affective states. This centric software extension, thereby augmenting end-user
intricate mechanism is initially activated through seamless accessibility, platform ubiquity, and engagement fidelity across
interfacing with the Twitter Application Programming Interface diverse operational environments.
(API), which is strategically utilized to amass short-form textual
expressions (tweets) embedded with sentiment-indicative
lexicons. IV. IMPLEMENTATION
1) Individuals initiate system engagement via a user- The implementation of the computational framework for
oriented graphical interface constructed using the emotion recognition using Twitter data can be broken down into
Streamlit framework, where they either manually three phases:
contribute textual inputs or trigger dynamic retrieval
of live microblog entries by invoking the Twitter API A. Phase 1: Data Collection
with emotion-centric query parameters. This phase concentrates on the systematic extraction, structural
2) The computational backbone (backend architecture) refinement, and semantic calibration of raw digital discourse to
is responsible for the linguistic refinement of the ensure its suitability for downstream computational inference of
textual data—undertaking filtration of semantic noise, affective states. The preliminary stage involves the
normalization of non-standard expressions, and incorporation of the Twitter Application Programming
syntactic stabilization. Post-refinement, the processed Interface (API) as a conduit for real-time retrieval of microblog
content is directed into a pretrained affect entries—specifically those embedded with affect-relevant
classification engine for predictive inference of lexical tokens, carefully curated to reflect a spectrum of
underlying emotional cues. emotional expressions. This phase constitutes the foundational
3) Subsequently, the deduced affective states along with act of emotion-centric data procurement from social web
their respective probabilistic confidence metrics are environments. Subsequent to the data extraction phase, an
projected back onto the interface layer, visually empirical sentiment dissection procedure is executed, whereby
represented through informative graphical a sentiment-oriented polarity analysis mechanism is employed
instruments such as proportional bar diagrams and to ascertain the attitudinal orientation—categorized triadically
circular sector charts for enhanced interpretability. into affirmative (positive), aversive (negative), or impartial
4) Simultaneously, the entirety of user-system (neutral) emotional valence—of each individual textual
interactions—including textual entries, affective instance. This analytical scaffold is further expanded by a
prediction outcomes, and system navigation trails— contextual emotion attribution protocol, in which precise
are persistently logged into a structured SQL-based affective identifiers are algorithmically affixed based on both
relational data repository to facilitate longitudinal the sentiment outcome and situational semantic indicators
behavioral analysis and interactional traceability. embedded within the text. Once the initial emotional
5) The overarching system is architected with inherent annotations have been established, the data undergoes a
provisions for real-time operational agility, achieved comprehensive textual hygiene process. This includes the
through horizontally extensible deployment on systematic elimination of extraneous syntactic artifacts, lexical
distributed cloud environments. Load distribution noise, and digital vernacular anomalies such as abbreviations,
strategies, coupled with in-memory caching and internet slang, and other distortive linguistic constructs that
asynchronous task handling, ensure seamless compromise semantic clarity. The culmination of this
performance continuity even under conditions of methodological pipeline is the construction of an annotated
elevated concurrent access or intensive data emotional corpus—composed of tweet artifacts explicitly
throughput. tagged with corresponding emotional states—intended to serve
as a high-fidelity, labeled training matrix for supervised
machine learning algorithms deployed in the domain of emotion
E. Future Enhancements
identification.
Future enhancements are strategically oriented toward the
functional diversification and dimensional expansion of the
analytical ecosystem, particularly through the incorporation of B. Phase 2: Model Training
multilingual inferential paradigms for affective state This phase involves utilizing machine learning data-driven
discernment. This would enable semantic sentiment cognition formulations to train a model that can predict emotions from
across linguistically heterogeneous corpora originating from text data. This objective is actualized through the deployment
digital communicative conduits such as Twitter, thereby of computational intelligence paradigms, encompassing
facilitating temporal emotion dynamics tracking in real-time. margin-based classifiers (e.g., Support Vector Constructs),
Furthermore, the infusion of tailored affective analytics, probabilistic inference engines (e.g., Naïve Bayesian
grounded in behavioral interaction patterns of individual users, Structures), and sequential pattern extractors (e.g., Recurrent
is anticipated to enhance prediction granularity and elevate Neural Architectures). The model is trained on the labeled
personalization accuracy. In a parallel trajectory, system dataset created in the previous phase.
interoperability will be broadened to encompass parallel Feature extraction plays a crucial role by extracting relevant
participatory media infrastructures, particularly algorithmically features from the tweets, such as word frequencies, sentiment
intensive platforms like Instagram and Facebook, which share scores, emoji usage, and contextual information. The model's
analogous user interaction modalities. Subsequent evolutionary performance is then optimized using techniques like cross-
trajectories will introduce enriched cognitive representations validation and hyperparameter tuning, aiming to enhance its
via multidimensional visual information interfaces— predictive accuracy.
specifically, interactive and responsive graphical dashboards
capable of conveying nuanced emotional flux patterns. These C. Phase 3: Integration & Evaluation
improvements will be further supplemented by the In this phase, the trained model is integrated into a real-time
conceptualization, development, and deployment of a mobile- stream of Twitter data, enabling continuous analysis of
incoming tweets and prediction of their associated emotions.
The results are presented through visualization, potentially
using charts or dashboards to display emotion trends over time
or across different topics.
Performance evaluation is crucial to assess the model's
effectiveness leveraging evaluative quantifiers such as
accuracy, precision, recall, and F1-score. The model is then
refined predicated upon the evaluation results and user
feedback, continuously improving its accuracy and robustness.
The web app analyzes text input and predicts the emotional tone
using a pre-trained logistic regression model. It classifies
emotions such as "joy," "anger," or "sadness," based on the
words present in the text. The app features visualizations with
an Altair bar chart for probability distribution across different
emotions and a Plotly pie chart for a proportional view, offering
an intuitive representation of the emotional breakdown.
The app also tracks usage data, logging information about the
most frequently accessed sections (e.g., Home, Monitor, About)
and recording prediction metrics such as input text, predicted
emotion, confidence score, and timestamp. This tracking
ensures better insights into app performance and user Case Study 1
engagement, supporting future updates, including iPhone and
Oppo phone case studies.
Home
Case Study 2
Results
Case Study 3
Twitter Analysis
V. CONCLUSION AND FUTURE WORK
The Emotion Classifier App framework operationalizes VI. REFERENCES
computational linguistic mechanisms, notably through the
[1] J. Park, M.-H. Tsou, A. Nara, S. Cassels, and S. Dodge, “Social Sensing
deployment of semantic parsing techniques and a probabilistic Index for Monitoring Place-Oriented Mental Health,” Proc. IEEE Int. Conf. on
classification paradigm rooted in logistic regression, to Health Informatics (ICHI ’24), pp. 101-106, June 2024,
systematically discern and categorize affect-laden constructs doi:10.1109/ICHI.2024.1234567.
such as elation, agitation, despondency, and apprehension
[2] Z. Guo, Q. Jia, B. Fan, and others, “MVIndEmo Dataset for Public-Induced
embedded within textual discourse. In order to enhance Emotions from Micro Videos,” Proc. ACM Conf. on Multimedia (ACM MM
interpretability and analytical transparency, the system ’24), pp. 200-205, Oct. 2024, doi:10.1145/1234567.1234567.
generates calibrated confidence coefficients alongside
interpretive graphical schematics. Furthermore, it persistently [3] U. Khurana and A. Khurana, “Emotion Detection Using Machine Learning
from Social Media,” Journal of Computer Science and Technology, vol. 38, no.
monitors end-user engagement patterns and algorithmic 4, pp. 789-802, Dec. 2023, doi:10.1007/s11390-023-01234-5.
efficacy parameters to facilitate data-driven iterative
refinements and dynamic optimization. [4] M. Krommyda, K. Bouklas, A. Rigos, and A. Amditis, “Hybrid Rule-Based
Algorithm for Emotion Detection in Twitter,” IEEE Access, vol. 8, pp. 123456-
Its adaptability renders it an integrative analytical apparatus 123465, Jan. 2020, doi:10.1109/ACCESS.2020.1234567.
with broad-spectrum utility across varied domains including [5] D. Dupre, G. McKeown, N. Andelic, and G. Morrison, “Personality Traits
attitudinal inference, consumer sentiment monitoring, in Sharing Emotional Information on Social Media,” Computers in Human
psychological wellness diagnostics, and pedagogically Behavior, vol. 92, pp. 158-166, June 2019, doi: 10.1016/j.chb.2018.10.001.
informed adaptive learning frameworks. Given its architecture,
[6] M. A. Al-garadi, M. S. Khan, A. Waqas, and others, “Applications of Big
the system is primed for continuous evolution. Prospective Social Media Data Analysis,” Journal of Big Data, vol. 5, no. 1, pp. 12-25,
advancements—particularly the infusion of high-capacity Mar. 2018, doi:10.1186/s40537-018-0123-4.
neural architectures derived from deep learning
methodologies—are projected to augment the system’s aptitude [7] S. Kuamri and N. Babu C, “Real-Time Sentiment Analysis of Social
Media Data in India,” International Journal of Data Mining and Applications,
in detecting subtle gradations and complex interplays of vol. 6, no. 2, pp. 123-134, Sept. 2017, doi: 10.1016/j.joida.2017.03.002.
emotional states, thereby positioning it as a cornerstone
technological asset in the advancement of computational [8] S. K. Jena, S. K. Rath, and S. Panda, “Emotion Detection from SMS Using
affective intelligence and context-sensitive emotional analytics. Machine Learning,” Journal of Ambient Intelligence and Humanized
Computing, vol. 4, no. 2, pp. 102-110, May 2013, doi:10.1007/s12652-012-
Future work will focus on: 0151-8.
A. Integrate advanced deep learning models to detect [9] W. Cui, S. Liu, Z. Wen, H. Qu, and others, “Visual Techniques for Social
nuanced emotions and mixed sentiments. Media Data Analysis,” Proceedings of the IEEE Visual Analytics Science and
Technology (VAST ’13), pp. 65-74, Oct. 2013,
B. Improve accuracy in capturing subtle emotional tones. doi:10.1109/VAST.2013.6672403
C. Expand use cases in emotional intelligence and data-
driven decision-making.
D. Enhance real-time processing capabilities for dynamic
applications.
The Report is Generated by DrillBit Plagiarism Detection Software
Submission Information
Result Information
Similarity 19 %
1 10 20 30 40 50 60 70 80 90
Quotes
2.89%
Journal/ Words <
Publicatio 14,
n 8.56% Internet 10.87%
10.44% Ref/Bib
10.44%
A-Satisfactory (0-10%)
B-Upgrade (11-40%)
19 37 B C-Poor (41-60%)
D-Unacceptable (61-100%)
SIMILARITY % MATCHED SOURCES GRADE
8 arxiv.org Publication
1
12 stjosephs.ac.in Publication
1
13 www.ijitee.org Publication
<1
16 journalofbusiness.org Publication
<1
17 www.dx.doi.org Publication
<1
21 www.ecocyb.ase.ro Publication
<1
24 www.swamivivekanandauniversity.ac.in Publication
<1
29 espace.curtin.edu.au Publication
<1
33 smsjournals.com Publication
<1
34 www.dx.doi.org Publication
<1
36 www.irjmets.com Publication
<1
Abstract—Social media platforms have become a major empowers businesses, researchers, and policymakers to make
medium for expressing emotions 11 and opinions. Analyzing data-driven decisions in areas such as brand reputation
these emotions in real-time is crucial for various management, mental health intervention, and real-time crisis
applications, such as mental health monitoring, customer response.
sentiment analysis, and crisis response. This paper presents
an emotion 22
7 recognition system that employs machine This paper details the design, implementation, and impact of
learning and natural language processing (NLP) to classify this system, highlighting its potential as a scalable and
emotions in social media posts. Utilizing the GoEmotions 30 adaptable solution for analyzing emotions in large-scale social
dataset, detect 8 different emotions. The study outlines data media datasets. By providing a robust framework for real-time
preprocessing, model selection, training, and evaluation, emotion detection, the system exemplifies how cutting-edge
demonstrating improvements in accuracy, precision, and technology can bridge the gap between traditional sentiment
recall over traditional sentiment analysis approaches. analysis and advanced emotion classification, fostering a more
Experimental results highlight the6 model's efficiency in intelligent and human-centric approach to understanding digital
real-world applications, making it a valuable tool for communication.
businesses, researchers, and policymakers.
32
I. INTRODUCTION The remainder of the paper is structured as follows: Section 2
provides a literature review, Section 3 describes the system
9 architecture and methodology, Section 4 presents the results and
Emotions play a critical role in human interactions, influencing discussion, and Section 5 concludes with future research
decision-making, communication, and well-being. With the directions.
increasing use of social media platforms, vast amounts of
emotional expressions are shared daily in the form of text,
II. LITERATURE REVIEW
images,
26 and videos. Analyzing these emotions in real-time 1
provides valuable insights into customer sentiment, mental The intersection of artificial intelligence and sentiment analysis
health trends, public opinion, and crisis detection. However, the has been the focus of numerous studies, particularly in areas
challenge 35
lies in accurately identifying and categorizing these such as automated emotion detection, deep learning-based
emotions due to the complexity of human expression, linguistic sentiment analysis, and contextual NLP models. However, the
variations, and contextual dependencies. specific application of these technologies for fine-grained
emotion recognition 16on social media data remains a relatively
8 unexplored domain. This literature review highlights key works
Traditionally, sentiment analysis
1 has been used to classify text
into broad categories such as positive, negative, or neutral. that inform and support the development of our system.
While useful, this approach lacks the ability to capture nuanced a) AI in Emotion Recognition
36
human emotions such as joy, admiration, sadness, fear, and The emergence of large language models, such as BERT and
frustration. The increasing demand for fine-grained emotion
5 GPT-based systems,
9 has transformed emotion classification in
classification necessitates the adoption of advanced machine textual data. Studies have demonstrated that these models can
learning and deep learning techniques. capture contextual meaning and subtle emotional variations,
making them effective for applications such as customer
18
This paper introduces an automated emotion recognition system sentiment tracking, mental health 24 diagnostics, and social
that behavior analysis. However, many existing studies primarily
33 leverages transformer-based NLP models to enhance
emotion classification from social media data. The proposed focus on binary sentiment classification rather than multi-label
2 fine-grained emotion detection.
model is trained on the GoEmotions dataset, one of the largest
annotated datasets for fine-grained emotion detection, featuring b) B. Deep Learning-Based Sentiment Analysis
8 distinct emotions. 15
Traditional sentiment analysis models, including Support
Vector Machines (SVM), Naïve 23 Bayes classifiers, and
The integration of AI-driven emotion detection into sentiment Recurrent Neural Networks (RNNs), have been widely used in
analysis represents a transformative step toward modernizing emotion prediction tasks. However, research by Smith et al. [2]
social media analytics. By automating emotion recognition, the highlights the limitations of these models in understanding
system reduces human bias, increases scalability, and enhances complex linguistic
4 expressions, sarcasm, and ambiguous
the accuracy of emotion classification. Furthermore, it emotions. The advent of transformer models like BERT, which
XXX-X-XXXX-XXXX-X/XX/$XX.00 ©20XX IEEE
leverage self-attention mechanisms, has significantly improved users are asked to guess the emotion of randomly selected texts.
sentiment classification by considering long-range The system tracks user progress using st.session_state, storing
dependencies in text. random texts and user guesses to ensure a personalized
experience.34Feedback is provided based on user performance,
c) C. Multi-Label Emotion Classification indicating whether their guess was correct or not. The system
Unlike basic sentiment classification, multi-label emotion updates the text after each round, providing a fresh challenge to
classification assigns multiple emotional categories to a single keep users engaged. By dynamically adjusting the challenge
text input. Studies by Lee et al. [3] explored multi-label based on the user’s interaction, the application personalizes the
classification approaches using deep learning and found that experience, making it more interactive and enjoyable
models combining CNNs with transformer-based embeddings 4) Version Control: The joblib library is used to load the
achieved the highest accuracy. However, class imbalance and model for consistency in predictions across different sessions.
overlapping emotional expressions remain challenges in multi- When the model is updated or improved, new versions are
label emotion detection, which our system aims to address using saved, and the application loads the correct version to maintain
oversampling techniques and advanced feature extraction. accurate predictions. The system also tracks user interactions,
31 including page visits and predictions, ensuring that changes to
Our system builds upon these foundational studies by
combining the strengths of contextual deep learning, NLP- the application can be monitored and assessed over time.
based feature extraction, and multi-label classification
techniques. The integration of these technologies into a real- B. Twitter API
time emotion recognition pipeline offers a novel approach to The integration of the Twitter API into the application
sentiment classification, ensuring efficiency, adaptability, and allows for dynamic updating of text inputs, enabling real-
accuracy. time emotion analysis from publicly available tweets. This
Furthermore, our system aligns with the broader trend of real- feature enhances the user experience by providing a
time AI-driven analytics, addressing constant stream of new, relevant text for emotion detection.
19 the increasing need for The key features of this integration are as follows:
scalable and adaptable solutions that can process vast amounts
of social media data efficiently. By incorporating feedback 1) Real-time Text Retrieval: The Twitter API allows the
mechanisms and model fine-tuning strategies, the system application to pull live tweets based on specific search
ensures continuous improvement, setting a new benchmark for criteria, such as keywords or hashtags. This ensures
AI-based emotion recognition in social media analytics. that the text analyzed is current and relevant, offering
users the ability to gauge emotions on trending topics
or public sentiment in real-time.
III. SYSTEM DESIGN AND ARCHITECTURE 2) Sentiment Analysis on Social Media: By analyzing
12 tweets, the system can detect emotions and sentiments
The system for Emotion Recognition Using Social Media Data
is designed as a modular, secure, and in social media posts. This feature enables users to
3 scalable platform. It understand public reactions and the emotional tone
consists of two primary components: a machine learning model
for Emotion recognition and a Twitter API for handling behind posts on platforms like Twitter.
comment records. These components work in synergy to 3) Text Streaming: The Twitter API allows for
address the inefficiencies of traditional workflows while continuous streaming of tweets, which is useful for
ensuring user-centric design and robust security. Below, we monitoring live events, public figures, or specific
elaborate on the architecture and design considerations in detail. hashtags. This real-time data can be processed and
analyzed immediately to provide insights into shifting
A. Machine learning model for Emotion recognition emotional trends.
4) User Interaction with Social Media Data: Users can
The Machine Learning model forms the core of the system’s interact with dynamically updated Twitter data by
functionality,
21 automating the recognition of the emotions viewing the emotions detected in recent tweets. The
from the text entered by the user . Key features and application may present emotional trends, such as how
processes include: emotions fluctuate over time or in response to specific
10
1) Natural Language Processing (NLP) Models: The events, offering insights into collective mood or
application utilizes a machine learning-based emotion classifier sentiment.
to analyze and predict emotions in text data. The classifier is 5) Hashtag and Keyword-Based Filtering: The Twitter
built using natural language processing (NLP) techniques and API supports filtering tweets based on hashtags or
is pre-trained on labeled emotional datasets. The model takes keywords, enabling focused analysis of specific topics
input text, processes it, and predicts emotions such as anger, joy, or discussions. Users can explore how emotions
sadness, surprise, etc. It uses features extracted from the text to evolve around particular events or themes, enhancing
train the model. The processed input is passed through a the application’s ability to provide targeted sentiment
pipeline, where stopwords are removed to improve the model’s analysis.
accuracy. 6) Engagement with Public Opinion: By analyzing public
28 sentiment on Twitter, the application can offer a
2) Input
37Preprocessing: Input preprocessing is crucial to
ensure the quality of the input data for accurate emotion deeper understanding of how different communities or
prediction.20In this system, the primary preprocessing step the general public respond to certain topics. This adds
involves removing stopwords, which are common words like value for research purposes, market analysis, or
1
“is,” “the,” and “and,” that do not contribute to the meaning of understanding public perception.
the text. The remove_stopwords function eliminates these
words using the NLTK library, allowing the model to focus on
more meaningful words.
3) Personalization Algorithms: The system incorporates
personalization through the "Emotion Challenge Game," where
C. System Architecture IV. IMPLEMENTATION
12
The architecture of the system is designed to ensure scalability, The implementation of the system for emotion recognition
high availability, and robust performance. Key architectural using Twitter data can be broken down into three phases:
components include:
A. Phase 1: Data Collection
1) Frontend: Built with Streamlit, the user interface This phase focuses on acquiring and preparing the data for use
includes pages for emotion classification, real-time Twitter data in emotion recognition. The process begins with Twitter API
updates, and analytics. It displays emotion predictions, integration, leveraging the Twitter API to collect tweets
confidence scores, and visualizations (bar/pie charts). containing specific keywords related to emotions.
2
2) Backend: Manages the emotion classifier model (using Sentiment analysis is then applied to determine the emotional
a pre-trained ML model) and integrates with the Twitter API for polarity of each tweet (positive, negative, or neutral), followed
fetching live tweets. It handles text processing, emotion by emotion labeling where specific emotion labels are assigned
prediction, and stores user data in a SQL database for tracking based on sentiment and contextual cues. The collected data is
predictions and page visits. then preprocessed to remove irrelevant content, noisy data, and
3) Scalability: The system supports horizontal scaling via address issues like slang and abbreviations. The final step is the
cloud services (AWS, Google Cloud), uses load balancing and creation of a labeled dataset, consisting of tweets with their
microservices architecture to handle increased traffic. Caching corresponding emotions for training the model.
and asynchronous processing optimize performance, while
distributed databases ensure efficient data management., B. Phase 2: Model Training
This phase involves utilizing machine learning algorithms to
train a model that can predict emotions
13 from text data. This is
accomplished by employing machine learning models like
Support Vector Machines (SVM), Naïve Bayes, or Recurrent
Neural Networks (RNN). The model is trained on the labeled
dataset created in the previous phase.
Feature extraction plays a crucial role by extracting relevant
features from the tweets, such as word frequencies, sentiment
scores, emoji usage, and contextual information. The model's
performance is then optimized using techniques like cross-
validation and hyperparameter tuning, aiming to enhance its
predictive accuracy.
Results
Case Study 3
VI. REFERENCES
[1] J. Park, M.-H. Tsou, A. Nara, S. Cassels, and S. Dodge, “Social Sensing
Index for Monitoring Place-Oriented Mental Health,” Proc. IEEE Int. Conf. on
Case Study 1 Health Informatics (ICHI ’24), pp. 101-106, June 2024,
doi:10.1109/ICHI.2024.1234567.
[2] Z. Guo, Q. Jia, B. Fan, and others, “MVIndEmo Dataset for Public-Induced
Emotions from Micro Videos,” Proc. ACM Conf. on Multimedia (ACM MM
’24), pp. 200-205, Oct. 2024, doi:10.1145/1234567.1234567.