Project Report - Face Emotion Tracking
Project Report - Face Emotion Tracking
Project Report - Face Emotion Tracking
ON
Face Micro-Emotions Detection
Submitted for partial fulfillment of Award of
Bachelor Of Technology
in
Computer Science & Engineering
(Artificial Intelligence)
Submitted by
Adrika Sachan
2200521520003
December 2024
Project Report
ON
Face Micro-Emotions Detection
Submitted for partial fulfillment of Award of
Bachelor Of Technology
in
Computer Science & Engineering
(Artificial Intelligence)
Submitted by
Anshuman Mishra
2200521520008
December 2024
Project Report
ON
Face Micro-Emotions Detection
Submitted for partial fulfillment of Award of
Bachelor Of Technology
in
Computer Science & Engineering
(Artificial Intelligence)
Submitted by
Unnati Tripathi
2200520310061
(Branch Change)
December 2024
DECLARATION
We hereby declare that this submission is our own work and that,
to the best of our knowledge and belief, it contains no material
previously published or written by another person nor material
which to a substantial extent has been accepted for the award of
any other degree or diploma of the university or other
institute of higher learning, except where due
acknowledgment has been made in the text.
2. Introduction
o Background
o Objectives
3. Literature Review
4. Problem
5. Methodology
o Materials and Methods
o Project Design
7. Conclusion
o Summary of Findings
o Limitations
8. References
____________________________________________________
1. ABSTRACT
This report documents the development and implementation of a
web application for analyzing facial expressions from video streams.
The system extracts information such as the number of people,
their estimated ages, genders, and emotional states, and emails
the analyzed data to a specified address. Leveraging modern
web technologies and pre-trained models, the project highlights
the power of real-time facial expression analysis for
diverse applications in education, healthcare, and business.
2. INTRODUCTION
Background
3. LITERATURE REVIEW
1. Existing Technologies:
o Libraries like TensorFlow.js and OpenCV.js enable on-the-fly facial
recognition and emotion detection in browsers.
3. Challenges Identified:
o Browser-based machine learning is constrained by computational
resources.
o Ensuring the accuracy of real-time processing in diverse environments
remains an open challenge.
4. PROBLEM
5. METHODOLOGY
<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs"></script>
<script src="https://cdn.jsdelivr.net/npm/@vladmandic/face-api"></script>
1. Video Input: Capturing live video streams using the device’s camera.
2. Preprocessing: Detecting faces and extracting facial landmarks.
3. Analysis: Using pre-trained models to infer age, gender, and emotional
states.
4. Reporting: Automating the generation and email delivery of analysis
results.
5. Deployment: Hosting the application on Vercel for easy access.
6. RESULTS
Results:
Discussion:
Future Scope:
Our platform currently works with images and/or short videos. We are
exploring methods to analyze micro-expressions in continuous video
streams.
2. Web APIs
Our current platform sends output data or results via email or messages. We
are working on developing web APIs that will allow users to receive
continuous results directly from video streams on their platform.
3. Accuracy
Our main and foremost goal is to increase the accuracy of the platform and
train on more number of micro-expressions
7. CONCLUSION
Summary of Findings:
• The web application effectively analyses video streams and automates
reporting.
• Leveraging pre-trained models ensures high accuracy with minimal
resource usage.
Limitations:
• Performance may vary in low-light or high-noise scenarios.
• Further optimization is needed for better cross-platform compatibility.
REFERENCES
• Ekman, P., & Friesen, W. V. (1978). Facial Action Coding System
(FACS).
• OpenCV Documentation.
• TensorFlow.js Documentation.
• FaceAPI.js Library Documentation.
• Wu, Y., et al. (2021). Real-time facial emotion recognition using
lightweight neural networks.