Language Android Application For People With Deaf Disabilities
Language Android Application For People With Deaf Disabilities
Language Android Application For People With Deaf Disabilities
A Thesis Presented to
The Faculty of The College of Computer Studies
Pamantasan ng Lungsod ng Pasig
______________________________________
In Partial Fulfillment
of the Requirements for the Degree of
Bachelor of Science in Information Technology
______________________________________
by
Balbaboco, Dionisio B.
Merin, John Carlo A.
Taal, Redick Jake D.
October, 2021
Chapter I
A. Introduction
Deaf is a disability that impair their hearing and make them unable to hear, therefore can
still do much other things. The only thing that separates deaf and the normal people is
fundamental part of every human being. But for people who are mute &hearing impaired,
communication is a big challenge. To communicate and understand them, one has to either learn
their language through sign language, hand signals or gestures. There are some projects that was
develop for hearing-impaired person, they used different software to create a system that can
help the hearing-impaired person. If there is a way for normal people and deaf people to
communicate, the deaf people can easily live like a normal person. And the only way for them to
communication tool for many deaf and hard-of-hearing people. Sign languages are the native
languages of the Deaf community and provide full access to communication, to communicate
both with normal people and with themselves. We as the normal people, tend to ignore the
importance of sign language, unless there are loved ones or any relatives who are deaf. One of
the solutions to communicate with the deaf people is by using the services of sign language
interpreter. But the usage of sign language interpreter can be costly. Therefore, our group wants
to find a way for the deaf people to communicate, express their feelings and makes them
decided to create an application. This application aims to normal people to learn the sign
language as well enhance the ability of deaf people to use sign language.
The application will give importance to the people with deaf disability that they are also
part of community. This application will also benefit the people with no disability who wants to
The research aim is to develop a mobile application that uses illustrations and pictures by
In line with this, the project aims to achieve the following specific objectives:
To identify images accurately and provide sign language guide and translation with the
To apply the cognitive learning approach in Language android application for people
useful. Create a virtual machine through an emulator and one of the features of an
Create some features: such as letters, numbers, words, colors, text to speech, and random
quizzes.
To test the learning assessment and communicate without the help of assistance in
For deaf people, it helps them boost their confident and have the chance to be considered
in the community like a normal person by the help of this application. Many possibilities of
people to learn and to adopt the sign language communication. As well for normal people, this
application will benefit them to not being ignorant with such signs also for them to be educated.
In global context, the study will give importance to the deaf people.
In societal context, the study will give confident to deaf people to communicate to other
people.
In environmental context, the study will let them feel that they also part of the
community.
D. Scope and Limitation
Scope
This study focused mainly on the design and development of an Android application:
Sign language guide for people with deaf disability an application system that will help the
hearing-impaired person to communicate with others. The study covers the following:
It will focus in deaf persons only. It will not tackle other kinds of disabilities except for
them.
For the application, it only consists of sign language and a text that will convert into
speech.
Limitations
The limitation of this study will be limited only in sign language (images) that will
convert into text and a text that will convert into speech. The respondents of this study will
be focuses on deaf persons. It will not tackle other kinds of disabilities except for them. A
voice converts into sign language and a motion sensor where sign language will convert into
Figure 1 is the conceptual framework which presents the process of the system
development. The study was presented using the three dimensions of conceptual paradigm:
The first frame is the Input stage which involved the user requirements like knowledge in
hand gestures, text input and sign language, which are needed in the development of the system.
System requirements were also considered which include the hardware requirements and
software specifications needed to develop the system. Related literature, related studies, internet
The second frame is the Process stage. In this study, the proponent adopted the Agile
Model which is a conceptual model used in project management that describes the stages
involved in an application system development project, from an initial feasibility study through
maintenance of the completed application. In addition, data gathering also part of the process of
completing the research. For the Output stage, this is the developed application system entitled,
“Android application: Sign language guide for people with deaf disability”.
Deaf disabilities: Researchers main target user to use our mobile application to help
Aide application: Researchers use this system to design and make an application.
Sign Language: It is what deaf people use to communicate with different people.
Chapter II
Assistive Android Application for Hearing Impaired People Using Sign Language - Joshi,
A., et al. "Assistive Android Application for hearing impaired people using Sign Language."
Advances in Natural and Applied Sciences, vol. 11, no. 7, May 2017, pp. 166+. Hearing
impaired people rarely used mobile phones before the introduction of SMS/MMS. Now texting
allows both the deaf and hearing people to communicate with each other. Mobile video chat may
one day replace texting but are not suitable for hearing impaired callers. Sign Language is the
primary means of communication in the deaf community. The problem arises when the deaf
people try to express themselves to other people with the help of these sign language grammar.
Our proposed system provides a learning as well as an interactive application. It enables both
normal and deaf people to learn the sign language and also provides communication between
The proposed system has several advantages but does not act as a translator and hence in the
future works, the application can be developed as one which consists of both the existing
learners. Cognitive theories view students as active in “an internal learning process that involves
memory, thinking, reflection, abstraction, motivation, and meta-cognition” (Ally, 2008).
Students organize old knowledge, scripts, and schema, find relationships, and link new
information to old (Cognitive Theories of Learning, n.d.). Ertmer and Newby (1993) note that
“learning is a change in the state of knowledge, and is a mental activity where an active learner
internally codes and structures knowledge” (p. 58). They believe that “the real focus of the
cognitive approach is on changing the learner by encouraging him/her to use appropriate learning
participated in the study. We used a pre-post design to measure the learning and attitude changes
of the teachers. The results of the analysis of pretests and posttests of cognitive outcomes
indicated that significant learning occurred as a result of the computer-based instruction. Also,
the majority of the students reacted positively to the quality of the lessons. The results also
suggest that applying an appropriate theoretical framework to the design of instruction offers an
avenue for meaningfully addressing the appropriate use of technology. (Mertens, D. M., &
Rabiu, J. (1992). Combining Cognitive Learning Theory and Computer Assisted Instruction for
http://www.jstor.org/stable/44400994).
environment for software, and incorporates its code editing and developer tools.
reference. It will quickly demonstrate the usage of the Android Studio IDE to build an Android
mobile app step by step. You won’t find any technical jargon, bloated samples, drawn out history
lessons, or witty stories in this book. What you will find is a reference that is concise, to the point
and highly accessible. (Hagos, Ted. (2019). Android Studio IDE Quick Reference: A Pocket Guide to
The application can interpret whether the addressee wants to use the text to speech
function or the speech to text function of the application. The addressee can choose between the
two features. And it can be used one after another. Every sentence completed by the hearing-
impaired person it will be recorded on the conversations windows that will serve as guide of
what are they talking about. Conversations can be saved and can be viewed for future references.
Figure 2.3. Speeches to Text & Text to Speech
Firstly, the hand gestures of the signer are always placed in the middle of the image
frame in order for the whole hand to fit in. Application of appropriate scale-invariant techniques
eliminates the need to normalize the size of hand gesture. The system framework has been
successfully implemented on smartphone platforms, and experimental results show that it is able
to recognize and translate 16 different American Sign Language gestures with an overall
accuracy of 97.13% (Pansare J.R, 2015). The app works in 2 parts: text to sign and a dictionary
(of sorts). If the user is trying to turn an English sentence into an ASL sentence, the app can
translate up to 50 words at once. The user types in what he wants to say, and ASL Translator
provides a series of videos and pictures of the correct signs. However, the app states that it
“generates sentences in ‘English word order’” and “improves the translation [of ‘signed exact
English’] with our Smart Translation Algorithm.” More than 30,000 words and 1,400 idioms are
Figure: 2.4 showing the demonstration of America Sign Language Translator Application.
America Sign Language Translator Application Firstly, the hand gestures of the signer are always
placed in the middle of the image frame in order for the whole hand to fit in. Application of appropriate
scale-invariant techniques eliminates the need to normalize the size of hand gesture. The system
framework has been successfully implemented on smartphone platforms, and experimental results show
that it is able to recognize and translate 16 different American Sign Language gestures with an overall
accuracy of 97.13% (Pansare J.R, 2015). The app works in 2 parts: text to sign and a dictionary (of sorts).
If the user is trying to turn an English sentence into an ASL sentence, the app can translate up to 50 words
at once. The user types in what he wants to say, and ASL Translator provides a series of videos and
pictures of the correct signs. However, the app states that it “generates sentences in ‘English word order’”
and “improves the translation [of ‘signed exact English’] with our Smart Translation Algorithm.” More
than 30,000 words and 1,400 idioms are programmed into the app.
Application
application named NEU-CEIT about the mobile learning environment, educational and sharing
structure of the developed application. A total of 27 students participated and students were
asked to upload the developed application and examine the content. Following the application,
students were administered an environment evaluation questionnaire. The results also showed
that mobile applications will support education and increase motivation. This study supports that
mobile applications improve academic achievement. (Kocakoyun, Şenay & Bicen, Huseyin.
Study of Mobile app support learning system to compare the effects of two different
self-efficacy, and system usability. The experimental results showed that a mobile APP support
learning approach could improve the students’ HRPF achievements. Furthermore, this study
found that self-efficacy and system operations affect the students’ HRPF achievements. (Cheng
& chen. (2018). Developing a mobile APP-Supported learning System for evaluating Health-
Chapter III