WJARR-2022-0565
WJARR-2022-0565
WJARR-2022-0565
Department of Computer Science and Engineering, Meenakshi Sundararajan Engineering College, Chennai, India.
Publication history: Received on 13 May 2022; revised on 14 June 2022; accepted on 17 June 2022
Abstract
In today's international, anyone opts for instant interaction with complicated structures that ensure a brief response.
Thus, with increasing improvement in technology, reaction time and ease of operations are the issues. Here is where
human-computer interaction comes into play. This interplay is unrestricted and challenges the used gadgets consisting
of the keyboard and mouse for input. Gesture recognition has been gaining tons of attention. Gestures are instinctive
and are often utilized in everyday interactions. Therefore, communicating using gestures with computer systems
creates an entire new trend of interaction. In this assignment, with the help of laptop vision and deep studying
techniques, person hand movements (gestures) are used in real-time to manipulate the media player. In this project,
seven gestures are defined to control the media gamers' usage of hand gestures. The proposed internet application
permits the person to use their neighborhood device digicam to become aware of their gesture and execute the control
over the media participant and comparable packages (with no extra hardware). It will increase performance and make
interaction convenient through letting the user manage his/her pc/laptop from a distance.
Keywords: Computer Vision; Deep Learning; Hand Gestures Recognition; Media Player Control
1. Introduction
The sophistication is what has caused the development of Technologies. Everyone relies upon to carry out most of their
responsibilities the usage of computer systems. The main input devices are the keyboard and mouse. But there are
enormous forms of health problems that affect many human beings, due to the consistent and non-stop artwork with
the pc. Direct use of fingers as an input device is an appealing approach for Human-Computer Interaction Since hand
gestures are an absolutely natural form of communication so it does not adversely have an effect on the health of the
operator as in the case of excessive use of the keyboard and mouse. The User interface has a very good understanding
of human hand gestures. By using the gesture, Feelings and thoughts also can be expressed.
Users typically use hand gestures to specific their feelings and notifications in their thoughts. Hand gestures and hand
posture are associated with the human palms in hand gesture reputation. In this paper, we're going to present software
that makes use of dynamic hand gestures as input to govern the home windows media participant. We have considered
unmarried-handed gestures and their directional motion defines a gesture for the software. In this utility, photograph
acquisition is finished with the usage of a Webcam. Some functions in home windows media game enthusiasts are used
more regularly and as a result, making use of controls windows media participant for those functions the usage of
predefined gestures.
*
Corresponding author: Vaishnavii Raghavendran
Department of Computer Science and Engineering Meenakshi Sundararajan Engineering College Chennai, India.
Copyright © 2022 Author(s) retain the copyright of this article. This article is published under the terms of the Creative Commons Attribution Liscense 4.0.
World Journal of Advanced Research and Reviews, 2022, 14(03), 466–472
1.1.1. A Dynamic hand gesture recognition system for controlling VLC media player
1.1.3 Controlling Media Player with Hand Gestures using Convolutional Neural Network
Improvement in technology, response time, and ease of operations are the concerns. Here is where human-computer
interaction comes into play. This interaction is unrestricted and challenges the used devices such as the keyboard
and mouse for input. Gesture recognition has been gaining much attention. Gestures are instinctive and are frequently
used in day-to-day interactions. Therefore, communicating using gestures with computers creates a whole new
standard of interaction. In this project, with the help of computer vision and deep learning techniques, user hand
movements (gestures) are used in real-time to control the media player. In this project, seven gestures are defined
to control the media players using hand gestures. The proposed web application enables the user to use their local
device camera to identify their gesture and execute the control over the media player and similar applications
(without any additional hardware). It increases efficiency and makes interaction effortless by letting the user control
his/her laptop/desktop from a distance.
Shilpa Chaman, Jay Jani, Henson Fernandes, Rahila Dhuka, Dhanvin Mehta(2018, IEEE)
The system Real-Time Gesture to Automotive Control (G2AC) system is developed in this paper, using hand gesture
recognition to handle the media player in automotive. The system presents fast gesture recognition with low
complexity algorithms for controlling real-time media in automotive systems. In the proposed vision-based system
two modules are interconnected: one module recognizes the hand gesture in the region of interaction and another
module does the task of selecting music from the media player using Raspberry pi. The proposed G2AC system can
recognize real-time hand gestures with 98 percent accuracy which is demonstrated using a hand gesture dataset
collected under different settings of illumination variation, hand orientation, and occlusion.
1.1.5 Human-computer interface using hand gesture recognition based on neural network
H. Jalab, H. K. Omer(2015, IEEE)
Gestures are one of the most vivid and dramatic ways of communication between humans and computers. Hence,
467
World Journal of Advanced Research and Reviews, 2022, 14(03), 466–472
there has been a growing interest in creating easy-to-use interfaces by directly utilizing the natural communication
and management skills of humans. This paper presents a hand gesture interface for controlling a media player using
a neural network. The proposed algorithm recognizes a set of four specific hand gestures, namely: Play, Stop,
Forward, and Reverse. Our algorithm is based on four phases, Image acquisition, Hand segmentation, Features
extraction, and Classification. A-frame from the webcam camera is captured, and then skin detection is used to
segment skin regions from background pixels. A new image is created containing the hand boundary. Hand shape
features extraction is used to describe the hand gesture. An artificial neural network has also been utilized as a
gesture classifier. 120 gesture images have been used for training. The obtained average classification rate is 95%.
The proposed algorithm develops an alternative input device to control the media player, and also offers different
gesture commands, which can be useful in real-time applications. Comparisons with other hand gesture recognition
systems have revealed that our system shows better performance in terms of accuracy.The automatic vision-based
recognition of hand gestures for sign language and control of electronic devices, like digital TV, and play stations was
considered a hot research topic recently. But the general problems of these works rise due to many issues, such as
the complex backgrounds, the skin color and the nature of static and dynamic hand gestures
1.1.6 System application control based on Hand gesture using Deep learning
The Human-Computer Interaction progresses toward interfaces that seem to be natural and intuitive to use rather
than the customary usage of keyboard and mouse. A hand gesture recognition system is one of the crucial techniques
to build user-friendly interfaces, because of its diversified application and the potential of interacting with machines
proficiently. Hand gestures including the movement of hands, fingers, or arms are considerable for interaction. The
proof levels of the hand gesture are perceived from the level of static gesture to the dynamic gestures or intricate
foundation through which the communication of human feeling with computers succeeds. The proposed solution is
framed by the identification of hand gestures as it possesses the perk of being used effortlessly and does not require
an intervening medium. The existing system for the application access is inflexible and arduous for people with
blindness and hand deformity regarding the human-computer interaction. A deep convolutional neural network
(DCNN) is put forward in this paper, to use hand gesture recognition and immediately classify them by preserving
even the not-hand area without any detection or segmentation process. Hence the proposed objective is to use
different hand gestures via an integrated webcam with the aid of deep learning concepts beneficial for the visually
impaired and people with a hand disability.The two approaches are static hand gestures and dynamic hand gestures.
The predetermined gesture is entirely recognized by the static hand gesture method. While on the contrary in the
dynamic method of gesture recognition, the meaning of the gesture is unclogged via its movement. The static gesture
is contrary to the dynamic gesture and is less practical, though it possesses the perk of being a method with fewer
difficulties.
468
World Journal of Advanced Research and Reviews, 2022, 14(03), 466–472
Figure 1 Modules
3. Capture image
Python provides various libraries for photo and video processing. One of them is OpenCV. OpenCV is a large library that
offers diverse features for image and video operations. With OpenCV, we will capture a video from the camera. It helps
you to create a video seize item that is useful to seize movies via a webcam and then you may carry out preferred
operations on that video. Steps to seize a video: Use cv2.VideoCapture () to get a video capture item for the digital
camera.
Figure 2 Architecture
469
World Journal of Advanced Research and Reviews, 2022, 14(03), 466–472
The gadget structure consists of the user offering gestures to the webcam which captures the gestures using OpenCV. It
then classifies the gestures because of the variety of angles located. Specific quantities of angles carry out particular
duties for that reason. Pyautogui is used to configure the media player with the python code. This mission aims to hit
upon hand gestures and control media without touching the keyboard.
5.2. Outcomes
The model weights are loaded to predict the hand gestures. The PyAutoGUI that's used for Keyboard key integration
with hand gestures and Streamlite that's used to create a person interface also are imported. The wide variety of presses
is assigned as 1, so whenever a gesture is expected the included manipulate feature is finished as soon as. The consumer
can go out the gadget with the aid of urgent the get away with the keyboard key. The video body will show the gesture
expected and the motion being performed each time the consumer is the use of the machine to govern the media
participant. The following conditional statements are used to perform the moves as shown in Table 1.
470
World Journal of Advanced Research and Reviews, 2022, 14(03), 466–472
6. Conclusion
In the cutting-edge global many centers is available to provide input to any software with or without bodily touch
(speech, hand gestures, and so forth.). The gesture would function as the direct command for operations consisting of
play or pause the video based on the person’s gestures onto the display. The user will offer a gesture as an entry in line
with the interesting characteristic. The Hand Gesture recognition is moving at incredible velocity for the futuristic
services and products and main businesses are developing technology-based at-hand gesture devices.
Future enhancement
The Hand Gesture reputation is transferring at amazing pace for the futuristic products and services and important
groups are growing generation based totally on the hand gesture gadget and that includes agencies like Microsoft,
Samsung, Sony and it includes the devices like Laptop, Hand held gadgets, Professional and LED lights. The verticals
consist of where the Gesture technology is and will be obvious are Entertainment, Artificial Intelligence, Education and
471
World Journal of Advanced Research and Reviews, 2022, 14(03), 466–472
Medical and Automation fields. And with a lot of Research and Development in the subject of Gesture Recognition Field,
its use and adoption will become greater cost-powerful and inexpensive. It’s a great characteristic of turning
information into features with a blend of technology and Human wave.
Acknowledgments
I would like to thank my teacher Mrs.M.Sumithra, Assistant Professor helping us with this project. She allowed me to
work on this project. Along with that, I would also like to thank Mr.S.Vickram designated partner of AiViREX Innovations
LLP wholeheartedly for guiding us in this project.
References
[1] Gaurav Sharma, Manuj Paliwal,(2020),” A Dynamic hand gesture recognition system for controlling VLC media
player”, IEEE 2020 18[ 4] : 52-57.
[2] Xiaoming Liu et Tsuhan Chen “Video-Based Face Recognition using Adaptive Hidden Markov Model” Electrical
and Computer Engineering, Carnegie Mellon University, Pittsburgh, PA, 15213, U.S.A
[3] Stella Nadar, Simran Nazareth, Kevin Paulson, NilambriNarka ‘Controlling Media Player with Hand Gestures using
Convolutional Neural Network”, IEEE 2021 5[2]:1-6
[4] Shilpa Chaman, Jay Jani, Henson Fernandes, Rahila Dhuka, Dhanvin Mehta “Using Real-Time Gesture to
Automotive Control”, IEEE 2018 3[5]:90-95
[5] H. Jalab, H. K. Omer “Human-computer interface using hand gesture recognition based on the neural network”,
IEEE 2015 2[4]: 3-7
[6] V Niranjani, R Keerthana, B Mohana Priya, K Nekalya, Anantha Krishnan Padmanabhan “ System application
control based on Hand gesture using Deep learning”, IEEE 2021 6[1]42-56
472