Final Group 1

Download as pdf or txt
Download as pdf or txt
You are on page 1of 73

AI VIRTUAL MOUSE

A PROJECT REPORT
Submitted by
KIRAN KUMAR B (408CS19010)
BHARATH KUMAR V (408CS19003)
HARSHITHA S (408CS19006)
KARTHIK SHETTY K (408CS19008)
KOUSHIK P (408CS19012)
in partial fulfilment for the award of the diploma

of
DIPLOMA IN COMPUTER SCIENCE & ENGINEERING
PROGRAMME
IN
DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING

HINDUSTAN ELECTRONICS ACADEMY POLYTECHNIC

DEPARTMENT OF TECHNICAL EDUCATION


BENGALURU -560001

Year of Submission: (August-2022)


A PROJECT REPORT
ON
AI VIRTUAL MOUSE
Submitted for partial fulfilment of the requirements for the award of
the
DIPLOMA IN COMPUTER SCIENCE AND
ENGINEERING

BY
KIRAN KUMAR B (408CS19010)
BHARATH KUMAR V (408CS19003)
HARSHITHA S (408CS19006)
KARTHIK SHETTY K (408CS19008)
KOUSHIK P (408CS19012)

Under the guidance of


Mrs. ANITTA MATHEW
Lecturer
Department of Computer Science and Engineering

HINDUSTAN ELECTRONICS ACADEMY POLYTECHNIC


Chinnapanahalli, Marathahalli Post, Bengaluru- 560037
CANDIDATE’S DECLARATION

We, KIRAN KUMAR B BHARATH KUMAR V HARSHITHA S KARTHIK SHETTY K


KOUSHIK P the student of Diploma in Computer Science and Engineering Department
bearing Register Number 408CS19010, 408CS19003, 408CS19006, 408CS19008,
408CS19012 of HINDUSTAN ELECTRONICS ACADEMY Polytechnic, hereby declare
that, We owe full responsibility for the information, results and conclusions provided in
this project work titled “AI VIRTUAL MOUSE” submitted to Board of Technical
Examinations, Government of Karnataka for the award of Diploma in Computer
Science and Engineering. To the best of my knowledge, this project work has not been
submitted in part or full elsewhere in any other institution/organization for the award of
any certificate/diploma/degree. We have completely taken care in acknowledging the
contribution of others in this academic work. I further declare that in case of any
violation of intellectual propertyrights and particulars declared, found at any stage, We,
as the candidate will be solely responsible for the same.

Date:

Place: Bengaluru

Name & Signature of candidate’s

Name: KIRAN KUMAR B


Reg No: 408CS19010

Name: BHARATH KUMAR V Name: HARSHITHA


Reg No: 408CS19003 Reg No: 408CS19006

Name: KARTHIK SHETTY K Name: KOUSHIK P


Reg No: 408CS19008 Reg No: 408CS19012

i
HINDUSTAN ELECTRONICS ACADEMEY
POLYTECHNIC
Diploma in Computer Science and Engineering

BONAFIDE CERTIFICATE

We Certified that this project report “AI VIRTUAL MOUSE” is the


bonafide work of “KIRAN KUMAR B,BHARATH KUMAR V,HARSHITHA
S,KARTHIK SHETTY K,KOUSHIK P,,” bearing Register Nos. “408CS19010,

408CS19003, 408CS19006, 408CS19008, 408CS19012” of this institution who

carried out the project work under my supervision.

SIGNATURE SIGNATURE

Mrs. Anitta Mathew Mrs.Poornima Manjunath

Guide Head of Department


Computer Science and Engineering Computer Science and Engineering
HEA Polytechnic, HEA Polytechnic,
Chinnapanahalli, Marathahalli, Chinnapanahalli, Marathahalli,
Bangalore- 560037 Bangalore- 560037

ii
DEPARTMENT OF TECHNICAL EDUCATION
HINDUSTAN ELECTRONICS ACADEMEY
POLYTECHNIC
Chinnapanahalli, Marathahalli Post, Bengaluru-560037
Department of Computer Science and Engineering

CERTIFICATE
We Certified that this project report entitled “AI VIRTUAL MOUSSE” which is
being submitted by KIRAN KUMAR B,BHARATH KUMAR
V,HARSHITHA,KARTHIK SHETTY K,KOUSHIK P, Reg. No. 408CS19010,
408CS19003, 408CS19006, 408CS19008, 408CS19012 a bonafide student of
Hindustan Electronic Academy Polytechnic in partial fulfilment for the award of
Diploma in Computer Science and Engineering during the year 2021-2022 is record
of students own work carried out under my/our guidance. It is certified that all
corrections/suggestions indicated for internal Assessment have been incorporated in the
Report and one copy of it being deposited in the polytechnic library.

The project report has been approved as it satisfies the academic requirements in respect
of Project work prescribed for the said diploma.

It is further understood that by this certificate the undersigned do not endorse or approve
any statement made, opinion expressed or conclusion drawn there in but approve the
project only for the purpose for which it is submitted.

Mrs. ANITTA Mrs. POORNIMA Mr. S. RAMAKRISHNA


MATHEW MANJUNATH REDDY

Guide Head of Department Principal

Name and signature of Examiner

1. 2.
iii
ACKNOWLEDGEMENT

We are extremely grateful to our beloved Principal Mr. S. Ramakrishna Reddy, HEA
Polytechnic for his kind co-operation.

We are extremely grateful to Mrs. Poornima Manjunath, HOD, Department of


Computer Science and Engineering, HEA Polytechnic for her kind co-operation and
encouragement.

We thank our project co-ordinator Mrs. Poornima Manjunath, HOD of Computer


Science Department, HEA Polytechnic for her valuable guidance and continuous
support to fulfil the project successfully.

We thank our project guide Mrs. Anitta Mathew, Lecturer of ComputerScience and
Engineering Department, HEA Polytechnic for her valuable guidance during the
course of project and continuous support to fulfil the project successfully.

Last but not the least we thank my Parents, Family members and Friends, for their great
support and encouragement throughout this project work.

KIRAN KUMAR B (408CS19010)


BHARATH KUMAR V (408CS19003)
HARSHITHA (408CS19006)
KARTHIK SHETTY K (408CS19008)
KOUSHIK P (408CS19012)

iv
LIST OF FIGURES

Figure No. Title Page No.


Fig. 5.1 Virtual mouse System Architecture 13
Fig. 5.2.1 Methodology 14
Fig. 5.3.1 The Camera Used in the AI Virtual Mouse System 15
Fig. 5.3.3 Virtual screen matching 16
Fig. 5.3.4 Detecting Which Finger Is Up 16
Fig. 5.3.6 For the Mouse Cursor Moving around the Computer 17
Window

Fig.5.3.11 For no Action to be performed on the screen 19


Fig. 5.5 Data Flow Diagram 20
Fig. 5.8 UML 21
Fig. 5.5.3 Use Case 22
Fig. 5.5.4 Sequence diagram 23

v
ABSTRACT

The mouse is one of the wonderful inventions of Human-Computer Interaction


(HCI) technology. Currently, wireless mouse or a Bluetooth mouse still uses devices
and is not free of devices completely since it uses a battery for power and a dongle
to connect it to the PC. In the proposed AI virtual mouse system, this limitation can
be overcome by employing webcam or a built-in camera for capturing of hand
gestures and hand tip detection using computer vision. The algorithm used in the
system makes use of the machine learning algorithm. Based on the hand gestures,
the computer can be controlled virtually and can perform left click, right click,
scrolling functions, and computer cursor function without the use of the physical
mouse. The algorithm is based on deep learning for detecting the hands. Hence, the
proposed system will avoid COVID-19 spread by eliminating the human
intervention and dependency of devices to control the computer.

Keywords: Human-Computer Interaction (HCI),wireless mouse,,webcam


.

vi
TABLE OF CONTENTS
CANDIDATE DECLARATION ................................................................................. i
PROJECT GUIDE CERTIFICATE ......................................................................... ii
CERTIFICATE ........................................................................................................... iii
ACKNOWLEDGEMENT .......................................................................................... iv
LIST OF FIGURES .................................................................................................... v
ABSTRACT ................................................................................................................. vi

Chapter 1- Introduction .......................................................................... 1


1.1 Problem statement…………………………………………………………2
1.2 Requirement analysis……………………………………………………....2

Chapter 2- Objectives ............................................................................... 5

Chapter 3- Tools/ Environment Used ..................................................... 7


3.1 Hardware Requirements ...............................................................................7
3.2 Software Requirements ............................................................................... 7

Chapter 4- System Analysis ..................................................................... 9


4.1 Existing System ........................................................................................ 9
4.1.1 Disadvantages of Existing System ............................................................9
4.2 Proposed System .................................................................................... 10
4.2.1 Advantages of Proposed System............................................................. 10

Chapter 5- System Design ..................................................................... 12


5.1 Introduction ............................................................................................ 12
5.2 System Architecture ................................................................................13
5.3 Working Principle .................................................................................. 13

5.4 Data Flow Diagram ................................................................................ 20


5.5 UML Diagram ........................................................................................ 21
5.5.1 Goals ....................................................................................................... 22
5.5.2 Use Case diagram ................................................................................... 22
5.5.4 Sequence Diagram ........................................................................... …..23

Chapter 6- Program code ...................................................................... 25


python.py ........................................................................................................ 25
Chapter 7- Testing .................................................................................. 43
7.1 Overview .................................................................................................. 43
7.2 A/B Testing .............................................................................................. 43
7.3 Beta Testing .............................................................................................. 45
7.4 White Box Testing ................................................................................... 45
7.5 Black Box Testing ................................................................................... 45
7.6 Positive Testing........................................................................................46
7.7 Negative Testing ..................................................................................... 46
7.8 Unit Testing ............................................................................................. 46
7.9 Functional Testing ................................................................................... 48
7.10 Integration Testing .................................................................................. 49
7.11 System Testing ........................................................................................ 49
7.12 Acceptance Testing ................................................................................. 49

Chapter 8- Input and Output screen .................................................... 52


8.1Snapshot of the Project and Description ................................................. 52
8.1.1 left click ...............................................................................................52
8.1.2 right click ............................................................................................. 52
8.1.3 double click .......................................................................................... 53
8.1.4 grab function ........................................................................................ 53
8.1.5 volume control ..................................................................................... 54
8.1.6 brightness control ................................................................................ 54
8.1.7 scroll function ...................................................................................... 55

Chapter 9- Limitation of the Project .................................................... 57

Chapter 10- Future Application of the Project .................................... 59

References and Bibliography .................................................................61


INTRODUCTION
AI virtual mouse

CHAPTER 1
INTRODUCTION

A virtual mouse is software that allows users to give mouse inputs to a system without using
an actual mouse. To the extreme it can also be called as hardware because it uses an ordinary
web camera. A virtual mouse can usually be operated with multiple input devices, which may
include an actual mouse or a computer keyboard. Virtual mouse which uses web camera works
with the help of different image processing techniques.

In this the hand movements of a user is mapped into mouse inputs. A web camera is set to take
images continuously. Most laptops today are equipped with webcams, which have recently
been used insecurity applications utilizing face recognition. In order to harness the full
potential of a webcam, it can be used for vision based CC, which would effectively eliminate
the need for a computer mouse or mouse pad. The usefulness of a webcam can also be greatly
extended to other HCI application such as a sign language database or motion controller. Over
the past decades there have been significant advancements in HCI technologies for gaming
purposes, such as the Microsoft Kinect and Nintendo Wii. These gaming technologies provide
a more natural and interactive means of playing videogames. Motion controls is the future of
gaming and it have tremendously boosted the sales of video games, such as the Nintendo Wii
which sold over 50 million consoles within a year of its release. HCI using hand gestures is
very intuitive and effective for one to one interaction with computers and it provides a Natural
User Interface (NUI). There has been extensive research towards novel devices and techniques
for cursor control using hand gestures. Besides HCI, hand gesture recognition is also used in
sign language recognition, which makes hand gesture recognition even more significant.

Dept. of CSE, HEA Polytechnic Aug-2022 Page 1


AI virtual mouse

1.1 PROBLEM STATEMENT


• To design motion tracking mouse which detect finger movements gestures instead of
physical mouse.
• To design an application (.exe file) with user friendly user interface which provides feature
for accessing motion tracking mouse feature.
• The camera should detect all the motions of hand and performs the operation of mouse.
• Implement such code where motion tracker mouse has drag & drop feature along with
scrolling feature.
• User Interface must be Simple & easy to understand.
• Physical mouse is subjected to mechanical wear and tear.
• Physical mouse requires special hardware and surface to operate.
• Physical mouse is not easily adaptable to different environments and its performance varies
depending on the environment.
• Mouse has limited functions even in present operational environments.
• All wired mouse and wireless mouse have its own lifespan.
• Implement such code where camera can recognize each and every finger movement &
responds according toit

1.2 REQUIREMENT ANALYSIS


1. Problem Recognition: To implement application which can access camera of system &
start detecting motion of finger to act as a cursor of a mouse to handle whole system.
2. Evaluation & Synthesis:
• The problem statement should be divided into two parts, one is implementation of user
Interface & second is implementation of code which can access camera and movements of
finger.
• After working on both parts separately, at the end just Integrate it to look it more neat and
good.
• At the end converting all this code in .exe file. So, it should be sharable and everyone can
use it.

Dept. of CSE, HEA Polytechnic Aug-2022 Page 2


AI virtual mouse

3. Modeling:
• Data Design: In this particular, we have to collect all the required data like how many
finger inputs we required, how many fingers we required for motion tracking. Also we have
to implement how many parts we want to display on UI.
• System Models: System should have Web camera support to trace motion of hand. Also, it
has some python libraries preinstalled so it should be easy to access.
• UI Design: To implement user interface first of all we have to decide environment required
for the
development of the application. After that according to architecture we should start
implementing the design.
4. Specification: • Use cases: After developing this application user should be able to access
their system through Motion Tracker Application.
• User are able to use this Application to maintain eye distance between their device and
himself. Also, during streaming he/she can able to access their system without any movement
of their body.
5. Review: We are developing such application which is combination of AI & Web. After
completing this project user can access their system with the help of their finger by using
system’s camera.

Dept. of CSE, HEA Polytechnic Aug-2022 Page 3


OBJECTIVES
AI virtual mouse

CHAPTER 2
OBJECTIVES

• Create such application which is part of AI.


• To design to operate with the help of a webcam.
• User should able to easily install in their computer.
• User should able to use feature of Drag & Drop.
• Also, it must have Scrolling feature.
• To design a virtual input that can operate on all surface.
• To convert hand gesture/motion into mouse input that will be set to a particular screen
position.
• UI of application should be easy to use.
• Program should run as fast as possible without any lag.
• There should be no heavy task which can disturb user.

Dept. of CSE, HEA Polytechnic Aug-2022 Page 5


TOOLS/ ENVIRONMENT

USED
AI virtual mouse

CHAPTER 3

TOOLS/ ENVIRONMENT USED

This chapter provides the hardware requirements, software functional and non-functional
requirement, and deployment environments.

3.1 HARDWARE REQUIREMENTS


 System: i3 or i5 processor

 System will be using Processor: Core2Duo Main Memory:

 2 GB RAM (Minimum)

 Hard Disk: 512 GB (Minimum)

 Display: 14" Monitor (For more comfort)

3.2 SOFTWARE REQUIREMENTS

 Operating system : Windows 10

 Coding Language : Python

 IDE : Visual Studio

 Python: To access camera & tracking all hand motion, python is very easy &
accurate to use. Python comes with lots of build in libraries which makes code
short and easily understandable. Python version required for building of this
application is 3.7

 Open CV Library: OpenCV are also included in the making of this program.
OpenCV (Open-Source Computer Vision) is a library of programming functions
for real time computer vision. OpenCV have the utility that can read image
pixels value, it also has the ability to create real time eye tracking and blink
detection.

 Tkinter: The tkinter package is the standard Python interface to the Tk GUI
toolkit.
SYSTEM ANALYSIS
AI virtual mouse

CHAPTER 4

SYSTEM ANALYSIS

4.1 EXISTING SYSTEM

The existing system consists of a mouse that can be either wireless or wired to control
the cursor, know we can use hand gestures to monitoring the system. The existing virtual
mouse control system consists of the simple mouse operation using the colored tips for
detection which are captured by web-cam, hence colored fingers acts as
an object which the web-camsense color like red, green, blue color to monitor the
system,whereas could perform basic mouse operation like minimize, drag, scroll up ,
scroll down , left-click right-click using hand gestures without any colored finger
because skin color recognition system is more flexible than the existing system.In the
existing system use static hand recognition like fingertip identification, hand shape,
Number of fingers to defined action explicitly , which makes a system more complex to
understand and difficult to use.

4.1.1 DISADVANTAGES OF EXISTING SYSTEM

 Physical mouse is subjected to mechanical wear and tear.


 Physical mouse requires special hardware and surface to operate.
 Physical mouse is not easily adaptable to different environments and its performance
varies depending on the environment.
 Mouse has limited functions even in present operational environments.
 All wired mouse and wireless mouse have its own lifespan.

Dept. of CSE, HEA Polytechnic Aug-2022 Page 9


AI virtual mouse

4.2 PROPOSED SYSTEM

The system works by identifying the color of the hand and decides the position of the
cursor accordingly but there are different conditions and scenario which make it
difficult for the algorithm to run in the real environment due to the Following reasons.
● Noises in the environment.
● Lighting condition in the environment
● Different textures of skin.
● Background object in the same colour of skin.
Fig. 1 Input Processing
So it becomes very important that the colour determining algorithm works
accurately. The proposed system can work for the skin tone of any color as well as can
work accurately in any lighting condition as well for the purpose of clicking the user
needs to create a 15 degree angle between its two-finger the proposed system can easily
replace the traditional mouse as well as the algorithm that requires colored tapes for
controlling the mouse .the research paper can be a pioneer in its field and can be a source
of further research in the corresponding field. The project can be developed with “zero-
cost” and can easily integrate with the existing system.

4.2.1 ADVANTAGES OF PROPOSED SYSTEM

 Virtual Mouse using Hand gesture recognition allows users to control mouse with
the help of hand gestures.
 System’s webcam is used for tracking hand gestures.
 Computer vision techniques are used for gesture recognition.
 OpenCV consists of a package called video capture which is used to capture data
from a live video.
 main thing we need to identify are the applications the model is going to develop
so the development of the mouse movement without using the system mouse.

Dept. of CSE, HEA Polytechnic Aug-2022 Page 10


SYSTEM DESIGN
AI virtual mouse

CHAPTER 5
SYSTEM DESIGN

5.1 Introduction

System design is a modelling process. It can be defined as a transaction from a user view to
programmers (developers) view. It concentrates on transferring of requirementspecification
to design specification. The design phase acts as a bridge between the requirements
specification and implementation phase. In this stage, the completedescription of our project
was understood and all possible combination to be implemented was considered.

The design activity into 2 separate phases:

 High level design

 Low level design

 High Level Design


The high-level design involves decomposing system into modules, and representing the
interface and invocation relationships among modules. A high-level design is referred to as
software architecture. A high-level design document will usually include a high-level
architecture diagram depicting the component may also depict or otherwise refers to work
flow between component system.

 Low Level Design


It involves the design of the internal logic of the individual module and involvesdeciding on
the class structure and algorithm to be used in the system. The detailed design determines
how the components identified the high-level designcan be implemented on software.

Dept. of CSE, HEA Polytechnic Aug-2022 Page 12


AI virtual mouse

5.2 System Architecture

ssssc

Fig. 5.1 AI virtual mouse System Architecture

5.3 Working principle

The AI virtual mouse system makes use of the transformational algorithm, and it
converts the co-ordinates of fingertip from the webcam screen to the computer window
full screen for controlling the mouse.

II. METHODOLOGY
The various functions and conditions used in the system are explained in the flowchart of the
real-time AI virtual mouse system

Dept. of CSE, HEA Polytechnic Aug-2022 Page 13


AI virtual mouse

Fig.5.2.1

Dept. of CSE, HEA Polytechnic Aug-2022 Page 14


AI virtual mouse
5.3.1 The Camera Used in the AI Virtual Mouse System

The proposed AI virtual mouse system is based on the frames that have been captured by the
webcam in a laptop or PC. By using the Python computer vision library OpenCV, the video
capture object is created and the web camera will start capturing video. The web camera
captures and passes the frames to the AI virtual system.

Fig.5.3.1

5.3.2 Capturing the Video and Processing

The AI virtual mouse system uses the webcam where each frame is captured till the
termination of the program.

5.3.3 (Virtual Screen Matching) Rectangular Region for Moving through


the Window

The AI virtual mouse system makes use of the transformational algorithm, and it converts the
co-ordinates of fingertip from the webcam screen to the computer window full screen for
controlling the mouse. When the hands are detected and when we find which finger is up for
performing the specific mouse function, a rectangular box is drawn with respect to the
computer window in the webcam region where we move throughout the window using the
mouse cursor.

Dept. of CSE, HEA Polytechnic Aug-2022 Page 15


AI virtual mouse

Fig.5.3.3

5.3.4. Detecting Which Finger Is Up and Performing the Particular Mouse


Function

In this stage, we are detecting which finger is up using the tip Id of the respective finger that
we found using the MediaPipe and the respective co-ordinates of the fingers that are up and
according to that, the particular mouse function is performed.

Fig.5.3.4

5.3.5. Mouse Functions Depending on the Hand Gestures and Hand Tip
Detection Using Computer Vision

5.3.6. For the Mouse Cursor Moving around the Computer Window

If the index finger is up with tip Id = 1 or both the index finger with tip Id = 1 and the middle
finger with tip Id = 2 are up, the mouse cursor is made to move around the window of the
computer using the AutoPy package of Python.

Dept. of CSE, HEA Polytechnic Aug-2022 Page 16


AI virtual mouse

Fig.5.3.6

5.3.7. For the Mouse to Perform Left Button Click

If both the index finger with tip Id = 1 and the middle finger with tip Id = 0 are up and the
distance between the two fingers is lesser than 30px, the computer is made to perform the left
mouse button click using the pynput Python package.

5.3.8. For the Mouse to Perform Right Button Click

If both the index finger with tip Id = 1 and the middle finger with tip Id = 2 are up and the
distance between the two fingers is lesser than 40 px, the computer is made to perform the
right mouse button click using the pynput Python package.

5.3.9. For the Mouse to Perform Scroll up Function

If both the index finger with tip Id = 1 and the thumb finger with tip Id = 2 are up and the
distance between the two fingers is lesser than 10 px and if the two fingers are moved up the
page, the computer is made to perform the scroll up mouse function using the PyAutoGUI
Python package.

Dept. of CSE, HEA Polytechnic Aug-2022 Page 17


AI virtual mouse
5.3.10. For the Mouse to Perform Scroll down Function

If both the index finger with tip Id = 1 and the thumb finger with tip Id = 2 are up and the
distance between the two fingers is lesser than 10px and if the two fingers are moved down
the page, the computer is made to perform the scroll down mouse function using the
PyAutoGUI Python package.

5.3.11. For No Action to be Performed on the Screen

If all the fingers are up with tip Id = 0, 1, 2, 3, and 4, the computer is made to not perform any
mouse events in the screen.

The various functions and conditions used in the system are explained in the flowchart
of the real-time AI virtual mouse system in figure.

Camera Used in the AI Virtual Mouse System. The proposed AI virtual mouse system is based
on the frames that have been captured by the webcam in a laptop or PC. By using the Python
computer vision library OpenCV, the video capture object is created and the web camera will
start capturing video, as shown in Figure. The web camera captures and passes the frames to
the AI virtual system.

Capturing the Video and Processing. The AI virtual mouse system uses the webcam where
each frame is captured till the termination of the program. The video frames are processed
from BGR to RGB colour space to find the hands in the video frame by frame as shown in the
following code:

def findHands(self, img , draw = True):


imgRGB = cv2.cvtColor(img , cv2.COLOR_BGR2RGB)
self.results = self.hands.process(imgRGB)

Dept. of CSE, HEA Polytechnic Aug-2022 Page 18


AI virtual mouse

Fig.5.3.11

Rectangular Region for Moving through the Window. The AI virtual mouse system makes use
of the transformational algorithm, and it converts the coordinates of fingertip from the webcam
screen to the computer window full screen for controlling the mouse. When the hands are
detected and when we find which finger is up for performing the specific mouse function, a
rectangular box is drawn with respect to the computer window in the webcam region where
we move throughout the window using the mouse cursor.

Detecting Which Finger Is Up and Performing the Particular Mouse Function. In this stage,
we are detecting which finger is up using the tip Id of the respective finger that we found using
the MediaPipe and the respective co-ordinates of the fingers that are up , and according to that,
the particular mouse function is performed.

Mouse Functions Depending on the Hand Gestures and Hand Tip Detection Using Computer
Vision For the Mouse Cursor Moving around the Computer Window. If the index finger is up
with tip Id = 1 or both the index finger with tip Id = 1 and the middle finger with tip Id = 2 are
up, the mouse cursor is made to move around the window of the computer using the AutoPy
package of Python.
For the Mouse to Perform Left Button Click. If both the index finger with tip Id = 1 and the
thumb finger with tip Id = 0 are up and the distance between the two fingers is lesser than
30px, the computer is made to perform the left mouse button click using the pynput.

Dept. of CSE, HEA Polytechnic Aug-2022 Page 19


AI virtual mouse

5.4 Data flow diagram

 A data-flow diagram is way of representing a flow of data of a process or a


system. The DFD also provides information about the outputs and inputs of each
entity and the process itself.

Fig. 5.4 Data flow diagram

Dept. of CSE, HEA Polytechnic Aug-2022 Page 20


AI virtual mouse

 5.5 UML Diagram

 UML stands for Unified Modelling Language. UML is a standardized general-


purpose modelling language in the field of object-oriented software
engineering. The standard is managed, and was created by, the Object
Management Group.
 The goal is for UML to become a common language for creating models of
object-oriented computer software. In its current form UML is comprised of two
major components: A Meta-model and a notation. In the future, some form of
method or process may also be added to; or associated with, UML.
 The Unified Modelling Language is a standard language for specifying,
Visualization, Constructing and documenting the artifacts of software system,
as well as for business modelling and other non-software systems.
 The UML represents a collection of best engineering practices that have proven
successful in the modelling of large and complex systems.
 The UML is a very important part of developing objects-oriented software and
the software development process. The UML uses mostly graphical notations to
express the design of software projects.

Fig.5.5 UML diagram

Dept. of CSE, HEA Polytechnic Aug-2022 Page 21


AI virtual mouse

5.5.1 GOALS

The Primary goals in the design of the UML are as follows:

 Provide users a ready-to-use, expressive visual modelling Language so that they


can develop and exchange meaningful models.

 Provide extendibility and specialization mechanisms to extend the core


concepts.

 Be independent of particular programming languages and development process.

 Provide a formal basis for understanding the modelling language.

 Encourage the growth of OO tools market.

 Support higher level development concepts such as collaborations, frameworks,


patterns and components.

5.5.2 Use Case Diagram

A use case diagram in the Unified Modelling Language (UML) is a type of behavioural
diagram defined by and created from a Use-case analysis. Its purpose is to present a
graphical overview of the functionality provided by a system in terms of actors, their
goals (represented as use cases), and any dependencies between those use cases. The
main purpose of a use case diagram is to show what system functions are performed for
which actor. Roles of the actors in the system can be depicted.

Fig.5.5.2 Use Case Diagram

Dept. of CSE, HEA Polytechnic Aug-2022 Page 22


AI virtual mouse

5.5.4 Sequence Diagram


A sequence diagram in Unified Modelling Language (UML) is a kind of interaction
diagram that shows how processes operate with one another and in what order. It is a
construct of a Message Sequence Chart. Sequence diagrams are sometimes called event
diagrams, event scenarios, and timing diagrams.

Fig.5.5.4 sequence diagram

Dept. of CSE, HEA Polytechnic Aug-2022 Page 23


PROGRAM CODE
AI virtual mouse

CHAPTER 6

PROGRAM CODE

Python.py

# Imports

import cv2

import mediapipe as mp

import pyautogui

import math

from enum import IntEnum

from ctypes import cast, POINTER

from comtypes import CLSCTX_ALL

from pycaw.pycaw import AudioUtilities, IAudioEndpointVolume

from google.protobuf.json_format import MessageToDict

import screen_brightness_control as sbcontrol

pyautogui.FAILSAFE = False

mp_drawing = mp.solutions.drawing_utils

mp_hands = mp.solutions.hands

# Gesture Encodings

class Gest(IntEnum):

# Binary Encoded

Dept. of CSE, HEA Polytechnic Aug-2022 Page 25


AI virtual mouse

FIST = 0

PINKY = 1

RING = 2

MID = 4

LAST3 = 7

INDEX = 8

FIRST2 = 12

LAST4 = 15

THUMB = 16

PALM = 31

# Extra Mappings

V_GEST = 33

TWO_FINGER_CLOSED = 34

PINCH_MAJOR = 35

PINCH_MINOR = 36

# Multi-handedness Labels

class HLabel(IntEnum):

MINOR = 0

MAJOR = 1

# Convert Mediapipe Landmarks to recognizable Gestures

Dept. of CSE, HEA Polytechnic Aug-2022 Page 26


AI virtual mouse

class HandRecog:

def init (self, hand_label):

self.finger = 0

self.ori_gesture = Gest.PALM

self.prev_gesture = Gest.PALM

self.frame_count = 0

self.hand_result = None

self.hand_label = hand_label

def update_hand_result(self, hand_result):

self.hand_result = hand_result

def get_signed_dist(self, point):

sign = -1

if self.hand_result.landmark[point[0]].y < self.hand_result.landmark[point[1]].y:

sign = 1

dist = (self.hand_result.landmark[point[0]].x -
self.hand_result.landmark[point[1]].x)**2

dist += (self.hand_result.landmark[point[0]].y -
self.hand_result.landmark[point[1]].y)**2

dist = math.sqrt(dist)

return dist*sign

def get_dist(self, point):

Dept. of CSE, HEA Polytechnic Aug-2022 Page 27


AI virtual mouse

dist = (self.hand_result.landmark[point[0]].x -
self.hand_result.landmark[point[1]].x)**2

dist += (self.hand_result.landmark[point[0]].y -
self.hand_result.landmark[point[1]].y)**2

dist = math.sqrt(dist)

return dist

def get_dz(self,point):

return abs(self.hand_result.landmark[point[0]].z -
self.hand_result.landmark[point[1]].z)

# Function to find Gesture Encoding using current finger_state.

# Finger_state: 1 if finger is open, else 0

def set_finger_state(self):

if self.hand_result == None:

return

points = [[8,5,0],[12,9,0],[16,13,0],[20,17,0]]

self.finger = 0

self.finger = self.finger | 0 #thumb

for idx,point in enumerate(points):

dist = self.get_signed_dist(point[:2])

dist2 = self.get_signed_dist(point[1:])

try:

ratio = round(dist/dist2,1)
Dept. of CSE, HEA Polytechnic Aug-2022 Page 28
AI virtual mouse

except:

ratio = round(dist/0.01,1)

self.finger = self.finger << 1

if ratio > 0.5 :

self.finger = self.finger | 1

# Handling Fluctations due to noise

def get_gesture(self):

if self.hand_result == None:

return Gest.PALM

current_gesture = Gest.PALM

if self.finger in [Gest.LAST3,Gest.LAST4] and self.get_dist([8,4]) < 0.05:

if self.hand_label == HLabel.MINOR :

current_gesture = Gest.PINCH_MINOR

else:

current_gesture = Gest.PINCH_MAJOR

elif Gest.FIRST2 == self.finger :

point = [[8,12],[5,9]]

dist1 = self.get_dist(point[0])

dist2 = self.get_dist(point[1])

ratio = dist1/dist2

if ratio > 1.7:

Dept. of CSE, HEA Polytechnic Aug-2022 Page 29


AI virtual mouse

current_gesture = Gest.V_GEST

else:

if self.get_dz([8,12]) < 0.1:

current_gesture = Gest.TWO_FINGER_CLOSED

else:

current_gesture = Gest.MID

else:

current_gesture = self.finger

if current_gesture == self.prev_gesture:

self.frame_count += 1

else:

self.frame_count = 0

self.prev_gesture = current_gesture

if self.frame_count > 4 :

self.ori_gesture = current_gesture

return self.ori_gesture

# Executes commands according to detected gestures

class Controller:

tx_old = 0

ty_old = 0

trial = True

Dept. of CSE, HEA Polytechnic Aug-2022 Page 30


AI virtual mouse

flag = False

grabflag = False

pinchmajorflag = False

pinchminorflag = False

pinchstartxcoord = None

pinchstartycoord = None

pinchdirectionflag = None

prevpinchlv = 0

pinchlv = 0

framecount = 0

prev_hand = None

pinch_threshold = 0.3

def getpinchylv(hand_result):

dist = round((Controller.pinchstartycoord - hand_result.landmark[8].y)*10,1)

return dist

def getpinchxlv(hand_result):

dist = round((hand_result.landmark[8].x - Controller.pinchstartxcoord)*10,1)

return dist

def changesystembrightness():

currentBrightnessLv = sbcontrol.get_brightness()/100.0

currentBrightnessLv += Controller.pinchlv/50.0

if currentBrightnessLv > 1.0:

currentBrightnessLv = 1.0

Dept. of CSE, HEA Polytechnic Aug-2022 Page 31


AI virtual mouse

elif currentBrightnessLv < 0.0:

currentBrightnessLv = 0.0

sbcontrol.fade_brightness(int(100*currentBrightnessLv) , start =
sbcontrol.get_brightness())

def changesystemvolume():

devices = AudioUtilities.GetSpeakers()

interface = devices.Activate(IAudioEndpointVolume._iid_, CLSCTX_ALL,


None)

volume = cast(interface, POINTER(IAudioEndpointVolume))

currentVolumeLv = volume.GetMasterVolumeLevelScalar()

currentVolumeLv += Controller.pinchlv/50.0

if currentVolumeLv > 1.0:

currentVolumeLv = 1.0

elif currentVolumeLv < 0.0:

currentVolumeLv = 0.0

volume.SetMasterVolumeLevelScalar(currentVolumeLv, None)

def scrollVertical():

pyautogui.scroll(120 if Controller.pinchlv>0.0 else -120)

def scrollHorizontal():

pyautogui.keyDown('shift')

pyautogui.keyDown('ctrl')

Dept. of CSE, HEA Polytechnic Aug-2022 Page 32


AI virtual mouse

pyautogui.scroll(-120 if Controller.pinchlv>0.0 else 120)

pyautogui.keyUp('ctrl')

pyautogui.keyUp('shift')

# Locate Hand to get Cursor Position

# Stabilize cursor by Dampening

def get_position(hand_result):

point = 9

position = [hand_result.landmark[point].x ,hand_result.landmark[point].y]

sx,sy = pyautogui.size()

x_old,y_old = pyautogui.position()

x = int(position[0]*sx)

y = int(position[1]*sy)

if Controller.prev_hand is None:

Controller.prev_hand = x,y

delta_x = x - Controller.prev_hand[0]

delta_y = y - Controller.prev_hand[1]

distsq = delta_x**2 + delta_y**2

ratio = 1

Controller.prev_hand = [x,y]

if distsq <= 25:

ratio = 0

elif distsq <= 900:

ratio = 0.07 * (distsq ** (1/2))

Dept. of CSE, HEA Polytechnic Aug-2022 Page 33


AI virtual mouse

else:

ratio = 2.1

x , y = x_old + delta_x*ratio , y_old + delta_y*ratio

return (x,y)

def pinch_control_init(hand_result):

Controller.pinchstartxcoord = hand_result.landmark[8].x

Controller.pinchstartycoord = hand_result.landmark[8].y

Controller.pinchlv = 0

Controller.prevpinchlv = 0

Controller.framecount = 0

# Hold final position for 5 frames to change status

def pinch_control(hand_result, controlHorizontal, controlVertical):

if Controller.framecount == 5:

Controller.framecount = 0

Controller.pinchlv = Controller.prevpinchlv

if Controller.pinchdirectionflag == True:

controlHorizontal() #x

elif Controller.pinchdirectionflag == False:

controlVertical() #y

lvx = Controller.getpinchxlv(hand_result)

lvy = Controller.getpinchylv(hand_result)

Dept. of CSE, HEA Polytechnic Aug-2022 Page 34


AI virtual mouse

if abs(lvy) > abs(lvx) and abs(lvy) > Controller.pinch_threshold:

Controller.pinchdirectionflag = False

if abs(Controller.prevpinchlv - lvy) < Controller.pinch_threshold:

Controller.framecount += 1

else:

Controller.prevpinchlv = lvy

Controller.framecount = 0

elif abs(lvx) > Controller.pinch_threshold:

Controller.pinchdirectionflag = True

if abs(Controller.prevpinchlv - lvx) < Controller.pinch_threshold:

Controller.framecount += 1

else:

Controller.prevpinchlv = lvx

Controller.framecount = 0

def handle_controls(gesture, hand_result):

x,y = None,None

if gesture != Gest.PALM :

x,y = Controller.get_position(hand_result)

# flag reset

if gesture != Gest.FIST and Controller.grabflag:

Controller.grabflag = False

pyautogui.mouseUp(button = "left")

Dept. of CSE, HEA Polytechnic Aug-2022 Page 35


AI virtual mouse

if gesture != Gest.PINCH_MAJOR and Controller.pinchmajorflag:

Controller.pinchmajorflag = False

if gesture != Gest.PINCH_MINOR and Controller.pinchminorflag:

Controller.pinchminorflag = False

# implementation

if gesture == Gest.V_GEST:

Controller.flag = True

pyautogui.moveTo(x, y, duration = 0.1)

elif gesture == Gest.FIST:

if not Controller.grabflag :

Controller.grabflag = True

pyautogui.mouseDown(button = "left")

pyautogui.moveTo(x, y, duration = 0.1)

elif gesture == Gest.MID and Controller.flag:

pyautogui.click()

Controller.flag = False

elif gesture == Gest.INDEX and Controller.flag:

pyautogui.click(button='right')

Controller.flag = False

elif gesture == Gest.TWO_FINGER_CLOSED and Controller.flag:

Dept. of CSE, HEA Polytechnic Aug-2022 Page 36


AI virtual mouse

pyautogui.doubleClick()

Controller.flag = False

elif gesture == Gest.PINCH_MINOR:

if Controller.pinchminorflag == False:

Controller.pinch_control_init(hand_result)

Controller.pinchminorflag = True

Controller.pinch_control(hand_result,Controller.scrollHorizontal,
Controller.scrollVertical)

elif gesture == Gest.PINCH_MAJOR:

if Controller.pinchmajorflag == False:

Controller.pinch_control_init(hand_result)

Controller.pinchmajorflag = True

Controller.pinch_control(hand_result,Controller.changesystembrightness,
Controller.changesystemvolume)

'''

Main Class

Entry point of Gesture Controller

'''

class GestureController:

gc_mode = 0

cap = None

CAM_HEIGHT = None

Dept. of CSE, HEA Polytechnic Aug-2022 Page 37


AI virtual mouse

s CAM_WIDTH = None

hr_major = None # Right Hand by default

hr_minor = None # Left hand by default

dom_hand = True

def init (self):

GestureController.gc_mode = 1

GestureController.cap = cv2.VideoCapture(0)

GestureController.CAM_HEIGHT =
GestureController.cap.get(cv2.CAP_PROP_FRAME_HEIGHT)

GestureController.CAM_WIDTH =
GestureController.cap.get(cv2.CAP_PROP_FRAME_WIDTH)

def classify_hands(results):

left , right = None,None

try:

handedness_dict = MessageToDict(results.multi_handedness[0])

if handedness_dict['classification'][0]['label'] == 'Right':

right = results.multi_hand_landmarks[0]

else :

left = results.multi_hand_landmarks[0]

except:

pass

try:

handedness_dict = MessageToDict(results.multi_handedness[1])

if handedness_dict['classification'][0]['label'] == 'Right':

Dept. of CSE, HEA Polytechnic Aug-2022 Page 38


AI virtual mouse

right = results.multi_hand_landmarks[1]

else :

left = results.multi_hand_landmarks[1]

except:

pass

if GestureController.dom_hand == True:

GestureController.hr_major = right

GestureController.hr_minor = left

else :

GestureController.hr_major = left

GestureController.hr_minor = right

def start(self):

handmajor = HandRecog(HLabel.MAJOR)

handminor = HandRecog(HLabel.MINOR)

with mp_hands.Hands(max_num_hands = 2,min_detection_confidence=0.5,


min_tracking_confidence=0.5) as hands:

while GestureController.cap.isOpened() and GestureController.gc_mode:

success, image = GestureController.cap.read()

if not success:

print("Ignoring empty camera frame.")

continue

Dept. of CSE, HEA Polytechnic Aug-2022 Page 39


AI virtual mouse

image = cv2.cvtColor(cv2.flip(image, 1), cv2.COLOR_BGR2RGB)

image.flags.writeable = False

results = hands.process(image)

image.flags.writeable = True

image = cv2.cvtColor(image, cv2.COLOR_RGB2BGR)

if results.multi_hand_landmarks:

GestureController.classify_hands(results)

handmajor.update_hand_result(GestureController.hr_major)

handminor.update_hand_result(GestureController.hr_minor)

handmajor.set_finger_state()

handminor.set_finger_state()

gest_name = handminor.get_gesture()

if gest_name == Gest.PINCH_MINOR:

Controller.handle_controls(gest_name, handminor.hand_result)

else:

gest_name = handmajor.get_gesture()

Controller.handle_controls(gest_name, handmajor.hand_result)

for hand_landmarks in results.multi_hand_landmarks:

mp_drawing.draw_landmarks(image, hand_landmarks,
mp_hands.HAND_CONNECTIONS)

else:

Dept. of CSE, HEA Polytechnic Aug-2022 Page 40


AI virtual mouse

Controller.prev_hand = None

cv2.imshow('Gesture Controller', image)

if cv2.waitKey(5) & 0xFF == 13:

break

GestureController.cap.release()

cv2.destroyAllWindows()

# uncomment to run directly

gc1 = GestureController()
gc1.start()

Dept. of CSE, HEA Polytechnic Aug-2022 Page 41


AI virtual mouse

TESTING
AI virtual mouse

CHAPTER 7

TESTING

7.1 Overview

The purpose of testing is to discover errors. Testing is the process of trying to discover
every conceivable fault or weakness in a work product. It provides a way to check the
functionality of components, sub-assemblies, assemblies and/ or a finished product. It
is the process of exercising software with the intent of ensuring that the Software system
meets its requirements and user expectations and does not fail in an unacceptable
manner. There are various types of test. Each test type addresses a specific testing
requirement.

7.2 A/B Testing

The case for A/B testing is very strong: When any changes are being made in design,
performing tests along the way means you backup design decisions with data, for
example.

The occurrence is very common:

 Members of a design team will disagree on what is the best path to pursue

 Client and designer will disagree as to which variation of an interface will work
better
 Business or Making team and design team will disagree on which will work
better

Apart from being a platform for everyone to raise sometimes heated personal opinions
and biases, discussion like these usually lead nowhere other than hour-long, heavy-
loaded meetings. Data is by far, the best way to settle these debates. A client wouldn’t
be arguing that a blue button is better than a red one if he knew the red variation would
increase his revenue by .5% (or let’s say, $500/day).

Dept. of CSE, HEA Polytechnic Aug-2022 Page 43


AI virtual mouse

A design team wouldn’t argue over which imagery to use if they knew that a certain
variation increase retention. A/B testing helps teams deliver better work, more
efficiently.
Going further, it also allows you to improve key business metrics. Testing – specially
if it’s conducted continually – enables you to optimize your interface and make sure
your website is delivering the best results possible. Picture for a moment an ecommerce
store.

The goal is to increase the number of checkouts. A/B testing only the listing page would
have a really small effect on the total number checkouts. It wouldn’t move the needle
significantly and neither would be optimizing just the homepage header, for example.
However, running tests to optimize all areas – from the menus, all the way to the
checkout confirmation – will results in a compound effect that will make more of an
impact.

Dept. of CSE, HEA Polytechnic Aug-2022 Page 44


AI virtual mouse

7.3 Beta Testing

A type of User Acceptance Testing, Beta Testing, also known as “field testing”, is done
in the customer’s environment. Beta testing is commonly used for brand new features
and products. The purpose of beta testing is to provide access to users who then provides
feedback, which helps improve the application. Beta testing often involves a limited
number of users.
7.4 White Box Testing
White box testing is a method of testing software in which the internal workings, code,
architecture, design, etc) are known to the tester. White box testing validates the internal
structure and therefore often focuses primarily on improving security, and making the
flow of inputs/ outputs more efficient and optimized. In white box testing, the tester is
often testing for internal security holes and broken or poorly structured coding paths.
The term “white box” is used because in this type of testing, you have visibility into the
internal workings. Because of this, white box testing usually requires a more technical
person. Types of white box testing include unit testing and integration testing.

7.5 Black Box Testing

Black box testing is a method of testing software in which the internal workings, (code,
architecture, design, etc), are NOT known to the tester. Black box testing focuses on
the behaviour of the software and involves testing from an external or end-user
perspective. With black box testing, the tester is testing the functionality of the software

without looking at the code or having any knowledge of the application’s internal flows.
Inputs and outputs are tested and compared to the expected output and if the actual
output doesn’t match the expected output, a bug has been found.

The term “black box” is used because in this type of testing, you don’t look inside of
the application. For this reason, non-technical people often conduct black box testing.
Types of black box testing include functional testing, system testing, usability testing,
and regression testing.

Dept. of CSE, HEA Polytechnic Aug-2022 Page 45


AI virtual mouse

7.6 Positive Testing

Positive testing is the type of testing that can be performed on the system by providing
the valid data as input. It checks whether an application behaves as expected with
positive inputs. This test is done to check the application that does what it is supposed
to do.

7.7 Negative Testing

Negative Testing is a variant of testing that can be performed on the system by


providing invalid data as input. It checks whether an application behaves as expected
with the negative inputs. This is to test the application does not do anything that it is
not supposed to do so.

7.8 Unit Testing

Unit testing involves the design of test cases that validate that the internal program logic
is functioning properly, and that program inputs produce valid outputs. All decisions
branches and internal code flow should be validated. It is the testing of individual
software units of the application. It is done after the completion of an individual unit
before integration. This is a structural testing, that relies on knowledge of its
construction and is invasive. Unit tests perform basic tests at component level and test
a specific business process, application, and/ or system configuration. Unit tests ensure
that each unique path of a business process performs accurately to the documented
specification and contains clearly defined inputs and expected results. Unit testing is
usually conducted as part of a combined code and unit test phaseof the software
lifecycle, although it is not uncommon for coding and unit testing to beconducted as
two distinct phases.

Dept. of CSE, HEA Polytechnic Aug-2022 Page 46


AI virtual mouse

Test strategy and approach

Field testing will be performed manually and functional tests will be written detail.

Test objectives

 All gestures must work properly.

 Cam must identify the fingers.

 The movements must not be delayed.

Features to be tested

 Is it performing all the actions ?

 Is it fast ?

Dept. of CSE, HEA Polytechnic Aug-2022 Page 47


AI virtual mouse

7.9 Functional Testing

Functional tests provide systematic demonstrations that functions tested are available
as specified by the business and technical requirements, system documentation, and
user manuals.
Functional testing is centred on the following items:

Valid Input : Identified classes of valid input must be accepted.


Invalid Input : Identified classes of invalid input must be rejected.
Functions : Identified functions must be exercised.
Output : Identified classes of application outputs must be exercised.
System/ Procedures : Interfacing systems or procedures must be exercised.

Organization and preparing of functional test is focused on requirements, key function,


or special test cases.

In addition, systematic coverage pertaining to identify Business process flows,


datafields,predefined processes, and successive processes must be considered for
testing.

Before functional testing is complete, additional test are identified and the effective
value of current test is determined.

Dept. of CSE, HEA Polytechnic Aug-2022 Page 48


AI virtual mouse

7.10 Integration Testing

Software integration testing is the incremental integration testing of two or more


integrated software components on a single platform to produce failure caused by
interface defects.

The task of the integration test is to check that components or software, e.g. component
in a software system or – one step up – software applications at the company level –
interact without error.
7.11 System Testing

System testing ensures that the entire integrated software system meets requirements.
It tests a configuration to ensure known and predictable results. An example of system
testing is based on process description and flows, emphasizing pre-driven process links
and integration points.
7.12 Acceptance Testing

User Acceptance Testing is a critical phase of any project and requires significant
participation by the end user. It also ensures that the system meets the functional
requirements.

Test Results

All the test cases mentioned above passed successfully. No defects encountered.

Dept. of CSE, HEA Polytechnic Aug-2022 Page 49


AI virtual mouse

Test Cases :-

Test Boundary Expected Actual Status


case Scenario Value Result Result
id
1 Used in normal >90% In normal Hand gestures Passed
environment. environment hand got easily
gestures can be recognized and
recognized easily. work properly.
2 Used in bright >60% In brighter In bright Passed
environment. environment, conditions the
software should software works
work fine as it very well.
easily detects the
hand movements
but in a more
brighter conditions
it may not detect
the hand gestures
as expected.

3 Used in dark <30% In dark In dark Failed


environment environment, environment
Itshould work software didn’t
properly. work properly
in detecting
hand gestures.
4 Used at a near distance >80% At this distance, It works fine Passed
(15cm) from the web this software and all features
cam. should perform works
perfectly. properly.
5 Used at a far distance >95% At this distance, At this Passed
(35cm) from the web this software distance, it is
cam. should work fine. working
properly.
6 Used at a farther distance >60% At this distance, At this Passed
(60cm) from the web theirwill be some distance, The
cam. problem in functions of
detecting hand this software
gestures but it works
should work fine. properly.

Dept. of CSE, HEA Polytechnic Aug-2022 Page 50


AI virtual mouse

SCREENSHOTS
AI virtual mouse

CHAPTER 8
INPUT AND OUTPUT SCREENS
8.1 Snapshot of the Project and Description

Fig 8.1.1 Left click

Fig 8.1.2 Right click

Dept. of CSE, HEA Polytechnic Aug-2022 Page 52


AI virtual mouse

Fig 8.1.3 Double click

Fig 8.1.4 Grab function

Dept. of CSE, HEA Polytechnic Aug-2022 Page 53


AI virtual mouse

Fig 8.1.5 Volume control

Fig 8.1.6 Brightness control

Dept. of CSE, HEA Polytechnic Aug-2022 Page 54


AI virtual mouse

Fig 8.1.7 Scroll function

Dept. of CSE, HEA Polytechnic Aug-2022 Page 55


AI virtual mouse

LIMITATIONS
AI virtual mouse

CHAPTER 9
LIMITATION OF THE PROJECT

 Not as fast as existing system

 Not suitable for dark environment

 Not as efficient as physical mouse

 The proposed AI virtual mouse has some limitations such as small decrease in
accuracy of the right click mouse function

 The model has some difficulties in executing clicking and dragging to select
the text.

Dept. of CSE, HEA Polytechnic Aug-2022 Page 57


AI virtual mouse

FUTURE APPLICATION
AI virtual mouse

CHAPTER 10

FUTURE APPLICATION OF THE PROJECT

There are several features and improvements needed in order for the program to be more user
friendly, accurate, and flexible in various environments.
The following describes the improvements and the features required:-

 Smart Movement: Due to the current recognition process are limited within 25cm
radius, an adaptive zoomin/out functions are required to improve the covered distance,
where it can automatically adjust the focus rate based on the distance between the
users and the webcam

 Better Accuracy & Performance: The response time are heavily relying on the
hardware of the machine, this includes the processing speed of the processor, the size
of the available RAM, and the available features of webcam. Therefore, the program
may have better performance when it's running on a decent machine with a webcam
that performs better in different types of lightings.

 Mobile Application: In future this web application also able to use on Android
devices, where touchscreen concept is replaced by hand gestures.

Dept. of CSE, HEA Polytechnic 2022 Page 59


REFERENCES AND

BIBLIOGRAPHY
AI virtual mouse

REFERENCES AND BIBLIOGRAPHY


[1] International Journal of Computer Trends and Technology (IJCTT) – volume
9 number 1– Mar 2014 ISSN: 2231-2803 www.internationaljournalssrg.org
Page 15 Mouse Control using a Web Camera based on Colour Detection.
[2] K N. Shah, K R. Rathod and S. J. Agravat, “A survey on Human Computer
Interaction Mechanism Using Finger Tracking” International Journal of
Computer Trends and Technology, 7(3), 2014, 174-177
[3] Tutorialspoint.com, (n.d.). SDLC - Agile Model. [online] Available at
athttp://www.tutorialspoint.com/sdlc/sdlc_agile_model .html+
[4] Python GUI Programming With Tkinterhttps://realpython.com/python-gui-
tkinter/
[5] Python numpy ,https://numpy.org/
[6] The Python Standard Library
https://python.readthedocs.io/en/latest/library/index. html
[7] The MATLAB website. [Online]. Available:
http://www.mathworks.com/matlabcentral/fileexchan ge/2 8757-tracking-red-
color-objects-using-matlab
[8] Pycharm https://www.jetbrains.com/pycharm
[9] http://anikettatipamula.blogspot.in/2012/02/hand-gesture-using-
opencv.html
[10] MSDN Microsoft developers network –
www.msdn.microsoft.com
[11] Code project – www.codeproject.com/Articles/498193/Mouse-
Control-via-Webcam
[12] Aniket Tatipamula’s Blog -
http://anikettatipamula.blogspot.in/2012/02/hand-gesture-using-opencv.html
[13] Microsoft Research Paper-
http://research.microsoft.com/enus/um/people/awf/bmvc02/project.pdf
[14] Banerjee, A., Ghosh, A., Bharadwaj, K., & Saikia, H. (2014).
Mouse control using a web camera based on colour detection. arXiv preprint
arXiv:1403.4722.

Dept. of CSE, HEA Polytechnic Aug-2022 Page 61


AI virtual mouse

[15] Chu-Feng, L. (2008). Portable Vision-Based HCI. [online]


Available at:
http://www.csie.ntu.edu.tw/~p93007/projects/vision/vision_hci_p93922007.p
df [Accessed 25 Aug. 2015].
[16] Park, H. (2008). A method for controlling mouse movement using
a real-time camera. Brown University, Providence, RI, USA, Department of
computer science.
[17] Kumar N, M. (2011). Manual Testing: Agile software
development. [online] Manojforqa.blogspot.com. Available at:
http://manojforqa.blogspot.com/2011/09/agile-software-development.html
[Accessed 27 Aug. 2015].
[18] Niyazi, K. (2012). Mouse Simulation Using Two Coloured Tapes.
IJIST, 2(2), pp.57- 63.
[19] Sekeroglu, K. (2010). Virtual Mouse Using a Webcam. [online]
Available at:
http://www.ece.lsu.edu/ipl/SampleStudentProjects/ProjectKazim/Virtual%20
Mouse% 20Using%20a%20Webcam_Kazim_Sekeroglu.pdf [Accessed 29
Aug. 2015].
[20] Tutorialspoint.com, (n.d.). SDLC - Agile Model. [online]
Available at: http://www.tutorialspoint.com/sdlc/sdlc_agile_model.htm
[Accessed 27 Aug. 2015].
[21] Tabernae.com, (n.d.). Software Life Cycle | Web Development
Outsourcing| IT Offshore Outsourcing. [online] Available at:
http://www.tabernae.com/process.aspx [Accessed 28 Aug. 2015].
[22] Zhengyou, Z., Ying, W. and Shafer, S. (2001). Visual Panel:
Virtual Mouse, Keyboard and 3D Controller with an Ordinary Piece of Paper.
[online] Available at: IA(HONS) Information System Engineering Faculty of
Information and Communication Technology (Perak Campus), UTAR 42
http://research.microsoft.com/en-us/um/people/zhang/Papers/PUI2001-
VisualPanel.pdf [Accessed 25 Aug. 2015].

Dept. of CSE, HEA Polytechnic Aug-2022 Page 62

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy