Vyshnavi Project
Vyshnavi Project
Submitted by:
(22091F0029)
(ESTD – 1995)
CERTIFICATE
This is to certify that GAJULAPALLI NAGA VYSHNAVI (22091F0029), of MCA III-
semester, has carried out the mini-project work entitled “FINGERPRINT BASED ON ATM
SYSTEM” under the supervision and guidance of Mr. V. V. NAGENDRA KUMAR, Assistant.
Professor, MCA Department, in partial fulfillment of the requirements for the award of Degree of
Master of Computer Applications from Rajeev Gandhi Memorial College of Engineering &
Technology (Autonomous), Nandyal is a bonafied record of the work done by her during 2023-
2024.
Dept. of MCA
I express my gratitude to Dr. K. Subba Reddy garu, Head of the Department of Computer
Science Engineering & MCA departments, all teaching and non-teaching staff of the Computer
Science Engineering department of Rajeev Gandhi memorial College of Engineering and
Technology for providing continuous encouragement and cooperation at various steps of my
project.
At the outset I thank to honorable Chairman Dr. M. SanthiRamudu garu, for providing us
with exceptional faculty and moral support throughout the course.
Finally, I extend my sincere thanks to all the Staff Members of MCA & CSE Departments
who have co-operated and encouraged us in making my project successful.
Whatever one does, whatever one achieves, the first credit goes to the Parents be it not for
their love and affection, nothing would have been responsible. I have seen every good that happens
to us their love and blessings.
BY
GAJULAPALLI NAGA VYSHNAVI (22091F0029)
CONTENTS
CHAPTER PAGE NO.
1. INTRODUCTION 1
1.1. Purpose 1
1.2. Scope 1
2. LITERATURE SURVEY 3
2.4.1. Disadvantages 4
2.5.1. Advantages 4
3.1.1 Algorithms 5
3.2 Modules 7
4. IMPLEMENTATION 20
4.Programming Language 20
4.4 Normalization 30
6.FUTURE ENHANCEMENT 37
7.CONCLUSION 38
Appendix: Screenshots 39
Bibliography 44
LIST OF FIGURES
1. Data dictionary 25
2. Data table 25
3. 1NF 31
4. 2NF 31
5. User details 32
6. User details 32
CHAPTER-I
INTRODUCTION
1.1 Purpose
Biometrics is the art of science and technology of measuring and analyzing biological data. If
biometrics refers to technologies that measure and analysis human body characteristics, such
as DNA, fingerprinting, eye retina and irises, voice pattern, facial pattern and measurement for
authentication purposes. Biometrics identifier method provides several advantages over the
traditional method and current method used in our daily life. Basically concentrate on two
function one is for identification and other verification. A modern ATM is typically made up
of the devices like CPU to control the user interface and devices related to transaction, magnetic
or chip card reader to identify the customer, Pin pad, secure crypto-processor generally within
a secure cover.
Display to be used by the customer for performing the transaction, function key button, record
printer to provide the customer with a record of their transaction, to store the parts of the
machinery requiring restricted access- vault, housing for aesthetics, sensors and indicators. In
this modern era there are many people using ATM. Fast development of banking has various
advantages and disadvantages.
1.2 Scope
The main objective of this system is to develop an embedded system, which is used for ATM
security applications. In these system Bankers will collect the customer fingerprints while
opening the accounts then customer will only access ATM machine. The working of these
ATM machine is when customer place finger on the finger print module it displays the name
of the customer on the LCD connected to the micro controller. If the user does not have a
account activated by a fingerprint initially it does not allow the user to do transactions.
Nowadays, using the ATM (Automatic Teller Machine) which provides customers with the
convenient banknote trading is very common. However, the financial crime case rises
repeatedly in recent years; a lot of criminal tampers with the ATM terminal and steal user's
credit card and password by illegal means. Once user's bank card is lost and the password is
stolen, the criminal will draw all cash in the shortest time, which will bring enormous financial
losses to customer. How to carry on the valid identity to the customer becomes the focus in
current financial circle.
We are using ATMs in our country for all our banking activities. An automated teller machine
(ATM) is an electronic telecommunications device that enables customers of financial
institutions to perform financial transactions, such as cash withdrawals, deposits, transfer
funds, or obtaining account information, at any time and without the need for direct interaction
with bank staff. On most modern ATMs, customers are identified by inserting a plastic ATM
card (or some other acceptable payment card) into the ATM, with authentication being by the
customer entering a personal identification number (PIN), which must match the PIN stored in
the chip on the card (if the card is so equipped), or in the issuing financial institution's database.
Using an ATM, customers can access their bank deposit or credit accounts in order to make a
variety of financial transactions such as cash withdrawals, check balances, or credit mobile
phones. ATMs can be used to withdraw cash in a foreign country. If the currency being
withdrawn from the ATM is different from that in which the bank account is denominated, the
money will be converted at the financial institution's exchange.
Most biometric technology systems use the same basic principles of operation. First, a person
must be registered, or enrolled, on the biometric system.
CHAPTER -II
LITERATURE REVIEW
2.1 A Review on Securing ATM System Using Fingerprint:
Biometric systems have overtime served as robust security mechanisms in various domains.
Fingerprints are the newest and most widely used form of biometric identification. The use of
fingerprint for identification has been employed in law enforcement for about a century. A
much broader application of fingerprint is for personal authentication, for instance to access a
computer, a network, an ATM machine, a car or a home. The method uses 3D analysis of the
finger for tracking and identification purposes.This recorded biostatistics information is then
stored for future use. Companies have used this type of biometrics for attendance tracking and
accessing secure entrances.
fingerprint of that users which use the machine. Using biometric, it verify/identify fingerprint
and gives accurate result that if it valid or not valid. In this way we can try to control the crime
circle of ATM and do secure it. In this scheme, a fingerprint biometric technique is fused with
the ATM for person authentication to ameliorate the security level.
The existing ATM system authenticates transactions via the card and PIN-based system.
Thereafter, it grants access to bank customers to several services such as cash withdrawal and
deposits, account to account transfers, balance enquiry, top up purchases and utility bills
payment. The ATM system compares the PIN entered against the stored authorization PIN for
every ATM users. If there is a match, the system authenticates die user and grants access to all
the services available via the ATM. If there is a mismatch on the other hand, the user
authentication process fails and the user is given two more opportunities to enter a correct PIN.
If an incorrect PIN is entered for the third time, the card gets blocked and retained by the ATM.
2.4.1 Disadvantages:
• Less Authentication
2.5.1 Advantages:
CHAPTER-III
SYSTEM DESIGN
3.1 System Narration:
The current banking system is very popular with the feature of offering customers high quality
service 24hours a day, but with a low quality security for the transaction
The traditional method of personal identification number (PIN) at the ATM has stood the
test of time, mainly due to its speed and storage, but with greater risk to customers and the bank
ATM security has often been compromised, hence the need to ensure the operation of ATM
transactions using the biometric fingerprint.
This research proposes to secure transactions at ATMs using a biometric fingerprint. The
proposed system is an improvement of the existing system through the use of a biometric
fingerprint and a BVN to secure transactions at ATMs.
The proposed new system will also be profitable as it is based on the existing system.
Fig-1:System Architecture.
3.1.1 Algorithms:
1. Linear Regression.
2. Decision tree.
Linear Regression is a supervised machine learning algorithm where the predicted output is
continuous and has a constant slope. It's used to predict values within a continuous range, (ex:
Patients, Medicines, Lab Tests) rather than trying to classify them into categories.
Decision trees are non-parametric supervised learning Method used for classification and
regression. The goal is to create a model that predicts the value of a target variable by learning
simple decision rules inferred from the data features.
A decision tree is drawn upside down with its root at the top. In the image on the left, the bold
text in black represents a condition/internal node, based on which the tree splits into branches/
edges. The end of the branch that doesn’t split anymore is the decision/leaf, in this case,
whether the passenger died or survived, represented as red and green text respectively.
Fig-3:Decision tree.
Root Node: It represents the entire population or sample and this further gets divided into two
or more homogeneous sets.
Decision Node: When a sub-node splits into further sub-nodes, then it is called the decision
node.
Leaf / Terminal Node: Nodes do not split is called Leaf or Terminal node.
Pruning: When we remove sub-nodes of a decision node, this process is called pruning. You
can say the opposite process of splitting.
3.2 Modules
Most biometric technology systems use the same basic principles of operation. First, a person
must be registered, or enrolled, on the biometric system.
1. Enrollment:
The process by which a user's biometric data is initially acquired, accessed, processed, and
stored in the form of a template for ongoing use in a biometric system is called enrollment.
Subsequent verification and identification attempts are conducted against the template(s)
generated during enrollment.
2. Presentation:
3. Biometric data:
The biometric data users provide in an unprocessed image or recording of a characteristic. The
unprocessed data is also referred to as raw biometric data or as a biometric sample. Raw
biometric data cannot be used to perform biometric matches. Instead, biometric data provided
by the user during enrollment and verification is used to generate biometric templates, and in
almost every system is discarded thereafter. Thus Biometric systems do not store biometric
data-systems use data for template creation. Enrollment requires the creation of an identifier
such as a username or ID. This identifier is normally generated by the user or administrator
during entry of personal data. When the user returns to verify, he or she enters the identifier,
and then provides biometric data. Once biometric data has been acquired, biometric templates
can be created by a process of feature extraction.
4. Feature extraction:
The automated process of locating and encoding distinctive characteristics from biometric
data in order to generate a template as called feature extraction. Feature extraction takes place
during enrollment and verification-any time a template is created. The feature extraction
process includes filtering and optimization of images and data in order to accurately locate
features. For example, voice-scan technologies generally filter certain frequencies and
patterns, and finger-scan technologies often thin ridges present in a fingerprint image to the
width of a single pixel. Since quality of feature extraction directly affects a system's ability to
generate templates, it is extremely important to the performance of a biometric system.
SDLC stands for software development life cycle. It is a process followed for software building
within a software organization. SDLC consists of a precise plan that describes how to develop,
maintain, replace, and enhance specific software. The life cycle defines a method for
improving the quality of software and the all-around development process.
SDLC is the cost-effective and time-efficient process that development teams use to design and
build high-quality software.
Fig-5:Spiral model
The presence of mind of the undertaking is reviewed to sort out and crucial comprehension is
advanced with an inconceivably sweeping approach for the task and two or three explanations.
During system appraisal the trustworthiness evaluation of the proposed structure is to be
finished. This is to guarantee that the proposed framework isn't a heap to the affiliation. For
possibility evaluation, some enthusiasm for the immense necessities for the design is critical.
1. Economical Feasibility
2. Specialized Feasibility
3. Social Feasibility
This study is finished to check the monetary impact that the system will have on the affiliation.
How much asset that the affiliation can fill the creative work of the framework is restricted.
The usages should be legitimized. In this way, the made structure likewise in all actuality
reasonable and this was achieved in light of the way that a gigantic piece of the upgrades used
are uninhibitedly open. Basically, the changed things ought to be bought.
This study is finished to actually look at the particular feasibility, or if nothing else, the specific
necessities of the system. Any construction made shouldn't have an appeal on the open specific
assets. This will induce high requests on the open explicit assets. This will incite high demands
being placed on the client. The made construction should have an unassuming need, as
irrelevant or invalid changes are typical for finishing this framework.
The piece of study is to examine the degree of certification of the framework by the client in
actuality. This integrates the most well-known approach to setting up the client to use the
structure gainfully. The client shouldn't feel compromised by the system, rather ought to
recognize it as a need. The level of affirmation by the clients solely depends on the methods
that are used to show the client the plan and to make him familiar with it. His level of assurance
ought to be raised with the goal that he is in like manner prepared to make some supportive
examination, which is welcomed, as he is the last client of the structure. Social feasibility is a
detailed study on how one interacts with others within a system or an organization. Social
impact analysis is an exercise aimed at identifying and analyzing such impacts in order to
understand the scale and reach of the project's social impacts. But cost-benefit analysis (CBA)
is only one aspect of economic evaluation. The evaluation should ask broader questions to
address socio-economic impacts overall.
Functional requirements describe what the system should do. The functional requirements can
be further categorized as follows:
The input design is the link between the information system and the user. It comprises the
developing specification and procedures for data preparation and the steps are necessary to put
transaction data in to a usable form for processing that can be achieved by inspecting the
computer to read data from a written or printed document or it can occur by having people
keying the data directly into the system. The design of input focuses on controlling the amount
of input required, controlling the errors, avoiding delay, avoiding extra steps and keeping the
process simple. The input is designed in such a way so that it provides security and ease of use
with retaining the privacy.
structures. If NFRs not tended to true to form, the results can include:
1. Scalability
2. Reliability
3. Regulatory
4. Maintainability
5. Serviceability
6. Utility
7. Availability
8. Usability
9. Interoperability
10. Environmental
Fingerprint Icon:
Modeling: Describe how the fingerprint is captured and processed for verification.
Database Interaction:
Modeling: Explain the interaction between the fingerprint scanner and the database for
matching fingerprints.
Authorization Status:
Visualization: Incorporate visual cues like a green checkmark for successful authentication
and a red cross for unsuccessful attempts.
Modeling: Outline the logic behind determining and displaying the authorization status
based on fingerprint matching.
Transaction Process:
Visualization: Include images or representations of ATM components like card slots and
cash dispensers.
Modeling: Detail the steps involved in a typical ATM transaction, emphasizing the role of
fingerprint authentication in securing transactions.
Security Measures:
Visualization: Integrate security symbols such as locks or shields to convey the secure
nature of the fingerprint authentication process.
Modeling: Explain the use of encryption and secure protocols in communication between
the fingerprint scanner and the ATM system to ensure data security.
User Feedback:
Visualization: Display feedback messages on the ATM screen to inform users of the
authentication status.
Modeling: Describe how the system provides response messages based on the success or
failure of each fingerprint authentication step.
The UML diagrams are arranged into fundamental charts, social frameworks, and besides
correspondence frame graphs. The diagrams are logically organized in the going with figure:
A use case outline in the Unified Modeling Language (UML) is a kind of lead outline depicted
by and conveyed utilizing a Use-case examination. Its motivation is to introduce a graphical
chart of the worth given by a design regarding performers, their targets (tended to as utilize
cases), and any circumstances between those usage cases. The key inspiration driving a usage
case frame is to show what system limits are performed for which performer. Occupations of
the performers in the system can be depicted.
Fig-7:Class Diagram
Fig-8:Sequence Diagram.
Activity diagrams describe the activities of a class. They are similar to state transition diagrams
and use similar conventions, but activity diagrams describe the behavior/states of a class in
response to internal processing rather than external events.
Fig-9:Component Diagram.
A part chart is utilized to separate a huge item situated framework into the more modest parts,
in order to make them more sensible. It displays the actual perspective on a framework, for
example, executables, documents, libraries, and so forth that wells inside the hub.
It pictures the connections as well as the association between the parts present in the
framework. It helps in framing an executable framework. A part is a solitary unit of the
framework, which is replaceable and executable. The execution subtleties of a part are covered
up, and it requires a point of interaction to execute a capacity. It resembles a black box whose
conduct is made sense of by the gate and required interfaces.
Fig-10:Component Diagram.
The organization outline pictures the actual equipment on which the product will be conveyed.
It depicts the static arrangement perspective on a framework. It includes the hubs and their
connections.
It discovers how programming is sent on the equipment. It maps the product engineering made
in plan to the actual framework design, where the product will be executed as a hub. Since it
includes numerous hubs, the relationship is shown by using correspondence ways.
Fig-11:Deployment diagram.
CHAPTER-IV
OVERVIEW OF TECHNOLOGIES
4 Programming Language:
Python is currently the most widely used multi-purpose, high-level programming language.
Python allows programming in Object-Oriented and Procedural paradigms. Python
programs generally are smaller than other programming languages like Java. Programmers
have to type relatively less and indentation requirement of the language, makes them
readable all the time. Python language is being used by almost all tech-giant companies like
– Google, Amazon, Facebook, Instagram, Dropbox, Uber… etc.
Python thoughts
If are not motivated by the how's and whys of Python, feel free to leap to the accompanying
segment. It will endeavor to clear up for the scrutinize why It think Python is most likely the
best language open and why it's an uncommon one to start programming with. Open-source
broadly useful language.
Python is a gigantic level, loosened up, shrewd and object-coordinated arranging language.
Python should be in a general sense reasonable. It utilizes English articulations much of the
time however different tongues use supplement, and it has less phonetic progressions than
different vernaculars.
Python is Interactive − It can truly sit at a Python brief and talk with the center individual
straightforwardly to make endeavor’s.
Python is a Beginner's Language − Python is an immense language for the youthful grown-
up developers and supports the headway of a wide degree of utilizations from clear text figuring
out how to WWW endeavor’s to games.
History of Python.
Python was made by Guido van Possum in the last piece of the eighties and mid-nineties at the
National Research Institute for Mathematics and Computer Science in the Netherlands.
Python is gotten from different tongues, including ABC, Modula-3, C, C++, Algol-68,
Smalltalk, and UNIX shell and other setting up vernaculars.
Python is protected. Like Perl, Python source code is after a short time accessible under the
GNU General Public License (GPL).
1. Easy to-learn − Python has not many watchwords, crucial arrangement, and a plainly
depicted language structure. This permits the understudy to rapidly get the language.
2. Simple to-explore − Python code is in a general sense all the more plainly portrayed
and clear to the eyes.
3. Simple to-remain mindful of − Python's source code is genuinely simple to-remained
mindful of.
4. A wide standard library − Python's more prominent piece of the library is absolutely
significant and cross-stage sensible on UNIX, Windows, and Macintosh.
5. Conventional Mode − Python has support for a sharp mode which licenses adroit
testing and looking at of bits of code.
6. Moderate − Python can run on a wide assembling of gear figures out and has
comparable quality of relationship on all stages.
7. Extendable −It can add low-level modules to the Python authority.
8. These modules interface with computer programmers to add to or re-endeavour their
gadgets to be more significant.
9. Data bases − Python gives points of relationship to all monstrous business informative
collections.
10. GUI Programming − Python stays mindful of GUI applications that can be made and
ported to different advancement calls, libraries and windows structures, like Windows
MFC, Macintosh, and the X Window strategy of Unix.
11. Versatile − Python gives a typical game plan and sponsorship for massive tasks than
shell setting up.
1. Extensive Libraries:
Python downloads with an extensive library and it contain code for various purposes like
regular expressions, documentation-generation, unit-testing, web browsers, threading,
databases, CGI, email, image manipulation, and more. So, we don’t have to write the
complete code for that manually.
2. Extensible:
As we have seen earlier, Python can be extended to other languages. You can write some
of your code in languages like C++ or C. This comes in handy, especially in projects.
3. Embeddable:
Complimentary to extensibility, Python is embeddable as well. You can put your Python
code in your source code of a different language, like C++. This lets us add scripting
capabilities to our code in the other language.
4. Improved Productivity:
5. IOT Opportunities:
Since Python forms the basis of new platforms like Raspberry Pi, it finds the future bright
for the Internet of Things. This is a way to connect the language with the real world.
6. Readable:
Because it is not such a verbose language, reading Python is much like reading English.
This is the reason why it is so easy to learn, understand, and code. It also does not need
curly braces to define blocks, and indentation is mandatory. This further aids the readability
of the code.
7. Object-Oriented:
This language supports both the procedural and object-oriented programming paradigms.
While functions help us with code reusability, classes and objects let us model the real
world. A class allows the encapsulation of data and functions into one.
Like we said earlier, Python is freely available. But not only can you download
Python for free, but you can also download its source code, make changes to it, and even
distribute it. It downloads with an extensive collection of libraries to help you with your
tasks.
9. Portable:
When you code your project in a language like C++, you may need to make some changes to
it if you want to run it on another platform. But it isn’t the same with Python.
Need to code only once, and you can run it anywhere. This is called Write Once Run Anywhere
(WORA). However, you need to be careful enough not to include any system-dependent features.
10. Interpreted:
Lastly, we will say that it is an interpreted language. Since statements are executed one by
one, debugging is easier than in compiled languages.
So far, we’ve seen why Python is a great choice for your project. But if you choose it, you
should be aware of its consequences as well. Let’s now see the downsides of choosing
Python over another language.
1. Speed Limitations:
We have seen that Python code is executed line by line. But since Python is interpreted, it
often results in slow execution. This, however, isn’t a problem unless speed is a focal point
for the project. In other words, unless high speed is a requirement, the benefits offered by
Python are enough to distract us from its speed limitations.
3. Design Restrictions:
As you know, Python is dynamically-typed. This means that you don’t need to declare the
type of variable while writing the code. It uses duck-typing. But wait, what’s that? Well, it
just means that if it looks like a duck, it must be a duck. While this is easy on the
programmers during coding, it can raise run-time errors.
Compared to more widely used technologies like JDBC (Java Data Base
Connectivity) and ODBC (Open Data Base Connectivity), Python’s database access
layers are a bit underdeveloped. Consequently, it is less often applied in huge enterprises.
5.Simple:
Python’s simplicity can indeed be a problem. Take my example. I don’t do Java, I’m more
of a Python person. To me, its syntax is so simple that the verbosity.
The informational collection to the side in memory can be of many sorts. For example, a
singular's age is taken care of as a numeric worth and their area is taken care of as alphanumeric
characters. Python has different standard information types that are utilized to depict the tasks
conceivable on them and the collecting procedure for every one of them. Python has five
standard information types −
• Numbers
• String
• List
• Tuples
Id Int(11) No None
Machine Learning:
Test frameworks
Multimedia
The study of machine learning certainly arose from research in this context, but in the data
science application of machine learning methods, it's more helpful to think of machine
learning as a means of building models of data.
At the most fundamental level, machine learning can be categorized into two main types:
supervised learning and unsupervised learning.
Unsupervised learning involves modeling the features of a dataset without reference to any
label, and is often described as "letting the dataset speak for itself." These models include
tasks such as clustering and dimensionality reduction. Clustering algorithms identify distinct
groups of data, while dimensionality reduction algorithms search for more succinct
representations of the data.
Reinforcement Learning – This involves learning optimal actions through trial and error.
So the next action is decided by learning behaviors that are based on the current state and
that will maximize the reward in the future.
Human beings, at this moment, are the most intelligent and advanced species on earth
because they can think, evaluate and solve complex problems. On the other side, AI is still
in its initial stage and haven’t surpassed human intelligence in many aspects. Then the
question is that what is the need to make machine learn? The most suitable reason for doing
this is, “to make decisions, based on data, with efficiency and scale”.
Machine Learning can review large volumes of data and discover specific trends and
patterns that would not be apparent to humans. For instance, for an e-commerce website like
Amazon, it serves to understand the browsing behaviors and purchase histories of its users
to help cater to the right products, deals, and reminders relevant to them.
With ML, you don’t need to babysit your project every step of the way. Since it means
giving machines the ability to learn, it lets them make predictions and also improve the
algorithms on their own. A common example of this is anti-virus softwares; they learn to filter
new threats as they are recognized. ML is also good at recognizing spam.
3. Continuous Improvement:
As ML algorithms gain experience, they keep improving in accuracy and efficiency. This
lets them make better decisions. Say you need to make a weather forecast model. As the
amount of data you have keeps growing, your algorithms learn to make more accurate
predictions faster.
Machine Learning algorithms are good at handling data that are multi-dimensional and multi-
variety, and they can do this in dynamic or uncertain environments.
5. Wide Applications:
You could be an e-tailer or a healthcare provider and make ML work for you. Where it does
apply, it holds the capability to help deliver a much more personal experience to customers
while also targeting the right customers.
1. Data Acquisition:
Machine Learning requires massive data sets to train on, and these should be
inclusive/unbiased, and of good quality. There can also be times where they must wait for
new data to be generated.
ML needs enough time to let the algorithms learn and develop enough to fulfill their purpose
with a considerable amount of accuracy and relevancy. It also needs massive resources to
function. This can mean additional requirements of computer power for you.
3. Interpretation of Results:
Another major challenge is the ability to accurately interpret results generated by the
algorithms. You must also carefully choose the algorithms for your purpose.
4. High error-susceptibility:
Machine Learning is autonomous but highly susceptible to errors. Suppose you train an
algorithm with data sets small enough to not be inclusive. You end up with biased predictions
coming from a biased training set. This leads to irrelevant advertisements being displayed to
customers. In the case of ML, such blunders can set off a chain of errors that can go undetected
for long periods of time. And when they do get noticed, it takes quite some time to recognize
the source of the issue, and even longer to correct it.
1.Tensorflow:
TensorFlow is a free and open-source software library for dataflow and differentiable
programming across a range of tasks. It is a symbolic math library, and is also used
for machine learning applications such as neural networks. It is used for both research and
production at Google.
TensorFlow was developed by the Google Brain team for internal Google use. It was
released under the Apache 2.0 open-source license on November 9, 2015.
2.Numpy:
It is the fundamental package for scientific computing with Python. It contains various
features including these important ones:
3.Pandas:
4.Matplotlib:
For simple plotting the pyplot module provides a MATLAB-like interface, particularly
when combined with IPython. For the power user, you have full control of line styles, font
properties, axes properties, etc, via an object oriented interface or via a set of functions
familiar to MATLAB users.
5.Scikit – learn:
4.4 Normalization
Data conditions are astute, all associated data things are assembled away. Normalization is
critical for certain reasons, but essentially in light of the fact that it grants data bases to consume
as little circle room as could be anticipated, achieving extended execution.
Table:3 1NF
Tables in 2NF ought to be in 1NF and not have any most of the way dependence (e.g., each
non-prime quality ought to be dependent upon the table's fundamental key).
Table:4 2NF
2 12 ******** Sign_up
3 13 ******** Sign_up
4 14 ******** Sign_up
5 15 ******** Sign_up
Tables in 3NF ought to be in 2NF and have no transitive reasonable circumstances on the
fundamental key. The going with two NFs furthermore exists anyway are only here and there
used
A higher transformation of the 3NF, the Boyce-Codd Normal Form is used to address the
abnormalities which could result accepting one more than one candidate key exists. Generally
called 3.5 Normal Form, the BCNF ought to be in 3NF and in each reasonable dependence (X
→ Y), X should be an extremely key.
For a table to in 4NF, it ought to be in BCNF and not have a multi-regarded dependence.
The underlying three NFs were construed during the 1970s by the father of the social data
model, E.F. Codd. Basically, all of the current social data base engines use his standards.
Insertion Anomaly: Insertion Anomaly implies when one can't install a new tuple into a
relationship due to nonappearance of data.
Deletion Anomaly: The eradicate irregularity suggests the situation where the deletion of
data achieves the inadvertent loss of another critical data.
Update molecule Anomaly: The update irregularity is the place where an update of a
lone data regard requires various lines of data to be invigorated.
CHAPTER-V
Testing
5.1 Testing Methodologies
The following are the Testing Methodologies:
Unit Testing.
Integration Testing.
User Acceptance Testing.
Output Testing.
Validation Testing.
Coordination testing settle the issues related with the twofold issues of affirmation and
program improvement. After the item has been incorporated a bunch of high request tests are
directed. The primary goal of testing process is to take unit attempted modules and collects
a program structure that has been coordinated by plan.
1.Top-Down Integration
This method is a languid procedure for managing the progress of program structure. Modules
are composed by moving lower through the control request, beginning with the chief program
module. The module subordinates to the essential program module are coordinated into the
development in either a significance first or breadth first way. The product is tried from
primary module and individual stubs are supplanted when the test continues downwards.
2. Base up Integration
This system begins the new turn of events and testing with the modules basically level in the
program structure. Since the modules are facilitated from the base up, taking care of expected
for modules subordinate to a given level is by and large available and the prerequisite for
nails is shed. The base up fuse system may be executed with the going with progresses:
1.The low-level modules are joined into bundles into bunches that play out a specific
Software sub-work.
2.A driver (i.e.) the control program for testing is made to work with analyse information and
result.
4. Drivers are disposed of and bunches are joined moving vertical in the program structure.
5.The granular perspectives test every module independently and afterward every module
will be module is incorporated with a primary module and tried for usefulness.
Client Acceptance of a structure is the indispensable part for the advancement of any system.
The structure practical is pursued for client affirmation by constantly keeping in touch with
the arranged system clients at the hour of making and making changes any spot required. The
system made gives a well-disposed UI that can undoubtedly be another seen even by an
individual to the framework.
In the wake of playing out the endorsement testing, the ensuing stage is yield attempting of
the proposed system, since no structure could be useful if it doesn't convey the essential
outcome in the predefined plan. Getting some data about the setup expected by them tests the
outcomes made or displayed by the structure feasible. Consequently, the outcome
configuration is considered in 2 ways - one is on screen and one more in printed plan.
Approval Checking
A system for structure testing organizes structure examinations and plan methodologies into
an overall organized series of steps that results in the productive advancement of
programming. A strategy for programming testing ought to oblige low-level tests that are
critical to affirm that a little source code part has been precisely executed as well as
irrefutable level tests that endorse huge system limits against client requirements.
Framework Testing:
Programming once supported ought to be gotten together with other structure parts (for
instance Equipment, people, informational collection). Structure testing affirms that all of the
parts are real and that overall system work execution is achieved.
An experiment is a report, which has a lot of test data, preconditions, expected results and
postconditions, made for a particular test circumstance to really take a look at consistence
against a specific need. Try goes probably as the early phase for the test execution, and
ensuing to applying a lot of information regards, the application has a legitimate outcome
and leaves the structure at some end point or generally called execution postcondition.
Table:7 Test case Results
Status
Test Input Expected Behavior Observed P = Passed
S. No. behavior F = Failed
1 Login as user or Administrator or user Got Expected P
admin with Home page for manager result
correct login should be displayed
details
2 Login as user or Error message should Got Expected P
admin with wrong be displayed result
login details
CHAPTER-VI
FEATURE ENHANCEMENT
Future Work
Improving of the existing system through the use of a biometric fingerprint to secure
transactions at ATMs. This system will also be profitable as it is based on the existing
system and thus reduce cost for research work.
Adding a card reader for people who are still using traditional methods.
Encryption: Ensure that fingerprint data is encrypted during transmission and storage
to prevent unauthorized access.
Alerts and Notifications: Implement real-time alerts for unusual activities or multiple
failed authentication attempts to enhance security monitoring.
CHAPTER-VII
CONCLUSION
In today’s modern world, autonomous systems play an important role in our day to day life.
As the social computerization and automation have drastically increased, it can be seen
evidently where the number of ATM centers increases rapidly. Most civilians use ATM’s
regularly. A good example can be a financial transaction, ease of money exchange etc. So
there exists an important factor called security.
The security features were enhanced largely for the stability and reliability of owner
recognition. The whole system was built on the fingerprint technology which makes the
system safer, reliable and easy to use.
The execution of ATM protection by availing fingerprint also has the traditional verifying
methods that were inputting the client's fingerprints, that is sent by the administrator and
checked correctly.
APPENDIX
SCREENSHOTS
Sample Screens:
All existing banking applications are authenticating users based on PIN NO or password but
this technique is not secured so in propose online banking application we are authenticating
user based on his finger print. To implement this project we have designed following
modules
1) Signup: using this module user can signup with the application by using username,
password and finger print image. All signup details will be saved in MYSQL database
2) Login: using this module user can login to application by entering username,
password and finger print image given at signup time to authenticate himself
3) Deposit: after successful authentication user can deposit amount and it will added to
his account
4) Withdraw: using this user can withdraw amount if sufficient balance available
5) View Balance: using this module user can view available balance
First create database in MYSQL by copying content from ‘DB.txt’ and then paste in MYSQL
To run project double click on ‘run.bat’ file to start python FLASK server.
Screenshot:1 In above screen server started and now open browser and enter URL as
‘http://localhost:5000/index’ and press enter key to get below.
Screenshot:2 In above screen click on ‘Signup Here’ link to get below screen.
Screenshot:3 In above screen fill all signup details and then choose finger print image and
then click on ‘Open’ button to load image and to get below screen.
Screenshot:4 In above screen after pressing ‘Register’ button we will get message as ‘Signup
process completed’ and now click on ‘Login Here’ link to get below screen
Screenshot:5 In above screen I am login and selecting wrong finger print as ‘4.png’ and then
click on ‘Open’ button to get below screen.
Screenshot:6 In above screen image loaded and now click on ‘Login’ button to get below
output.
Screenshot:7 In above screen login is failed and now login with correct image.
Screenshot:8 In above screen now i am uploading correct image and press ‘Login’ button to
get below output
Screenshot:9In above screen user login is successful and we got deposit and with draw option.
BIBILOGRAPHY
[1] Prof. Ranjit Mane1, Sagar Chavan2, Trushali Birambole3, Asmita kamble4, “Fingerprint
Based ATM System”, International Journal for Research Trends and Innovation (IJRTI) ,
ISSN:2456-3315, Volume 4, Issue 4, 2017.
[2] Saima Rafat Bhandari1, Zarina Begum K Mundargi2, "A Review on Securing ATM
System Using Fingerprint", International Journal of Scientific Research in Computer Science,
Engineering and Information Technology (IJSRCSEIT), ISSN:2456-3307, Volume 3, Issue 2,
2018.
[3] Christiawan1, Bayu Aji Sahar2, Azel Fayyad Rahardian3, Elvayandri Muchtar4,
"Fingershield ATM – ATM Security System Using Fingerprint Authentication". Issue 4.
[4] Vijayraj A, "A Survey on Cardless Cash Access Using Biometric ATM Security System”,
Scholars Journal of Engineering and Technology (SJET), ISSN 2347-9523, Issue 2.
[5] Moses Okechukwu Onyesolu, Ignatius M. Ezeani, “ATM Security Using Fingerprint
Biometric Identifier: An Investigative Study”, International Journal of Advanced Computer
Science and Applications (IJACSA), 030412, 2012.
[6] Samayita Bhattacharya and Kalyani Mali,” Fingerprint Recognition Using Minutiae
Extraction Method”, (International World Wide Web Conference Committee) © 2011 IW3C2,
published under Creative Commons CC by 4.0 January,2011.