Batch No 14
Batch No 14
Batch No 14
ABSTRACT
Fake currency is the money produced without the approval of the government, creation of it is
considered as a great Offense. The elevation of color printing technology has increased the rate of fake
currency note printing on a very large scale. Years before, the printing could be done in a print house,
but now anyone can print a currency note with maximum accuracy using a simple laser printer. This
results in the issue of fake notes instead of the genuine ones has been increased very largely. It is the
biggest problem faced by many countries including India. Though Banks and other large organizations
have installed Automatic machines to detect fake currency notes, it is really difficult for an average
person to distinguish between the two. This has led to the increase of corruption in our country
hindering the country's growth. Some of the methods to detect fake currency are watermarking,
optically variable ink, security thread, latent image, techniques like counterfeit detection pens. We
hereby propose an application system for detecting fake currency where image processing is used to
detect fake notes. We are going to detect the variation in barcode among the real and fake one and also,
we will find out dissimilarities between the image under consideration and the prototype. CNN
classifiers will be used to detect fake currency. The proposed app for fake currency detection will be
simple, accurate and easy to use
I
TABLE OF CONTENTS
TOPICS PAGE NO
● Certificates
● Acknowledgement
● Abstract
● Figures/Tables
CHAPTER-1: INTRODUCTION 1
CHAPTER-8: IMPLEMENTATION
8.1 MODULES 22
9.1 PYTHON 25
CHAPTER-10: RESULTS/DISCUSSIONS
CHAPTER-11: CONCLUSION 35
CHAPTER-12: REFERENCES/BIBLIOGRAPHY
III
LIST OF FIGURES
1 Architecture of system
2 Data flow Diagram
3 UML diagrams
4 Use Case
5Class
6 Sequence
7C
8 Activity
9 Unsupervised learning
10 CNN
11 Command Prompt
12 http request link
13 App in the browser
14 The web application front page
15 Python Application Installation
16.Python Setup and Success logs
17 OutPut screen 1
18 OutPut screen 2
19 Output screen 3
20 Output screen 4
III
EVALUATION OF MACHINE LEARNING ALGORITHMS FOR THE
DETECTION OF FAKE BANK CURRENCY
Bachelor of Technology
in
Computer Science & Engineering
by
IV
DEPARTMENT OF COMPUTER SCIENCE & ENGINEERING
CERTIFICATE
This is to certify that the dissertation entitled “Title of the Project”, being submitted by
Vangala Santhosh Reddy, bearing Roll No: 20X31A6249, to Jawaharlal Nehru Technological
University Hyderabad in partial fulfillment of the requirements for the award of the degree of
carried out by him. The results of investigations enclosed in this report have been verified and found
satisfactory. The results embodied in this dissertation have not been submitted to any other University
V
SRI INDU INSTITUTE OF ENGINEERING & TECHNOLOGY
(Affiliated to JNTUH, Hyderabad, Approved by AICTE, New Delhi)
Sheriguda (V), Ibrahimpatnam (M), R.R.Dist., Telangana- 501510.
DECLARATION
FOR THE DETECTION OF FAKE BANK CURRENCY”, carried out under the guidance
Technological University Hyderabad in partial fulfillment of the requirements for the award
of the degree of Bachelor of Technology in Computer Science & Engineering. This is a record
of bonafide work carried out by me and the results embodied in this dissertation have not been
reproduced or copied from any source. The results embodied in this dissertation have not been
submitted to any other University or Institute for the award of any other degree.
V
CHAPTER-1
INTRODUCTION
Computers and mobile phones have become an unavoidable part of our lives. There are a lot of things which
we can do with these technologies. With the rapid development of mobile phones and technologies come
several services like application creation - (refers to the process of making application software for handheld
and desktop devices such as mobile phones, personal computers and Personal Digital Assistants. Through the
usage of apps, the user is provided with various features that will enable him to fulfil all his needs and much
more. Apps should be interactive to the users, Camera/webcam services- includes use of camera services for
processing various aspects of image. Fake currency Detection is a system that can be used to overcome the
limitations most of the people and our institutions of higher learning face with respect to making difference
between counterfeit currencies- (is imitation currency produced without the legal sanction of the state or
government, usually in a deliberate attempt to imitate that currency and so as to deceive its recipient) and real
currencies. The project involves making use of Digital Image Processing Domain - Digital image processing
is the use of computer algorithms to perform image processing on digital images.
1
CHAPTER-2
LITERATURE SURVEY
The paper titled as “Fake currency Detection using Basic Python Programming and Web Framework” (2020)
presented by Prof Chetan More, Monu Kumar, Rupesh Chandra, Raushan Singh. System proposed in this
paper makes use of flask web framework (Flask is micro web framework of python and web programming)
and is written in python programming language returned.
The paper titled as “Detection of Counterfeit Indian Currency Note Using Image Processing” presented by
Vivek Sharan and Amandeep Kaur in 2019 describes Detection of Counterfeit Indian Currency Notes using
Image Processing. In this paper, three major features were taken into consideration; Latent image, Logo of
RBI and denomination numeral with Rupee symbol with color part of the currency note. Using these three
features they had applied an algorithm which detects counterfeit Indian currency notes.
The paper titled “Indian Paper currency detection “presented by Aakash S. Patil in 2019, introduced a new
technique to improve the Recognition ability and the transaction speed to classify Indian currency. It
involved making use of OpenCv library of computer functions mainly aimed at real-time computer vision
which covered functions such as note identification, segmentation and Recognition and NumPy module of
Python used for numerical processing, arg parse to parse command line arguments cv2 for the OpenCV
bindings.
The paper titled as “Identification of fake notes and denomination recognition” presented by Archana MR,
Kalpitha C P, Prajwal S K, Pratiksha N proposed Identification of fake note and denomination recognition in
2018 to reduce human power. This system is mainly divided into two halves: currency recognition &
conversion system. They made use of a software interface which could be utilized for different types of
monetary standards.
The paper titled as “Fake currency detection using Image processing” presented by S. Atchaya, K. Harini, G.
Kaviarasi, B. Swathi in 2017 gave the technique called Performance Matrix for the Fake currency detection
using MATLAB image processing system. Neural networks and model-based reasoning are the two methods
behind this technique. Various methods like water marking, optically variable ink, fluorescence, etc. are used
to detect fake currency in this paper.
2
CHAPTER -3
SYSTEM ANALYSIS
From the observation of the papers we can say that there are certain stages which are very important
in the existing system architecture. Firstly we have the step called image acquisition means we have to take
input as the image only through the scanner and in this there is no use of any digital camera to capture the
image in the real time system. In this existing architecture, only the front part of the note is take into
consideration and not the rear part. After that we have next step called as pre-processing method. In this there
are basically 3 to 4 sub stages involved like preprocessing, grayscale conversion, edge detection and
segmentation.
Existing fake bank currency detection systems have their limitations and disadvantages, which can include:
False Positives: One of the most significant drawbacks of fake currency detection systems
is the potential for false positives. Legitimate currency notes may be flagged as fake, causing inconvenience
to users and businesses.
False Negatives: Conversely, there is also the risk of false negatives, where counterfeit
notes go undetected by the system, leading to the circulation of counterfeit currency.
3
Limited Detection Methods: Many fake currency detection systems rely on a limited set of
detection methods, such as UV (Ultraviolet) and magnetic ink detection. Counterfeiters may use more
sophisticated methods, making it difficult for these systems to catch advanced counterfeit bills.
Cost: High-quality, multi-modal detection systems can be costly, especially for smaller
businesses. This cost may deter some from investing in effective counterfeit detection technology.
Maintenance: The systems require regular maintenance and calibration to ensure accuracy.
Neglecting maintenance can lead to decreased effectiveness over time.
4
3.2.1 ADVANTAGES
Detecting fake bank currency using machine learning algorithms offers several advantages:
Improved Accuracy: Machine learning algorithms can be trained on a large dataset of genuine and
counterfeit currency notes, enabling them to learn intricate patterns and features that may be difficult for
human operators or traditional detection methods to discern. This leads to higher accuracy in identifying
counterfeit notes.
Real-time Detection: Machine learning systems can process and analyze currency notes in real-time,
providing rapid results. This is particularly beneficial in high-volume environments like banks, retail stores,
and ATMs.
Adaptability: Machine learning algorithms can adapt to new counterfeit techniques and variations as they
encounter them. This adaptability makes them more robust against evolving counterfeit methods.
Reduced False Positives: ML algorithms can be fine-tuned to minimize false positives, reducing the chances
of genuine notes being flagged as counterfeit and causing inconvenience to users.
Scalability: Machine learning systems can be easily scaled to accommodate different currency types and
denominations, making them versatile for use in various countries and settings.
Multimodal Detection: ML algorithms can utilize multiple modalities for detection, including visual
analysis, infrared, ultraviolet, and more, enhancing their ability to spot counterfeit currency.
Continuous Improvement: As more data is collected and more counterfeit notes are detected, machine
learning algorithms can continuously improve their performance, refining their ability to identify fake
currency over time.
Reduced Human Error: ML-based systems are less prone to human error, making them more reliable in
detecting counterfeit currency consistently.
Integration with Existing Systems: Machine learning algorithms can be integrated with existing
point-of-sale (POS) systems, ATMs, and other financial equipment, making it easier for businesses to
implement counterfeit detection measures.
5
Cost Savings: Over time, machine learning systems can be cost-effective, as they require less human
intervention, resulting in potential cost savings for businesses.
6
CHAPTER-4
SYSTEM REQUIREMENTS
The functional requirements for detecting fake bank currency using machine learning algorithms should
encompass a range of features and capabilities to ensure effective counterfeit detection. Here are some key
functional requirements:
Data Collection:Ability to collect and maintain a comprehensive dataset of genuine and counterfeit currency
samples for training and testing.
Training and Model Development:Capable of training machine learning models using the collected data to
distinguish between genuine and counterfeit currency.Ability to fine-tune and update models as new
counterfeit techniques emerge.
Currency Compatibility:Support for various currency types, denominations, and designs, ensuring
versatility in different regions.
Multimodal Sensing:Ability to incorporate multiple modalities for detection, including visual analysis,
ultraviolet (UV), infrared (IR), magnetic ink, and other relevant detection techniques.
Real-time Processing:Capability to process currency notes in real-time to ensure swift detection and
minimize delays in transactions.
Adaptability:Ability to adapt and evolve to new counterfeit techniques and variations over time.
Accuracy:High accuracy in distinguishing between genuine and counterfeit currency, with a low rate of false
positives and false negatives.
User Interface:User-friendly interface for operators or end-users to interact with the system, providing clear
feedback on detected currency authenticity.
7
Integration:Compatibility with existing point-of-sale (POS) systems, ATMs, and other financial equipment
for seamless integration into banking and retail operations.
Scalability:
Ability to handle high transaction volumes in busy environments, such as banks and retail stores.
RAM -4 GB (min)
Hard Disk-500 GB
Processor-i5
8
CHAPTER-5
SYSTEM STUDY
The feasibility of the project is analyzed in this phase and a business proposal is put forth with a very
general plan for the project and some cost estimates. During system analysis the feasibility study of the
proposed system is to be carried out. This is to ensure that the proposed system is not a burden to the
company. For feasibility analysis, some understanding of the major requirements for the system is essential.
Three key considerations involved in the feasibility analysis are,
● ECONOMICAL FEASIBILITY
● TECHNICAL FEASIBILITY
● SOCIAL FEASIBILITY
ECONOMIC FEASIBILITY
This study is carried out to check the economic impact that the system will have on the organization. The
amount of funds that the company can pour into the research and development of the system is limited. The
expenditures must be justified. Thus the developed system as well within the budget and this was achieved
because most of the technologies used are freely available. Only the customized products had to be
purchased.
TECHNICAL FEASIBILITY
This study is carried out to check the technical feasibility, that is, the technical requirements of the system.
Any system developed must not have a high demand on the available technical resources. This will lead to
high demands on the available technical resources. This will lead to high demands being placed on the client.
The developed system must have a modest requirement, as only minimal or null changes are required for
implementing this system.
9
SOCIAL FEASIBILITY
The aspect of study is to check the level of acceptance of the system by the user. This includes the process of
training the user to use the system efficiently. The user must not feel threatened by the system, instead must
accept it as a necessity. The level of acceptance by the users solely depends on the methods that are
employed to educate the user about the system and to make him familiar with it. His level of confidence must
be raised so that he is also able to make some constructive criticism, which is welcomed, as he is the final
user of the system.
10
CHAPTER-6
SYSTEM DESIGN
11
6.2 UML DIAGRAMS
12
GOALS:
A use case diagram in the Unified Modeling Language (UML) is a type of behavioral diagram defined by
and created from a Use-case analysis. Its purpose is to present a graphical overview of the functionality
provided by a system in terms of actors, their goals (represented as use cases), and any dependencies between
those use cases. The main purpose of a use case diagram is to show what system functions are performed for
which actor. Roles of the actors in the system can be depicted
13
.
In software engineering, a class diagram in the Unified Modeling Language (UML) is a type
of static structure diagram that describes the structure of a system by showing the system's classes, their
attributes, operations (or methods), and the relationships among the classes. It explains which class contains
information.
14
6.2.3 SEQUENCE DIAGRAM
A sequence diagram in Unified Modeling Language (UML) is a kind of interaction diagram that
shows how processes operate with one another and in what order. It is a construct of a Message
Sequence Chart. Sequence diagrams are sometimes called event diagrams, event scenarios, and timing
diagrams.
In UML diagrams, collaboration is a type of structured classifier in which roles and attributes co-operate to
define the internal structure of a classifier. You use a collaboration when you want to define only the roles
and connections that are required to accomplish a specific goal of the collaboration.
15
6.2.5 ACTIVITY DIAGRAM
Activity diagrams are graphical representations of workflows of stepwise activities and action
with support for choice, iteration and concurrency. In the Unified Modeling Language, activity
diagrams can be used to describe the business and operational step-by-step workflows of
components in a system. An activity diagram shows the overall flow of control
16
CHAPTER-7
Designing a system to detect fake bank currency using machine learning algorithms involves multiple steps.
Here's an outline of the process, from data collection to model deployment:
1. Data Collection:
Gather a diverse dataset of both genuine and fake banknotes. This dataset should include images or data
points related to various features of the banknotes that can be used for analysis. These features might include
security features like watermarks, holograms, and UV patterns.
2. Data Preprocessing:
Clean and preprocess the data, ensuring that it is consistent and free from noise. Data preprocessing steps
may include resizing images, converting them to grayscale, normalizing pixel values, and extracting relevant
features.
3. Feature Engineering:
Extract relevant features from the banknote images or data points. These features might include texture,
colour, shape, or statistical characteristics. You can use techniques like edge detection, texture analysis, or
colour histograms.
4. Data Splitting:
Divide the dataset into three subsets: training, validation, and test sets. Typically, an 80-20 or 70-30 split is
used, with the majority of the data allocated for training.
5. Model Selection:
Choose an appropriate machine learning algorithm for the task. Common choices include Support Vector
Machines (SVM), Random Forests, Convolutional Neural Networks (CNNs), or Gradient Boosting.
17
6. Model Training:
Train the selected model on the training data. Ensure that you use the appropriate loss function, and monitor
the model's performance on the validation set. You may need to adjust hyperparameters like learning rates
and regularisation terms.
7. Model Evaluation:
Assess the model's performance using various evaluation metrics, such as accuracy, precision, recall,
F1-score, and ROC-AUC. Additionally, consider confusion matrices to understand false positives and false
negatives.
8. Hyperparameter Tuning:
Fine-tune the model's hyperparameters based on the validation performance. This process may require
multiple iterations.
9.Model Testing:
Once the model's performance is satisfactory on the validation set, evaluate it on the test set to assess its
real-world performance.
10. Deployment:
Deploy the trained model in a production environment. This can be done through web applications, APIs, or
integration into banknote processing machines.
The primary objective for the detection of fake bank currency using machine learning
algorithms is to create a reliable and accurate system for distinguishing genuine currency
from counterfeit currency. The following are specific objectives associated with this task:
1.High Accuracy:
Develop a machine learning model that can achieve a high level of accuracy in distinguishing genuine
currency from counterfeit currency.
18
2.Real-time Detection:
Enable real-time detection of fake banknotes, ensuring that the system can quickly process and verify
currency within a reasonable time frame.
3.Versatility:
Create a system that can detect various types of counterfeit techniques, including those involving image
manipulation, printing errors, and fraudulent security features.
4.Generalization:
Ensure that the model can generalize well to detect fake banknotes from different countries and with various
denominations.
5.Robustness:
Design a system that remains effective in the presence of variations in lighting, angles, and conditions
commonly encountered in real-world scenarios.
7. Data Security:
Implement strong data security measures to protect sensitive information, ensuring that banknote images are
not misused or compromised.
8.Scalability:
Develop a system that can be easily integrated into various platforms, such as ATM machines, point-of-sale
systems, and mobile apps.
9.Adaptability: in
Allow for regular updates and adaptability to new counterfeit methods as counterfeiters continually evolve
their techniques.
10.User-Friendly Interface:
Provide a user-friendly interface for operators and end-users to easily interact with the system and understand
the results.
19
7.2 OUTPUT DESIGN
The output design for the detection of fake bank currency using machine learning algorithms involves the
presentation and communication of results to users, administrators, and relevant stakeholders. Here's how
you can design the output to effectively convey the findings of the currency authentication system:
1.Authentication Decision:
Clearly communicate the authentication decision for each banknote, indicating whether it is genuine or
potentially fake. This can be a binary "Genuine/Fake" classification.
2.Confidence Score:
Provide a confidence score or probability associated with the decision. This score helps users understand the
system's level of confidence in its decision.
3.Image Visualization:
Display the banknote image or a representation of it with any pertinent annotations, such as areas of
suspicion or security features that have been examined.
6.User Feedback:
Provide feedback to end-users, such as a display message or indicator, indicating the outcome of the
verification process. For example, "Authentic" or "Please contact a supervisor."
10.User Interface:
Design a user-friendly interface with intuitive visuals and instructions for operators, cashiers, or end-users
interacting with the system.
21
Objective: Maintain an auditable log and records of all authentication decisions, including timestamps,
images, and related data, for accountability and auditing purposes.
8.Explanation of Decision:
Objective: Offer explanations of the system's decision, highlighting specific features or patterns that
contributed to the verdict to build trust and understanding.
9.Seamless Integration:
Objective: Ensure seamless integration of the output with existing systems used by financial institutions or
businesses, facilitating data flow and real-time updates.
10.User-Friendly Interface:
Objective: Design a user-friendly interface that simplifies the interpretation of results and instructions for
operators or end-users.
22
CHAPTER-8
IMPLEMENTATION
8.1 MODULES
Detecting fake banknotes using machine learning algorithms is a challenging yet important task. To create a
system that can evaluate the authenticity of banknotes, you will need various modules and techniques. Here's
an outline of the key modules and steps you might consider:
Data Collection and Preprocessing:
● Gather a dataset of genuine and counterfeit banknotes. Ensure that the dataset is diverse and includes
different denominations and variations of counterfeit banknotes.
● Preprocess the images, removing noise, standardizing dimensions, and enhancing the quality if
necessary.
Feature Extraction:
● Extract relevant features from the banknote images. Common features include texture, color, and
patterns.
● Use techniques like Gabor filters, Histogram of Oriented Gradients (HOG), and Local Binary Pattern
(LBP) to capture distinctive characteristics.
Machine Learning Model Selection:
● Choose an appropriate machine learning algorithm for classification. Some popular choices are
Support Vector Machines (SVM), Random Forest, Decision Trees, and Neural Networks.
Model Training and Validation:
● Split your dataset into training and validation sets. Train your model on the genuine and counterfeit
banknote samples.
● Employ techniques like cross-validation to ensure the model's robustness.
Model Evaluation Metrics:
● Use appropriate evaluation metrics such as accuracy, precision, recall, F1-score, and ROC-AUC to
assess the model's performance.
Hyperparameter Tuning:
● Fine-tune the model's hyperparameters to achieve better accuracy and generalization.
Data Augmentation:
● Augment the dataset to include variations of genuine and counterfeit banknotes. This can help
improve the model's ability to generalize to different scenarios.
23
Deployment:
Integrate the trained model into a software application or system. You can use libraries like Tkinter which is
used to implement GUI
User Interface:
● Create a user-friendly interface for users to input banknote images and receive results. This interface
may include features like uploading images, displaying results, and providing user feedback.
Real-time Processing:
● If needed, implement real-time processing by integrating the system with cameras or image capture
devices.
Preprocessing
It is very difficult to process an image. Before any image is processed, it is very significant to
remove unnecessary items it may hold. After removing unnecessary artifacts, the image can
be processed successfully. The initial step of image processing is Image Pre-Processing Pre-
processing involves processes like conversion to grayscale image, noise removal and image
reconstruction. Conversion to grey scale image is the most common pre-processing practice.
After the image is converted to grayscale, then remove excess noise using different filtering
methods.
Image segmentation
Segmentation of images is important as large numbers of images are generated during the
scan and it is unlikely for clinical experts to manually divide these images in a reasonable
time. Image segmentation refers to segregation of given image into multiple non-overlapping
regions. Segmentation represents the image into sets of pixels that are more significant and
easier for analysis. It is applied to approximately locate the boundaries or objects in an image
and the resulting segments collectively cover the complete image. The segmentation
algorithms works on one of the two basic characteristics of image intensity; similarity and
discontinuity.
24
Feature extraction
Feature extraction is an important step in the construction of any pattern classification and
aims at the extraction of the relevant information that characterizes each class. In this process
relevant features are extracted from objects/ alphabets to form feature vectors. These feature
vectors are then used by classifiers to recognize the input unit with target output unit. It
becomes easier for the classifier to classify between different classes by looking at these18
features as it allows fairly easy to distinguish. Feature extraction is the process to retrieve the
most important data from the raw data.
Classification
Classification is used to classify each item in a set of data into one of predefined set of classes
or groups. In other words, classification is an important technique used widely to differentiate
normal and tumor brain images. The data analysis task classification is where a model or
classifier is constructed to predict categorical labels. Classification is a data mining function
that assigns items in a collection to target categories or classes. The goal of classification is to
accurately predict the target class for each case in the data
25
CHAPTER-9
SOFTWARE ENVIRONMENT
9.1 PYTHON
26
It is used for:
● web development (server-side),
● software development,
● Mathematics,
● system scripting.
● Python can be used on a server to create web applications.
● Python can be used alongside software to create workflows.
● Python can connect to database systems. It can also read and modify files.
● Python can be used to handle big data and perform complex mathematics.
● Python can be used for rapid prototyping, or for production-ready software development.
● Python works on different platforms (Windows, Mac, Linux, Raspberry Pi, etc).
● Python has a simple syntax similar to the English language.
● Python has syntax that allows developers to write programs with fewer lines than some other
programming languages.
● Python runs on an interpreter system, meaning that code can be executed as soon as it is written.
This means that prototyping can be very quick.
Python relies on indentation, using whitespace, to define scope; such as the scope of loops, functions and
classes. Other programming languages often use curly-brackets for this purpose.
27
MACHINE LEARNING
Machine learning is a type of computer technology that allows computers to learn and make predictions or
decisions without being explicitly programmed. In simple terms, it's about teaching computers to learn from
data and use that knowledge to perform tasks or solve problems. Here's how it works:
Data Collection: First, you gather a lot of data related to the task you want the computer to perform. This data
can be anything from images and text to numbers and sensor readings.
Training: You feed this data into a machine learning algorithm. During the training phase, the algorithm looks
for patterns, relationships, or rules in the data. It tries to figure out how the input data (e.g., the features of an
image) relates to the desired output (e.g., whether the image contains a cat or a dog).
Learning: The machine learning algorithm adjusts its internal parameters based on the patterns it finds in the
training data. It essentially learns from the data.
Prediction or Decision: Once the algorithm has learned from the data, it can be used to make predictions or
decisions on new, unseen data. For example, it can classify new images as either cats or dogs based on what
it learned during training.
Here's a simple analogy: Think of machine learning like teaching a computer to recognize different fruits.
You show it a bunch of apples, oranges, and bananas, and it learns to distinguish them by their size, color,
and shape. Once it's learned, you can give it a new, unlabeled fruit, and it can tell you whether it's an apple,
orange, or banana based on what it learned from the training data.
In machine learning, tasks are generally classified into broad categories. These categories are based on
how learning is received or how feedback on the learning is given to the system developed.
Two of the most widely adopted machine learning methods are supervised learning which trains algorithms
based on example input and output data that is labeled by humans, and unsupervised learning which provides
the algorithm with no labeled data in order to allow it to find structure within its input data. Let’s explore
these methods in more detail.
Supervised Learning
In supervised learning, the computer is provided with example inputs that are labeled with their desired
outputs. The purpose of this method is for the algorithm to be able to “learn” by comparing its actual output
with the “taught” outputs to find errors, and modify the model accordingly.Supervised learning therefore uses
patterns to predict label values on additional unlabeled data. For example,
with supervised learning, an algorithm may be fed data with images of sharks labeled as fish and images of
oceans labeled as water. By being trained on this data, the supervised learning algorithm should be able to
28
later identify unlabeled shark images as fish and unlabeled ocean images as water.A common use case of
supervised learning is to use historical data
o predict statistically likely future events. It may use historical stock market information to anticipate
upcoming fluctuations, or be employed to filter out spam emails. In supervised learning, tagged photos of
dogs can be used as input data to classify untagged photos of dogs.
Unsupervised Learning
In unsupervised learning, data is unlabeled, so the learning algorithm is left to find commonalities among
its input data. As unlabeled data are more abundant than labeled data, machine learning methods that
facilitate unsupervised learning are particularly valuable.The goal of unsupervised learning may be as
straightforward as discovering hidden patterns within a dataset, but it may also have a goal of feature
learning, which allows the computational machine to automatically discover the representations that are
needed to classify raw data.
Unsupervised learning is commonly used for transactional data. You may have a large dataset of customers
and their purchases, but as a human you will likely not be able to make sense of what similar attributes can
be drawn from customer profiles and their types of purchases. With this data fed into an unsupervised
learning algorithm, it may be determined that women of a certain age range who buy unscented soaps are
likely to be pregnant, and therefore a marketing campaign related to pregnancy and baby products can be
targeted to this audience in order to increase their number of purchases.
Approaches
29
Convolutional neural network (CNN/ConvNet) is a class of deep neural networks, most commonly applied to
analyze visual imagery. Now when we think of a neural network we think about matrix multiplications but
that is not the case with ConvNet. It uses a special technique called Convolution. Now in mathematics
convolution is a mathematical operation on two functions that produces a third function that expresses how
the shape of one is modified by the other.
30
CNN(CONVOLUTIONAL NEURAL NETWORK)
CNN stands for Convolutional Neural Network, which is a class of deep learning neural networks commonly
used for image and video analysis, as well as in various other applications like natural language processing.
CNNs are designed to automatically and adaptively learn patterns, features, and hierarchies from data,
particularly in the context of grid-like data, such as images.
Here are the key components and concepts associated with CNNs:
1. Convolutional Layer: Convolutional layers are the building blocks of CNNs. They consist of a set of
learnable filters (also known as kernels) that slide over the input data to perform convolution operations. This
operation extracts local patterns or features from the input data. Convolution helps the network recognize
spatial hierarchies and patterns in the data.
2. Pooling Layer: Pooling layers are used to reduce the spatial dimensions of the data while retaining
important information. Common pooling operations include max-pooling and average-pooling. Pooling helps
to make the network more robust to variations in the input data and reduces the number of parameters.
3. Activation Function: After each convolutional and pooling operation, an activation function is applied,
typically the Rectified Linear Unit (ReLU) function. This introduces non-linearity into the model, allowing it
to learn complex patterns and features.
4. Fully Connected Layer: CNNs often conclude with one or more fully connected layers, which act as
traditional neural network layers. These layers help combine high-level features and make predictions based
on the learned features. In the case of image classification, the final fully connected layer typically outputs
class probabilities.
5. Stride: Stride determines the step size at which the filter moves across the input data during convolution. A
larger stride reduces the spatial dimensions of the output feature maps.
31
6. Padding: Padding is the addition of extra rows and columns of zeros around the input data before
convolution. It helps control the spatial dimensions of the feature maps and prevent them from shrinking too
quickly.
7. Filters/Kernels: Filters are small, learnable matrices that are applied during convolution. These filters are
responsible for recognizing different features within the input data, such as edges, textures, or more complex
structures.
8. Hierarchical Feature Learning: CNNs are designed to learn hierarchical features, starting from low-level
features (e.g., edges and corners) in the early layers to more complex and abstract features in the deeper
layers. This ability to capture hierarchical features makes CNNs highly effective in image analysis tasks.
9. Transfer Learning: CNNs often benefit from transfer learning, which involves using a pre-trained network
(e.g., on a large dataset like ImageNet) as a starting point for a new task. This can save time and
computational resources, as the lower layers of the network have already learned useful features.
CNNs are widely used in various applications, including image classification, object detection, image
segmentation, facial recognition, medical image analysis, and more. They have revolutionized the field of
computer vision and have made significant contributions to the field of deep learning.
Python Installation
Many PCs and Macs will have python already installed.To check if you have python installed on a Windows
PC, search in the start bar for Python or run the following on the Command Line (cmd.exe):C:\Users\Your
Name>python --version
To check if you have python installed on a Linux or Mac, then on linux open the command line or on Mac
open the Terminal and type:
python --version
32
Download the Correct version into the system
Step 1: If you find that you do not have python installed on your computer, then you can download it for
free from the following website: https://www.python.org/
Now, check for the latest and the correct version for your operating system.
33
Step 3: You can either select the Download Python for windows 3.7.4 button in Yellow Color or you can
scroll further down and click on download with respective to their version. Here, we are downloading the
most recent python version for windows 3.7.4
Step 4: Scroll down the page until you find the Files option.
Step 5: Here you see a different version of python along with the operating system.
• To download Windows 32-bit python, you can select any one from the three options: Windows x86
embeddable zip file, Windows x86 executable installer or Windows x86 web-based installer.
34
•To download Windows 64-bit python, you can select any one from the three options: Windows
x86-64 embeddable zip file, Windows x86-64 executable installer or Windows x86-64 web-based
installer.
Here we will install Windows x86-64 web-based installer. Here your first part regarding which version of
python is to be downloaded is completed. Now we move ahead with the second part in installing python i.e.,
Installation
Installation of Python
Step 1: Go to Download and Open the downloaded python version to carry out the installation process.
Step 2: Before you click on Install Now, Make sure to put a tick on Add Python 3.7 to PATH.
35
Step 3: Click on Install NOW After the installation is successful. Click on Close.
With these above three steps on python installation, you have successfully and correctly installed Python.
Now is the time to verify the installation. Note: The installation process might take a couple of minutes.
Python Quickstart:
Python is an interpreted programming language, this means that as a developer you write Python (.py) files in
a text editor and then put those files into the python interpreter to be executed.
The way to run a python file is like this on the command line:
Let's write our first Python file, called helloworld.py, which can be done in any text editor.
helloworld.py
print("Hello, World!")\
Open your command line, navigate to the directory where you saved your file, and run: C:\Users\Your
Name>python helloworld.py
36
The output should read: Hello, World!
Congratulations, you have written and executed your first Python program.
The Python Command Line tests a short amount of code in python. Sometimes it is quickest and easiest
not to write the code in a file. This is made possible because Python can be run as a command line itself.
Type the following on the Windows, Mac or Linux command line: C:\Users\Your Name>python Or, if
the "python" command did not work, you can try "py": C:\Users\Your Name>py
From there you can write any python, including our hello world example from earlier in the tutorial:
C:\Users\Your Name>python
Python 3.6.4 (v3.6.4:d48e ceb, Dec 19 2017, 06:04:45) [MSC v.1900 32 bit (Intel)] on win32 Type
"help", "copyright", "credits" or "license" for more information.
>>>print("Hello, World!")
Which will write "Hello, World!" in the command line: C:\Users\Your Name>python
Python 3.6.4 (v3.6.4:d48e ceb, Dec 19 2017, 06:04:45) [MSC v.1900 32 bit (Intel)] on win32 Type
"help", "copyright", "credits" or "license" for more information.
Whenever you are done in the python command line, you can simply type the following to quit the
python command line interface: exit().
37
DJANGO
Django is a high-level Python Web framework that encourages rapid development and clean, pragmatic
design. Built by experienced developers, it takes care of much of the hassle of Web development, so you
can focus on writing your app without needing to reinvent the wheel. It’s free and open source. Django's
primary goal is to ease the creation of complex, database-driven websites. Django emphasizes reusability
and "pluggability" of components, rapid development, and the principle of don't repeat yourself. Python
is used throughout, even for settings files and data models.
38
PYTHON LIBRARIES
Tensorflow
TensorFlow is a free and open-source software library for dataflow and differentiable programming across a
range of tasks. It is a symbolic math library, and is also used for machine learning applications such as neural
networks. It is used for both research and production at Google. TensorFlow was developed by the Google
Brain team for internal Google use. It was released under the Apache 2.0 open-source license on November
9, 2015.
Numpy
Numpy is a general-purpose array-processing package. It provides a high-performance multidimensional
array object, and tools for working with these arrays. It is the fundamental package for scientific computing
with Python. It contains various features including these important ones:
Besides its obvious scientific uses, Numpy can also be used as an efficient multi-dimensional container of
generic data. Arbitrary data-types can be defined using Numpy which allows Numpy to seamlessly and
speedily integrate with a wide variety of databases.
Pandas
Pandas is an open-source Python Library providing high-performance data manipulation and analysis tool
using its powerful data structures. Python was majorly used for data munging and preparation. It had very
little contribution towards data analysis. Pandas solved this problem. Using Pandas, we can accomplish five
typical steps in the processing and analysis of data, regardless of the origin of data load, prepare, manipulate,
model, and analyze. Python with Pandas is used in a wide range of fields including academic and commercial
domains including finance, economics, Statistics, analytics, etc.
Matplotlib
Matplotlib is a Python 2D plotting library which produces publication quality figures in a variety of hardcopy
formats and interactive environments across platforms. Matplotlib can be used in Python scripts, the Python
and IPython shells, the Jupyter Notebook, web application servers, and four graphical user interface toolkits.
39
Matplotlib tries to make easy things easy and hard things possible. You can generate plots, histograms, power
spectra, bar charts, error charts, scatter plots, etc., with just a few lines of code. For examples, see the sample
plots and thumbnail gallery. For simple plotting the pyplot module provides a MATLAB-like interface,
particularly when combined with IPython. For the power user, you have full control of line styles, font
properties, axes properties, etc, via an object-oriented interface or via a set of functions familiar to MATLAB
users.
Scikit – learn
Scikit-learn provides a range of supervised and unsupervised learning algorithms via a consistent interface in
Python. It is licensed under a permissive simplified BSD license and is distributed under many Linux
distributions, encouraging academic and commercial use. Python Python is an interpreted high-level
programming language for general-purpose programming. Created by Guido van Rossum and first released
in 1991, Python has a design philosophy that emphasizes code readability, notably using significant
whitespace. Python features a dynamic type system and automatic memory management. It supports multiple
programming paradigms, including object-oriented, imperative, functional and procedural, and has a large
and comprehensive standard library.
• Python is Interpreted − Python is processed at runtime by the interpreter. You do not need to compile your
program before executing it. This is similar to PERL and PHP.
• Python is Interactive − you can actually sit at a Python prompt and interact with the interpreter directly to
write your programs.
Python also acknowledges that speed of development is important. Readable and terse code is part of this,
and so is access to powerful constructs that avoid tedious repetition of code. Maintainability also ties into
this may be an all but useless metric, but it does say something about how much code you have to scan, read
and/or understand to troubleshoot problems or tweak behaviors. This speed of development, the ease with
which a programmer of other languages can pick up basic Python skills and the huge standard library is key
to another area where Python excels. All its tools have been quick to implement, saved a lot of time, and
several of them have later been patched and updated by people with no Python background - without
breaking.
40
9.2 SOURCE CODE
global filename
global classifier
def upload():
global filename
filename = filedialog.askdirectory(initialdir = ".")
text.delete('1.0', END)
text.insert(END,filename+' Loaded')
41
text.insert(END,"Dataset Loaded")
def processImages():
text.delete('1.0', END)
X_train = np.load('model/features.txt.npy')
Y_train = np.load('model/labels.txt.npy')
text.insert(END,'Total images found in dataset for training = '+str(X_train.shape[0])+"\n\n")
def generateModel():
global classifier
text.delete('1.0', END)
if os.path.exists('model/model.json'):
with open('model/model.json', "r") as json_file:
loaded_model_json = json_file.read()
classifier = model_from_json(loaded_model_json)
classifier.load_weights("model/model_weights.h5")
classifier._make_predict_function()
print(classifier.summary())
f = open('model/history.pckl', 'rb')
data = pickle.load(f)
f.close()
acc = data['accuracy']
accuracy = acc[9] * 100
text.insert(END,"CNN Training Model Accuracy = "+str(accuracy)+"\n")
else:
classifier = Sequential()
classifier.add(Convolution2D(32, 3, 3, input_shape = (64, 64, 1), activation = 'relu'))
classifier.add(MaxPooling2D(pool_size = (2, 2)))
classifier.add(Convolution2D(32, 3, 3, activation = 'relu'))
classifier.add(MaxPooling2D(pool_size = (2, 2)))
classifier.add(Flatten())
classifier.add(Dense(output_dim = 256, activation = 'relu'))
42
classifier.add(Dense(output_dim = 1, activation = 'softmax'))
print(classifier.summary())
classifier.compile(optimizer = 'adam', loss = 'categorical_crossentropy', metrics = ['accuracy'])
hist = classifier.fit(X_train, Y_train, batch_size=16, epochs=10, shuffle=True, verbose=2)
classifier.save_weights('model/model_weights.h5')
model_json = classifier.to_json()
with open("model/model.json", "w") as json_file:
json_file.write(model_json)
f = open('model/history.pckl', 'wb')
pickle.dump(hist.history, f)
f.close()
f = open('model/history.pckl', 'rb')
data = pickle.load(f)
f.close()
acc = data['accuracy']
accuracy = acc[9] * 100
text.insert(END,"CNN Training Model Accuracy = "+str(accuracy)+"\n")
def predict():
name = filedialog.askopenfilename(initialdir="testImages")
img = cv2.imread(name)
img = cv2.resize(img, (64,64))
im2arr = np.array(img)
im2arr = im2arr.reshape(1,64,64,3)
XX = np.asarray(im2arr)
XX = XX.astype('float32')
XX = XX/255
preds = classifier.predict(XX)
print(str(preds)+" "+str(np.argmax(preds)))
predict = np.argmax(preds)
print(predict)
img = cv2.imread(name)
43
img = cv2.resize(img,(450,450))
msg = ''
if predict == 0:
cv2.putText(img, 'Fake', (10, 25), cv2.FONT_HERSHEY_SIMPLEX,0.6, (0, 255, 255), 2)
msg = 'Fake'
else:
cv2.putText(img, 'Real', (10, 25), cv2.FONT_HERSHEY_SIMPLEX,0.6, (0, 255, 255), 2)
msg = 'Real'
cv2.imshow(msg,img)
cv2.waitKey(0)
def graph():
f = open('model/history.pckl', 'rb')
data = pickle.load(f)
f.close()
accuracy = data['accuracy']
loss = data['loss']
plt.figure(figsize=(10,6))
plt.grid(True)
plt.xlabel('Iterations')
plt.ylabel('Accuracy/Loss')
plt.plot(loss, 'ro-', color = 'red')
plt.plot(accuracy, 'ro-', color = 'green')
plt.legend(['Loss', 'Accuracy'], loc='upper left')
plt.title('CNN Accuracy & Loss')
plt.show()
main.config(bg='LightSteelBlue3')
main.mainloop()
45
46
CHAPTER-10
RESULTS/DISCUSSIONS
Integration testing
Integration tests are designed to test integrated software components to determine if they actually run as one
program. Testing is event driven and is more concerned with the basic outcome of screens or fields.
Integration tests demonstrate that although the components were individually satisfactory, as shown by
successfully unit testing, the combination of components is correct and consistent. Integration testing is
specifically aimed at exposing the problems that arise from the combination of component.
Functional testing
Functional tests provide systematic demonstrations that functions tested are available as specified by the
business and technical requirements, system documentation, and user manuals.
Functional testing is centered on the following items:
47
Valid Input: identified classes of valid input must be accepted. Invalid Input:identified classes of invalid input
must be rejected. Functions: identified functions must be exercised.
Output:identified classes of application outputs must be exercised. Systems/Procedures : interfacing systems
or procedures must be invoked.Organization and preparation of functional tests is focused on requirements,
key functions, or special test cases. In addition, systematic coverage pertaining to identifying Business
process flows; data fields, predefined processes, and successive processes must be considered for testing.
Before functional testing is complete, additional tests are identified and the effective value of current tests is
determined.
White Box Testing is a testing in which the software tester has knowledge of the inner workings, structure
and language of the software, or at least its purpose. It has a purpose. It is used to test areas that cannot be
reached from a black box level.
Black Box Testing is testing the software without any knowledge of the inner workings, structure or language
of the module being tested. Black box tests, as most other kinds of tests, must be written from a definitive
source document, such as specification or requirements document, such as specification or requirements
document. It is a test in which the software under test is treated as a black box.
You cannot “see” into it. The test provides inputs and responds to outputs without considering how the
software works.
Unit Testing
Unit testing is usually conducted as part of a combined code and unit test phase of the software
lifecycle, although it is not uncommon for coding and unit testing to be conducted as two distinct phases.
Test strategy and approach
Field testing will be performed manually and functional tests will be written in detail.
Test objectives
● All field entries must work properly.
● Pages must be activated from the identified link.
48
● The entry screen, messages and responses must not be delayed.
Features to be tested
● Verify that the entries are of the correct format
● No duplicate entries should be allowed
● All links should take the user to the correct page.
Integration Testing
Software integration testing is the incremental integration testing of two or more integrated software
components on a single platform to produce failures caused by interface defects.The task of the integration
test is to check that components or software applications, e.g. components in a software system or – one step
up – software applications at the company level – interact without error.
Acceptance Testing
User Acceptance Testing is a critical phase of any project and requires significant participation by the end
user. It also ensures that the system meets the functional requirements. Test Results: All the test cases
mentioned above passed successfully. No defects encountered.
49
10.2 OUTPUT SCREENS
50
51
52
53
CHAPTER-11
CONCLUSION
11.1 CONCLUSION
We commenced with a brief introduction to our system and discussed the scope and objectives
of our project. During the literature survey we got an opportunity to look closely into the
problem that people are facing in the current environment, we reviewed multiple research
papers out of which we taper down to ten papers and selected five papers as our base research
papers. We analyzed all existing architectures of our base papers and by understanding their
working we have discovered some flaws in the currently existing system. We have kept all the
prime features of existing systems as a primary focus with some of the additional features for
our proposed system.
Many different adaptations, tests and innovations have been kept for the future due to the lack of time. As
future work concerns deeper analysis of particular mechanisms, new proposals to try different methods or
simple curiosity.
1. In future we would be including a module for currency conversion.
2. We can implement the system for foreign currencies.
3. Tracking of device’s location through which the currency is scanned and maintaining the same in the
database.
54
CHAPTER-12
REFERENCES
[1] Prof Chetan More, Monu Kumar, Rupesh Chandra, Raushan Singh, “Fake currency
Detection using Basic Python Programming and Web Framework” IRJET International
Research Journal of Engineering and Technology, Volume: 07 Issue: 04 | Apr 2020 ISSN: 2395-
0056
[2] Vivek Sharan, Amandeep Kaur,” Detection of Counterfeit Indian Currency Note Using
Image Processing” International Journal of Engineering and Advanced Technology (IJEAT),
Volume.09, Issue:01, ISSN: 2249-8958 (October 2019)
[3] Aakash S Patel, “Indian Paper currency detection” International Journal for Scientific
Research & Development (IJSRD), Vol. 7, Issue 06, ISSN: 2321-0613 (June 2019)
[4] Archana M Kalpitha C P, Prajwal S K, Pratiksha N,” Identification of fake notes and
denomination recognition” International Journal for Research in Applied Science &
Engineering Technology (IJRASET), Volume. 6, Issue V, ISSN: 2321-9653, (May 2018)
[5] S. Atchaya, K. Harini, G. Kaviarasi, B. Swathi, “Fake currency detection using Image
processing”, International Journal of Trend in Research and Development (IJTRD), ISSN:
2394-9333 (2017).
55