Kiet School of Engineering & Technology: Department of Computer Appication

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 30

KIET SCHOOL OF ENGINEERING & TECHNOLOGY

DEPARTMENT OF COMPUTER APPICATION


PRESENTATION ON ANN & THERE INTELLIGENCE SECURITY APPLICATION

SUBMITTED BY
ROLL NO. :- 0902914114 NAME :- Navneet Singh SEMESTER :- VIth SECTION :- A

Artificial Neural Networks

Outline
What are Neural Networks? Neural networks to the rescue Where can neural network systems help Learning Learning performance Where are NN used? Applications Strengths of a Neural Network Advantages Disadvantages

A new sort of compute What are (everyday) computer systems good at... and not so good at?
Good at
Rule-based systems :
doing what the programmer wants them to do

Not so good at
Dealing with noisy data Dealing with unknown environment data
Fault tolerance

Adapting to circumstances

What are Neural Networks?


Models of the brain and nervous system Highly parallel
Process information much more like the brain than a serial computer

Learning Very simple principles Very complex behaviours Applications


As powerful problem solvers As biological models

Neural networks to the rescue


Neural network: information processing paradigm inspired by biological nervous systems, such as our brain Structure: large number of highly interconnected processing elements (neurons) working together Like people, they learn from experience (by example)

CONTINUE.
Neural networks are configured for a specific application, such as pattern recognition or data classification, through a learning process In a biological system, learning involves adjustments to the synaptic connections between neurons same for artificial neural networks (ANNs)

Where can neural network systems help


when we can't formulate an algorithmic solution. when we can get lots of examples of the behavior we require. learning from experience when we need to pick out the structure from existing data.

Inspiration from Neurobiology


A neuron: many-inputs / one-output unit output can be excited or not excited incoming signals from other neurons determine if the neuron shall excite ("fire") Output subject to attenuation in the synapses, which are junction parts of the

neuron

Mathematical representation
The neuron calculates a weighted sum of inputs and compares it to a threshold. If the sum is higher than the threshold, the output is set to 1, otherwise to -1

A simple perceptron
Its a single-unit network Change the weight by an amount proportional to the difference between the desired output and the actual output. We = * (D-Y).Ii Perceptron Learning Rule

Example: A simple single unit adaptive network


The network has 2 inputs, and one output. All are binary. The output is
1 if W0I0 + W1I1 + Wb > 0 0 if W0I0 + W1I1 + Wb 0

We want it to learn simple OR: output a 1 if either I0 or I1 is 1. Demo

Learning
From experience: examples / training data Strength of connection between the neurons is stored as a weight-value for the specific connection Learning the solution to a problem = changing the connection weights

Operation mode
Fix weights (unless in online learning) Network simulation = input signals flow through network to outputs Output is often a binary decision Inherently parallel Simple operations and threshold: fast decisions and real-time response

Artificial Neural Networks


Adaptive interaction between individual neurons Power: collective behavior of interconnected neurons

The hidden layer learns to recode (or to provide a representation of) the inputs: associative mapping

Evolving networks
Continuous process of:
Evaluate output Adapt weights Take new inputs

ANN evolving causes stable state of the weights, but neurons continue working: network has learned dealing with the problem

Learning performance
Network architecture Learning method:
Unsupervised Reinforcement learning Backpropagation

Unsupervised learning
No help from the outside No training data, no information available on the desired output Learning by doing Used to pick out structure in the input:
Clustering Reduction of dimensionality compression

Example: Kohonens Learning Law

Competitive learning: example


Example: Kohonen network Winner takes all only update weights of winning neuron
Network topology Training patterns Activation rule Neighbourhood Learning

Reinforcement learning
Teacher: training data The teacher scores the performance of the training examples Use performance score to shuffle weights randomly Relatively slow learning due to randomness

Back propagation
Desired output of the training examples Error = difference between actual & desired output Change weight relative to error size Calculate output layer error , then propagate back to previous layer Improved performance, very common!

Hopfield law demo


if the desired and actual output are both active or both inactive, increment the connection weight by the learning rate, otherwise decrement the weight by the learning rate. Matlab demo of a simple linear separation One perceptron can only separate linearily!

Online / Offline
Offline
Weights fixed in operation mode Most common

Online
System learns while in operation mode Requires a more complex network architecture

Where are NN used?


Recognizing and matching complicated, vague, or incomplete patterns Data is unreliable Problems with noisy data
Prediction Classification Data association Data conceptualization Filtering Planning

Applications
Prediction: learning from past experience
pick the best stocks in the market predict weather identify people with cancer risk

Classification
Image processing Predict bankruptcy for credit card companies Risk assessment

Continue
Recognition
Pattern recognition: SNOOPE (bomb detector in U.S. airports) Character recognition Handwriting: processing checks

Data association
Not only identify the characters that were scanned but identify when the scanner is not working properly

Continue
Data Conceptualization
infer grouping relationships e.g. extract from a database the names of those most likely to buy a particular product.

Data Filtering
e.g. take the noise out of a telephone signal, signal smoothing

Planning
Unknown environments Sensor data is noisy Fairly new approach to planning

Strengths of a Neural Network


Power: Model complex functions, nonlinearity built into the network Ease of use:
Learn by example Very little user domain-specific expertise needed

Intuitively appealing: based on model of biology, will it lead to genuinely intelligent computers/robots?
Neural networks cannot do anything that cannot be done using traditional computing techniques, BUT they can do some things which would otherwise be very difficult.

General Advantages
Advantages
Adapt to unknown situations Robustness: fault tolerance due to network redundancy Autonomous learning and generalization

Disadvantages
Not exact Large complexity of the network structure

For motion planning?

Status of Neural Networks


Most of the reported applications are still in research stage No formal proofs, but they seem to have useful applications that work

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy