0% found this document useful (0 votes)
81 views20 pages

Cours - Machine Learning

This document discusses machine learning and deep learning techniques. It provides examples of applications such as face recognition, character recognition, and medical diagnosis. It explains the machine learning framework where a prediction function is applied to image features to classify images into categories. Training involves estimating the prediction function to minimize errors on labeled examples, while testing applies the function to unlabeled examples. Feature extraction and different classifier models like k-nearest neighbor and neural networks are discussed. Deep learning can process diverse data with less preprocessing but requires more data and computation than traditional machine learning.

Uploaded by

INTTIC
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
81 views20 pages

Cours - Machine Learning

This document discusses machine learning and deep learning techniques. It provides examples of applications such as face recognition, character recognition, and medical diagnosis. It explains the machine learning framework where a prediction function is applied to image features to classify images into categories. Training involves estimating the prediction function to minimize errors on labeled examples, while testing applies the function to unlabeled examples. Feature extraction and different classifier models like k-nearest neighbor and neural networks are discussed. Deep learning can process diverse data with less preprocessing but requires more data and computation than traditional machine learning.

Uploaded by

INTTIC
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

Classification, Machine learning

et Deep learning
Traditional Programming

Data
Computer Output
Program

Machine Learning

Data
Computer Program
Output
Clustering Strategies
• K-means
– Iteratively re-assign points to the nearest cluster
center.
Sample Applications
• Face recognition
• Character recognition
• Speech recognition
• Medical diagnosis
• Industrial applications
• Web search
• Space exploration
• Robotics
• Information extraction
• Social networks
Face Recognition

Training examples of a person

Test images
The machine learning
framework
• Apply a prediction function to a feature representation of
the image to get the desired output:

f( ) = “apple”
f( ) = “tomato”
f( ) = “cow”
The machine learning
framework
y = f(x)
output prediction Image
function feature

• Training: given a training set of labeled examples {(x1,y1),


…, (xN,yN)}, estimate the prediction function f by minimizing
the prediction error on the training set
• Testing: apply f to a never before seen test example x and
output the predicted value y = f(x)
Steps
Training Training
Labels
Training
Images
Image Learned
Training
Features model

Testing

Image Learned
Prediction
Features model
Test Image
Features
• Raw pixels

• Histograms

• Other descriptors
Many classifiers to choose from

• K-nearest neighbor
• Neural networks
• SVM
• Deep Neural networks
• Etc.
Classifiers: Nearest neighbor

Training
Training Test
examples
examples example
from class 2
from class 1

f(x) = label of the training example nearest to x

• All we need is a distance function for our inputs


• No training required!
(Artificial) Neural Networks
• Motivation: human brain
– massively parallel (1011
neurons, ~20 types)
– small computational units with
simple low-bandwidth
communication (1014 synapses,
1-10ms cycle time)

• Realization: neural network


– units (≈ neurons) connected by
directed weighted links
– activation function from inputs
to output
Neural Networks (continued)

• neural network = parameterized family of nonlinear functions


• types
– feed-forward (acyclic): single-layer perceptrons, multi-layer networks
– recurrent (cyclic): Hopfield networks, Boltzmann machines
Neural Network Learning
Key Idea: Adjusting the weights changes the function
represented by the neural network (learning = optimization in
weight space).

Iteratively adjust weights to reduce error (difference between


network output and target output).

• Weight Update
– backpropagation
Deep Learning
• Deep learning (DL) is a subtype of machine learning (ML). DL can process a wider
range of data resources, requires less data preprocessing by humans (e.g. feature
labelling), and can sometimes produce more accurate results than traditional ML
approaches (although it requires a larger amount of data to do so).
• However, it is computationally more expensive in time to execute, hardware costs
and data quantities.
Deep Learning
Deep Learning

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy