0% found this document useful (0 votes)
30 views

1.Module-2 (Supervised Learning) final

Supervised ml note

Uploaded by

binduann
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views

1.Module-2 (Supervised Learning) final

Supervised ml note

Uploaded by

binduann
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 13

“Module-2 (Supervised Learning)"

Presented by
Dr.S.Russia
Professor / CSE
Velalar College of Engineering and Technology, Erode.
Date: 12.11.2024

12/10/2024 1
Agenda
1. Dimensionality Reduction:
a. Subset Selection
b. Principal Component Analysis (PCA)
2. Regression:
a. Linear Regression (one var. & multiple var.)
b. Linear methods for classification - LR,NB,DT
3. Case Study: Classifier for face detection
Module-2 (Supervised Learning
1. Dimensionality Reduction
2. Subset selection
3. Principal Component Analysis
4. Regression
5. Linear regression with one variable
6. Linear regression with multiple variables
7. Solution using gradient descent algorithm and
matrix method,
8. Basic idea of overfitting in regression.
9. Linear Methods for Classification- Logistic
regression,
10. Naive Bayes, Decision tree algorithm ID3.
11. Case Study: Develop a classifier for face detection.
1.Dimensionality Reduction

• Dimensionality Reduction: Reduce the the number of


input variables in a dataset.
• Why DR?
 Less computing or training time
 Redundancy is removed
 Reduce the storage space
Introduction
• The complexity of any classifier or regressor
depends on the number of inputs.
• This determines both the time and space
complexity and the necessary number of
training examples to train such a classifier or
regressor.
• There are two main methods for reducing
dimensionality:
1. Feature selection
2. Feature extraction
Feature selection

– In feature selection, we are interested in feature


selection finding k of the d dimensions that give
us the most information and we discard the
other (d − k) dimensions.
– Feature selection methods that choose a
subset of important features pruning the rest
and feature extraction methods that form
fewer, new features from the original inputs.
• In subset selection, we are interested in finding the
best subset of the set of features. The best subset
contains the least number of dimensions that most
contribute to accuracy. We discard the remaining,
unimportant dimensions. Using a suitable error
function, this can be used in both regression and
classification problems.
• There are two approaches:
– Forward selection
– Backward selection
Forward selection

• In forward selection, we start with no


variables and add them one by one, at each
step adding the one that decreases the error
the most, until any further addition does not
decrease the error (or decreases it only
slightly).
Backward selection

• we start with all variables and remove them


one by one, at each step removing the one
that decreases the error the most (or
increases it only slightly), until any further
removal increases the error significantly.
Feature extraction:

– Feature extraction: In feature extraction, we are


interested in finding a new set of k dimensions
that are combinations of the original d
dimensions.
– The best known and most widely used feature
extraction methods are Principal Components
Analysis (PCA) and Linear Discriminant Analysis
(LDA), which are both linear projection methods,
• Feature Selection (Subset Selection ): selects
a subset of existing features.
• Feature Extraction (Transformation-based
methods): creates new features by
transforming the existing ones.
• The best known and most widely used feature
extraction methods are Principal Components
Analysis (PCA)
PCA- What & Why
• It is a feature reduction tech.
• f1,f2,f3,…. f100
• PC1 (80%) ,PC2(15 %),PC3 (3%)…. 98%
12/10/2024 13

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy