0% found this document useful (0 votes)
9 views2 pages

FML Course Content

The Foundations of Machine Learning course focuses on enabling systems to learn and improve from experience using machine learning algorithms. It covers supervised and unsupervised learning techniques, including decision trees, neural networks, and clustering methods, while emphasizing the mathematical relationships within these algorithms. Students will gain practical skills in applying machine learning software to solve real-world problems and evaluate algorithm performance.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views2 pages

FML Course Content

The Foundations of Machine Learning course focuses on enabling systems to learn and improve from experience using machine learning algorithms. It covers supervised and unsupervised learning techniques, including decision trees, neural networks, and clustering methods, while emphasizing the mathematical relationships within these algorithms. Students will gain practical skills in applying machine learning software to solve real-world problems and evaluate algorithm performance.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

FOUNDATIONS OF MACHINE LEARNING

IV Semester: CSE(AI & ML)

Course Code Category Hours / Week Credits Maximum Marks


L T P C CIA SEE Total
ACAC03 Core
3 1 0 4 30 70 100
Contact Classes: 45 Tutorial Classes: 15 Practical Classes: Nil Total Classes: 60
Prerequisites: Linear Algebra and Calculus, Probability and Statistics, Python Programming
I. COURSE OVERVIEW:
The main emphasis of this course is to provide systems the ability to automatically learn and improve from
experience without being explicitly programmed. The course includes the fundamental concepts to build, train,
and predict data models using machine learning (ML) algorithms. This course provides a clear understanding on
concepts of supervised learning through decision trees, advanced techniques like neural networks, Naive Bayes
and k-nearest neighbor algorithm and introduction to unsupervised and reinforcement learning. Machine
Learning has revolutionized industries like medicine, healthcare, manufacturing, banking, and several other
industries

II. COURSE OBJECTIVES:


The students will try to learn:
I The fundamental concepts and techniques of machine learning.
II The underlying mathematical relationships within and across machine learning algorithms and the
paradigms of supervised and unsupervised learning.
III The skills of using machine learning software for solving practical problems.
IV To choose suitable machine learning algorithms and evaluate the performance of algorithms to provide
solutions for various real-world problems.

III. COURSE OUTCOMES:


After successful completion of the course, students should be able to:
CO 1 Demonstrate the characteristics of Machine Learning that make it useful to solve Understand
real-world problems
CO 2 Make use of Supervised Learning Algorithm for Classification Model and Apply
Decision Tree Learning.
CO 3 Build a Prediction Model by using Linear Regression Techniques and Ensemble Apply
Techniques.
CO 4 Make use of Bayesian Learning for Classification Model andoutline Unsupervised Apply
learning Algorithms for determining hidden patterns in data
CO 5 Discuss the methodology of Neural Networks and Support Vector Machines to classify Apply
the Linear and Non-Linear data
CO 6 Identify appropriate Machine Learning Algorithms depending on the nature of the Apply
Learning System

IV. SYLLABUS:
MODULE – I: INTRODUCTION TO MACHINE LEARNING (09)
Machine Learning Foundations: Introduction to machine learning, learning problems and scenarios, need for machine
learning, types of learning, standard learning tasks, the Statistical Learning Framework, Probably Approximately
Correct (PAC) learning.

MODULE – II: SUPERVISED LEARNING ALGORITHMS (09)


Learning a Class from Examples, Linear, Non-linear, Multi-class and Multi-label classification, Decision Trees: ID3,
Classification and Regression Trees (CART), Regression: Linear Regression, Multiple Linear Regression, Logistic
Regression.

1|Page
MODULE – III: ENSEMBLE AND PROBABILISTIC LEARNING (09)
Ensemble Learning Model Combination Schemes, Voting, Error-Correcting Output Codes, Bagging: Random Forest
Trees, Boosting: Adaboost, Stacking

Bayesian Learning, Bayes Optimal Classifier, Naïve Bayes Classifier, Bayesian Belief Networks, Mining
Frequent Patterns

MODULE - IV UNSUPERVISED LEARNING (09)


Introduction to clustering, Hierarchical: AGNES, DIANA, Partitional: K-means clustering, K-Mode Clustering, Self-
Organizing Map, Expectation Maximization, Gaussian Mixture Models, Principal Component Analysis (PCA),
Locally Linear Embedding (LLE), Factor Analysis

MODULE - V ADVANCED SUPERVISED LEARNING (09)


Neural Networks: Introduction, Perceptron, Multilayer Perceptron, Support vector machines: Linear and Non-Linear,
Kernel Functions, K-Nearest Neighbors.

V. TEXT BOOKS:
1. Ethem Alpaydin, “Introduction to Machine Learning”, MIT Press, PHI, 3rd Edition, 2014.
2. Mehryar Mohri, Afshin Rostamizadeh, Ameet Talwalkar, “Foundations of Machine Learning”, MIT Press,
2nd Edition, 2018.

VI. REFERENCE BOOKS:


1. Tom M. Mitchell, “Machine Learning”, McGraw Hill, Indian Edition, 2017.
2. Sahi Shalev-Shwartz, Shai Ben-David, “Understanding Machine Learning: From Theory to Algorithms”,
Cambridge University Press, 2014.
3. Christopher M. Bishop, “Pattern Recognition and Machine Learning”, Springer, 2010.
4. Trevor Hastie, Robert Tibshirani, Jerome Friedman, “The Elements of Statistical Learning: Data Mining,
Inference, and Prediction”, Springer, 2nd Edition, 2009.
5. Avrim Blum, John Hopcroft, Ravindran Kannan, “Foundations of Data Science”, Cambridge University Press,
2020.
6. Gareth James, Daniela Witten, Trevor Hastie and Rob Tibshirani, “An Introduction to Statistical Learning: with
applications in R”, Springer Texts in Statistics, 2017.

VII. WEB REFERENCES:


1. https://onlinecourses.nptel.ac.in/noc19_cs52/preview
2. https://ece.iisc.ac.in/~parimal/2019/ml.html
3. https://www.springer.com/gp/book/9780387848570
4. https://www.cse.iitb.ac.in/~sunita/cs725/calendar.html
5. https://www.analyticsvidhya.com/blog/2018/12/guide-convolutional-neural-network-cnn/
6. https://cs.nyu.edu/~mohri/mlu11/

2|Page

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy