Syllabus
Syllabus
Syllabus
3 00 3
COURSE OBJECTIVES:
To understand the rationale for software development process models
To understand why the architectural design of software is important;
To understand the five important dimensions of dependability, namely, availability, reliability,
safety, security, and resilience.
To understand the basic notions of a web service, web service standards, and service- oriented
architecture;
To understand the different stages of testing from testing during development of a software system
SUGGESTED ACTIVITIES
1. Comparatively analysing different Agile methodologies.
2. Describing the scenarios where ‘Scrum’ and ‘Kanban’ are used.
3. Mapping the data flow into suitable software architecture.
4. Developing behavioural representations for a class or component.
5. Implementing simple applications as RESTful service.
COURSE OUTCOMES:
The Students will be able to
CO1:Identify appropriate process models based on the Project requirements
CO2:Understand the importance of having a good Software Architecture.
TOTAL: 45 PERIODS
CO3:Understand the five important dimensions of dependability, namely, availability,
reliability, safety, security, and resilience.
CO4:Understand the basic notions of a web service, web service standards, and service-oriented architecture;
CO5:Be familiar with various levels of Software testing
REFERENCES:
1. Software Engineering: A Practitioner's Approach, 9th Edition. Roger Pressman and Bruce
Maxim, McGraw-Hill 2019.
2. Software Engineering, 10th Edition, Ian Somerville, Pearson Education Asia 2016.
3. Software Architecture In Practice, 3rd Edition, Len Bass, Paul Clements and Rick Kazman,
Pearson India 2018
4. An integrated approach to Software Engineering, 3rd Edition, Pankaj Jalote, Narosa
Publishing House, 2018
5. Fundamentals of Software Engineering, 5th Edition, Rajib Mall, PHI Learning Private
Ltd, 2018
C MACHINE LEARNING L T PC
P 3 0 2 4
4
2
5
2
COURSE OBJECTIVES:
2. Study atleast
To3 understand
Tools available for Machine
the concepts and Learning and discuss
mathematical pros &
foundations ofcons of each
machine learning and types of
problems tackled by machine learning
To explore the different supervised learning techniques including ensemble methods
To learn different aspects of unsupervised learning and reinforcement learning
To learn the role of probabilistic methods for machine learning
To understand the basic concepts of neural networks and deep learning
I
What is Machine Learning? Need –History – Definitions – Applications - Advantages,
Disadvantages & Challenges -Types of Machine Learning Problems – Mathematical Foundations -
Linear Algebra & Analytical Geometry -Probability and Statistics- Bayesian Conditional Probability -
Vector Calculus & Optimization - Decision Theory - Information theory
USUPERVISED LEARNING 9
N
I
T
I
I
Introduction-Discriminative and Generative Models -Linear Regression - Least Squares -Under-fitting
/ Overfitting -Cross-Validation – Lasso Regression- Classification - Logistic Regression- Gradient
Linear Models -Support Vector Machines –Kernel Methods -Instance based Methods - K-Nearest
Neighbors - Tree based Methods –Decision Trees –ID3 – CART - Ensemble Methods –Random
Forest - Evaluation of Classification Algorithms
I
I
I
Introduction - Clustering Algorithms -K – Means – Hierarchical Clustering - Cluster Validity -
Dimensionality Reduction –Principal Component Analysis – Recommendation Systems - EM
algorithm. Reinforcement Learning – Elements -Model based Learning – Temporal Difference
Learning
I
V
Introduction -Naïve Bayes Algorithm -Maximum Likelihood -Maximum Apriori -Bayesian Belief
Networks -Probabilistic Modelling of Problems -Inference in Bayesian Belief Networks – Probability
Density Estimation - Sequence Models – Markov Models – Hidden Markov Models
3. Take an example of a classification problem. Draw different decision trees for the example
and explain the pros and cons of each decision variable at each level of the tree
4. Outline 10 machine learning applications in healthcare
5. Give 5 examples where sequential models are suitable.
6. Give at least 5 recent applications of CNN
PRACTICAL EXERCISES: 30 PERIODS
1. Implement a Linear Regression with a Real Dataset
(https://www.kaggle.com/harrywang/housing). Experiment with different features in building
a model. Tune the model's hyperparameters.
2. Implement a binary classification model. That is, answers a binary question such as "Are
houses in this neighborhood above a certain price?"(use data from exercise 1). Modify the
classification threshold and determine how that modification influences the model. Experiment
with different classification metrics to determine your model's effectiveness.
3. Classification with Nearest Neighbors. In this question, you will use the scikit-learn’s KNN
classifier to classify real vs. fake news headlines. The aim of this question is for you to read the
scikit-learn API and get comfortable with training/validation splits. Use California Housing
Dataset
4. In this exercise, you'll experiment with validation sets and test sets using the dataset. Split
a training set into a smaller training set and a validation set. Analyze deltas between training set
and validation set results. Test the trained model with a test set to determine whether your
trained model is overfitting. Detect and fix a common training problem.
5. Implement the k-means algorithm using https://archive.ics.uci.edu/ml/datasets/Codon+usage
dataset
6. Implement the Naïve Bayes Classifier using
https://archive.ics.uci.edu/ml/datasets/Gait+Classification
dataset
7. Project - (in Pairs) Your project must implement one or more machine learning algorithms and
apply them to some data.
COURSE OUTCOMES:
Upon the completion of course, students will be able to
CO1: Understand and outline problems for each type of machine learning
CO2: Design a Decision tree and Random forest for an application
CO3: Implement Probabilistic Discriminative and Generative algorithms for an application and
analyze the results.
CO4: Use a tool to implement typical Clustering algorithms for different types of applications. CO5:
Design and implement an HMM for a Sequence Model type of application and identify applications
suitable for different types of Machine Learning with suitable justification.
TOTAL:75 PERIODS
REFERENCES
1. Stephen Marsland, “Machine Learning: An Algorithmic Perspective”, Chapman & Hall/CRC, 2nd
Edition, 2014.
2. Kevin Murphy, “Machine Learning: A Probabilistic Perspective”, MIT Press, 2012
3. Ethem Alpaydin, “Introduction to Machine Learning”, Third Edition, Adaptive Computation and
Machine Learning Series, MIT Press, 2014
4. Tom M Mitchell, “Machine Learning”, McGraw Hill Education, 2013.
5. Peter Flach, “Machine Learning: The Art and Science of Algorithms that Make Sense of Data”, First
Edition, Cambridge University Press, 2012.
6. Shai Shalev-Shwartz and Shai Ben-David, “Understanding Machine Learning: From Theory to
Algorithms”, Cambridge University Press, 2015
7. Christopher Bishop, “Pattern Recognition and Machine Learning”, Springer, 2007.
8. Hal Daumé III, “A Course in Machine Learning”, 2017 (freely available online)
9. Trevor Hastie, Robert Tibshirani, Jerome Friedman, “The Elements of Statistical Learning”,
Springer, 2009 (freely available online)
10. Aurélien Géron , Hands-On Machine Learning with Scikit-Learn and TensorFlow: Concepts,
Tools, and Techniques to Build Intelligent Systems 2nd Edition, o'reilly, (2017)
COURSE OBJECTIVES:
To understand the computational approaches to Modeling, Feature Extraction
To understand the need and application of Map Reduce
To understand the various search algorithms applicable to Big Data
To analyze and interpret streaming data
To learn how to handle large data sets in main memory and learn the various clustering
techniques applicable to Big Data
UNIT V CLUSTERING 9
Introduction to Clustering Techniques – Hierarchical Clustering –Algorithms – K-Means – CURE –
Clustering in Non -– Euclidean Spaces – Streams and Parallelism – Case Study: Advertising on the
Web – Recommendation Systems.
TOTAL: 45 PERIODS
COURSE OUTCOMES:
Upon completion of this course, the students will be able to
CO1: Design algorithms by employing Map Reduce technique for solving Big Data problems.
CO2: Design algorithms for Big Data by deciding on the apt Features set .
CO3: Design algorithms for handling petabytes of datasets
CO4: Design algorithms and propose solutions for Big Data by optimizing main
memory consumption
CO5: Design solutions for problems in Big Data by suggesting appropriate clustering techniques.
REFERENCES:
1. Jure Leskovec, AnandRajaraman, Jeffrey David Ullman, “Mining of Massive Datasets”,
Cambridge University Press, 3rd Edition, 2020.
2. Jiawei Han, MichelineKamber, Jian Pei, “Data Mining Concepts and Techniques”, Morgan
Kaufman Publications, Third Edition, 2012.
3. Ian H.Witten, Eibe Frank “Data Mining – Practical Machine Learning Tools and
Techniques”, Morgan Kaufman Publications, Third Edition, 2011.
4. David Hand, HeikkiMannila and Padhraic Smyth, “Principles of Data Mining”, MIT
PRESS, 2001
WEB REFERENCES:
1. https://swayam.gov.in/nd2_arp19_ap60/preview
2. https://nptel.ac.in/content/storage2/nptel_data3/html/mhrd/ict/text/106104189/lec1.pdf
ONLINE RESOURCES:
1. https://examupdates.in/big-data-analytics/
2. https://www.tutorialspoint.com/big_data_analytics/index.htm
3. https://www.tutorialspoint.com/data_mining/index.htm
ADVANCED DIGITAL IMAGE PROCESSING L T P C
19MCN19
(Common to M.E - CN, AE& VLSI)
3 0 0 3
OBJECTIVES
COURSE OUTCOMES
SYLLABUS
UNIT II SEGMENTATION 9
Edge detection, Thresholding, Region growing, Fuzzy clustering, Watershed algorithm, Active
contour models, Texture feature based segmentation, Graph based segmentation, Wavelet based
Segmentation - Applications of image segmentation.
Overview of image fusion, pixel fusion, wavelet based fusion -region based fusion.
UNIT V 3D IMAGE VISUALIZATION 9
Sources of 3D Data sets, Slicing the Data set, Arbitrary section planes, The use of color,
Volumetric display, Stereo Viewing, Ray tracing, Reflection, Surfaces, Multiple connected
surfaces, Image processing in 3D, Measurements on 3D images.
TOTAL: 45 PERIODS
REFERENCES