Open navigation menu
Close suggestions
Search
Search
en
Change Language
Upload
Sign in
Sign in
Download free for days
0 ratings
0% found this document useful (0 votes)
27 views
AAI Module 1
These are the original notes with summarized content for Module 1 of AAI.
Uploaded by
ahmed.412052.cs
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Download now
Download
Save AAI Module 1 For Later
Download
Save
Save AAI Module 1 For Later
0%
0% found this document useful, undefined
0%
, undefined
Embed
Share
Print
Report
0 ratings
0% found this document useful (0 votes)
27 views
AAI Module 1
These are the original notes with summarized content for Module 1 of AAI.
Uploaded by
ahmed.412052.cs
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Download now
Download
Save AAI Module 1 For Later
Carousel Previous
Carousel Next
Save
Save AAI Module 1 For Later
0%
0% found this document useful, undefined
0%
, undefined
Embed
Share
Print
Report
Download now
Download
You are on page 1
/ 12
Search
Fullscreen
MODULE 1 Generative and CHAPTER 1 Probabilistic Models University Prescribed Syllabus Introduction : Overview of generative models and their importance in Al, Fundamentals of Probability theory and generative modeling, Introduction to GANs, VAEs and other generative models. Significance of generative models, Challenges with generative models. Probabi : Gaussian Mixture Models (GMMs), Hidden Markov Models (HMMs), Bayesian Networks, Markov Random Field (MRFs), Probabilistic Graphical Model.Q1 DL tt INTRODUCTION {1.1.1 Overview of Generative Models and their Importance in at ‘Whats @ generative model ? (2 Marks) | ‘What are different types of generative models? @ Marks) Describe importance of generative models in Al (5 Maris) L Genentive models are a class of machine faring Ts sed in aril megenee (A) wet data | 9. tis sit, mimic gon de se inde have ic signin imporance nA eo the aby gear news tig ich fara wide ang of aplaton) "= What Are Generative Models ? ‘© Generative models are @ type of machine learning “model designed to capture the underlying distribution Of a dataset and then generate new data samples that resemble the original data. They are used fo learn the probabilistic relationships between features inthe data, annie allowing them to create new data points that share similarities with the training data +_Sienerative models are essential in AI for several reasons: 1S Types of Generative Models — (Cie vas pes gerne meds which pag is own songs an sees Autoregessive Models :[hesé models model the onditoral probability of cach data point given the provous onwe|Examples include Reowrent_ Neural Networks GANS), Long _Shor-Term emery networks (LSTMs)\and Transformers. ‘Variational Autoencoders (VAEs) : VAEs are a type Of prebabilineautoencoder that can generate data by taming from a leap asa space They ae sed in tite nage net and dating.) Generative Adversaial Networks (GANS) fGANs const fa generator ad a Waeiminator nebo that compete wih ach otber. The generale tiesto cea realsie daa, while the diseriinator tes to Gisingulshbetnen real and generated dal{GANS ere Widely used for image generation, style transfer, andAsie el rngene i Se NEO ow-ased Models [Tse moi ee fineness ae water et Abe dats and a sine dinibton (eg, omit) (Normalizing flows and Real NVP are examples of ‘such _ 5. PixelCNN and PixelRNN : (These autoregressive ‘models are used for generating high-resolution images bby modeling the condone distribution of each pixel Sree € BotumaanMaciies + Tse ae sgeatic ‘generative neural networks used for various tasks, including collubor ring and unsuper feature learning 5 Importance of Generative Models in AT s Generative models play @ pivotal role in various AT Ww (gallant eoatiean OWS), e [Data Augmentation : Generative models can create CoO de son ten mee ing foe ea eee for training and impeoving model generalization | + fliage Generation : GAN, in paul, have as ae ie ona eae ee eepfakes. | — ee coat es aay es Sie neg oar fe ea ar Soren rode cee se Dig YE oe Tua acme ae eee Sector dau ming ten Weber don pondals mln | oe poiteepeerinrynapinietcher fede cn ang tan bese a cae ee ee eisai eines ity ob ana a al Langs oa et Gr, Da pace Limi oe cating a ae eae cnet gern, an ral langage understanding, vote wnt acoder Yor 23-20 (MB-124) _Jext GeneratioQ2 2. What are Gaussian Mixture Models (GMM), and how are they used in modelling probability distributions characterized by multiple Gaussian components? Gaussian Mixture Models (GMMs) + The Gaussian mixture model (GMM) is a probabilistic model that assumes the data points come from a limited set of Gaussian isrbutions with uncertain variables. + The mean and covariance matixx characterizes. each individual Gaussian distribution. + Asan extansion ofthe k-means clustering technique, a GMM takes into account the data's covariance structure and the likethood of ‘each point being derived from each Gaussian distribution. ‘A Gaussian Mintre faction ta it compesed of several Gaussians, ech ete by k © {lass K}, where kis the numer of caters four dee. Each Gausian kin the mere in comprned ofthe following parame mean: tht defies scene + A cowiance that defines its wid This would be eglvalent fo the dimensions of an lla ina mula seni. * A mising probly x that fins how big or mal he Gaussian frtion willbe. Let wenow ilstrat thse parameters proniclly: Chater?QBS cout you explain the basic concepts behind Hidden Markov Models (HMMs) and rovide examples of real-world applications where they are employed? Hidden Markov Models (HMMs) + The hidden Markov Madel (HMM) is « statistical model that is used to describe the probabilistic relationship between a sequence of observations and a sequence of hhidden states. + Its often used in situations where the underlying system or process that generates the observations is unknown or Model.” fen, hence it has the name “Hidden Markov + It is used to predict future observations or classify sequences, based on the underlying hidden process that generates the data, ‘An HIMM consists of two types of variables: hidden states and observations. 41, The hidden states are the underlying variables that generate the observed data, but they are not dircetly observable. 2. The observations are the variables that are measured and observed.ovances bila tigen (AU Som 8-808) follows Gausin suitable for highly ining he optimal umber of compooens 5) canbe challenging, and GNM ae sensitive inition 25 1.2.2 Hidden Markov Models (HMMs) 7 Valero, Shaded Sat geo HMM model consist ofthese basic pans > HMM Mode! + ian sas 4+ obsersaton symbols or states) + transion fom inal sateto inal hidden sae probability tution + transion terminal tte probably station (in moet cates excloded tom motel because all ‘robs equal in geer se) sa tanson pba stitution “sate enision baby dsiuion In next section wil explain these MIM pss in ei "© Hidden states and observation symbols ‘HMM has wo par: hilen and observed. The Biden ar costs of iden sates which ave oot drstly observed, their prea is observed by observation symbole that in stats eis (MU-New Syabu wet Academie Yaar 2328 (8-138) rate Example 1 + You don't kiow in what mood your lind yond i no i iden ae) ut ‘you absre thei ations (observable symbol), and Probie Moa fr the ations yo observe yo make 3 pes bout idl sein whic he ois canple 2: You want to know your ens svi. tu you can only observe what weathers otsie- Your fiend scvites which are hidden sites “emis bservable symbols, which are wea onion. You righ thik that shold be other way, that weather ‘conn i hide states and your finds actives ane observable symbols, bu the key that wea ou can observe, bat Your fends activity you ca, that rakes sates away Yow can se, that n mood example observed symbols ‘se setally emited fom iden sates, where in frien activity example observed symbols are ke 8 reason for you fries acivtes. So observation symbols canbe lke det reson fr hen sate of bscraion symbols can be Hk consequence of iden states ean be bh ways hii th beauty of MM. In general, you choose biden sates you cn" directly bere (0d, ens seve, ec) and you choose ‘eration “symtls you can always observe (ation, weather ondtons, iden ste a8 observation sates visualization for Bangle. Yow find’ sete © Basel (8) (2 Font F) © Videogames (6) Otserable symbols: © Sumy (S) © Chudy(C) Rainy (R) ©) © tien ©@@ om Fig. 1.22: Widen states nd observable symbols Neo Pubs A SACHIN SHA Verea4 4. What are Bayesian Networks, and how do they represent probabilistic relationships ‘among a set of random varinbles using a directed acyclic graph? Bayesian Networks + Bayesian networks are a widely-used class of probabilistic graphical models. *+ They consist of two parts: a structure and parameters. + The structure is adirected acyclic graph (DAG) that expresses conditional independencies and dependencies among random variables associatcd with nodes, * The parameters consist of conditional probability distributions associated with. each node. * A Bayesian network is a compact, flexible and interpretable representation of a Joint probability distribution, * Tis also an useful tool in knowledge discovery as directed acyclic graphs allow representing causal relations between variables. Typically, a Bayesian network is leamed from data, Bayesian networks.Qs ‘5. How are Markov Random Fields (MRFs) different fromother probabilistic models, and ‘what advantages do they offer in capturing spatial dependencies in structured data sich 1s images? ‘Markow Random Fields ‘Ata motvaing ee apps tat we ate maleng wing erence among pe A.DsC,D Let's tt (A.D (B.6). (C.D). 0 (DA tent and ewe ae simi ving eens. Thee inlrce cane aly eet a8 salircal pape (One way odie yay vet oi og dso of A. D.C, Dito ion ‘coro cach anignnen thee varies atl en define» ohl ons ner see A eur can be ay fen hat ino ee, we wil dle athe fre #1A,2,C,0) (A. BYAD.CVAC, DYMO. AY, ‘whet (1,77 iota apa ewig corset oe abongiewh X9, ve wo ux coin {P EES 1 tine ‘The fcc inthe unormatae enon se on feed of The al woe then defied ot HAD.C.D)=451A,8.6,0), whee Z~ Enc MAD.C,D) naling emt sa ete ht the luo in on, i) é CO nde gpl penton of on prog feng pears oe fats indie The figure on che rghr sets the pirsve tort, presen inthe model.‘models that encode complex joint multivariate probability distributions using graphs. © In other words, PGMs capture conditional independence relationships between interacting random Variables. This is beneficial since a Jot of knowledge on raph has been gi hered over the years in various domains, especially on separating subsets, cliques and. functions on graphs. ‘+ This knowledge can be reused in PGMs, Furthermore, ‘one can easily visualize PGMs and get a quick overview ofthe model structure. ‘+ By knowing the graph structure of a PGM, one can solve tasks such as inference (computing the marginal distribution of one or more random variables) or Jeaming (estimating the ~— parameters of probability functions). One can even try to learn the siructure ofthe graph itself, given some data _B3sis of Probabilistic Graphical Models Graphical Representation: PGMS use a_ graphical Structure, typically a graph, (© represent and visualize the relationships. betwee ables. In these ariables, and edges represent probabilistic dependencies or interactions between variables, araphs, nodes represent i Probabilistic Relationships: PGMs capture probabilistic ‘relationships among variables, allowing us to model uncertainty and express conditional dependent between variahles, s (MU-New Syllabus we. Academic Vear 23-24) (MB-134)relationships between variables, Markov Random Fields (MRFs) : Markoy direction, and they indicate io air wise interactions 0r conditional dependencies, _2--Components of PGMs 3. * Nodes (Vertices) : Nodes ina PGM represent tandom variables. These variables can be discrete, continuous, oF a combination of both, * Edges (Edges) : Edges in a PGM represent Probabilistic dependencies or interactions between variables. In Bayesian Networks, edges indicate causal relationships, while in Markov Random Fields, they denote pair wise associ ions, © — Conditional Probability Distributions : Each node in a Bayesian Network has an associated conditional probability distribution that quanti the likelihood of the node given its parent nodes, * In Markov Random Fields, potential functions or energy functions describe the compatibility between variables in cliques. Applications of Probabilistic Graphical ModelsAdvanced Artificial Intelligence (MU-Sem 8-AIl&DS) 2 Pattern Recognition : PGMs are used in pattern recognition tasks, including image recognition, speech recognition, and gesture recognition. . atural Language Processing : PGMs are applied to various NLP tasks like part-of-speech tagging, syntactic _ parsing, and machine translation. 7 ‘Computer Vision : PGMs are used for tasks such as object recognition, image segmentation, and 3D scene reconstruction.i, Aimitations of PGMs Scalability : PGMs can become computationally infeasible for large, high-dimensional problems due to the need for exact or approximate inference. Complexity of Learning : Learning the structure and parameters of PGMs from data can be challenging, particularly for models with many variables. Assumptions : PGMs make certain independence assumptions, and if these assumptions do not hold in the data, the model may not accurately represent the underlying distribution.
You might also like
Daphne Koller, Nir Friedman Probabilistic Graphical Models Principles and Techniques 2009
PDF
100% (10)
Daphne Koller, Nir Friedman Probabilistic Graphical Models Principles and Techniques 2009
1,270 pages
(Adaptive Computation and Machine Learning) Daphne Koller - Nir Friedman - Probabilistic Graphical Models - Principles and PDF
PDF
No ratings yet
(Adaptive Computation and Machine Learning) Daphne Koller - Nir Friedman - Probabilistic Graphical Models - Principles and PDF
1,270 pages
AAI IA1 QUE ANS
PDF
No ratings yet
AAI IA1 QUE ANS
17 pages
aai
PDF
No ratings yet
aai
10 pages
Generative Learning algorithims 1233
PDF
No ratings yet
Generative Learning algorithims 1233
33 pages
Module 4.2
PDF
No ratings yet
Module 4.2
42 pages
Adv Ai
PDF
No ratings yet
Adv Ai
9 pages
CSGL
PDF
No ratings yet
CSGL
11 pages
AAI - Module 1 - Generative Adversarial Network and Probabilistic Models
PDF
No ratings yet
AAI - Module 1 - Generative Adversarial Network and Probabilistic Models
12 pages
Lecture # 1-2 Introduction To Gen AI
PDF
No ratings yet
Lecture # 1-2 Introduction To Gen AI
41 pages
Introduction To Markov Random Fields: Andrew Blake and Pushmeet Kohli
PDF
No ratings yet
Introduction To Markov Random Fields: Andrew Blake and Pushmeet Kohli
15 pages
Awiszus Markov Chain Neural CVPR 2018 Paper
PDF
No ratings yet
Awiszus Markov Chain Neural CVPR 2018 Paper
8 pages
Week 12 Chats
PDF
No ratings yet
Week 12 Chats
4 pages
HiddenMarkovModels_BARCA
PDF
No ratings yet
HiddenMarkovModels_BARCA
44 pages
ML unit V
PDF
No ratings yet
ML unit V
32 pages
Chapter 5 - Graphical Models
PDF
No ratings yet
Chapter 5 - Graphical Models
65 pages
Essentials of Bayesian Inference 1706204646
PDF
No ratings yet
Essentials of Bayesian Inference 1706204646
21 pages
Generative Adversarial Networks For Data
PDF
No ratings yet
Generative Adversarial Networks For Data
86 pages
24f_09_hidden_markov_models
PDF
No ratings yet
24f_09_hidden_markov_models
79 pages
Lecture # 2-1 Probabilistic Models
PDF
No ratings yet
Lecture # 2-1 Probabilistic Models
40 pages
Introduce To Probabilistic Machine Learning
PDF
No ratings yet
Introduce To Probabilistic Machine Learning
53 pages
Deep Learning u5
PDF
No ratings yet
Deep Learning u5
5 pages
AI Week 14
PDF
No ratings yet
AI Week 14
3 pages
Applications of PGMs
PDF
No ratings yet
Applications of PGMs
4 pages
AItRBM Proof
PDF
No ratings yet
AItRBM Proof
23 pages
Bayesian Networks: Machine Learning, Lecture (Jaakkola)
PDF
No ratings yet
Bayesian Networks: Machine Learning, Lecture (Jaakkola)
8 pages
(Ebook) Hidden Markov models and dynamical systems by Andrew M. Fraser ISBN 9780898716658, 0898716659 All Chapters Instant Download
PDF
100% (2)
(Ebook) Hidden Markov models and dynamical systems by Andrew M. Fraser ISBN 9780898716658, 0898716659 All Chapters Instant Download
77 pages
Vision Day 30 May 2012
PDF
No ratings yet
Vision Day 30 May 2012
33 pages
Generatuvemodals
PDF
No ratings yet
Generatuvemodals
3 pages
Hidden Markov Models: A Simple Markov Chain
PDF
No ratings yet
Hidden Markov Models: A Simple Markov Chain
46 pages
oussidi2018
PDF
No ratings yet
oussidi2018
8 pages
Deep Learning Models
PDF
No ratings yet
Deep Learning Models
18 pages
Markov Models
PDF
No ratings yet
Markov Models
54 pages
Cheat Sheet 4
PDF
No ratings yet
Cheat Sheet 4
2 pages
Machine Learning Technique - Introduction To Graphical Models
PDF
No ratings yet
Machine Learning Technique - Introduction To Graphical Models
12 pages
Знімок екрана 2022-10-31 о 18.56.30
PDF
No ratings yet
Знімок екрана 2022-10-31 о 18.56.30
96 pages
Unit 4 Full PPT (ML)
PDF
No ratings yet
Unit 4 Full PPT (ML)
31 pages
10 1 1 314 2260 PDF
PDF
No ratings yet
10 1 1 314 2260 PDF
41 pages
Lecture 2
PDF
No ratings yet
Lecture 2
31 pages
Week 6 v1.61 (Hidden) - Revision, CW1, and Probabilistic Graphical Models
PDF
No ratings yet
Week 6 v1.61 (Hidden) - Revision, CW1, and Probabilistic Graphical Models
65 pages
ml 5
PDF
No ratings yet
ml 5
28 pages
A Gentle Introduction To Generative Adversarial Networks (GANs)
PDF
No ratings yet
A Gentle Introduction To Generative Adversarial Networks (GANs)
15 pages
Tom M CMU ANN Lecture Notes
PDF
No ratings yet
Tom M CMU ANN Lecture Notes
68 pages
Lectures 7 and 8
PDF
No ratings yet
Lectures 7 and 8
37 pages
Computational Genomics Hidden Markov Models (HMMS)
PDF
No ratings yet
Computational Genomics Hidden Markov Models (HMMS)
55 pages
Lec18 HMMs
PDF
No ratings yet
Lec18 HMMs
56 pages
paper8
PDF
No ratings yet
paper8
26 pages
ST Flour Notes
PDF
No ratings yet
ST Flour Notes
104 pages
Introduction To Generative Adversarial Networks: Luke de Oliveira
PDF
No ratings yet
Introduction To Generative Adversarial Networks: Luke de Oliveira
31 pages
Dynamic Bayesian Networks - Representation, Inference and Learning
PDF
No ratings yet
Dynamic Bayesian Networks - Representation, Inference and Learning
225 pages
Machine Learning
PDF
No ratings yet
Machine Learning
88 pages
Brief Intro To ML PDF
PDF
No ratings yet
Brief Intro To ML PDF
236 pages
Lec23 PDF
PDF
No ratings yet
Lec23 PDF
7 pages
lec12
PDF
No ratings yet
lec12
15 pages
Dl Highlights
PDF
No ratings yet
Dl Highlights
6 pages
Lab Tutorial
PDF
No ratings yet
Lab Tutorial
103 pages
PGM Theory Notes
PDF
No ratings yet
PGM Theory Notes
16 pages
Aai 04
PDF
No ratings yet
Aai 04
7 pages
AAI Module 4
PDF
No ratings yet
AAI Module 4
13 pages
AAI Module 3
PDF
No ratings yet
AAI Module 3
11 pages
AAI Module 2
PDF
No ratings yet
AAI Module 2
18 pages