Template For Uf Recherche Bib Review Insa Toulouse DGM

Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

INSA Toulouse - Semester 1 (2022-2023) Bib.

Review

Title of your research

Prenom Nom 1 email1@insa-toulouse.fr


Department of Mechanical Engineering
INSA Toulouse
Toulouse, France
Prenom Nom 2 email2@insa-toulouse.fr
Department of Mechanical Engineering
INSA Toulouse
Toulouse, France

Tutors: E. Marenic, P. Oumaziz and A.C. Araujo

Abstract
This paper describes the mixtures-of-trees model, a probabilistic model for discrete mul-
tidimensional domains. Mixtures-of-trees generalize the probabilistic trees of Chow and
Liu (1968) in a different and complementary direction to that of Bayesian networks. We
present efficient algorithms for learning mixtures-of-trees models in maximum likelihood
and Bayesian frameworks. We also discuss additional efficiencies that can be obtained
when data are “sparse,” and we present data structures and algorithms that exploit such
sparseness. Experimental results demonstrate the performance of the model for both den-
sity estimation and classification. We also discuss the sense in which tree-based classifiers
perform an implicit form of feature selection, and demonstrate a resulting insensitivity to
irrelevant attributes.
Keywords: Bayesian Networks, Mixture Models, Chow-Liu Trees

1. Introduction

Probabilistic inference has become a core technology in AI, largely due to developments
in graph-theoretic methods for the representation and manipulation of complex probability
distributions (Pearl, 1988). Whether in their guise as directed graphs (Bayesian networks)
or as undirected graphs (Markov random fields), probabilistic graphical models have a num-
ber of virtues as representations of uncertainty and as inference engines. Graphical models
allow a separation between qualitative, structural aspects of uncertain knowledge and the
quantitative, parametric aspects of uncertainty...

2. Name of section 2

Let u, v, w be discrete variables such that v, w do not co-occur with u (i.e., u ̸= 0 ⇒


v = w = 0 in a given dataset D). Let Nv0 , Nw0 be the number of data points for which
v = 0, w = 0 respectively, and let Iuv , Iuw be the respective empirical mutual information

UF Recherche - 22/23 ©M.Rojas Date : 10.Jan.23


Nom-1, Nom-2 and Nom-3

values based on the sample D. Then

Nv0 > Nw0 ⇒ Iuv ≤ Iuw (1)

with equality only if u is identically 0. Entropies will be denoted by H. We aim to show


∂Iuv
that ∂P v0
< 0....

3. Conclusions
Subjects in this study were able to ....

References
C. K. Chow and C. N. Liu. Approximating discrete probability distributions with depen-
dence trees. IEEE Transactions on Information Theory, IT-14(3):462–467, 1968.

Judea Pearl. Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference.


Morgan Kaufman Publishers, San Mateo, CA, 1988.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy