Home > Teaching > CS 798D: Algorithms for Bayesian Networks and Causality

#### CS 798D: Algorithms for Bayesian Networks and Causality

###### Pre-requisites:

Basics of Probability and Statistics, Data Structures and Algorithms, or consent of the instructor

###### Course Description:

Current machine learning algorithms often deal with datasets having hundred or thousands of features. In the unsupervised learning setting, often such datasets are modelled as random samples from a highdimensional distribution which we are interested to learn. Bayesian Networks are a popular class of probabilistic graphical model that allows us to succinctly represent high-dimensional probability distributions owing to a limited conditional dependence between the component random variables. These networks have been used to model large datasets across several practical topics including Bioinfomatics, Medical Diagnosis, Information Retrieval, Image Processing. In this course, we take up a formal algorithmic study of Bayesian networks. We will look at the central problem of efficiently learning an unknown Bayesian network using an optimal number of samples. Along the way, we will touch upon several information-theoretic tools which are useful in data science.

In the second part of the course, we will study Causality using graphical models.  Here we consider the probabilistic dependence of each node in the Bayesian network as an independent mechanism subject to manipulation. Specifically, the causal effect of a set of nodes on another is formalized by an intervention that fixes a set of nodes of the Bayes net to particular values ignoring their parents while all other nodes are still governed by the usual probabilistic dependence of the Bayes net. We will discuss several important problems that includes learning causal effects and causal models from observational and interventional data.

###### Course Contents:
 Title Hours Bayesian networks Revision of Probability Basics 1 Introduction to probabilistic graphical models 1 Introduction to Bayesian Networks 1 Learning a Bayes net using hypothesis selection 1.5 Information-theoretic Lower bound 1.5 Fixed-structure discrete Bayes net 1 Tree-structured discrete Bayes net 1.5 Fixed structure Gaussian Bayes net 1.5 NP-Hardness of Learning Bayes net 1 Learning algorithms: PC and GES 3 Causal models Introduction to Causality 2 Pearl’s formalism and do calculus 1 Identifiability and complete conditions 2 Algorithms for Learning Causal Models 1 Causality and Machine learning 2 Additional lectures Ising models 3 Gaussian graphical models 3 Student presentations 13 Total 39

###### References:
1. Probabilistic Graphical Models. MIT Press. Daphne Koller and Nir Friedman.
2. Causality. Cambridge University Press. Judea Pearl.
3. Probabilistic Machine Learning. MIT Press. Kevin Murphy.
4. Machine Learning: A Probabilistic Perspective. MIT Press. Kevin Murphy.