# CSCI-GA.3033-006

# Special Topics in Machine Learning: Probabilistic Graphical Models

# David Sontag

# Computer Science

## Prerequisites

This is a graduate-level course. Students should previously have taken one of the following classes:
In addition, students should have a solid understanding of basic concepts from probability (e.g., Bayes' rule, multivariate distributions, conditional independence) and algorithms (e.g., dynamic programming, graphs, shortest paths, complexity).
These prerequisites may be waived in some cases (please e-mail instructor).

## Book

Probabilistic Graphical Models: Principles and Techniques by Daphne Koller and Nir Friedman. MIT Press, 2009.
## (Draft) Syllabus

**Probabilistic reasoning and AI** (2 lectures)
- Introduction: uncertainty in artificial intelligence, machine learning
- Background on probability and statistics, loss functions
- Latent variable models, mixture models, classification

**Introduction to graphical models** (2 lectures)
- Directed models: generative models, Bayesian networks
- Undirected models: exponential family, Markov random fields, factor graphs

**Exact and approximate probabilistic inference** (5 lectures)
- Exact inference, junction tree algorithm
- Markov chain Monte Carlo, local search algorithms
- Variational methods
- Loopy belief propagation
- Mean field algorithms, TRW and convex upper bounds

- Combinatorial optimization
- LP relaxations, Lagrangian relaxation/dual decomposition

**Learning from data** (5 lectures)
- Maximum likelihood, maximum entropy, and other formulations
- Missing data, expectation maximization (EM)
- Structured prediction, max-margin methods
- Pseudo-likelihood, approximate inference used within learning