Probabilistic Graphical Models
Spring 2012
Overview A graphical model is a probabilistic model, where the
conditional dependencies between the random variables is
specified via a graph. Graphical models provide a flexible
framework for modeling large collections of variables with
complex interactions, as evidenced by their wide domain of
application, including for example machine learning,
computer vision, speech and computational biology. This
course will provide a comprehensive survey of learning and
inference methods in graphical models. 

General information Lecture: Thursday 5:006:50pm
Office hours: Tuesday
56pm and by appointment. Location: 715 Broadway, 12th floor, Room
1204 Grading: problem sets (70%) + exam
(30%). Problem Set policy Book:
Probabilistic Graphical
Models: Principles and Techniques by Daphne Koller
and Nir Friedman, MIT Press (2009). Mailing list: To subscribe to the class list, follow instructions here. 
Schedule
Week  Date  Topic  Readings  Assignments 
1  Jan 26  Introduction, Bayesian networks [Slides] 
Chapters 13, Appendix A How to write a spelling corrector (optional) 

2  Feb 2 
Undirected graphical models [Slides] 
Chapter 4 (except for 4.6) Introduction to Probabilistic Topic Models (optional) 

3  Feb 9  Dual decomposition and NLP applications
[Slides] (Guest lecture by Sasha Rush) 
Introduction to Dual Decomposition for Inference (sections 1.11.4) Dual Decomposition for NLP (optional) 

4  Feb 16 
Conditional random fields [Slides] 
Section 4.6 An Introduction to Conditional Random Fields (section 2) Original paper introducing CRFs (optional) 

5  Feb 23 
Exact inference [Slides] 
Sections 99.4, 9.6.1, 9.79.8, Chapter 10 

6  March 1 
Exact inference (continued) [Slides, Notes] 
Sections 13.113.3, 13.513.5.2, 13.713.9 (also relevant: readings from week 3) 

7  March 8 (no class March 15, Spring break) 
LP relaxations for MAP inference [Slides, Notes] 
Chapter 8 Introduction to Dual Decomposition for Inference (sections 1.5, 1.6) 

8  March 22 
Variational inference [Slides] 
Chapter 11 

9  March 29 
MonteCarlo methods for inference 
Chapter 12 

10  April 3, 79pm. Note special date, time, and location! Class will be in 719 Broadway, Room 1221 
Learning (Bayesian networks) [Slides] 
Chapters 16, 17. Section 18.1 

11  April 12  Learning (unobserved data, EM) [Slides] 
Sections 19.1, 19.2 The Expectation Maximization Algorithm: A short tutorial Latent Dirichlet Allocation (sections A.3, A.4) 

12  April 19 
Learning (Markov networks) [Slides] 
Chapter 20 (except for 20.7) Notes on pseudolikelihood An Introduction to Conditional Random Fields (section 4) Recent paper on approximate maximum entropy learning in MRFs (optional) 

13  April 26  Learning (structured prediction) [Slides] 


14  May 3 
Advanced topics (spectral algorithms) [Slides] 
Notes on spectral learning of hidden Markov models (optional) A Method of Moments for Mixture Models and Hidden Markov Models (optional) 

15 
May 10 
Final exam (in class) 
PrerequisitesThis is a graduatelevel course. Students should previously have taken one of the following classes:
These prerequisites may be waived in some cases (please
email instructor). 
Problem
Set policy I expect you to try solving each problem set on your own. However, when being stuck on a problem, I encourage you to collaborate with other students in the class, subject to the following rules:
