CBLL HOME
VLG Group
News/Events
Seminars
People
Research
Publications
Talks
Demos
Datasets
Software
Courses
Links
Group Meetings
Join CBLL
Y. LeCun's website
CS at Courant
Courant Institute
NYU
Lush
Lush

Advanced Machine Learning: Schedule


[ Course Homepage | Schedule and Course Material | Mailing List ]

This page contains the schedule, slide from the lectures, lecture notes, reading lists, assigments, and web links.

I urge you to download the DjVu viewer and view the DjVu version of the documents below. They display faster, are higher quality, and have generally smaller file sizes than the PS and PDF.

01/18: Introduction

01/26: multi-layer learning

Early history of multilayer learning

Paper

02/01: Target Prop Algorithms

Papers

02/08: Unsupervised Feature Learning

Papers

  • Marc'Aurelio Ranzato: Symmetric Product of Experts.

02/15: Unsupervised Learning

Papers

02/22: Hinton Day

Talks

03/01: Graphical Models

Different types of graphical models (Yann)

  • Bayesian belief nets
  • directed graphical models
  • graphical models with loops are generally intractable
  • conditional probability tables are invertible with Bayes rule: the directions of the arrow don't matter in principle (they do not express causality, just dependency).
  • undirected graphical models: the likelihood is a product of potential functions
  • Markov random fields: graphical models with local interactions
  • undirected graphical models with potential functions must be normalized explicitely. The partition function problem.
  • factor graphs: each potential function is explicitely represented (a slightly more general representation of graphical models)
  • logarithmic representation: the factors are additive energy functions. The likelihood is proportional to exp(-energy).
  • energy-based models: factors graphs without normalization (no partition function). Can be used when no explicit probabilities are required: only the relative values of the energis matter.
  • representing common models as factor graphs: example: an HMM is a "comb".

03/08: Independent Component Analysis, Source Separation

Papers

Links, additional info

03/15: Spring Break

03/22: Sequence Labeling

Tutorial / Review

Graph Transformer Networks. Sequence labeling with energy-Based factor graphs. (see gradient-based learning applied to document recognition part 4-7, page 16 on. (Yann)

Papers

03/29: Dynamic Graphical Models

Tutorial

04/05: no class

NO CLASS (Snowbird workshop)

04/12: Reinforcement Learning

Topics

Each group will study and explain one class of RL algorithm, with an application, as listed below. Much of the required information can be found in Sutton and Barto's book Reinforcement Learning: An Introduction. However, a number of other sources of introductory information is listed below.

Background Reading Material

04/19:

04/26:

.