CBLL HOME
VLG Group
News/Events
Seminars
People
Research
Publications
Talks
Demos
Datasets
Software
Courses
Links
Group Meetings
Join CBLL
Y. LeCun's website
CS at Courant
Courant Institute
NYU
Lush
Lush

Machine Learning and Pattern Recognition: Schedule


[ Course Homepage | Schedule and Course Material | Mailing List ]

This page contains the schedule, slide from the lectures, lecture notes, reading lists, assigments, and web links.

I urge you to download the DjVu viewer and view the DjVu version of the documents below. They display faster, are higher quality, and have generally smaller file sizes than the PS and PDF.

Full-text search is provided for the entire collection of slides and papers. Click here to search

You can have a look at the schedule and class material for the version of this course taught during the Spring 2004 semester, but be warned that the new edition is significantly different.

09/06: Introduction and basic concepts

Subjects treated: Intro, types of learning, nearest neighbor, how biology does it, linear classifier, perceptron learning procedure, linear regression,

Slides: [DjVu | PDF | PS]

Recommended Reading:

  • Hastie/Tibshirani/Friedman: Chapter 2
  • Refresher on random variables and probabilites by Andrew Moore: (slides 1-27) [DjVu | PDF]
  • Refresher on joint probabilities, Bayes theorem by Chris Willams: [DjVu | PDF]
  • Refresher on statistics and probabilities by Sam Roweis: [DjVu | PS]
  • If you are interested in the early history of self-organizing systems and cybernetics, have a look at this book available from the Internet Archive's Million Book Project: Self-Organizing Systems, proceedings of a 1959 conference edited by Yovits and Cameron (DjVu viewer required for full text).

09/13: Energy-Based Models, Loss Functions, Linear Machines

Subjects treated: Energy-based models, minimum-energy machines, loss functions. Linear machines: perceptron, logistic regression. Linearly parameterized classifiers: Polynomial classifiers, basis function expansion, RBFs, Kernel-based expansion.

Slides: [DjVu | PDF | PS]

09/20: Gradient-Based Learning I, Multi-Module Architectures and Back-Propagation, Regularization

Subjects treated: Multi-Module learning machines. Vector modules and switches. Multilayer neural nets. Backpropagation Learning. Intro to Model Selection, structural risk minimization, regularization.

Slides on Regularization: [DjVu | PDF | PS]

Slides on Multi-Module Back-Propagation: [DjVu | PDF | PS]

Required Reading:

Gradient-based Learning Applied to Document Recognition by LeCun, Bottou, Bengio, and Haffner; pages 1 to the first column of page 18: [DjVu | .ps.gz ]

Homework Assignements: Linear Classifier: implementing the Perceptron Algorithm, MSE Classifier (linear regression), Logistic Regression. Details and datasets below:

  • Download this tar.gz archive. It contains the datasets and the homework description.
  • Decompress it with "tar xvfz hw-linear.tgz" on Unix/Linux or with Winzip in Windows.
  • The file README.txt contains the questions and instructions.
  • Most the of the necessary Lush code is provided.
  • Due Date is Tuesday October 11th, before the lecture.

09/27: Gradient-Based Learning II: Special Modules and Architectures

Subjects treated: Trainers; complex topologies; special modules; Cross-entropy and KL-divergence; RBF-nets, Mixtures of Experts; Parameter space transforms; weight sharing; convolution module; TDNN; Recurrent nets.

Slides: [DjVu | PDF | PS]

Homework Assignements: Computing Jacobians:

pen-and-paper homework to get familiar with computing jacobians. Click on one of these links to get the text of the homework: [DjVu | PDF | PS]

Due Date is Tuesday October 18th, before the lecture.

10/04: Convolutional Nets, Image Recognition

Subjects treated: Convolutional Networks; Image recognition, object detection, and other applications;

Slides: talk on object recognition with convolutional nets: [DjVu| PDF]

Required Reading:

If you haven't read it already: Gradient-based Learning Applied to Document Recognition by LeCun, Bottou, Bengio, and Haffner; pages 1 to the first column of page 18: [ DjVu | .ps.gz ]

Optional Reading: Fu-Jie Huang, Yann LeCun, Leon Bottou: "Learning Methods for Generic Object Recognition with Invariance to Pose and Lighting.", Proc. CVPR 2004. [DJVU, PDF, PS.GZ]

10/11: More Applications to Vision and Speech

Slides:: same as last week.

10/18: Probabilistic Learning, MLE, MAP, Bayesian Learning

Subjects treated: Refresher probability theory; Bayesian Estimation, Maximum Likelihood Estimation, Maximum A Posteriori Estimation, Negative Log-Likelihood Loss Functions.

Slides: Refresher on Probability Theory: [DjVu | PDF | PS]

Slides: Bayesian Learning: [DjVu | PDF | PS]

Required Reading:

10/25: Learning Theory, Bagging, Boosting, VC-Dim

Subjects treated: Ensemble methods, More on Bayesian Learning, Bagging, Boosting. Learning Theory, Bounds, VC-Dimension.

Slides: Ensemble Methods: [DjVu | PDF | PS]

11/01: Efficient Optimization

Subjects treated: Optimization: Convergence of gradient-based optimization and acceleration techniques. Gauss-Mewton, Levenberg-Marquardt, BFGS, Conjugate Gradient.

Slides:

Required Reading: Efficient Backprop, by LeCun, Bottou, Orr, and Muller: [ DjVu | .ps.gz ]

11/08: Intro to unsupervised Learning

Subjects treated: Unsupervised Learning: Clustering: K-Means; Principal Component Analysis, Auto-Encoders. Density Estimation: Parzen Windows. Gaussian Density Estimation. Latent variables.

Unsupervised learning: [DjVu | PDF | PS]

Homework Assignements: Neural Nets and Backpropagation:

Click on this links to get the homework: hw-backprop.tgz.

Due Date is Tuesday November 22th, before the lecture.

11/15: more on unsupervised Learning, EM

note: the guest lecture is cancelled..

the Estimation-Maximization algorithm. Mixtures of Gaussians.

Unsupervised learning: [DjVu | PDF | PS]

11/22: Modeling Sequences: Hidden Markov Models, Graph Transformer Networks

Subjects treated:

Modeling distributions over sequences. Learning machines that manipulate graphs. Finite-state transducers. Graph Transformer Networks. Introduction to Hidden Markov Models (HMM).

Required Reading:

Note: the slides on Transducers and GTNs used in class are not provided because the paper above covers the material.

11/29: Intro to Graphical Models

Subjects treated: Intro to graphical models, Belief Networks and Factor Graphs, Inference, Belief Propagation, Boltzmann Machines.

Suggested Reading: David Mackay's book Information Theory, Inference, and Learning Algorithms. (available for free download in PDF and DjVu).

Homework Assignements: K-Means and Mixture of Gaussians Model:

Click on this links to get the homework: hw-unsup.tgz.

Due Date is Friday December 16th.

12/06: Support Vector Machines, kernel methods

.