
CSC2515 Fall 2004  Lectures
Tentative Lecture Schedule
 Sept 14  Machine Learning:
Introduction to Machine Learning, Generalization and Capacity
(notes [ps]
[pdf])
 Sept 21 Classification 1:
KNN, linear discriminants, decision trees
(notes [ps]
[pdf])
 Sept 28  Classification 2:
probabilistic classifiers: classconditional Gaussians,
naive Bayes, logistic regression, neural nets for classification
(notes [ps]
[pdf])
 Oct 5: Assignment 1 (Classification) posted
 Oct 5  Regression 1:
constant model, linear models, generalized additive models
(e.g. RBFs), locally weighted regression,
multilayer perceptrons/neural networks
(notes [ps]
[pdf])
 Oct 12  Objective Functions and Optimization:
error surfaces, weight space, gradient descent, stochastic gradient,
conjugate gradients, second order methods, convexity, enforcing constraints
(notes [ps]
[pdf])
 Oct19: Assignment 1 due at the start of class
 Oct 19  Regression 2 and Supervised Mixtures:
credit assignment problem, neural networks, radial basis networks,
kolmogorov's theorem,
backprop algorithm for efficiently computing gradients,
mixtures of experts, piecewise models
(notes [ps]
[pdf])
 Oct26: Assignment 2 (Regression) posted
 Oct 26  Unsupervised Learning 1:
Trees & Clustering
Kmeans, heirarchical clustering (alglomerative and divisive),
maximum likelihood trees, optimal tree structure
(notes [ps]
[pdf])
 Nov 2  Unsupervised Learning 2:
Mixture models and the EM Algorithm:
missing data, hidden variables,
Jensen's inequality, lower bound on marginal likelihood,
free energy interpretation, inference,
(notes [ps]
[pdf])
 Nov 9: Assignment 2 due
 November 9  Unsupervised Learning 3:
Continuous latent variable models, Factor Analysis, (Probabilistic)
PCA, Mixtures of Factor Analyzers, Independent Components Analysis
(notes [ps]
[pdf])
 Nov 16: Assignment 3 posted
 Nov 16  Time Series Models
autoregressive/Markov models, hidden Markov models
(notes [ps]
[pdf])
 Nov 23  Capacity Control:
generalization and overfitting, No free lunch theorems,
high dimensional issues.
capacity control methods: weight decay,
early stopping, cross validation, model averaging, intro to Bayesianism
(notes [ps]
[pdf])
 Nov 30: Assignment 3 due at the start of class
 Nov 30  MetaLearning Methods:
stacking, bagging, boosting
(notes [ps]
[pdf])
 Dec 7  Kernel methods:
the kernel trick, support vector machines, kernel perceptrons,
sparsity, capacity control, dual problems
(notes [ps]
[pdf])
 December 20  projects due by email before 5pm
Send attachments or valid URL pointing to your report.
POSTSCRIPT or PDF only. DO NOT SUBMIT WORD, HTML OR OTHER FORMAT
FILES.
 December 20  all readings must be completed by 5pm.
email csc2515readings@cs
 Extra topics we may or may not have time for
 other kernel machines: gaussian processes
 linear dynamical systems, Kalman filtering
 Approximate inference and learning:
sampling, variational approximations, loopy belief propagation
 Spectral Methods:
Isomap,LLE, spectral clustering,
 MaxEnt models:
maximum entropy/energybased models, iterative scaling,
products of experts, dependency nets,
 Matrix Factorizations:
aspect models/LDA/plaids, nonnegative matrix factorization
 Automatic Structure Learning:
sparsity priors, empirical Bayes,
automatic relevance determination (MLII), structural EM
[
Home 
Course Information 
Lecture Schedule/Notes 
Textbook/Readings 
Assignments/Tests 
Computing 
]
CSC2515  Machine Learning  www.cs.toronto.edu/~roweis/csc2515/
