Computer Science Colloquium

Applications of Random Matrices in Spectral Computations and Machine Learning

Dimitris Achlioptas
Microsoft

Friday, December 17, 2004 11:30 A.M.
Room 1302 Warren Weaver Hall
251 Mercer Street
New York, NY 10012-1185

Directions: http://cs.nyu.edu/csweb/Location/directions.html
Colloquium Information: http://cs.nyu.edu/csweb/Calendar/colloquium/index.html

Hosts:

Joel Spencer spencer@cs.nyu.edu, (212) 998-3219

Abstract

We will show how carefully crafted random matrices can be used to enhance a wide variety of computational tasks, including: dimensionality reduction, spectral computations, and kernel methods in machine learning. Two concrete examples are:

-- Imagine that we want to compute the top few eigenvectors of a matrix A, hoping to extract the "important" features in A. We will prove that either this is not worth doing, or that we can begin by randomly throwing away a large fraction of A's entries.

-- A famous result of Johnson and Lindenstrauss asserts that any set of n points in d-dimensional Euclidean space can be embedded in k-dimensional Euclidean space, where k=O(log n), such that all pairwise distances are preserved with arbitrarily good accuracy. We prove that to construct such an embedding it suffices to multiply the n x d matrix of points with a random d x k matrix, whose entries are set to 1 independently, with equal probability.

The emphasis of the talk will be on Euclidean intuition. In particular, no prior knowledge on random matrices and/or linear algebra will be assumed.


top | contact webmaster@cs.nyu.edu