Title: Agnostically Learning Decision Trees. Speaker: Parikshit Gopalan (U. Washington) We give a query algorithm for agnostically learning decision trees with respect to the uniform distribution on inputs. Given black-box access to an arbitrary binary function $f$ on the $n$-dimensional hypercube, our algorithm finds a function that agrees with $f$ on almost (within an $\eps$ fraction) as many inputs as the best size-$t$ decision tree, in time $poly(n,t,1/\eps)$. This is the first polynomial-time algorithm for learning decision trees in a harsh noise model. The core of our learning algorithm is a procedure to implicitly solve a convex optimization problem over the $L_1$ ball in $2^n$ dimensions using an approximate gradient projection method. No prior knowledge of learning or optimization is necessary. This is joint work with Adam Tauman Kalai and Adam Klivans.