Usual coordinates:
Friday, 2:00PM
Room 1314
Warren Weaver Hall
251 Mercer Street
Fall 2014 Schedule
Friday September 12
2:00AM
WWH 1314
Aravindan Vijayaraghavan (Courant) and Igor Shinkar (Courant)
Two short presentations
Aravindan Vijayaraghavan:
Title: Integrality of Stable Instances of Graph Partitioning problems
Abstract: We investigate the notion of stability that was proposed by Bilu and Linial
for studying practically interesting instances of graph partitioning and clustering problems.
Such instances have stable optimal solutions that do not change upon small perturbations of the
instance (multiplicative perturbations to the edge weights up to a factor of c). We obtain exact
polynomial-time algorithms for stable instances of Max Cut and Minimum Multiway cut.
We show that c-stable instances of Max-Cut can be solved efficiently, for c = O(\sqrt{log n} loglog n).
This represents a significant improvement over previous results, which needed the instances to be
c=O(\sqrt{n}) stable. We also give an algorithm for Multiway Cut instances that are 4-stable.
Our algorithms are based on a structural result showing that certain LP or SDP relaxations of
these problems become integral under sufficient stability.
Based on joint work with Konstantin Makarychev and Yury Makarychev
-----------------------
Igor Shinkar:
Title: Bi-Lipschitz Bijection between the Boolean Cube and the Hamming Ball
Abstract:
We construct a bi-Lipschitz bijection from the Boolean cube to the Hamming ball of equal volume. More precisely, we show that
for all even n there exists an explicit bijection f from the n-dimensional Boolean cube to the Hamming ball of equal volume
embedded in (n+1)-dimensional Boolean cube, such that for all x and y it holds that distance(x,y) / 5 <= distance(f(x),f(y))
<= 4 distance(x,y) , where distance(,) denotes the Hamming distance. In particular, this implies that the Hamming ball is
bi-Lipschitz transitive.
This result gives a strong negative answer to an open problem of Lovett and Viola (CC 2012), who raised the question in the
context of sampling distributions in low-level complexity classes. The conceptual implication is that the problem of proving
lower bounds in the context of sampling distributions will require some new ideas beyond the sensitivity-based structural
results of Boppana (IPL 97).
Joint work with Itai Benjamini and Gil Cohen.
Friday September 19
2:30PM
WWH 412
(Note the unusual time and location)
Pranjal Awasthi (Princeton)
Learning Halfspaces with Noise
Abstract: We study the problem of learning halfspaces in the malicious noise model of Valiant.
In this model, an adversary can corrupt an η fraction of both the label part and the feature
part of an example. We design a polynomial-time algorithm for learning halfspaces in R^d under
the uniform distribution with near optimal noise tolerance.
Our results also imply the first active learning algorithm for learning halfspaces that can handle malicious noise.
Joint work with Nina Balcan and Phil Long.
Friday September 26
2:00PM
WWH 1314
Omri Weinstein (Princeton)
Approximating the best Nash Equilibrium in n^{o(log n)}-time breaks the Exponential Time Hypothesis
Abstract: The celebrated PPAD hardness result for finding an exact Nash equilibrium in a two-player game
initiated a quest for finding *approximate* Nash equilibria efficiently, and is one of the major
open questions in algorithmic game theory.
We study the computational complexity of finding an \eps-approximate Nash equilibrium with good social
welfare. Hazan and Krauthgamer and subsequent improvements showed that finding an epsilon-approximate
Nash equilibrium with good social welfare in a two player game and many variants of this problem is at
least as hard as finding a planted clique of size O(log n) in the random graph g(n,1/2).
We show that any polynomial time algorithm that finds an \eps-approximate Nash equilibrium with good
social welfare refutes (the worst-case) Exponential Time Hypothesis by Impagliazzo and Paturi. Specifically,
it would imply a 2^O(\sqrt{n}) algorithm for SAT.
Our lower bound matches the quasi-polynomial time algorithm by Lipton, Markakis and Mehta for solving
the problem. Our key tool is a reduction from the PCP machinery to finding Nash equilibrium via free games,
the framework introduced in the recent work by Aaronson, Impagliazzo and Moshkovitz. Techniques developed
in the process may be useful for replacing planted clique hardness with ETH-hardness in other applications.
Joint work with Mark Braverman and Young Kun Ko.
Thursday October 2
2:00PM
WWH 1314
(Note the unusual day)
Anindya De (Rutgers)
Central limit theorem for Gaussian chaos and deterministic counting for polynomial threshold functions
Abstract: It is a well-known fact that linear combinations of Gaussians are Gaussians.
What about polynomial combinations? In this talk, we will see a new central limit theorem
for polynomials of Gaussians. The techniques will draw from the so-called "Stein's method"
in probability theory and Malliavin calculus. As the main application, we will give the first
deterministic polynomial time algorithm for approximate counting of any constant degree PTF with
subconstant error. This theorem will involve a new decomposition procedure for polynomial which
might be of interest in other applications as well.
Joint work with Rocco Servedio.
Friday October 10
2:00PM
WWH 1314
Mark Braverman (Princeton)
Small value parallel repetition for general games
Abstract: We prove a parallel repetition theorem for general games
with value tending to 0. Previously Dinur and Steurer proved such a
theorem for the special case of projection games. Our proof is
elementary in that it only employs basic information theory and
probability theory. Our proofs also extend to the high value regime
(value close to 1) and provide alternate proofs for the parallel
repetition theorems of Holenstein and Rao for general and projection
games respectively. Joint work with Ankit Garg.
Friday October 17
2:00PM
WWH 805
(Note the unusual location)
Zeev Dvir (Princeton)
Private Information Retrieval with 2-Servers and sub-polynomial communication
Abstract: A 2-server Private Information Retrieval (PIR) scheme allows a user to retrieve
the i'th bit of an n-bit database replicated among two servers (which do not communicate)
while not revealing any information about i to either server. The privacy of the user is
information theoretic and does not rely on any cryptographic assumptions. In this work we
construct a new 2-server PIR scheme with total communication cost sub-polynomial in n.
This improves over the currently known 2-server protocols which require n^{1/3} communication
and matches the communication cost of known 3-server PIR schemes. Our improvement comes from
reducing the number of servers in existing protocols, based on Matching Vector Codes,
from 3 or 4 servers to 2. This is achieved by viewing these protocols in an algebraic
way (using polynomial interpolation) and extending them using partial derivatives.
Joint work with Sivakanth Gopi (Princeton)
Friday October 24
4:00PM
WWH 905
(Note the unusual time and location)
Abstract: I will overview recent results on learning structured univariate
probability distributions. These results follow from a general approach to
density estimation based on piecewise polynomial approximations. The
key tool that enables this approach is a computationally efficient
algorithm for learning probability distributions that are well
approximated by piecewise polynomial density functions.
(Based in part on joint work with S. Chan, R. Servedio and X. Sun.)
Friday November 7
2:00PM
WWH 1314
Ran Raz (Weizmann)
Exponential Separation of Information and Communication
Abstract: In profoundly influential works, Shannon and Huffman show that if Alice
wants to send a message X to Bob, it's sufficient for her to send roughly H(X) bits
(in expectation), where H denotes Shannon's entropy function. In other words, the
message x can be compressed to roughly H(X) bits, the information content of the message.
Can one prove similar results in the interactive setting, where Alice and Bob engage in
an interactive communication protocol?
We show the first gap between communication complexity and information complexity, by giving
an explicit example of a partial boolean function with information complexity O(k), and
distributional communication complexity > 2^k. This shows that a communication protocol
cannot always be compressed to its internal information, answering (the standard formulation of)
the above question in the negative. By a result of Braverman, our example gives the largest possible gap.
By a result of Braverman and Rao, our example gives the first gap between communication complexity and
amortized communication complexity, implying that strong direct sum does not hold for distributional communication complexity.
Joint work with Anat Ganor and Gillat Kol.
Friday November 14
2:00PM
WWH 1314
Anand Louis (Princeton)
Partitioning Graphs and Hypergraphs
Abstract: Graph and hypergraph partitioning problems are a central topic of research
in the study of algorithms and complexity theory, with connections to sampling algorithms,
mixing time of markov chains, metric embeddings, among others. There has been a lot of
work studying the eigenvalues of the graph, most of which have focused on the second
largest eigenvalue of the adjacency matrix of the graph. However, very little is known
about the other eigenvalues of the graph. A spectral theory for hypergraphs has been
explored in the literature only briefly, with limited success.
I will define a Markov Operator for Hypergraphs which generalizes the Heat Kernal for
graphs (i.e. the Random Walk operator). Based on this, I will show generalizations of
many known results for graphs to hypergraphs, and present some new results relating
higher eigenvalues of graphs/hypergraphs to higher order expansion quantities.
Many of the results were obtained in joint work with Konstantin Makarychev, Yury Makarychev,
Prasad Raghavendra, Prasad Tetali, Santosh Vempala.