We consider a basic problem in unsupervised learning: learning an unknown Poisson Binomial Distribution. A Poisson Binomial Distribution (PBD) over {0,1,...,n} is the distribution of a sum of n independent Bernoulli random variables which may have arbitrary, potentially non-equal, expectations. These distributions were first studied by S. Poisson in 1837 and are a natural n-parameter generalization of the familiar Binomial Distribution. Surprisingly, prior to our work this basic learning problem was poorly understood, and known results for it were far from optimal. We essentially settle the complexity of the learning problem for this basic class of distributions. We give a computationally efficient algorithm which properly learns to eps-accuracy using O~(1/eps^2) samples independent of n. This is nearly optimal since any algorithm must use Omega(1/eps^2) samples. We also give positive and negative results for some extensions of this learning problem. The talk will be based on joint work with Costis Daskalakis and Rocco Servedio.