Computational Mathematics and Scientific Computing Seminar
Accelerated conditional gradient methods for continuous sparse optimization problems
Speaker: Konstantin Pieper, Florida State University
Location: Warren Weaver Hall 1302
Date: Nov. 3, 2017, 10 a.m.
Synopsis:
We consider a class of sparse minimization problems over a space of measures on a continuum, with solutions consisting of a finite sum of Dirac-delta functions. Such problems arise in inverse source location problems and in the context of optimal experimental design. For the algorithmic solution we consider a conditional gradient method, which iteratively inserts Dirac-delta functions and optimizes the corresponding coefficients. Under general assumptions, a sub-linear convergence rate in the objective functional is obtained, which is sharp in most cases. To improve efficiency, one can fully resolve the finite-dimensional sub-problems occurring in each step of the method. We provide an analysis for the resulting procedure: under a structural assumption on the optimal solution, a linear Cλk convergence rate is obtained locally. Numerical experiments confirm the theoretical findings and the practical efficiency of the method.