Variational Analysis of Sparsity Optimization
Many problems of interest where more than one solution is possible seek, among these, the one that is sparsest. The objective that most directly accounts for sparsity, the metric, is usually avoided since this leads to a combinatorial optimization problem. The function is often viewed as the limit of the metrics. Naturally, there have been some attempts to use this as an objective for small, though this is a nonconvex function for . We propose instead a scaled and shifted Fermi-Dirac entropy with two parameters, one controlling the smoothness of the approximation and the other the of the metric. Our proposed metric is a convex relaxation for which a strong duality theory holds, yielding dual methods for metrics approaching the desired function. Without smoothing, we propose a dynamically reweighted subdifferential descent method with ``exact'' line search that is finitely terminating for constraints that are well-separated. This algorithm is shown to recapture in a special case certain well-known ``greedy'' algorithms. Consequently we are able to provide an explicit algorithm whose fixed point, under the appropriate assumptions, is the sparsest possible solution. The variational perspective yields general strategies to make the algorithm more robust.