Invexifying Regularization of Non-Linear Least-Squares Problems

arXiv preprint, 2021

Code arXiv

Abstract

We consider regularization of non-convex optimization problems involving a non-linear least-squares objective. By adding an auxiliary set of variables, we introduce a novel regularization framework whose corresponding objective function is not only provably invex, but it also satisfies the highly desirable Polyak–Łojasiewicz inequality for any choice of the regularization parameter. Although our novel framework is entirely different from the classical l2-regularization, an interesting connection is established for the special case of under-determined linear least-squares. In particular, we show that gradient descent applied to our novel regularized formulation converges to the same solution as the linear ridge-regression problem. Numerical experiments corroborate our theoretical results and demonstrate the method’s performance in practical situations as compared to the typical l2-regularization.