Robust Algorithms for Linear and Nonlinear Regression via Sparse Modeling Methods: Theory, Algorithms and Applications to Image Denoising
The task of robust regression is of particular importance in signal processing, statistics and machine learning. Ordinary estimators, such as the Least Squares (LS) one, fail to achieve sufficiently good performance in the presence of outliers. Although the problem has been addressed many decades ago and several methods have been established, it has recently attracted more attention in the context of sparse modeling and sparse optimization techniques. The latter is the line that has been followed in the current dissertation. The reported research, led to the development of a novel approach in the context of greedy algorithms. The model adopts the decomposition of the noise into two parts: a) the inlier noise and b) the outliers, which are explicitly modeled by employing sparse modeling arguments. Based on this rationale and inspired by the popular Orthogonal Matching Pursuit (OMP two novel efficient greedy algorithms are established, one for the linear and another one for the nonlinear robust regression task. The proposed algorithm for the linear task, i.e., Greedy Algorithm for Robust Denoising (GARD), alternates between a Least Squares (LS) optimization criterion and an OMP selection step, that identifies the outliers. The method is compared against state-of-the-art methods through extensive simulations and the results demonstrate that: a) it exhibits tolerance in the presence of outliers, i.e., robustness, b) it attains a very low approximation error and c) it has relatively low computational requirements. Moreover, due to the simplicity of the method, a number of related theoretical properties are derived. Initially, the convergence of the method in a finite number of iteration steps is established. Next, the focus of the theoretical analysis is turned on the identification of the outliers. The case where only outliers are present has been studied separately; this is mainly due to the following reasons: a) the simplification of technically demanding algebraic manipulations and b) the ?articulation? of the method?s interesting geometrical properties. In particular, a bound based on the Restricted Isometry Property (RIP) constant guarantees that the recovery of the signal via GARD is exact (zero error). Finally, for the case where outliers as well as inlier noise coexist, and by assuming that the inlier noise vector is bounded, a similar condition that guarantees the recovery of the support for the sparse outlier vector is derived. If such a condition is satisfied, then it is shown that the approximation error is bounded, and thus the denoising estimator is stable. For the robust nonlinear regression task, it is assumed that the unknown nonlinear function belongs to a Reproducing Kernel Hilbert Space (RKHS). Due to the existence of outliers, common techniques such as the Kernel Ridge Regression (KRR), or the Support Vector Regression (SVR) turn out to be inadequate. By employing the aforementioned noise decomposition, sparse modeling arguments are employed so that the outliers are estimated according to the greedy approach. The proposed robust scheme, i.e., Kernel Greedy Algorithm for Robust Denoising (KGARD), alternates between a KRR task and an OMP-like selection step. Theoretical results regarding the identification of the outliers are provided. Moreover, KGARD is compared against other cutting edge methods via extensive simulations, where its enhanced performance is demonstrated. Finally, the proposed robust estimation framework is applied to the task of image denoising, where the advantages of the proposed method are unveiled. The experiments verify that KGARD improves the denoising process significantly, when outliers are present.
