Abstract / truncated to 115 words (read the full abstract)

As many machine learning and signal processing problems are fundamentally nonconvex and too expensive/difficult to be convexified, my research is focused on understanding the optimization landscapes of their fundamentally nonconvex formulations. After understanding their optimization landscapes, we can develop optimization algorithms to efficiently navigate these optimization landscapes and achieve the global optimality convergence. So, the main theme of this thesis would be optimization, with an emphasis on nonconvex optimization and algorithmic developments for these popular optimization problems. This thesis can be conceptually divided into four parts: Part 1: Convex Optimization. In the first part, we apply convex relaxations to several popular nonconvex problems in signal processing and machine learning (e.g. line spectral estimation problem and ... toggle 7 keywords

alternating minimization atomic norm bregman distance convex optimization distributed optimization nonconvex optimization tensor optimization


Li, Qiuwei
Colorado School of Mines
Publication Year
Upload Date
Oct. 1, 2019

First few pages / click to enlarge

The current layout is optimized for mobile phones. Page previews, thumbnails, and full abstracts will remain hidden until the browser window grows in width.

The current layout is optimized for tablet devices. Page previews and some thumbnails will remain hidden until the browser window grows in width.