Abstract / truncated to 115 words (read the full abstract)

In this thesis we investigate centralized and distributed variants of sparse Bayesian learning (SBL), an effective probabilistic regression method used in machine learning. Since inference in an SBL model is not tractable in closed form, approximations are needed. We focus on the variational Bayesian approximation, as opposed to others used in the literature, for three reasons: First, it is a flexible general framework for approximate Bayesian inference that estimates probability densities including point estimates as a special case. Second, it has guaranteed convergence properties. And third, it is a deterministic approximation concept that is even applicable for high dimensional problems where non-deterministic sampling methods may be prohibitive. We resolve some inconsistencies in the literature involved ... toggle 5 keywords

sparse bayesian learning variational bayes sensor networks distributed learning consensus

Information

Author
Buchgraber, Thomas
Institution
Graz University of Technology
Supervisors
Publication Year
2013
Upload Date
June 30, 2014

First few pages / click to enlarge

The current layout is optimized for mobile phones. Page previews, thumbnails, and full abstracts will remain hidden until the browser window grows in width.

The current layout is optimized for tablet devices. Page previews and some thumbnails will remain hidden until the browser window grows in width.