Abstract / truncated to 115 words (read the full abstract)

Due to a variety of potential barriers to sample acquisition, many of the datasets encountered in important classification applications, ranging from tumor identification to facial recognition, are characterized by small samples of high-dimensional data. In such situations, linear classifiers are popular as they have less risk of overfitting while being faster and more interpretable than non-linear classifiers. They are also easier to understand and implement for the inexperienced practitioner. In this dissertation, several gaps in the literature regarding the analysis and design of linear classifiers for high-dimensional data are addressed using tools from the field of asymptotic Random Matrix Theory (RMT) which facilitate the derivation of limits of relevant quantities or distributions, such as the ... toggle 5 keywords

linear discriminant analysis random matrix theory high-dimensional data small sample classification

Information

Author
Niyazi, Lama
Institution
King Abdullah University of Science and Technology
Supervisors
Publication Year
2023
Upload Date
June 26, 2024

First few pages / click to enlarge

The current layout is optimized for mobile phones. Page previews, thumbnails, and full abstracts will remain hidden until the browser window grows in width.

The current layout is optimized for tablet devices. Page previews and some thumbnails will remain hidden until the browser window grows in width.