Contributions to signal analysis and processing using compressed sensing techniques

Chapter 2 contains a short introduction to the fundamentals of compressed sensing theory, which is the larger context of this thesis. We start with introducing the key concepts of sparsity and sparse representations of signals. We discuss the central problem of compressed sensing, i.e. how to adequately recover sparse signals from a small number of measurements, as well as the multiple formulations of the reconstruction problem. A large part of the chapter is devoted to some of the most important conditions necessary and/or sufficient to guarantee accurate recovery. The aim is to introduce the reader to the basic results, without the burden of detailed proofs. In addition, we also present a few of the popular reconstruction and optimization algorithms that we use throughout the thesis. Chapter 3 presents an alternative sparsity model known as analysis sparsity, that offers similar recovery ...

Cleju, Nicolae — "Gheorghe Asachi" Technical University of Iasi


Modulation Spectrum Analysis for Noisy Electrocardiogram Signal Processing and Applications

Advances in wearable electrocardiogram (ECG) monitoring devices have allowed for new cardiovascular applications to emerge beyond diagnostics, such as stress and fatigue detection, athletic performance assessment, sleep disorder characterization, mood recognition, activity surveillance, biometrics, and fitness tracking, to name a few. Such devices, however, are prone to artifacts, particularly due to movement, thus hampering heart rate and heart rate variability measurement and posing a serious threat to cardiac monitoring applications. To address these issues, this thesis proposes the use of a spectro-temporal signal representation called “modulation spectrum”, which is shown to accurately separate cardiac and noise components from the ECG signals, thus opening doors for noise-robust ECG signal processing tools and applications. First, an innovative ECG quality index based on the modulation spectral signal representation is proposed. The representation quantifies the rate-of-change of ECG spectral components, which are shown to ...

Tobon Vallejo, Diana Patricia — INRS-EMT


Mining the ECG: Algorithms and Applications

This research focuses on the development of algorithms to extract diagnostic information from the ECG signal, which can be used to improve automatic detection systems and home monitoring solutions. In the first part of this work, a generically applicable algorithm for model selection in kernel principal component analysis is presented, which was inspired by the derivation of respiratory information from the ECG signal. This method not only solves a problem in biomedical signal processing, but more importantly offers a solution to a long-standing problem in the field of machine learning. Next, a methodology to quantify the level of contamination in a segment of ECG is proposed. This level is used to detect artifacts, and to improve the performance of different classifiers, by removing these artifacts from the training set. Furthermore, an evaluation of three different methodologies to compute the ECG-derived ...

Varon, Carolina — KU Leuven


Linear Dynamical Systems with Sparsity Constraints: Theory and Algorithms

This thesis develops new mathematical theory and presents novel recovery algorithms for discrete linear dynamical systems (LDS) with sparsity constraints on either control inputs or initial state. The recovery problems in this framework manifest as the problem of reconstructing one or more sparse signals from a set of noisy underdetermined linear measurements. The goal of our work is to design algorithms for sparse signal recovery which can exploit the underlying structure in the measurement matrix and the unknown sparse vectors, and to analyze the impact of these structures on the efficacy of the recovery. We answer three fundamental and interconnected questions on sparse signal recovery problems that arise in the context of LDS. First, what are necessary and sufficient conditions for the existence of a sparse solution? Second, given that a sparse solution exists, what are good low-complexity algorithms that ...

Joseph, Geethu — Indian Institute of Science, Bangalore


Spatio-temporal characterization of the surface electrocardiogram for catheter ablation outcome prediction in persistent atrial fibrillation

Atrial fibrillation (AF) is the most common sustained cardiac arrhythmia encountered in clinical practice, and one of the main causes of ictus and strokes. Despite the advances in the comprehension of its mechanisms, its thorough characterization and the quantification of its effects on the human heart are still an open issue. In particular, the choice of the most appropriate therapy is frequently a hard task. Radiofrequency catheter ablation (CA) is becoming one of the most popular solutions for the treatment of the disease. Yet, very little is known about its impact on heart substrate during AF, thus leading to an inaccurate selection of positive responders to therapy and a low success rate; hence, the need for advanced signal processing tools able to quantify AF impact on heart substrate and assess the effectiveness of the CA therapy in an objective and ...

Marianna Meo — Université Nice Sophia Antipolis


Advanced tools for ambulatory ECG and respiratory analysis

The electrocardiogram or ECG is a relatively easy-to-record signal that contains an enormous amount of potentially useful information. It is currently mostly being used for screening purposes. For example, pre-participation cardiovascular screening of young athletes has been endorsed by both scientific organisations and sporting governing bodies. A typical cardiac examination is taken in a hospital environment and lasts 10 seconds. This is often sufficient to detect major pathologies, yet this small sample size of the heart’s functioning can be deceptive when used to evaluate one’s general condition. A solution for this problem is to monitor the patient outside of the hospital, during a longer period of time. Due to the extension of the analysis period, the detection rate of cardiac events can be highly increased, compared to the cardiac exam in the hospital. However, it also increases the likelihood of ...

Moeyersons, Jonathan — KU Leuven


Heart rate variability : linear and nonlinear analysis with applications in human physiology

Cardiovascular diseases are a growing problem in today’s society. The World Health Organization (WHO) reported that these diseases make up about 30% of total global deaths and that heart diseases have no geographic, gender or socioeconomic boundaries. Therefore, detecting cardiac irregularities early-stage and a correct treatment are very important. However, this requires a good physiological understanding of the cardiovascular system. The heart is stimulated electrically by the brain via the autonomic nervous system, where sympathetic and vagal pathways are always interacting and modulating heart rate. Continuous monitoring of the heart activity is obtained by means of an ElectroCardioGram (ECG). Studying the fluctuations of heart beat intervals over time reveals a lot of information and is called heart rate variability (HRV) analysis. A reduction of HRV has been reported in several cardiological and noncardiological diseases. Moreover, HRV also has a prognostic ...

Vandeput, Steven — KU Leuven


Blind Source Separation of functional dynamic MRI signals via Dictionary Learning

Magnetic Resonance Imaging (MRI) constitutes a non-invasive medical imaging technique that allows the exploration of the inner anatomy, tissues, and physiological processes of the body. Among the different MRI applications, functional Magnetic Resonance Imaging (fMRI) has slowly become an essential tool for investigating the brain behavior and, nowadays, it plays a fundamental role in clinical and neurophysiological research. Due to its particular nature, specialized signal processing techniques are required in order to analyze the fMRI data properly. Among the various related techniques that have been developed over the years, the General Linear Model (GLM) is one of the most widely used approaches, and it usually appears as a default in many specialized software toolboxes for fMRI. On the other end, Blind Source Separation (BSS) methods constitute the most common alternative to GLM, especially when no prior information regarding the brain ...

Morante, Manuel — National and Kapodistrian University of Athens


Monitoring Infants by Automatic Video Processing

This work has, as its objective, the development of non-invasive and low-cost systems for monitoring and automatic diagnosing specific neonatal diseases by means of the analysis of suitable video signals. We focus on monitoring infants potentially at risk of diseases characterized by the presence or absence of rhythmic movements of one or more body parts. Seizures and respiratory diseases are specifically considered, but the approach is general. Seizures are defined as sudden neurological and behavioural alterations. They are age-dependent phenomena and the most common sign of central nervous system dysfunction. Neonatal seizures have onset within the 28th day of life in newborns at term and within the 44th week of conceptional age in preterm infants. Their main causes are hypoxic-ischaemic encephalopathy, intracranial haemorrhage, and sepsis. Studies indicate an incidence rate of neonatal seizures of 2‰ live births, 11‰ for preterm ...

Cattani Luca — University of Parma (Italy)


Compressive Sensing Based Candidate Detector and its Applications to Spectrum Sensing and Through-the-Wall Radar Imaging

Signal acquisition is a main topic in signal processing. The well-known Shannon-Nyquist theorem lies at the heart of any conventional analog to digital converters stating that any signal has to be sampled with a constant frequency which must be at least twice the highest frequency present in the signal in order to perfectly recover the signal. However, the Shannon-Nyquist theorem provides a worst-case rate bound for any bandlimited data. In this context, Compressive Sensing (CS) is a new framework in which data acquisition and data processing are merged. CS allows to compress the data while is sampled by exploiting the sparsity present in many common signals. In so doing, it provides an efficient way to reduce the number of measurements needed for perfect recovery of the signal. CS has exploded in recent years with thousands of technical publications and applications ...

Lagunas, Eva — Universitat Politecnica de Catalunya


Robust Methods for Sensing and Reconstructing Sparse Signals

Compressed sensing (CS) is a recently introduced signal acquisition framework that goes against the traditional Nyquist sampling paradigm. CS demonstrates that a sparse, or compressible, signal can be acquired using a low rate acquisition process. Since noise is always present in practical data acquisition systems, sensing and reconstruction methods are developed assuming a Gaussian (light-tailed) model for the corrupting noise. However, when the underlying signal and/or the measurements are corrupted by impulsive noise, commonly employed linear sampling operators, coupled with Gaussian-derived reconstruction algorithms, fail to recover a close approximation of the signal. This dissertation develops robust sampling and reconstruction methods for sparse signals in the presence of impulsive noise. To achieve this objective, we make use of robust statistics theory to develop appropriate methods addressing the problem of impulsive noise in CS systems. We develop a generalized Cauchy distribution (GCD) ...

Carrillo, Rafael — University of Delaware


Robust Estimation and Model Order Selection for Signal Processing

In this thesis, advanced robust estimation methodologies for signal processing are developed and analyzed. The developed methodologies solve problems concerning multi-sensor data, robust model selection as well as robustness for dependent data. The work has been applied to solve practical signal processing problems in different areas of biomedical and array signal processing. In particular, for univariate independent data, a robust criterion is presented to select the model order with an application to corneal-height data modeling. The proposed criterion overcomes some limitations of existing robust criteria. For real-world data, it selects the radial model order of the Zernike polynomial of the corneal topography map in accordance with clinical expectations, even if the measurement conditions for the videokeratoscopy, which is the state-of-the-art method to collect corneal-height data, are poor. For multi-sensor data, robust model order selection selection criteria are proposed and applied ...

Muma, Michael — Technische Universität Darmstadt


Continuous respiratory rate monitoring to detect clinical deteriorations using wearable sensors

Acutely-ill hospitalised patients are at risk of clinical deteriorations in health leading to adverse events such as cardiac arrests. Deteriorations are currently detected by manually measuring physiological parameters every 4-6 hours. Consequently, deteriorations can remain unrecognised between assessments, delaying clinical intervention. It may be possible to provide earlier detection of deteriorations by using wearable sensors for continuous physiological monitoring. Respiratory rate (RR) is not commonly monitored by wearable sensors, despite being a sensitive marker of deteriorations. This thesis presents investigations to identify an algorithm suitable for estimating RR from two signals commonly acquired by wearable sensors: the electrocardiogram (ECG) and photoplethysmogram (PPG). A suitable algorithm was then used to estimate RRs retrospectively from a physiological dataset acquired from acutely-ill patients to assess the potential utility of wearable sensors for detecting deteriorations. Existing RR algorithms were identi ed through a systematic ...

Charlton, Peter — King's College London


Self-Organization and Data Compression in Wireless Sensor Networks of Extreme Scales: Application to Environmental Monitoring, Climatology and Bioengineering

Wireless Sensor Networks (WSNs) aim for accurate data gathering and representation of one or multiple physical variables from the environment, by means of sensor reading and wireless data packets transmission to a Data Fusion Center (DFC). There is no comprehensive common set of requirements for all WSN, as they are application dependent. Moreover, due to specific node capabilities or energy consumption constraints several tradeoffs have to be considered during the design, and particularly, the price of the sensor nodes is a determining factor. The distinction between small and large scale WSNs does not only refers to the quantity of sensor nodes, but also establishes the main design challenges in each case. For example, the node organization is a key issue in large scale WSNs, where many inexpensive nodes have to properly work in a coordinated manner. Regarding the amount of ...

Chidean, Mihaela I. — Rey Juan Carlos University


Ultra low-power biomedical signal processing: an analog wavelet filter approach for pacemakers

The purpose of this thesis is to describe novel signal processing methodologies and analog integrated circuit techniques for low-power biomedical systems. Physiological signals, such as the electrocardiogram (ECG), the electroencephalogram (EEG) and the electromyogram (EMG) are mostly non-stationary. The main difficulty in dealing with biomedical signal processing is that the information of interest is often a combination of features that are well localized temporally (e.g., spikes) and others that are more diffuse (e.g., small oscillations). This requires the use of analysis methods sufficiently versatile to handle events that can be at opposite extremes in terms of their time-frequency localization. Wavelet Transform (WT) has been extensively used in biomedical signal processing, mainly due to the versatility of the wavelet tools. The WT has been shown to be a very efficient tool for local analysis of nonstationary and fast transient signals due ...

Haddad, Sandro Augusto Pavlík — Delft University of Technology

The current layout is optimized for mobile phones. Page previews, thumbnails, and full abstracts will remain hidden until the browser window grows in width.

The current layout is optimized for tablet devices. Page previews and some thumbnails will remain hidden until the browser window grows in width.