Spatio-temporal characterization of the surface electrocardiogram for catheter ablation outcome prediction in persistent atrial fibrillation

Atrial fibrillation (AF) is the most common sustained cardiac arrhythmia encountered in clinical practice, and one of the main causes of ictus and strokes. Despite the advances in the comprehension of its mechanisms, its thorough characterization and the quantification of its effects on the human heart are still an open issue. In particular, the choice of the most appropriate therapy is frequently a hard task. Radiofrequency catheter ablation (CA) is becoming one of the most popular solutions for the treatment of the disease. Yet, very little is known about its impact on heart substrate during AF, thus leading to an inaccurate selection of positive responders to therapy and a low success rate; hence, the need for advanced signal processing tools able to quantify AF impact on heart substrate and assess the effectiveness of the CA therapy in an objective and ...

Marianna Meo — Université Nice Sophia Antipolis


Contributions to signal analysis and processing using compressed sensing techniques

Chapter 2 contains a short introduction to the fundamentals of compressed sensing theory, which is the larger context of this thesis. We start with introducing the key concepts of sparsity and sparse representations of signals. We discuss the central problem of compressed sensing, i.e. how to adequately recover sparse signals from a small number of measurements, as well as the multiple formulations of the reconstruction problem. A large part of the chapter is devoted to some of the most important conditions necessary and/or sufficient to guarantee accurate recovery. The aim is to introduce the reader to the basic results, without the burden of detailed proofs. In addition, we also present a few of the popular reconstruction and optimization algorithms that we use throughout the thesis. Chapter 3 presents an alternative sparsity model known as analysis sparsity, that offers similar recovery ...

Cleju, Nicolae — "Gheorghe Asachi" Technical University of Iasi


Mining the ECG: Algorithms and Applications

This research focuses on the development of algorithms to extract diagnostic information from the ECG signal, which can be used to improve automatic detection systems and home monitoring solutions. In the first part of this work, a generically applicable algorithm for model selection in kernel principal component analysis is presented, which was inspired by the derivation of respiratory information from the ECG signal. This method not only solves a problem in biomedical signal processing, but more importantly offers a solution to a long-standing problem in the field of machine learning. Next, a methodology to quantify the level of contamination in a segment of ECG is proposed. This level is used to detect artifacts, and to improve the performance of different classifiers, by removing these artifacts from the training set. Furthermore, an evaluation of three different methodologies to compute the ECG-derived ...

Varon, Carolina — KU Leuven


Heart rate variability : linear and nonlinear analysis with applications in human physiology

Cardiovascular diseases are a growing problem in today’s society. The World Health Organization (WHO) reported that these diseases make up about 30% of total global deaths and that heart diseases have no geographic, gender or socioeconomic boundaries. Therefore, detecting cardiac irregularities early-stage and a correct treatment are very important. However, this requires a good physiological understanding of the cardiovascular system. The heart is stimulated electrically by the brain via the autonomic nervous system, where sympathetic and vagal pathways are always interacting and modulating heart rate. Continuous monitoring of the heart activity is obtained by means of an ElectroCardioGram (ECG). Studying the fluctuations of heart beat intervals over time reveals a lot of information and is called heart rate variability (HRV) analysis. A reduction of HRV has been reported in several cardiological and noncardiological diseases. Moreover, HRV also has a prognostic ...

Vandeput, Steven — KU Leuven


Compressive Sensing Based Candidate Detector and its Applications to Spectrum Sensing and Through-the-Wall Radar Imaging

Signal acquisition is a main topic in signal processing. The well-known Shannon-Nyquist theorem lies at the heart of any conventional analog to digital converters stating that any signal has to be sampled with a constant frequency which must be at least twice the highest frequency present in the signal in order to perfectly recover the signal. However, the Shannon-Nyquist theorem provides a worst-case rate bound for any bandlimited data. In this context, Compressive Sensing (CS) is a new framework in which data acquisition and data processing are merged. CS allows to compress the data while is sampled by exploiting the sparsity present in many common signals. In so doing, it provides an efficient way to reduce the number of measurements needed for perfect recovery of the signal. CS has exploded in recent years with thousands of technical publications and applications ...

Lagunas, Eva — Universitat Politecnica de Catalunya


Self-Organization and Data Compression in Wireless Sensor Networks of Extreme Scales: Application to Environmental Monitoring, Climatology and Bioengineering

Wireless Sensor Networks (WSNs) aim for accurate data gathering and representation of one or multiple physical variables from the environment, by means of sensor reading and wireless data packets transmission to a Data Fusion Center (DFC). There is no comprehensive common set of requirements for all WSN, as they are application dependent. Moreover, due to specific node capabilities or energy consumption constraints several tradeoffs have to be considered during the design, and particularly, the price of the sensor nodes is a determining factor. The distinction between small and large scale WSNs does not only refers to the quantity of sensor nodes, but also establishes the main design challenges in each case. For example, the node organization is a key issue in large scale WSNs, where many inexpensive nodes have to properly work in a coordinated manner. Regarding the amount of ...

Chidean, Mihaela I. — Rey Juan Carlos University


Robust Estimation and Model Order Selection for Signal Processing

In this thesis, advanced robust estimation methodologies for signal processing are developed and analyzed. The developed methodologies solve problems concerning multi-sensor data, robust model selection as well as robustness for dependent data. The work has been applied to solve practical signal processing problems in different areas of biomedical and array signal processing. In particular, for univariate independent data, a robust criterion is presented to select the model order with an application to corneal-height data modeling. The proposed criterion overcomes some limitations of existing robust criteria. For real-world data, it selects the radial model order of the Zernike polynomial of the corneal topography map in accordance with clinical expectations, even if the measurement conditions for the videokeratoscopy, which is the state-of-the-art method to collect corneal-height data, are poor. For multi-sensor data, robust model order selection selection criteria are proposed and applied ...

Muma, Michael — Technische Universität Darmstadt


Bayesian Compressed Sensing using Alpha-Stable Distributions

During the last decades, information is being gathered and processed at an explosive rate. This fact gives rise to a very important issue, that is, how to effectively and precisely describe the information content of a given source signal or an ensemble of source signals, such that it can be stored, processed or transmitted by taking into consideration the limitations and capabilities of the several digital devices. One of the fundamental principles of signal processing for decades is the Nyquist-Shannon sampling theorem, which states that the minimum number of samples needed to reconstruct a signal without error is dictated by its bandwidth. However, there are many cases in our everyday life in which sampling at the Nyquist rate results in too many data and thus, demanding an increased processing power, as well as storage requirements. A mathematical theory that emerged ...

Tzagkarakis, George — University of Crete


Ultra low-power biomedical signal processing: an analog wavelet filter approach for pacemakers

The purpose of this thesis is to describe novel signal processing methodologies and analog integrated circuit techniques for low-power biomedical systems. Physiological signals, such as the electrocardiogram (ECG), the electroencephalogram (EEG) and the electromyogram (EMG) are mostly non-stationary. The main difficulty in dealing with biomedical signal processing is that the information of interest is often a combination of features that are well localized temporally (e.g., spikes) and others that are more diffuse (e.g., small oscillations). This requires the use of analysis methods sufficiently versatile to handle events that can be at opposite extremes in terms of their time-frequency localization. Wavelet Transform (WT) has been extensively used in biomedical signal processing, mainly due to the versatility of the wavelet tools. The WT has been shown to be a very efficient tool for local analysis of nonstationary and fast transient signals due ...

Haddad, Sandro Augusto Pavlík — Delft University of Technology


Robust Methods for Sensing and Reconstructing Sparse Signals

Compressed sensing (CS) is a recently introduced signal acquisition framework that goes against the traditional Nyquist sampling paradigm. CS demonstrates that a sparse, or compressible, signal can be acquired using a low rate acquisition process. Since noise is always present in practical data acquisition systems, sensing and reconstruction methods are developed assuming a Gaussian (light-tailed) model for the corrupting noise. However, when the underlying signal and/or the measurements are corrupted by impulsive noise, commonly employed linear sampling operators, coupled with Gaussian-derived reconstruction algorithms, fail to recover a close approximation of the signal. This dissertation develops robust sampling and reconstruction methods for sparse signals in the presence of impulsive noise. To achieve this objective, we make use of robust statistics theory to develop appropriate methods addressing the problem of impulsive noise in CS systems. We develop a generalized Cauchy distribution (GCD) ...

Carrillo, Rafael — University of Delaware


Extraction and Denoising of Fetal ECG Signals

Congenital heart defects are the leading cause of birth defect-related deaths. The fetal electrocardiogram (fECG), which is believed to contain much more information as compared with conventional sonographic methods, can be measured by placing electrodes on the mother’s abdomen. However, it has very low power and is mixed with several sources of noise and interference, including the strong maternal ECG (mECG). In previous studies, several methods have been proposed for the extraction of fECG signals recorded from the maternal body surface. However, these methods require a large number of sensors, and are ineffective with only one or two sensors. In this study, state modeling, statistical and deterministic approaches are proposed for capturing weak traces of fetal cardiac signals. These three methods implement different models of the quasi-periodicity of the cardiac signal. In the first approach, the heart rate and its ...

Niknazar, Mohammad — University of Grenoble


Combining anatomical and spectral information to enhance MRSI resolution and quantification: Application to Multiple Sclerosis

Multiple sclerosis is a progressive autoimmune disease that a˙ects young adults. Magnetic resonance (MR) imaging has become an integral part in monitoring multiple sclerosis disease. Conventional MR imaging sequences such as fluid attenuated inversion recovery imaging have high spatial resolution, and can visualise the presence of focal white matter brain lesions in multiple sclerosis disease. Manual delineation of these lesions on conventional MR images is time consuming and su˙ers from intra and inter-rater variability. Among the advanced MR imaging techniques, MR spectroscopic imaging can o˙er complementary information on lesion characterisation compared to conventional MR images. However, MR spectroscopic images have low spatial resolution. Therefore, the aim of this thesis is to automatically segment multiple sclerosis lesions on conventional MR images and use the information from high-resolution conventional MR images to enhance the resolution of MR spectroscopic images. Automatic single time ...

Jain, Saurabh — KU Leuven


Sparsity-Aware Wireless Networks: Localization and Sensor Selection

Wireless networks have revolutionized nowadays world by providing real-time cost efficient service and connectivity. Even such an unprecedented level of service could not fulfill the insatiable desire of the modern world for more advanced technologies. As a result, a great deal of attention has been directed towards (mobile) wireless sensor networks (WSNs) which are comprised of considerably cheap nodes that can cooperate to perform complex tasks in a distributed fashion in extremely harsh environments. Unique features of wireless environments, added complexity owing to mobility, distributed nature of the network setup, and tight performance and energy constraints, pose a challenge for researchers to devise systems which strike a proper balance between performance and resource utilization. We study some of the fundamental challenges of wireless (sensor) networks associated with resource efficiency, scalability, and location-awareness. The pivotal point which distinguishes our studies from ...

Jamali-Rad, Hadi — TU Delft


Transformation methods in signal processing

This dissertation is concerned with the application of the theory of rational functions in signal processing. The PhD thesis summarizes the corresponding results of the author’s research. Since the systems of rational functions are defined by the collection of inverse poles with multiplicities, the following parameters should be determined: the number, the positions and the multiplicities of the inverse poles. Therefore, we develop the hyperbolic variant of the so-called Nelder–Mead and the particle swarm optimization algorithm. In addition, the latter one is integrated into a more general multi-dimensional framework. Furthermore, we perform a detailed stability and error analysis of these methods. We propose an electrocardiogram signal generator based on spline interpolation. It turns to be an efficient tool for testing and evaluating signal models, filtering techniques, etc. In this thesis, the synthesized heartbeats are used to test the diagnostic distortion ...

Kovács, Péter — Eötvös L. University, Budapest, Hungary


EEG-Biofeedback and Epilepsy: Concept, Methodology and Tools for (Neuro)therapy Planning and Objective Evaluation

Objective diagnosis and therapy evaluation are still challenging tasks for many neurological disorders. This is highly related to the diversity of cases and the variety of treatment modalities available. Especially in the case of epilepsy, which is a complex disorder not well-explained at the biochemical and physiological levels, there is the need for investigations for novel features, which can be extracted and quantified from electrophysiological signals in clinical practice. Neurotherapy is a complementary treatment applied in various disorders of the central nervous system, including epilepsy. The method is subsumed under behavioral medicine and is considered an operant conditioning in psychological terms. Although the application areas of this promising unconventional approach are rapidly increasing, the method is strongly debated, since the neurophysiological underpinnings of the process are not yet well understood. Therefore, verification of the efficacy of the treatment is one ...

Kirlangic, Mehmet Eylem — Technische Universitaet Ilmenau

The current layout is optimized for mobile phones. Page previews, thumbnails, and full abstracts will remain hidden until the browser window grows in width.

The current layout is optimized for tablet devices. Page previews and some thumbnails will remain hidden until the browser window grows in width.