Multimodal epileptic seizure detection : towards a wearable solution

Epilepsy is one of the most common neurological disorders, which affects almost 1% of the population worldwide. Anti-epileptic drugs provide adequate treatment for about 70% of epilepsy patients. The remaining 30% of the patients continue to have seizures, which drastically affects their quality of life. In order to obtain efficacy measures of therapeutic interventions for these patients, an objective way to count and document seizures is needed. However, in an outpatient setting, one of the major problems is that seizure diaries kept by patients are unreliable. Automated seizure detection systems could help to objectively quantify seizures. Those detection systems are typically based on full scalp Electroencephalography (EEG). In an outpatient setting, full scalp EEG is of limited use because patients will not tolerate wearing a full EEG cap for long time periods during daily life. There is a need for ...

Vandecasteele, Kaat — KU Leuven


Learning from structured EEG and fMRI data supporting the diagnosis of epilepsy

Epilepsy is a neurological condition that manifests in epileptic seizures as a result of an abnormal, synchronous activity of a large group of neurons. Depending on the affected brain regions, seizures produce various severe clinical symptoms. Epilepsy cannot be cured and in many cases is not controlled by medication either. Surgical resection of the region responsible for generating the epileptic seizures might offer remedy for these patients. Electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) measure the changes of brain activity in time over different locations of the brain. As such, they provide valuable information on the nature, the timing and the spatial origin of the epileptic activity. Unfortunately, both techniques record activity of different brain and artefact sources as well. Hence, EEG and fMRI signals are characterised by low signal to noise ratio. Data quality and the vast amount ...

Hunyadi, Borbála — KU Leuven


Efficient parametric modeling, identification and equalization of room acoustics

Room acoustic signal enhancement (RASE) applications, such as digital equalization, acoustic echo and feedback cancellation, which are commonly found in communication devices and audio equipment, aim at processing the acoustic signals with the final goal of improving the perceived sound quality in rooms. In order to do so, signal processing algorithms require the acoustic response of the room to be represented by means of parametric models and to be identified from the input and output signals of the room acoustic system. In particular, a good model should be both accurate, thus capturing those features of room acoustics that are physically and perceptually most relevant, and efficient, so that it can be implemented as a digital filter and used in practical signal processing tasks. This thesis addresses the fundamental question in room acoustic signal processing concerning the appropriateness of different parametric ...

Vairetti, Giacomo — KU Leuven


Automated detection of epileptic seizures in pediatric patients based on accelerometry and surface electromyography

Epilepsy is one of the most common neurological diseases that manifests in repetitive epileptic seizures as a result of an abnormal, synchronous activity of a large group of neurons. Depending on the affected brain regions, seizures produce various severe clinical symptoms. There is no cure for epilepsy and sometimes even medication and other therapies, like surgery, vagus nerve stimulation or ketogenic diet, do not control the number of seizures. In that case, long-term (home) monitoring and automatic seizure detection would enable the tracking of the evolution of the disease and improve objective insight in any responses to medical interventions or changes in medical treatment. Especially during the night, supervision is reduced; hence a large number of seizures is missed. In addition, an alarm should be integrated into the automated seizure detection algorithm for severe seizures in order to help the ...

Milošević, Milica — KU Leuven


Mining the ECG: Algorithms and Applications

This research focuses on the development of algorithms to extract diagnostic information from the ECG signal, which can be used to improve automatic detection systems and home monitoring solutions. In the first part of this work, a generically applicable algorithm for model selection in kernel principal component analysis is presented, which was inspired by the derivation of respiratory information from the ECG signal. This method not only solves a problem in biomedical signal processing, but more importantly offers a solution to a long-standing problem in the field of machine learning. Next, a methodology to quantify the level of contamination in a segment of ECG is proposed. This level is used to detect artifacts, and to improve the performance of different classifiers, by removing these artifacts from the training set. Furthermore, an evaluation of three different methodologies to compute the ECG-derived ...

Varon, Carolina — KU Leuven


Development of an automated neonatal EEG seizure monitor

Brain function requires a continuous flow of oxygen and glucose. An insufficient supply for a few minutes during the first period of life may have severe consequences or even result in death. This happens in one to six infants per 1000 live term births. Therefore, there is a high need for a method which can enable bedside brain monitoring to identify those neonates at risk and be able to start the treatment in time. The most important currently available technology to continuously monitor brain function is electroEncephaloGraphy (or EEG). Unfortunately, visual EEG analysis requires particular skills which are not always present round the clock in the Neonatal Intensive Care Unit (NICU). Even if those skills are available it is laborsome to manually analyse many hours of EEG. The lack of time and skill are the main reasons why EEG is ...

Deburchgraeve, Wouter — KU Leuven


Methods For Detection and Classification In ECG Analysis

The first part of the presented work is focused on measuring of QT intervals. QT interval can be an indicator of the cardiovascular health of the patient and detect any potential abnormalities. The QT interval is measured from the onset of the QRS complex to the end of the T wave. However, measurements for the end of the T wave are often highly subjective and the corresponding verification is difficult. Here we propose two methods of QT interval measuring – wavelet based and template matching method. Methods are compared with each other and tested on standard QT database. The second part of the presented work is focused on modelling of arrhythmias using McSharry’s model followed with classification using an artificial neural network. The proposed method uses pre-processing of signals with Linear Approximation Distance Thresholding method and Line Segment Clustering method ...

Kicmerova, Dina — Brno University of Technology / Department of Biomedical Engineering


Detection of epileptic seizures based on video and accelerometer recordings

Epilepsy is one of the most common neurological diseases, especially in children. And although the majority of patients can be treated through medication or surgery (70%-75%), a significant group of patients cannot be treated. For this latter group of patients it is advisable to follow the evolution of the disease. This can be done through a long-term automatic monitoring, which gives an objective measure of the number of seizures that the patient has, for example during the night. On the other hand, there is a reduced social control overnight and the parents or caregivers can miss some seizures. In severe seizures, it is sometimes necessary, however, to avoid dangerous situations during or after the seizure (e.g. the danger of suffocation caused by vomiting or a position that obstructs breathing, or the risk of injury during violent movements), and to comfort ...

Cuppens, Kris — Katholieke Universiteit Leuven


First-order Convex Optimization Methods for Signal and Image Processing

In this thesis we investigate the use of first-order convex optimization methods applied to problems in signal and image processing. First we make a general introduction to convex optimization, first-order methods and their iteration complexity. Then we look at different techniques, which can be used with first-order methods such as smoothing, Lagrange multipliers and proximal gradient methods. We continue by presenting different applications of convex optimization and notable convex formulations with an emphasis on inverse problems and sparse signal processing. We also describe the multiple-description problem. We finally present the contributions of the thesis. The remaining parts of the thesis consist of five research papers. The first paper addresses non-smooth first-order convex optimization and the trade-off between accuracy and smoothness of the approximating smooth function. The second and third papers concern discrete linear inverse problems and reliable numerical reconstruction software. ...

Jensen, Tobias Lindstrøm — Aalborg University


Contributions to signal analysis and processing using compressed sensing techniques

Chapter 2 contains a short introduction to the fundamentals of compressed sensing theory, which is the larger context of this thesis. We start with introducing the key concepts of sparsity and sparse representations of signals. We discuss the central problem of compressed sensing, i.e. how to adequately recover sparse signals from a small number of measurements, as well as the multiple formulations of the reconstruction problem. A large part of the chapter is devoted to some of the most important conditions necessary and/or sufficient to guarantee accurate recovery. The aim is to introduce the reader to the basic results, without the burden of detailed proofs. In addition, we also present a few of the popular reconstruction and optimization algorithms that we use throughout the thesis. Chapter 3 presents an alternative sparsity model known as analysis sparsity, that offers similar recovery ...

Cleju, Nicolae — "Gheorghe Asachi" Technical University of Iasi


Decomposition methods with applications in neuroscience

The brain is the most important signal processing unit in the human body. It is responsible for receiving, processing and storing information. One of the possibilities to study brain functioning is by placing electrodes on the scalp and recording the synchronous neuronal activity of the brain. Such a recording measures a combination of active processes in the whole brain. Unfortunately, it is also contaminated by artifacts. By extracting the artifacts and removing them, cleaned recordings can be investigated. Furthermore, it is easier to look at specific brain activities, like an epileptic seizure, than at a combination. In this thesis, we present different mathematical techniques that can be used to extract individual contributing sources from the measured signals for this purpose. We focused on Canonical Correlation Analysis (CCA), Independent Component Analysis (ICA) and Canonical/ Parallel Factor Analysis (CPA). We show that ...

De Vos, Maarten — Katholieke Universiteit Leuven


Full-Duplex Wireless: Self-interference Modeling, Digital Cancellation, and System Studies

In the recent years, a significant portion of the research within the field of wireless communications has been motivated by two aspects: the constant increase in the number of wireless devices and the higher and higher data rate requirements of the individual applications. The undisputed outcome of these phenomena is the heavy congestion of the suitable spectral resources. This has inspired many innovative solutions for improving the spectral efficiency of the wireless communications systems by facilitating more simultaneous connections and higher data rates without requiring additional spectrum. These include technologies such as in-phase/quadrature (I/Q) modulation, multiple-input and multiple-output (MIMO) systems, and the orthogonal frequency-division multiplexing (OFDM) waveform, among others. Even though these existing solutions have greatly improved the spectral efficiency of wireless communications, even more advanced techniques are needed for fulfilling the future data transfer requirements in the ultra high ...

Korpi, Dani — Tampere University of Technology


Bayesian Compressed Sensing using Alpha-Stable Distributions

During the last decades, information is being gathered and processed at an explosive rate. This fact gives rise to a very important issue, that is, how to effectively and precisely describe the information content of a given source signal or an ensemble of source signals, such that it can be stored, processed or transmitted by taking into consideration the limitations and capabilities of the several digital devices. One of the fundamental principles of signal processing for decades is the Nyquist-Shannon sampling theorem, which states that the minimum number of samples needed to reconstruct a signal without error is dictated by its bandwidth. However, there are many cases in our everyday life in which sampling at the Nyquist rate results in too many data and thus, demanding an increased processing power, as well as storage requirements. A mathematical theory that emerged ...

Tzagkarakis, George — University of Crete


Stereoscopic depth map estimation and coding techniques for multiview video systems

The dissertation deals with the problems of stereoscopic depth estimation and coding in multiview video systems, which are vital for development of the next generation three-dimensional television. The depth estimation algorithms known from literature, along with theoretical foundations are discussed. The problem of estimation of depth maps with high quality, expressed by means of accuracy, precision and temporal consistency, has been stated. Next, original solutions have been proposed. Author has proposed a novel, theoretically founded approach to depth estimation which employs Maximum A posteriori Probability (MAP) rule for modeling of the cost function used in optimization algorithms. The proposal has been presented along with a method for estimation of parameters of such model. In order to attain that, an analysis of the noise existing in multiview video and a study of inter-view correlation of corresponding samples of pictures have been ...

Stankiewicz, Olgierd — Poznan University of Technology


Discrete Quadratic Time-Frequency Distributions: Definition, Computation, and a Newborn Electroencephalogram Application

Most signal processing methods were developed for continuous signals. Digital devices, such as the computer, process only discrete signals. This dissertation proposes new techniques to accurately define and efficiently implement an important signal processing method---the time--frequency distribution (TFD)---using discrete signals. The TFD represents a signal in the joint time--frequency domain. Because these distributions are a function of both time and frequency they, unlike traditional signal processing methods, can display frequency content that changes over time. TFDs have been used successfully in many signal processing applications as almost all real-world signals have time-varying frequency content. Although TFDs are well defined for continuous signals, defining and computing a TFD for discrete signals is problematic. This work overcomes these problems by making contributions to the definition, computation, and application of discrete TFDs. The first contribution is a new discrete definition of TFDs. A ...

O' Toole, John M. — University of Queensland

The current layout is optimized for mobile phones. Page previews, thumbnails, and full abstracts will remain hidden until the browser window grows in width.

The current layout is optimized for tablet devices. Page previews and some thumbnails will remain hidden until the browser window grows in width.