Emotion assessment for affective computing based on brain and peripheral signals
Current Human-Machine Interfaces (HMI) lack of ?emotional intelligence?, i.e. they are not able to identify human emotional states and take this information into account to decide on the proper actions to execute. The goal of affective computing is to fill this lack by detecting emotional cues occurring during Human-Computer Interaction (HCI) and synthesizing emotional responses. In the last decades, most of the studies on emotion assessment have focused on the analysis of facial expressions and speech to determine the emotional state of a person. Physiological activity also includes emotional information that can be used for emotion assessment but has received less attention despite of its advantages (for instance it can be less easily faked than facial expressions). This thesis reports on the use of two types of physiological activities to assess emotions in the context of affective computing: the activity of the central nervous system (i.e. the brain) and the activity of the peripheral nervous system. The central activity is monitored by recording electroencephalograms (EEGs). The peripheral activity is assessed by using the following sensors: a Galvanic Skin Response (GSR) sensor to measure sudation; a respiration belt to measure abdomen expansion; a plethysmograph to record blood volume pulse (BVP); and a temperature sensor to record finger temperature. The valence-arousal space is chosen to represent emotions since it originates from the cognitive theory of emotions and is general enough to be usable for several applications. Several areas in the valence-arousal space are used as ground-truth classes. In order to automatically detect those classes from physiological signals, it is necessary to find a computational model that maps a given physiological pattern to one of the chosen classes. Pattern recognition and machine learning algorithms are employed to infer such a model. Three protocols that use different emotion elicitation methods (images, recall of past emotional episodes and playing a video game at different difficulty levels) are designed to gather physiological signals during emotional stimulations. For each emotional stimulations, features are extracted from the EEG and peripheral signals. For EEG signals, energy features that are known to be related to emotional processes are computed. Moreover, the Mutual Information (MI) computed between all pairs of electrodes is proposed as a new set of features. For peripheral signals, the computed features are chosen based on the review of the literature and are discussed relatively to temporal aspects. Several classifiers (Na?ve Bayes, Discriminant Analysis, Support Vector Machines ? SVM and Relevance Vector Machines – RVM) are then trained on the resulting databases. The performance of these algorithms is evaluated according to the obtained accuracy for both intra (two first protocols) and inter (third protocol) participant classification. Feature selection methods are also employed to find the most relevant features for emotion assessment and reduce the size of the original feature spaces. Finally, fusion of the central and peripheral information at the decision and feature level is analyzed. The results show that EEG signals are usable for emotion assessment since the classification accuracy obtained with EEG features is much higher than the random level. The best accuracy for the detection of three emotional classes is 68% and around 80% for two classes, when averaged across participants. The effectiveness of the MI feature set for emotion assessment is also demonstrated by the classification accuracies (for instance 62% to detect three emotional classes). Moreover, the accuracy obtained for classification based on EEG features is found to be higher than based on peripheral features when the features are computed on short time periods (6 to 30 seconds). Fusion of the peripheral and EEG features at the decision level significantly increased the accuracy (by an amount of 3% to 7% encouraging further fusion with other sources of emotional information (facial expressions, speech, etc.). An application of the developed methods which automatically adapts the difficulty of games based on emotion assessment is proposed as an example of affective HMI.
