Documents

A Novel Emotion Elicitation Index Using Frontal

Description
Emotion Elicitation
Categories
Published
of 10
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Related Documents
Share
Transcript
  IEEE TRANSACTIONS ON INFORMATION TECHNOLOGY IN BIOMEDICINE, VOL. 15, NO. 5, SEPTEMBER 2011 737 A Novel Emotion Elicitation Index Using FrontalBrain Asymmetry for Enhanced EEG-BasedEmotion Recognition Panagiotis C. Petrantonakis  , Student Member, IEEE  , and Leontios J. Hadjileontiadis  , Senior Member, IEEE   Abstract —This paper aims at providing a novel method for eval-uating the emotion elicitation procedures in an electroencephalo-gram (EEG)-based emotion recognition setup. By employing thefrontal brain asymmetry theory, an index, namely asymmetryIndex (AsI), is introduced, in order to evaluate this asymmetry.This is accomplished by a multidimensional directed informationanalysis between different EEG sites from the two opposite brainhemispheres.Theproposedapproachwasappliedtothree-channel(Fp1, Fp2, and F3/F4 10/20 sites) EEG recordings drawn from 16healthy right-handed subjects. For the evaluation of the efficiencyof the AsI, an extensive classification process was conducted usingtwo feature-vector extraction techniques and a SVM classifier forsix different classification scenarios in the valence/arousal space.This resulted in classification results up to 62.58% for the user in-dependentcaseand94.40%fortheuser-dependentone,confirmingtheefficacyofAsIasanindexfortheemotionelicitationevaluation.  Index Terms —Emotion elicitation, emotion recognition,electroencephalogram,frontalbrainasymmetry,multidimensionaldirected information. I. I NTRODUCTION H UMAN machine interaction (HMI) has gained intenseattentioninthelastdecadeasmachineshavedramaticallyinfluenced our lives in many aspects, such as, communication,profession,andentertainment.Ithasbeenlatelyargued[1]thatif machines could understand a person’s affective state, HMI maybecome more intuitive, smoother, and more efficient defininga new approach in the HMI area known as ffective computing(AC). AC is the research field that deals with the design of systems and devices that can recognize, interpret, and processhuman emotions and would serve as the means of imbuingmachines with the ability of acting emotionally.Emotion recognition is the first step toward the abovemen-tioned ultimate endeavor of AC. In order to recognize emo-tions many approaches have been proposed basically usingfacial expressions [2], [3], speech [4], [5], or signals fromthe autonomous nervous system (ANS) (e.g., heart rate, gal-vanic skin response) [6], [7] or even combination of them ManuscriptreceivedNovember6,2010;revisedApril13,2011;acceptedMay20, 2011. Date of publication May 27, 2011; date of current version September2, 2011.TheauthorsarewiththeDepartmentofElectricalandComputerEngineering,Aristotle University of Thessaloniki, Thessalonika, GR-54124, Greece (e-mail:ppetrant@auth.gr; leontios@auth.gr).Color versions of one or more of the figures in this paper are available onlineat http://ieeexplore.ieee.org.Digital Object Identifier 10.1109/TITB.2011.2157933 [8]. Lately, electroencephalogram-based emotion recognition(EEG-ER) [9]–[11] has been proposed, which offers great ben-efitswithregardtoitsimplementation,suchasefficienttimeres-olution, less intrusiveness, and signals captured from the srcinof the emotion genesis, i.e., the central nervous system (CNS).During an EEG-ER realization three major steps have to beimplemented. First of all, the emotion elicitation step deals withthe problem of efficiently evoked emotions to subjects who par-ticipateinthecorrespondingexperiment.Secondandthirdstepshave to do with captured data preprocessing and classification,respectively. Their effectiveness, however, is highly dependenton the emotion elicitation. The latter is of major importance inan EEG-ER system. If the subjects have not effectively becomeemotionally aroused during the emotion elicitation steps, the re-spective signals would not “carry” the corresponding emotionalinformation, resulting in an incompetent emotion classificationprocess.Emotion elicitation procedures are mostly based on projec-tions of videos or pictures that are assumed to evoke certainemotions.Themajorityofthestudiesthatdealtwiththeemotionrecognition problem (e.g., [11]) conducted emotion elicitationprocessesthatconsistedofsuchprojectionsofmultipletrialsforeachemotionalstateandusedallofthesignalscorrespondingtoeach one of the trials for the classification step. Considering thatthe situations depicted in the aforementioned videos/picturesare emotionally interpreted from the individuals in a way that ishighly influenced by factors, such as personality and personalexperiences, it is significantly questioned if all of the trials havethe same emotional impact to all of the subjects. Different sub- jects may be emotionally affected by different videos/pictures.In this work, a novel measure to evaluate the emotional im-pact of each emotion elicitation trial via the EEG signals isintroduced. Consequently, by picking out the emotion elicita-tion trials that correspond to significant emotional responses,according to the introduced evaluation measure, the succeed-ing classification would have considerable rate improvement.In order to define an emotion elicitation evaluation measure,the frontal brain asymmetry concept [12] was used, exploitingthe neuronal bases of the emotion expression in the brain. Toextract this evaluation measure in the form of an index, namelythe asymmetry index (AsI), a multidimensional directed infor-mation (MDI) analysis [13] was adopted, resulting in a robustmathematical representation of the asymmetry concept, in amanner that efficiently handles the information flow (in bits)among multiple brain sites. In order to proceed with the assess-ment of the effectiveness of the AsI, an extended classification 1089-7771/$26.00 © 2011 IEEE  738 IEEE TRANSACTIONS ON INFORMATION TECHNOLOGY IN BIOMEDICINE, VOL. 15, NO. 5, SEPTEMBER 2011 process took place. A support vector machine (SVM) classifierandtwofeature-vectorextractiontechniqueswereemployedforthe classification process. EEG signals were acquired from 16healthy subjects using three EEG channels, which are sufficientboth for the representation of emotion expression in brain (seeSectionII)andfortheimplementationoftheMDIanalysis[13].Experimental results demonstrated the potential of AsI as a ro-bust evaluation criterion of the emotion elicitation step of anEEG-ER system.The rest of the paper is structured as follows. Section II givessomebackgroundmaterialwithregardtotheemotionelicitationprocess and frontal brain asymmetry concept, thoroughly de-scribes the MDI method and introduces the proposed approach.Section III discusses some implementation issues, such as theconstruction of the dataset and the classification set up, whereasSection IV presents the results. Section V provides some dis-cussion on the overall evaluation of the proposed methodology.Finally, Section VI concludes the paper.II. M ETHOD  A. Background 1) Valence/Arousal-Based Emotion Elicitation:  Psycholo-gists do not present emotions as discrete states, but ratheras continuous ones and therefore demonstrate them in an n -dimensional ( n -D) space; usually the 2-D valence/arousalspace(VAS)isadopted.Valencestandsforone’sjudgmentabouta situation as positive or negative and arousal spans from calm-ness to excitement, expressing the degree of one’s excitation.The most frequently used technique to evoke emotion during anEEG-ER procedure is to use pictures depicting situations thatare supposed to elicit affective states relying in the VAS. TheInternational Affective Picture System (IAPS) [14] is a widelyused dataset, containing such pictures, which come with theirindividual values of valence and arousal. Consequently, the pro- jection of the IAPS pictures to a subject with the simultaneousrecording of EEG signals formulates an emotion elicitation pro-cess srcinating from the VAS theory. 2) Frontal Brain Asymmetry:  The asymmetry between theleft and right brain hemispheres forms the most prominent ex-pression of emotion in brain signals. Davidson  et al.  [12] de-veloped a model that related this asymmetric behavior withemotions, with the latter analyzed in the VAS. According to thatmodel, emotions are: 1) organized around approach-withdrawaltendencies and 2) differentially lateralized in the frontal regionof the brain. The left frontal area is involved in the experi-ence of positive emotions (high values of valence), such as joyor happiness (the experience of positive affect facilitates andmaintains approach behaviors), whereas the right frontal regionis involved in the experience of negative emotions (lower va-lence values), such as fear or disgust (the experience of negativeaffect facilitates and maintains withdrawal behaviors). In orderto quantify the abovementioned asymmetry, Davidson used theasymmetry index  DI   = ( L − R ) / ( L + R ) , where  L  and  R  arethe power of specific bands of the left and right hemispheres,respectively. Later, the asymmetry concept was also confirmedby other studies. For example, Davidson  et al.  [15], tried todifferentiate the emotions of happiness and disgust with EEGsignals captured from the left and right frontal, central, anteriortemporal, and parietal regions (F3, F4, C3, C4, T3, T4, P3, P4positions according to the 10–20 system [16]). The results re-vealedamoreright-sidedactivation,asfarasthepowerofalpha(8–12 Hz) band of the EEG signal is concerned, for the disgustcondition for both the frontal and anterior temporal regions.Thus, the results enhanced the applicability of the aforemen-tioned model (this model is used by the method described insucceeding sections to define the AsI as a metric of an emo-tion arousal (see II.C)) and confirmed the evidenced extensiveanatomical reciprocity of both regions with limbic circuits thathave been directly implicated in the control of emotion [17].Furthermore, Davidson [18] cites several studies that have ex-amined frequency bands other than alpha, including theta, beta,and gamma. In these studies, alpha power was also examinedand in a number of cases, asymmetrical effects were found inbands other than alpha, while effects in the alpha band wereabsent. Moreover, Bos [11] examined the efficacy of alpha, betaand the combination of them to discriminate emotions withinthe VAS and concluded that both bands include important infor-mation for the aforementioned discrimination. This perspectiveis adopted in this study (see Section III.A).  B. Multidimensional Directed Information Correlations among multiple time series are in general ex-pected when they are simultaneously observed from an ob- ject. If a relation of temporal ordering is noted as the corre-lation relation among these time series, some are interpreted ascauses and others as results, suggesting a cause–effect relationamong the time series (causality analysis). When causality insuch a sense is noted in multiple time series, the relation isdefined as directed information [19]. There are methods suitedto causality analysis, such as directed-coherence analysis [20],directed-information analysis [19], multidimensional directedinformation (MDI) analysis [13], Kaminski’s method (DTF)[21],partialdirectedcoherence[22],andGrangercausality[23].In this work, the MDI analysis was employed as a means toidentify the causality between any two series considering allacquired series. One of the main advantages of MDI is that theamount of information propagation is presented as an absolutevalue in bits and not as a correlation, which is a relative value(e.g., in directed-coherence analysis [20 a description of theMDI follows.Consider the simple case of two stationary time series  X   and Y   of length  N   divided into  n  epochs of length  L  =  N/n ; eachepoch of length  L  =  P   + 1 + M   is written as a sequence of two sections of length  P  and  M   before and after the  x  k  and  y k sampled value of time series  X   and  Y   at time  k  , respectively, i.e., X   =  x k − P   ...x k − 1 x k x k +1  ...x k + M   =  X  P  x k X  M  (1) Y   =  y k − P   ...y k − 1 y k y k +1  ...y k + M   =  Y  P  y k Y  M  (2)where  X  P  =  x k − P   ...x k − 1 ;  X  M  =  x k +1  ...x k + M   ; Y  P  = y k − P   ...y k − 1 ;  Y  M  =  y k +1  ...y k + M   .  PETRANTONAKIS AND HADJILEONTIADIS: NOVEL EMOTION ELICITATION INDEX USING FRONTAL BRAIN ASYMMETRY 739 The mutual information between the time series  X   and  Y   iswritten as I  ( X  ; Y  ) =  k I  k ( X  ; Y  )  (3)where, I  k ( X  ; Y  ) =  I  ( x k ; Y  M  | X  P  Y  P  y k )+ I  ( y k ; X  M  | X  P  Y  P  x k ) + I  ( x k ; y k | X  P  Y  P  ) . (4)The three terms of the right-hand side of (4) correspond to theinformation shared by the signal  x  k  of   X   at time  k   and the futurepart  Y  M  of   Y   after time  k   (first part); the information shared bythe signal  y k  of   Y   at time  k   and the future part  X  M  of   X   aftertime  k   (second part); and the information that isnot contained inthe past parts  X  P  and  Y  P  of   X   and  Y   but is shared by  x  k  and  y k (third part). Since  I  k (  X  ;  Y  ) represents mutual information whichhas symmetry, we have  I  k (  X  ;  Y  )  =  I  k ( Y  ;  X  ), meaning that itcontains no directivity, while the three terms in the right part of (4) contain a temporal relation which produces directivity. Thisdirectivity is defined [19] as directed information and depictedby using the arrow for clarification. For example, the first termof the right part of (4) can be written as I  ( x k ; Y  M  | X  P  Y  P  y k ) =  I  ( x k  → Y  M  | X  P  Y  P  y k )  (5)and analyzed as I  ( x k  → Y  M  | X  P  Y  P  y k ) = M   m =1 I  ( x k  → y k + m | X  P  Y  P  y k ) (6)where each term on the right-hand side of (6) can be interpretedasinformationthatisfirstgeneratedin  X  attime k  andpropagatedwith a time delay of   m  to  Y  , and can be calculated throughthe conditional mutual information as a sum of joint entropyfunctions: I  ( x k  → y k + m | X  P  Y  P  y k ) =  H  ( X  P  Y  P  x k y k )+ H  ( X  P  Y  P  y k y k + m ) − H  ( X  P  Y  P  y k ) − H  ( X  P  Y  P  x k y k y k + m ) .  (7)According to [13], the joint entropy  H  ( z 1  ...z n )  of n Gaus-sian stochastic variables  z 1 ,...,z n  can be calculated using thecovariance matrix  R ( z 1  ...z n )  as H  ( z 1  ...z n ) = 12 log[(2 πe ) n | R ( z 1  ...z n ) | ]  (8)where  |·|  denotes the determinant; by using (8), (7) can bewritten as I  ( x k  → y k + m | X  P  Y  P  y k )= 12 log | R ( X  P  Y  P  x k y k ) |·| R ( X  P  Y  P  y k y k + m ) || R ( X  P  Y  P  y k ) |·| R ( X  P  Y  P  x k y k y k + m ) | .  (9)Sofar,thecalculationoftheinformationflowbetweentwoserieswas presented. When the relations among three or more seriesare to be examined, however, the correct result of analysis isnot generally obtained if the aforementioned method is appliedto each pair of series [13]. To better comprehend this, considerthat there exist three series  X  ,  Y  , and  Z   (instead of the just two  X   and  Y  ) with information flow from  Z   to both  X   and  Y   but notbetween  X  and Y  .Whenconventionaldirectedinformationanal-ysis, based only on the relation between  X   and  Y  , is applied, aninformation flow would wrongly be identified, as if there existsa flow between  X   and  Y  , since they contain the common compo-nent from  Z  . To circumvent this ambiguity, the aforementionedmethod has to be expanded accordingly, in order to consider allmeasured time series, and the MDI, which represents the flowbetween two arbitrary series, must be defined. Consequently,the following expression for the MDI is obtained for the simplecase of three interacting signals  X, Y, Z  : I  ( x k  → y k + m | X  P  Y  P  Z  P  y k z k )= 12 log | R ( X  P  Y  P  Z  P  x k y k z k ) |·| R ( X  P  Y  P  Z  P  y k z k y k + m ) || R ( X  P  Y  P  Z  P  y k z k ) |·| R ( X  P  Y  P  Z  P  x k y k z k y k + m ) | . (10)Using (10) and (6), the total amount of information, namely S  , that is first generated in  X   and propagated to  Y   taking intoaccount the existence of   Z  , across the time delay range is S  XY  :  I  ( x k  → Y  M  | X  P  Y  P  Z  P  y k z k ) = M   m =1 12 log × | R ( X  P  Y  P  Z  P  x k y k z k ) |·| R ( X  P  Y  P  Z  P  y k z k y k + m ) || R ( X  P  Y  P  Z  P  y k z k ) |·| R ( X  P  Y  P  Z  P  x k y k z k y k + m ) | . (11)In the subsequent paragraph, (11) will be used to consolidatethe AsI measure by estimating the mutual information sharedbetween the left and right brain hemisphere, exploiting that waythe frontal brain asymmetry concept. C. The Proposed Approach According to the frontal brain asymmetry concept, the expe-rience of negative emotions is related with an increased rightfrontal and prefrontal hemisphere activity, whereas positiveemotions evoke an enhanced left-hemisphere activity. Let usassume that an EEG channel, i.e., Channel 1 recorded from thelefthemisphere,anotherEEGchannel,Channel2fromtheotherhemisphere, and Channel 3 recorded from both hemispheres asa dipole channel represent the signals  X  ,  Y  , and  Z  , previously in-troduced by the MDI analysis, respectively. If we now considerthe asymmetry concept, a measure to evaluate this asymmetryinformation in signals  X   and  Y   taking into account the infor-mation propagated by signal  Z   to both of them (and as a resultisolating the information shared between the  X  ,  Y   pair moreeffectively), would introduce an index of how effectively anemotion has been elicited. Toward this, it is assumed that thetotal amount of information  S   [see (11)], hidden in the EEG sig-nals and shared between the left and right hemisphere (signals  X   and  Y  , respectively) would become maximum when the sub- ject is calm (information symmetry), whereas  S   would becomeminimum when the subjects are emotionally aroused (infor-mation asymmetry). In order to investigate the aforementioned  740 IEEE TRANSACTIONS ON INFORMATION TECHNOLOGY IN BIOMEDICINE, VOL. 15, NO. 5, SEPTEMBER 2011 assumption, two values were calculated according to the MDIanalysis, i.e., the  S  r  and  S   p  values.  S  r  refers to bidirectional in-formationsharingbetween  X  and Y  ,takingintoaccount  Z  ,whenthe subject does not feel any emotion; hence, she/he is relaxed,i.e., S  r  =  S  XY r  + S  Y X r  (12)where  S   p  is the same sharing information during the periodwhere she/he is supposed to feel an emotion, i.e., S   p  =  S  XY  p  + S  Y X  p  .  (13)According to what has already been discussed,  S   p  will presum-ably be smaller than  S  r  if the asymmetry concept holds. Finally,in order to directly define a measure for emotion experience,the asymmetry index (AsI) is introduced which is defined asthe distance of the  ( S   p ,S  r )  point, corresponding to a specificpicture, from the line  S  r  =  S   p , i.e. AsI = ( S  r  − S   p ) ×√  22  .  (14)Later in this paper, AsI will serve as an index for the efficiencyof the emotion elicitation of each trial during the correspondingexperiment(seesubsequentsectionfordatasetconstruction)andwill be further evaluated in regard of its foundation as a metricfor emotion arousal through an extensive classification setup.III. I MPLEMENTATION  I SSUES  A. Dataset Construction For the construction of the EEG data, a specifically designedexperiment was conducted through an emotion elicitation pro-cess. In this experiment, 16 (nine males and seven females)healthy right-handed volunteers in the age group of 19–32 yearsparticipated.Thewholeexperimentwasdesignedtoinductemo-tion within the VAS and, specifically, for four affective states,i.e., LALV (low arousal-low valence), LAHV (low arousal-highvalence), HAHV (high arousal-high valence), and HALV (higharousal-low valence). 1 Ten pictures per affective state were se-lected from the IAPS database according to their arousal andvalence values provided by the norms of the IAPS database.Particularly, the low (  L ) grade previously mentioned was con-sidered lower than 4 ( < 4) in a scale of 1–9 and the high (  H  )grade greater than 6 (6 < ) in the same scale. Moreover, the stan-dard deviation of these values considered to be lower than 2.2.Accordingtothesecriteria,11,47,61,and17picturesforLALV,LAHV, HAHV, and HALV affective states were extracted, re-spectively. From these pictures, ten pictures per affective statewere randomly selected and constituted the 40-picture databaseused in the experiment. Fig. 1(a) shows the location of the pic-tures finally selected at the VAS.The experimental protocol included the series of stepsschematically depicted in Fig. 1(b). In particular, before theprojection of each picture, an 11-s period took place consistingsequentially of: 1) a 5-s black screen period, 2) a 5-s period 1 Valence is commonly associated with positive/negative instead of high/lowbut the second one is used here for the sake of simplicity.Fig. 1. (a) Location of the selected pictures (big black dots) for the experi-ment’sconductionalongwiththerestpicturesoftheIAPSdatabase(smallblack dots) at the arousal-valence space. (b) Schematic representation of the experi-mental protocol followed. (c) The Fp1, Fp2, F3, and F4 electrode positions inthe 10–20 system (marked with black), used for the EEG acquisition. in which countdown frames ( 5 → 1 ) were demonstrated, and3)an1-sprojectionofacrossshapeinthemiddleofthescreentoattractthesightofthesubject.The5-scountdownphasewasem-ployedtoaccomplisharelaxationphaseandemotion-reset[11],due to its naught emotional content before the projection of thenew picture. During the experiment, the selected pictures wereprojected (in sequence: 10 for LALV, 10 for LAHV, 10 forHAHV, and 10 for HALV) for 5 s.After the picture projection, acomputerized 20-s self-assessmentmannequin (SAM) [24] pro-cedure took place. The same 36-second procedure was repeatedfor every one of the 40 pictures. The sequence of the projectionof the affective states was chosen in order to show emotionally
Search
Tags
Related Search
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks