Society for Neuroscience

Chicago, Illinois



October, 2009

J. Onton1 & S. Makeig1
1Swartz Center for Computational Neuroscience, Institute for Neural Computation, University of California San Diego, La Jolla, CA

Classification of emotional state from EEG power spectra

The ability to identify a person's emotional state based on relatively easily acquired scalp electroencephalographic (EEG) data could be of clinical importance for anger management, depression, anxiety or stress reduction, and for relating to persons with communication disabilities. This study attempted to discover the brain dynamic correlates of experienced emotional feeling by suggesting to normal young adult subjects, via voice recordings, to recall and/or imagine scenarios in which they had felt or would possibly feel a series of suggested emotions, while encouraging their imaginations, including associated somatic sensations, in each case to become as vivid as possible. A total of 15 alternating positive and negative valence emotion suggestions were given in a self-paced paradigm, separated by brief periods of suggested relaxation. EEG data from 256 scalp leads were collected and then linearly decomposed using independent component analysis (ICA) to find maximally independent component (IC) EEG processes, thus eliminating confounds between the different EEG source signal mixtures recorded at the scalp electrode channels. To explore spectral modulations of these IC source signals during emotional experience, we first decomposed their 50% overlapping 1-sec epochs during 1-5 min emotion imagination periods using a wavelet transform. The IC log spectrographic data for each subject were then decomposed using ICA into 100 maximally frequency- and/or source-independent modulator patterns (IMs). Next, from each emotion period, contiguous 10-second stretches of data were removed at random for later classification. The remaining time-window weights (less two 1-s classification data edge buffers) were averaged for each IM, giving a template data matrix (IMs by 15 emotions). Contiguous 1-10 s stretches of the withheld data for each IM were then averaged for each emotion. The resulting (IMs by 1) vectors were submitted to k-nearest neighbor classification (Matlab knnclassify()) to find the best-matching emotion vector in the mean template data (IMs x 15 emotions). Despite considerable variability across emotions and subjects, even single 1-s windows from emotional periods were correctly classified with over 50% accuracy (chance being ~7%). Results imply that for all or most subjects, changes in emotional state induced by guided imagery can be classified with far better than chance accuracy based on moving-mean changes in EEG power spectra.

Return to Current Abstracts