Fourth International BCI Meeting 2010

Asilomar Conference Center, Carmel, California



May 31 - June 3, 2010

Scott Makeig1 & Julie Onton2
1 Swartz Center for Computational Neuroscience, Institute for Neural Computation, University of California San Diego, La Jolla, CA
2 Naval Health Research Center, San Diego, CA

Emotional BCI: Communicating emotion via EEG dynamics

Background and Objective: So far the brain-computer interface (BCI) field has focused in large part on developing abilities to make one or more discrete choices (yes/no, on/off, left/right, etc.) and to use written language (spelling out words). Yet our sense of personal fulfillment derives not only from this kind of agency and communication. Many of our most deeply felt and influential experiences involve communication with others by gestures, facial expressions, eye movements and nonverbal sounds. Inability to fluently communicate feeling to others burdens not only locked-in patients but also many other physically and mentally handicapped subjects (stroke, palsy, Alzheimers, etc.). Is it possible to interpret EEG dynamics to learn what a subject is feeling? And if so, could we use EEG to communicate our feelings to others?

Methods: Onton & Makeig (Front. Hum. Neurosci., 2009) explored the first of these questions by using a method of guided imagery to induce 32 eyes-closed subjects to imagine situations in which they would feel a series of 15 suggested emotions while attending to their somatic sensations to intensify their emotional experience. Subjects pressed a hand-held thumb button when they first felt the indicated emotion and then attempted to hold the feeling as long as possible. When the experience waned, they pressed the button again, cueing delivery of relaxation instructions, followed by the next emotion suggestion. In a de-briefing survey, subjects generally indicated that they experienced most of the suggested emotions moderately to strongly. This gave us 256-channel EEG data from 15 labeled 1-5 min emotion periods per subject.

Results: We looked for within-subject differences in EEG dynamics during these periods by performing (temporal) independent component analysis (ICA) followed by (spatial-log-spectral) ICA on the independent brain source process activities. The latter found multiplicative spectral modes each characteristic of one or more source process. These independent modulator processes included a large class of broadband high-frequency processes scaling power from near 30 Hz to 200 Hz or higher. Multidimensional scaling of mean broadband power levels during the 15 periods in the individually processed data from the 32 subjects returned a two-dimensional 'emotion space' in which one dimension loaded strongly on emotional valence. Further, even a few seconds of data removed from each emotion period could be assigned its original emotion label in most instances by comparing its dynamics to those of the remaining data from the same subject.

Discussion and Conclusions: The first author proposes to test, next, whether this data modeling method can be applied to musician subjects who imagine different notes of the scale by imaginatively experiencing their emotional or affective nature. If so, can we build a musical emotion BCI that allows musical subjects to select and play notes of the scale (slowly, but with 'heartfelt' intent) by EEG alone, thus producing a sense of emotional communication. We will show pilot data from experiments toward this goal, and plan to use the resulting system to direct the first public performance of an original 'concerto for brain and small chamber orchestra.' If successful, more general emotional BCIs might be developed for use by normal and handicapped subjects.

Return to Current Abstracts