Summer School on Music & Brain

Yosemite Valley and UC Merced, CA



May 18-23, 2014

Scott Makeig , Swartz Center for Computational Neuroscience, Institute for Neural Computation, University of California San Diego, La Jolla, CA

Studying the Brain Dynamics of Music, Movement, and Emotion

Our intentions and behavioral decisions, the major portion of 'who we are,' are now clearly understood to crucially involve a major and long scientifically under-explored aspect of cognition -- our affective or emotional perception, feeling, and awareness. Moreover, as social beings, we are faced with the highly important problem of knowing (i.e., 'seeing,' 'hearing,' 'feeling,' or somehow sensing ...) the intentions of others, often from quite sparse and transient sensory cues. To 'see,' 'read,' or somehow 'intuit' the feelings of others (the primary source and determiner of their intentions), we need to interpret subtle aspects of their behavior -- their facial expressions, their body position and gestural 'body language,' their vocal prosody and choice of words.

Modern neurophysiology has demonstrated the existence of brain support for such perception, first in the form of brain neural networks evidenced by ('mirror') neurons that are activated either by motivated self movements or by motivated movements of others. Note, however, that 'mirror neurons' themselves cannot perform such 'reading' and interpretation -- unless their dynamics are understood as being tiny, isolated portions of large and complex brain networks that must operate in 'real' time, often based on quick, transiently perceivable clues. However, these discoveries suggest that we actually understand the actions of others by experiencing them 'as if' they were our own.

An important consequence of our long-evolved and crucial ability to read the feelings of others from their postural expressions, movements and voice qualities comes our ability to read human feeling expression from (or into) more abstract forms, movements and sound streams. It is this that has allowed artistic feeling expression to become such an important and highly valued human activity. Music is such a large part of our culture today because of the emotional 'grounding' and intensification it can induce in music listeners. Remarkably, both instrumental (wordless) music as well as song can produce deep experience of sympathetic or affective perception -- in listeners who approach listening to it with 'an open heart.'

Three major aspects by which music can communicate affectively are melody, harmony, and rhythm. Certainly we recognize the intimate relationship between a 'catchy' musical rhythm and physical dance. But perception of musical melodies is also widely recognized to have an 'embodied' aspect. For example, pianists playing an upward scale tend to lengthen each note ('climbing up' in pitch), whereas for downward scales may use shorter notes ('bouncing back down'). The exact sense in which melodic contour is 'embodied' in an imaginative space with gravity has not been modeled. Clearly, studies of expressive movements to music could prove valuable in constructing such a model. At SCCN we have been working to build and disseminate methods for treating high-density EEG data as a true functional brain/cortical imaging modality in which some scales of activity within distributed cortical networks can be seen, measured, and modeled. We are also pioneering the development of a new brain/body imaging concept we call 'mobile brain/body imaging' or MoBI. This new paradigm combines high-density EEG and body motion capture with recording of eye gaze and task behavior in subjects who are performing actions -- and/or interacting -- in natural 3-D space.

Here, though, I want to focus more on the harmonic aspect of music which uses a pitch scale and harmonic relationships within it that is remarkably universal - of the highly developed musical cultures perhaps only the gamelon music of Indonesia does not base its music on perfect fifths (3/2 frequency ratios) and major/minor thirds (5/4), as well as the octave (2/1) identity that is fundamental to all musical cultures, as evidenced by men and women being heard and said to be singing 'in unison' when in reality their voices are an octave (2/1) apart in pitch.

To create an affectively charged music, composers work simultaneously with all three elements of music - melody, harmony, and rhythm. But their use of harmony, in particular, transcends particular musical style boundaries. I believe that this is because the harmonic style and structure of Western ('common practice') music involves a deep and strong mapping between harmonic model space and affective model space, a concept I first found in the writing of a controversial French Indophile, Alain Danielou, and have tested only in part.

Neurophysiology was long (the whole XXth century, and still in large part today) hindered by its fixation on single-cell dynamics, with human studies discounted because of the unavoidably invasive nature of single-neuron recording. Non-invasive electrical brain imaging (EEG), on the other hand has long been hampered by a failure to use an electrical forward head model to model and visualize the recorded data within the original brain space, its actual origin - instead measuring only the raw scalp signals that (by biophysics) must each mix contributions from many brain areas.

Our recent studies at the Swartz Center for Computational Neuroscience have shown that adequate EEG brain dynamic measures can reveal the ways in which brain electrical activity supports affective perception and imagination. But these experiments only touch the surface of a rich body of experimental evidence and investigation that needs to be brought to bear to develop scientific understanding affective perception, communication, and empathy - and its brain dynamic support. I will propose studies to clarify the way our brains support our affective perception, imagination, and experience of both music and direct human communication.

Scott Makeig

Suggested Reading:

Makeig, S. and G. Balzano, " Octave tuning-- two modes of perception," Proceedings of the Fifth Symposium on the Acoustics and Psychoacoustics of Music, Lawrence, Kansas: University of Kansas, 1982

Leslie, Ojeda, & Makeig, 'Measuring musical engagement using expressive movement and EEG brain dynamics.' Pyschomusicology, 2014

S Makeig, Gramann K, Jung T-P, Sejnowski TJ, Poizner H. Linking brain, mind and behavior: The promise of mobile brain/body imaging (MoBI) International Journal of Psychophysiology, 2009

Makeig, Debener, Onton, & Delorme, 'Mining event-related brain dynamics' Trends in Cog Sci 2004

Further reading:

Onton & Makeig, High-frequency broadband modulations of EEG spectra, Frontiers in Human Neuroscience, 2009.

S Makeig, G Leslie, T Mullen, D Sarma, N Bigdely-Shamlo, C Kothe, First demonstration of a musical emotion BCI. In: S. D ́Mello et al. (Eds.): ACII 2011, Part II, LNCS 6975, pp. 487–496, Springer-Verlag, 2011

Return to Current Abstracts

Back to Makeig Home Page