Fluxpoint (2008)Tim Mullen, Gautam Agarwal, Richard Warp(Mindchill 2.0)
Fluxpoint occurred on Friday June 20th to a sold-out audience at the Chez Poulet Gallery in San Francisco. The event, principally organized by British electroacoustic/new classical composer Richard Warp, was a terrific mixture of experimental music and video, with several live performances, some pre-recorded compositions (combined with video), and a group improvisation session. Throughout the performance, audience members and performers were hooked up to Mindchill, which visualized their arousal level (as measured by galvanic skin response) in the form dynamic changes in time-lapse photography of plant growth, crystal formation, and other natural processes, as well as some fractal image evolution. This was projected on a large display behind the main stage. Additionally, during a special session, audience members were able to manipulate both the visual display as well as a 5-instrument synth ensemble based on their arousal levels and breathing patterns (Rich Warp helped develop the MAX/MSP interface for this).
Below are a couple photos from the event. For more photos go here.
A string quartet performs Claire Singer's 4:8:1
A group improvisation with Mindchill (projected on screen) controlled by Rich Warp (on the Theramin).
Group improvisation with Mindchill (plants); Jacob Wolkenhauer (guitar, center), Claire Singer (cello, right), Richard Warp (theramin/mindchill, far right)
Below is the full program for the event
In Tones: o / r / t / i -- Music for Online Performer (2010)Tim Mullen, Richard Warp, Adam Jansch
In Tones: Organ/Radio/Television/Internet is quartet of concerts/installations (16/01/2010) produced by Adam Jansch and Richard Glover at Phipps Hall and St. Paul's Hall at the University of Huddersfield, United Kingdom. The performances focused predominantly on three fundamental mediums of communication that defined the 20th century: Radio, Television, and the Internet. SF-based British electroacoustic/new classical composer Richard Warp and I created the Internet installation. Our work is, in part, inspired by Alvin Lucier's 1965 Music for Solo Performer, the first work in history to use brain waves to generate sound in an artistic fashion. In Solo Performer, alpha (8-12 Hz) "brainwaves," recorded from Lucier's brain using EEG, are transmitted to amplified loudspeakers that are used to resonate percussion instruments placed around the hall. By modulating his alpha rhythm, Lucier can effect changes in the musical structure of the performance.
Our performance, Music for Online Performer, takes this idea a step further, exploring the interaction and cooperative control between a "brain musician," attempting to manipulate a quartet of (acoustic) robotic instruments by modulating four fundamental brain rhythms, and a human "composer/conductor" (Richard Warp) who has created a composition that will isolate certain combinations of instruments at different stages in the performance, and is furthermore directing the musician, in real-time, to increase/decrease the power of specific neural rhythms and thereby evolve the musical composition. Furthermore, the musician, composer, and quartet/audience are physically located thousands of miles apart (San Diego, USA; San Francisco, USA; Huddersfield, United Kingdom, respectively) connected only via the Internet. The entire performance was streamed live online with cameras in all three geographic locations allowing people to connect from anywhere in world and be part of the virtual audience. Online participants were encouraged to interact with the composer/conductor in real-time via a chat room and suggest changes in the ongoing composition (e.g., "increase the cello pitch!").
Technical Details
Electrical signals recorded from the brain of a participant in San Diego, USA are used to manipulate acoustic instruments in front of a live audience at Phipps Hall at the University of Huddersfield, UK. Neural activity from the "musician" is measured continuously using electroencephalography (EEG). These signals are separated into quasi-independent components using a spatial filter previously learned by Independent Component Analysis (ICA). The activations of 4 informative components are selected and reduced to 4 variables representing changing aspects of neuronal activity (specifically spectral power) in four fundamental frequency bands: theta (4-8 Hz, over midline frontal cortex), alpha (8-12.5 Hz, over visual cortex), mu (10-12.5 Hz, over left hand sensorimotor cortex), beta (12.5-30 Hz, over right hand motor cortex)). These signals are streamed continuously to Phipps Hall where they are converted to variations in pitch and percussive frequency through the mechanical manipulation of a four-piece robotic instrument ensemble (cello, tympani, cymbal, chimes/bells) using an Arduino board interface. Using Skype, the music is streamed back to the conductor (Warp) in San Francisco and the musician (Mullen) in San Diego, who uses this feedback (along with local visual feedback), combined with compositional instructions delivered by the conductor, to manipulate his brain rhythms and thereby inform the ongoing composition. Meanwhile, webcams in all three locations (SD, SF, UK) continuously transmit a live audio/visual stream of the performance to a global audience via a public Livestream channel. Internet audience members can interact with each other and the conductor via a chat room interface and thereby influence the evolving composition.
Below is a short documentary created by Adam Jansch in which the artists discuss the motivations and challenges behind the four pieces which comprised In Tones (organ/radio/television/internet). Music for Online Performer is discussed towards the end of the film.
Performance Credits:
Thanks to Yijun Wang, Ph.D for his invaluable assistance in setting up the EEG system. Thanks to Swartz Center for Computational Neuroscience for providing the EEG hardware. Software for interfacing with the EEG hardware was adapted from the DataRiver/Matriver package developed by Andrey Vankov and Nima Bigdely Shamlo. Finally, a big thanks to Adam Jansch for his incredible assistance with (among many other things) setting up the robotic instruments in the UK and for inviting our participation in the In Tones series.
Just: A Suite for Violin, Cello, Flute and Brain (2010)Scott Makeig, Grace Leslie, Tim Mullen, Alex Khalil, Christian Kothe,
Just was composed by Scott Makeig and first performed by Grace Leslie, Alex Khalil, Scott Makeig and Tim Mullen on June 2nd, 2010 at the Fourth International Brain-Computer Interface meeting at the Asilomar Conference Center in Monterrey, California. Mental state classification was done with Christian Kothe's BCILab software.
Final rehearsal prior to performance of Just at the Fourth International BCI meeting at Asilomar (Monterrey, California, USA). Left to right are Grace Leslie (flute, percussion, Max/MSP), Alex Khalil ('cello), Scott Makeig (violin, composer), Tim Mullen ("brainist", neural interface), Christian Kothe (BCI), Dev Sarma (Tech support).
In this video clip (starting around minute 23:00), Just (and the underlying BCI technology) is featured in UCSDTV's UCSD@50 series honoring UC San Diego's 50th anniversary.
The above highlight article discusses Just as well as other Brain-Machine Interface work being carried out at our lab at the Swartz Center for Computational Neuroscience.
Grace Leslie and Tim Mullen
![]() MoodMixer is an interactive installation in which participants collaboratively navigate a two-dimensional music space by manipulating their cognitive state and conveying this state via wearable Electroencephalography (EEG) technology. The participants can choose to actively manipulate or passively convey their cognitive state depending on their desired approach and experience level. A four-channel electronic music mixture continually conveys the participants’ expressed cognitive states while a colored visualization of their locations on a two-dimensional projection of cognitive state attributes aids their navigation through the space. MoodMixer is a collaborative experience that incorporates aspects of both passive and active EEG sonification and performance art. In our NIME '11 paper below we discuss the technical design of the installation and place its collaborative sonification aesthetic design within the context of existing EEG-based music and art.
A depiction of the MoodMixer Installation in use.
Diagram of the installation hardware setup with the communication protocols between components. In the NIME version of the installation, index1 corresponds to “relaxation/meditation” and index2 to “attention/focus”
Grace Leslie and Tim Mullen While the first iteration of MoodMixer spatially remixed four pre-recorded electronic music samples, MoodMixer 2.0 (premiered in San Diego at Mozart and the Mind 2012) uses a new automatic music generator to produce a composition reminiscent of John Adams' piano piece Phrygian Gates (1977-8). The software randomly chooses notes from a set scale and repeats them to create slowly evolving loops, hallmarks of Adams' minimalist style, of which Phrygian Gates is a prototypical example. The scale begins in A Lydian, then shifts to A Phrygian, and then cycles around the circle of fifths, to E Lydian, E Phrygian, and so on. Several aspects of the music are manipulated to match the participants' cognitive states, primarily tempo and mode, as these are thought to be the features which most determine a piece of music's emotional expression [1]. One participant is able to increase and decrease the tempo of the piece based on their level of relaxation, while the other participant influences the overall texture of the piece by expanding and contracting note lengths based on their level of focused attention. Participants also have the option of jumping to the next key in the cycle by blinking their eyes. No two performances of the composition sound the same, given the unique contributions of each participant and the piece's continually evolving structure. References [1] Gabrielsson, A.; Lindstrom, E. (2001). “The influence of musical structure on emotional expression.” Music and Emotion: Theory and Research: 223–243.
EEG OceanNima Bigdely-Shamlo and Tim Mullen
Background
The brain is a complex environment with many spatially-removed subsystems interacting over time. Visualizing patterns of brain activity over long periods of time is generally impossible by simply looking at long time-series of EEG. As such, researchers usually apply reductive techniques which seek to extract meaningful patterns of activity from the time-series. For example, we might apply spectral decomposition techniques to identify frequency-specific structure within activity recorded from different parts of the brain. However, with these reductive techniques it can be difficult to get an intuitive sense of the overall pattern of ongoing brain activity across time and space.
EEG Ocean is a novel approach to visualizing complex spatiotemporal patterns of dynamic brain activity over long periods of time in an intuitive and aesthetically-pleasing manner. EEG is thought to be the summed activity of multiple cortical generators or "sources." To approximately recover the activations of these sources, Independent Component Analysis (ICA), a blind source separation technique which separates a multi-channel EEG signal into maximally independent components (ICs), can be applied to EEG data collected over a long period of time. Sources that are interacting (non-independent) may preserve some residual mutual information after applying ICA. As such, ICs are clustered using Multidimensional Scaling (MDS) such that those with high residual mutual information are spatially proximal while those with little shared information are pushed apart. The activity of these ICs are then displayed as "ripples" propagating out through time from their respective ICs (neural "sources"). With a high-resolution display this allows us to visualize the collective activity of many neural components (independent or interacting across time) over long periods of time (e.g., many minutes, rather than seconds). Event Related Potentials (ERPs) and oscillatory bursts appear as prominent ripple sequences which can be traced back to their respective sources. Relative timing of neural events between ICs can be easily inferred by comparing the respective radii of ripples emanating from different ICs. A single frame is a "snap-shot" of all significant neural events over some period of time up to the present, where the length of time visualizable depends on the step size between frames and the resolution of the display. A more detailed description of the procedure is outlined below.
Example In the following demo, EEGOcean is applied to 64-channel EEG of a representative subject performing a Rapid Serial Visual Presentation task (RSVP; many image chips displayed in rapid sequence (12 images per/sec) with a target chip randomly embedded in the sequence). EEG is collected at 256 Hz. ICA is applied to the data and selected IC power (abs. value of IC activations after bandpassing with 3-30 Hz zero-phase FIR filter) is visualized in approximately real-time (with a rate of 48ms of EEG per frame, each second of video corresponds to 1.1209 seconds of EEG). For this demo, we have a display width of 3200 pixels, so any given frame contains between 6.25 and 12.5 seconds of activity for a given IC (depending on the location of the IC). Note that with a resolution of 35,000 x 8000 (e.g., maximum HiPerSpace resolution) and a step size of 100ms of EEG per frame, a single EEGOcean snapshot would visualize ~30-60 minutes of activity for all selected ICs. This provides a unique ability to gain an intutive sense of ongoing brain dynamics and identify salient neural events occurring over very long periods of time, such as during sleep, resting state, etc.
The movie below is a reduced-resolution video (for web streaming) of EEG Ocean applied to 12.5 seconds of continuous EEG.
High-resolution screenshots of EEG Ocean from the above movie (click for larger image). An interactive multi-scale version is available here.
Screenshots from EEG Ocean on HiPerSpace at Calit2
Future WorkEEGOcean is still in its infancy. Within certain resolution and framerate limitations, is currently possible to use EEG Ocean in real-time using SCCN's open-source Datariver/Matriver software. We plan to add interactive control allowing the user to call up alternate views (e.g., 3D dipole displays and raw time-series) for selected components and obtain more detailed information about a given component or display section. Multi-scale imaging technology can be used to allow the user to change the temporal scale (zoom in) while preserving resolution. See this link (thanks to Nima Bigdely Shamlo) for a simple example of an interactive multi-scale implementation using Microsoft Seadragon Ajax.
A beta (older) version of EEG Ocean capable of real-time usage is included in the open-source Matriver package (Matlab) accessible here. Contact me or Nima for details on newer versions.
Ringing Minds -- collective brain responses interacting in a spontaneous musical landscape premiered on May 31st, 2014, at Mainly Mozart's "Mozart and the Mind" festival in La Jolla, California. Ringing Minds explores collective brain responses interacting in a spontaneous musical landscape aided by recent advances in sensing technology and powerful tools for analyzing electroencephalograms (EEGs) of multiple brains.
Ringing Minds is a collaboration with experimental music pioneer and CalArts Music Chair, David Rosenboom, and composer-performer and UCSD ethnomusicologist Alexander Khalil. In this unique fusion of performance and scientific experiment, Rosenboom and Khalil, on violin and lithoharp (an instrument made from sonorous stone), musically influence four audience members' brain states, measured by wearable EEG systems. A "hyperbrain" which simultaneously combines dynamical information from all four brains was generated using algorithms developed during my Ph.D research. The dynamical states of the hyperbrain, specifically resonant Principal Oscillation Patterns, were rendered audible, using spatial sonification algorithms by Rosenboom, forming a musical "pond" into which sonic events are thrown – creating ripples, resonance, and an altogether unique musical experience.
Violin and Electronics: David Rosenboom
Lithoharp: Alexander Khalil
Hyperbrain: Tim Mullen
** 2015 UPDATE ** A revised version of Ringing Minds was performed Saturday, May 23, 2015 at the Whitney Museum of American Art in Manhattan as part of a 3 day retrospective of David Rosenboom's 50 years of pioneering work in experimental music. Mozart and the Mind 2012Please visit the MATM 2012 page for details.Mozart and the Mind 2013Please visit the MATM page for details.Mozart and the Mind 2014Please visit the MATM page for details.Copyright (c) 2009 Tim Mullen |
|