BioArt

On this page you can read about some of the art installations and projects I've worked on.



Yuri's Night Bay Area 2008 

Gautam Agarwal, Tim Mullen, Jonathan Toomim

April 2008, myself and two YNBA veterans (Gautam Agarwal, Jonathan Toomim) are participating in Yuri's Night at NASA Ames Moffett Field Research Center with some biofeedback-related art-tech projects we hope will be both intellectually as well as aesthetically engaging. See below for more information on this amazing worldwide expo/art-show/party as well as information on our two projects, Mindchill and Introscope.

 

[from http://ynba.org/2008/overview.php:]

Yuri's Night Bay Area is a massive celebration of space, science, art, music, and technology . Once a year in over a hundred places all over the world, Yuri's Night commemorates the anniversary of the launch of the first man in space , Yuri Gagarin, and the launch of the first Space Shuttle exactly twenty years later. Starting in 2007, Space Generation teamed up with the NASA Ames Research Center for the first time, along with a team of amazing volunteers, to host the largest Yuri's Night celebration ever held in a massive hangar on Moffett Field in Mountain View, CA.

 

Yuri's Night Bay Area taps into the San Francisco area's unique energy to bring together scientists, artists, technologists, musicians, and space enthusiasts in a fusion of celebration and education that is unlike anything else you've ever seen.In 2008 the event is growing to twice the size, bringing in more hot musicians , more brilliant scientists , more amazing artists , and the all-new Festival of Ideas .

 

Learn more about Yuri's Night Bay Area from NASA!

Mindchill and Introscope

by Infodelic Ectomorphs (i.e.)

(Gautam Agarwal, Tim Mullen, Jonathan Toomim)

 

This project addresses the YNBA '08 theme of "Radical Technology for a Sustainable Future"


Philosophy

Changes in the world are tightly coupled to changes in our being. While the rapid transformation of the earth becomes more evident, the concomitant acceleration of our thought processes remains invisible. The sensorium created by our technology tends toward finer spatiotemporal scales, making the potentially dire trajectory of the slow and vast processes that sustain life as we know it more an intellectual concern and less a salient experience. We hope to engage Yuri's Night attendees to broaden their scope of awareness through the use of biofeedback. The body is a microcosm of the planet, a complex web of interactions where one ebb leads to another flow. Through biofeedback, users will experience such a coupling between external events and internal states. Along with techniques such as meditation, biofeedback may be useful in exploring the intricate relation between the mental and physical world. We believe that developing an ecology of mind may be a crucial step in understanding and restructuring the immense energetic transaction we are engaged in with our planet today.


Description

Subjects' galvanic skin response (GSR) and brainwaves (EEG) will control an audiovisual environment that informs them of moment-to-moment fluctuations in their level of arousal (GSR) and general cognitive state (EEG).

 

MindchillChanges in sympathetic activity have been shown to correlate with changes in emotional arousal and stress levels. We will measure this change using a galvanic skin response (GSR) sensor. The user's skin conductance will be measured continuously and his/her arousal will be represented as water in its solid, liquid, or gas states. As a user becomes more aroused, she will cause water to thaw or boil; as she relaxes, she will cause it to condense or freeze. This will be attempted in real-time using a thermoelectric junction, with close-ups of ice-crystal/gas-bubble formation streamed through a webcam and projected in real-time on a screen in front of the user. Alternate feedback modes will include coupling the GSR to time-lapse films of plant growth, ice crystal formation, polar ice melt or other engaging natural processes. Users have the option of being presented with a series of verbal or visual stimuli to trigger emotional responses.



IntroscopeElectroencephalograhy (EEG) is a method for measuring the electrical activity of the brain using electrodes placed on the scalp. This project will focus on coupling a user's brain dynamics to an audiovisual environment, affording the user visualization and enhanced control of the ongoing electrical activity of the brain. The EEG signal will have independent and simultaneous auditory and visual representations. Two separate auditory representations will be selectable. In the first, the EEG be transformed in the frequency domain to a range audible to humans and then played through speakers and/or headphones. A number of filters may optionally be applied to increase the saliency of EEG features that correlate well with the quality of subjective experience. The second will analyze the EEG for certain discrete electrophysiological events which will be used to guide a software synthesizer-based music generator. Visual representations will be produced with an array of bright multicolored flashing LEDs. After nightfall, the high contrast between the LEDs' brightness and average ambient brightness will reduce the saliency and visibility of the subject's physical surroundings, facilitating the fusion of the internal and external worlds and carrying her through a journey of her own creation. Additional LED pods will be placed in the area in a dance light-like manner for spectators' enjoyment; the algorithm used to control the audience LEDs will be adjustable independently of that of the subject's dedicated LED pod, and can be turned off selectively.


Below are some photos of Mindchill 1.0 and its debut at NASA Ames Research Center for YNBA '08.


Version 1.0 of our home-grown GSR-thermocouple system
The Infodelic Ectomorphs testing Mindchill 
(left: Gautam Agarwal, right: Tim Mullen)

 Setup: Hanger A wherein Mindchill was housed

Gautam discussing matters of the Mind with a crowd gathered around Mindchill

Mindchill (reverse of projection screen)

 Another Mindchill mode (chill out and you keep the polar ice caps from melting!)


Fluxpoint (2008)  

Tim Mullen, Gautam Agarwal, Richard Warp

(Mindchill 2.0)


Fluxpoint occurred on Friday June 20th to a sold-out audience at the Chez Poulet Gallery in San Francisco. The event, principally organized by British electroacoustic/new classical composer Richard Warp, was a terrific mixture of experimental music and video, with several live performances, some pre-recorded compositions (combined with video), and a group improvisation session. Throughout the performance, audience members and performers were hooked up to Mindchill, which visualized their arousal level (as measured by galvanic skin response) in the form dynamic changes in time-lapse photography of plant growth, crystal formation, and other natural processes, as well as some fractal image evolution. This was projected on a large display behind the main stage. Additionally, during a special session, audience members were able to manipulate both the visual display as well as a 5-instrument synth ensemble based on their arousal levels and breathing patterns (Rich Warp helped develop the MAX/MSP interface for this). 


Below are a couple photos from the event. For more photos go here.



A string quartet performs Claire Singer's 4:8:1




A group improvisation with Mindchill (projected on screen) controlled by Rich Warp (on the Theramin). 




Group improvisation with Mindchill (plants);  Jacob Wolkenhauer (guitar, center), Claire Singer (cello, right), Richard Warp (theramin/mindchill, far right)
 


Below is the full program for the event


You are invited to FLUXPOINT an eclectic evening of experimental music and film in San Francisco... 


HEATHER FRASCH


Heather Frasch is a composer of acoustic and electro-acoustic music, improviser, sound installation artist, and experimental flutist whose music has been performed in the US, Europe and Asia.


MINDCHILL


Mindchill, created by UC Berkeley brain hackers Gautam Agarwal and Tim Mullen, attempts to engage individuals in broadening their scope of awareness through the use of biofeedback. Fluctuations in the user's arousal and affect are measured continuously via GSR and represented as real-time changes in plant growth, ice crystal formation and other engaging and artistic natural processes.


CLAIRE M SINGER


Claire is an electroacoustic composer and performer from Scotland. Her compositional work includes fixed media (stereo and multi-channel), site-specific, multi-media, live electronics and collaborative work. Claire is also involved in 'cello and electronic improvisational work and has performed with various experimental music groups in London and throughout the rest of the UK.

In 2007, Claire was awarded the PRS Atom Award for New Music, which has funded her trip to San Francisco to develop her work with Max/MSP.


DAMON WAITKUS


Damon Waitkus was born in Boston, Massachusetts, in 1977, and earned an MA in composition from Mills College in 2006. Much of his recent work has been for recorded media, combining field recordings collected from various natural, domestic, and urban environments with passages for traditional instruments.


RICHARD WARP


British electroacoustic/new classical composer based in Berkeley, California. An MMus graduate of Goldsmiths College, University of London, his work attempts to explore the musical mind/body schism between "cerebral" sonic architecture and instinctual, emotionally driven impulses.


JACOB WOLKENHAUER


A guitarist who plays unorthodox music to serve two completely different ends - to make the familiar become foreign, and to make the abstract become universal. Through the de-humanization of sampled text, he expands awareness of our most basic form of communication: speech. In combination with instrumental music, he takes abstract forms with no literal reference and gives them voice.


Please do not park in the parking lot next door, it does not belong to us and the neighbors are rabid. 

The Chez Poulet is 2 blocks from the 24th street BART train. Plenty of bike parking. 

Refreshments available. 

RSVP MANDATORY, as space is limited and the event will defiantly sell out.

Our sliding scale is designed to accommodate students, artists in residence or on stipend, interns and monks. Please be generous. 

Cell phones ringing during the performance will be confiscated and blended to a puree.




In Tones: o / r / t / i -- Music for Online Performer (2010)

Tim Mullen, Richard Warp, Adam Jansch

In Tones: Organ/Radio/Television/Internet is quartet of concerts/installations (16/01/2010) produced by Adam Jansch and Richard Glover at Phipps Hall and St. Paul's Hall at the University of Huddersfield, United Kingdom. The performances focused predominantly on three fundamental mediums of communication that defined the 20th century: Radio, Television, and the Internet.  SF-based British electroacoustic/new classical composer Richard Warp and I created the Internet installation. Our work is, in part, inspired by Alvin Lucier's 1965 Music for Solo Performer, the first work in history to use brain waves to generate sound in an artistic fashion.  In Solo Performer, alpha (8-12 Hz) "brainwaves," recorded from Lucier's brain using EEG, are transmitted to amplified loudspeakers that are used to resonate percussion instruments placed around the hall. By modulating his alpha rhythm, Lucier can effect changes in the musical structure of the performance.

Our performance, Music for Online Performer, takes this idea a step further, exploring the interaction and cooperative control between a "brain musician," attempting to manipulate a quartet of (acoustic) robotic instruments by modulating four fundamental brain rhythms, and a human "composer/conductor" (Richard Warp) who has created a composition that will isolate certain combinations of instruments at different stages in the performance, and is furthermore directing the musician, in real-time, to increase/decrease the power of specific neural rhythms and thereby evolve the musical composition. Furthermore, the musician, composer, and quartet/audience are physically located thousands of miles apart (San Diego, USA; San Francisco, USA; Huddersfield, United Kingdom, respectively) connected only via the Internet. The entire performance was streamed live online with cameras in all three geographic locations allowing people to connect from anywhere in world and be part of the virtual audience. Online participants were encouraged to interact with the composer/conductor in real-time via a chat room and suggest changes in the ongoing composition (e.g., "increase the cello pitch!"). 


 
 

A paper describing Online Performer was published in the Proceedings of the 2011 International Conference on New Interfaces for Musical Expression (NIME) in Oslo, Norway and can be accessed by clicking the icon on the left. An extended version of the paper can be accessed here



Technical Details

Electrical signals recorded from the brain of a participant in San Diego, USA are used to manipulate acoustic instruments in front of a live audience at Phipps Hall at the University of Huddersfield, UK. Neural activity from the "musician" is measured continuously using electroencephalography (EEG). These signals are separated into quasi-independent components using a spatial filter previously learned by Independent Component Analysis (ICA).  The activations of 4 informative components are selected and reduced to 4 variables representing changing aspects of neuronal activity (specifically spectral power) in four fundamental frequency bands: theta (4-8 Hz, over midline frontal cortex), alpha (8-12.5 Hz, over visual cortex), mu (10-12.5 Hz, over left hand sensorimotor cortex), beta (12.5-30 Hz, over right hand motor cortex)). These signals are streamed continuously to Phipps Hall where they are converted to variations in pitch and percussive frequency through the mechanical manipulation of a four-piece robotic instrument ensemble (cello, tympani, cymbal, chimes/bells) using an Arduino board interface. Using Skype, the music is streamed back to the conductor (Warp) in San Francisco and the musician (Mullen) in San Diego, who uses this feedback (along with local visual feedback), combined with compositional instructions delivered by the conductor, to manipulate his brain rhythms and thereby inform the ongoing composition. Meanwhile, webcams in all three locations (SD, SF, UK) continuously transmit a live audio/visual stream of the performance to a global audience via a public Livestream channel. Internet audience members can interact with each other and the conductor via a chat room interface and thereby influence the evolving composition.

Installation flowchart for Music for Online Performer. Globes represent Internet transmission.

Music For Online Performer



Below is a short documentary created by Adam Jansch in which the artists discuss the motivations and challenges behind the four pieces which comprised In Tones (organ/radio/television/internet). Music for Online Performer is discussed towards the end of the film. 



Performance Credits:

Thanks to Yijun Wang, Ph.D for his invaluable assistance in setting up the EEG system. Thanks to Swartz Center for Computational Neuroscience for providing the EEG hardware. Software for interfacing with the EEG hardware was adapted from the DataRiver/Matriver package developed by Andrey Vankov and Nima Bigdely Shamlo. Finally, a big thanks to Adam Jansch for his incredible assistance with (among many other things) setting up the robotic instruments in the UK and for inviting our participation in the In Tones series.



Just: A Suite for Violin, Cello, Flute and Brain (2010)

Scott Makeig, Grace Leslie, Tim Mullen, Alex Khalil, Christian Kothe,

Just was composed by Scott Makeig and first performed by Grace Leslie, Alex Khalil, Scott Makeig and Tim Mullen on June 2nd, 2010 at the Fourth International Brain-Computer Interface meeting at the Asilomar Conference Center in Monterrey, California. Mental state classification was done with Christian Kothe's BCILab software.

 
 

The full programme notes for Just is available here (click on the PDF icon).

This contains a description of the installation and bios of performers.







Final rehearsal prior to performance of Just at the Fourth International BCI meeting at Asilomar (Monterrey, California, USA). Left to right are Grace Leslie (flute, percussion, Max/MSP), Alex Khalil ('cello), Scott Makeig (violin, composer), Tim Mullen ("brainist", neural interface), Christian Kothe (BCI), Dev Sarma (Tech support). 


Performance of Just at the Sonic Diasporas Music Festival at UC San Diego.






In this video clip (starting around minute 23:00), Just (and the underlying BCI technology) is featured in UCSDTV's UCSD@50 series honoring UC San Diego's 50th anniversary. 


Research Explore Stories

The above highlight article discusses Just as well as other Brain-Machine Interface work being carried out at our lab at the Swartz Center for Computational Neuroscience.


MoodMixer

Grace Leslie and Tim Mullen

MoodMixer is an interactive installation in which participants collaboratively navigate a two-dimensional music space by manipulating their cognitive state and conveying this state via wearable Electroencephalography (EEG) technology. The participants can choose to actively manipulate or passively convey their cognitive state depending on their desired approach and experience level. A four-channel electronic music mixture continually conveys the participants’ expressed cognitive states while a colored visualization of their locations on a two-dimensional projection of cognitive state attributes aids their navigation through the space. MoodMixer is a collaborative experience that incorporates aspects of both passive and active EEG sonification and performance art. In our NIME '11 paper below we discuss the technical design of the installation and place its collaborative sonification aesthetic design within the context of existing EEG-based music and art.


 
 

A paper describing MoodMixer was published in the Proceedings of the 2011 International Conference on New Interfaces for Musical Expression (NIME) in Oslo, Norway and can be accessed by clicking the icon on the left.


A depiction of the MoodMixer Installation in use.





Diagram of the installation hardware setup with the communication protocols between components. In the NIME version of the installation, index1 corresponds to “relaxation/meditation” and index2 to “attention/focus”




Grace and Tim and a laptop version of MoodMixer at CES 2011 in Las Vegas




MoodMixer 2.0

Grace Leslie and Tim Mullen

Kircher's 1650 "Arca Musarithmica" -- a precursor to modern algorithmic composition

While the first iteration of MoodMixer spatially remixed four pre-recorded electronic music samples, MoodMixer 2.0 (premiered in San Diego at Mozart and the Mind 2012) uses a new automatic music generator to produce a composition reminiscent of John Adams' piano piece Phrygian Gates (1977-8). The software randomly chooses notes from a set scale and repeats them to create slowly evolving loops, hallmarks of Adams' minimalist style, of which Phrygian Gates is a prototypical example. The scale begins in A Lydian, then shifts to A Phrygian, and then cycles around the circle of fifths, to E Lydian, E Phrygian, and so on. Several aspects of the music are manipulated to match the participants' cognitive states, primarily tempo and mode, as these are thought to be the features which most determine a piece of music's emotional expression [1]. One participant is able to increase and decrease the tempo of the piece based on their level of relaxation, while the other participant influences the overall texture of the piece by expanding and contracting note lengths based on their level of focused attention. Participants also have the option of jumping to the next key in the cycle by blinking their eyes. No two performances of the composition sound the same, given the unique contributions of each participant and the piece's continually evolving structure.

References

[1] Gabrielsson, A.; Lindstrom, E. (2001). “The influence of musical structure on emotional expression.” Music and Emotion: Theory and Research: 223–243.


 
 


A abstract describing MoodMixer 2.0 for Mozart and the Mind Festival 2012 can be downloaded from by clicking the icon on the left.



EEG Ocean

Nima Bigdely-Shamlo and Tim Mullen

EEG Ocean is a method for low-dimensional (2D) visualization of the spatial topographies, temporal activations, and correlations between multiple quasi-independent components (sources) extracted from high-density EEG.

Background

The brain is a complex environment with many spatially-removed subsystems interacting over time. Visualizing patterns of brain activity over long periods of time is generally impossible by simply looking at long time-series of EEG. As such, researchers usually apply reductive techniques which seek to extract meaningful patterns of activity from the time-series. For example, we might apply spectral decomposition techniques to identify frequency-specific structure within activity recorded from different parts of the brain. However, with these reductive techniques it can be difficult to get an intuitive sense of the overall pattern of ongoing brain activity across time and space.

EEG Ocean is a novel approach to visualizing complex spatiotemporal patterns of dynamic brain activity over long periods of time in an intuitive and aesthetically-pleasing manner. EEG is thought to be the summed activity of multiple cortical generators or "sources." To approximately recover the activations of these sources, Independent Component Analysis (ICA), a blind source separation technique which separates a multi-channel EEG signal into maximally independent components (ICs), can be applied to EEG data collected over a long period of time. Sources that are interacting (non-independent) may preserve some residual mutual information after applying ICA. As such, ICs are clustered using Multidimensional Scaling (MDS) such that those with high residual mutual information are spatially proximal while those with little shared information are pushed apart. The activity of these ICs are then displayed as "ripples" propagating out through time from their respective ICs (neural "sources"). With a high-resolution display this allows us to visualize the collective activity of many neural components (independent or interacting across time) over long periods of time (e.g., many minutes, rather than seconds). Event Related Potentials (ERPs) and oscillatory bursts appear as prominent ripple sequences which can be traced back to their respective sources. Relative timing of neural events between ICs can be easily inferred by comparing the respective radii of ripples emanating from different ICs. A single frame is a "snap-shot" of all significant neural events over some period of time up to the present, where the length of time visualizable depends on the step size between frames and the resolution of the display. A more detailed description of the procedure is outlined below.

Algorithm
  1. N-channel EEG is separated into N (maximally) independent components using Independent Component Analysis (ICA)
  2. A subset of informative Independent Components (ICs) are selected for display (this selection may vary depending on the goals of the researcher)
  3. (OPTIONAL) Selected ICs are projected onto a 2D image using metric multidimensional scaling on a relevant measure (such as mutual info or latency between two ICs in which mutual info is maximum).
  4. ICA weights are rendered as topographic plots ("islands") each depicting the spatial distribution of activation of a given IC across the scalp (nose points up)
  5. Select a reference time point (current time). For each pixel in the image, distance from the pixel to each IC location is calculated and mapped into latency w.r.t. the reference point (longer distance equals longer latency before current time point in EEG). A measure of IC activity (e.g., activation, abs. activity, power at certain band, or mutual information with class) is assigned to that pixel based on calculated latency and these values for different ICs are summed together.
  6. Apply a preferred color mapping to pixel values and render the image (or 3D surface height map).
  7. Repeat 5-6 for a range of current time points over a segment of EEG (of arbitrary length) with a specified step size (e.g., 50 ms).

Example

In the following demo, EEGOcean is applied to 64-channel EEG of a representative subject performing a Rapid Serial Visual Presentation task (RSVP; many image chips displayed in rapid sequence (12 images per/sec) with a target chip randomly embedded in the sequence). EEG is collected at 256 Hz. ICA is applied to the data and selected IC power (abs. value of IC activations after bandpassing with 3-30 Hz zero-phase FIR filter) is visualized in approximately real-time (with a rate of 48ms of EEG per frame, each second of video corresponds to 1.1209 seconds of EEG). For this demo, we have a display width of 3200 pixels, so any given frame contains between 6.25 and 12.5 seconds of activity for a given IC (depending on the location of the IC). Note that with a resolution of 35,000 x 8000 (e.g., maximum HiPerSpace resolution) and a step size of 100ms of EEG per frame, a single EEGOcean snapshot would visualize ~30-60 minutes of activity for all selected ICs. This provides a unique ability to gain an intutive sense of ongoing brain dynamics and identify salient neural events occurring over very long periods of time, such as during sleep, resting state, etc.


The movie below is a reduced-resolution video (for web streaming) of EEG Ocean applied to 12.5 seconds of continuous EEG.

EEG Ocean Movie Clip



High-resolution screenshots of EEG Ocean from the above movie (click for larger image). An interactive multi-scale version is available here.
 
 
 
 


In November, 2008 EEG Ocean was displayed on the HiPerSpace wall at Calit2. Until 2009, HiPerSpace was the world's largest LCD display wall with a maximum resolution of 35,840 x 8,000 pixels for a total of 286,720,000 pixels. I was assisted in this demo by Ramsin Khoshabeh and members of Falko Kuester's lab at Calit2. Below is video and images of EEG Ocean on HiPerSpace. Note that for this demo we used a display resolution of 3200 x 1600. As such, EEG Ocean is making use of only a fraction of the total HiPerSpace real estate.

Screenshots from EEG Ocean on HiPerSpace at Calit2
 
 
 
 


Future Work

EEGOcean is still in its infancy. Within certain resolution and framerate limitations, is currently possible to use EEG Ocean in real-time using SCCN's open-source Datariver/Matriver software. We plan to add interactive control allowing the user to call up alternate views (e.g., 3D dipole displays and raw time-series) for selected components and obtain more detailed information about a given component or display section. Multi-scale imaging technology can be used to allow the user to change the temporal scale (zoom in) while preserving resolution. See this link (thanks to Nima Bigdely Shamlo) for a simple example of an interactive multi-scale implementation using Microsoft Seadragon Ajax.

Software

A beta (older) version of EEG Ocean capable of real-time usage is included in the open-source Matriver package (Matlab) accessible here.  Contact me or Nima for details on newer versions.



Ringing Minds (2014)


Ringing Minds -- collective brain responses interacting in a spontaneous musical landscape premiered on May 31st, 2014, at Mainly Mozart's "Mozart and the Mind" festival in La Jolla, California. Ringing Minds explores collective brain responses interacting in a spontaneous musical landscape aided by recent advances in sensing technology and powerful tools for analyzing electroencephalograms (EEGs) of multiple brains.

Ringing Minds is a collaboration with experimental music pioneer and CalArts Music Chair, David Rosenboom, and composer-performer and UCSD ethnomusicologist Alexander Khalil. In this unique fusion of performance and scientific experiment, Rosenboom and Khalil, on violin and lithoharp (an instrument made from sonorous stone), musically influence four audience members' brain states, measured by wearable EEG systems. A "hyperbrain" which simultaneously combines dynamical information from all four brains was generated using algorithms developed during my Ph.D research. The dynamical states of the hyperbrain, specifically resonant Principal Oscillation Patterns, were rendered audible, using spatial sonification algorithms by Rosenboom, forming a musical "pond" into which sonic events are thrown – creating ripples, resonance, and an altogether unique musical experience.

Violin and Electronics: David Rosenboom
Lithoharp: Alexander Khalil
Hyperbrain: Tim Mullen


** 2015 UPDATE ** A revised version of Ringing Minds was performed Saturday, May 23, 2015 at the Whitney Museum of American Art in Manhattan as part of a 3 day retrospective of David Rosenboom's 50 years of pioneering work in experimental music. 


 
https://drive.google.com/file/d/0B6Q34w15EJPIVmR4RHJ4NXd1bVU/view?usp=sharing
 
A book chapter describing Ringing Minds and other MATM installations has been published by Springer":

Mullen et al, “MindMusic: Playful and Social Installations at the Interface Between Music and the Brain." in More Playful User Interfaces. Edited by Anton Nihjolt. Springer. 2015




Mozart and the Mind 2012

Please visit the MATM 2012 page for details.



Mozart and the Mind 2013

Please visit the MATM page for details.



Mozart and the Mind 2014

Please visit the MATM page for details.

Copyright (c) 2009 Tim Mullen