The Glass Brain
The Glass Brain is a Unity3D brain visualization that displays source activity and connectivity, inferred in real-time from high-density EEG using methods implemented in SIFT and BCILAB, developed by Tim Mullen and Christian Kothe at the Swartz Center for Computational Neuroscience, UC San Diego, and Syntrogi Labs. The project was developed as a collaboration with Adam Gazzaley and the Neuroscape Lab at UC San Francisco, with contributions from NVIDIA, StudioBee, and many others.
This is an anatomically-realistic 3D brain visualization depicting real-time source-localized activity (power and “effective” connectivity) from EEG (electroencephalographic) signals. Each color represents inferred source power and connectivity in a different frequency band (theta, alpha, beta, gamma) and the golden lines are white matter anatomical fiber tracts. Estimated information transfer between brain regions is visualized as pulses of light flowing along the fiber tracts connecting the regions.
The modeling pipeline includes MRI (Magnetic Resonance Imaging) brain scanning to generate a high-resolution 3D model of an individual’s brain, skull, and scalp tissue, DTI (Diffusion Tensor Imaging) for reconstructing white matter tracts, and BCILAB / SIFT to remove artifacts and statistically reconstruct the locations and dynamics (amplitude and multivariate Granger-causal interactions) of multiple sources of activity inside the brain from signals measured at electrodes on the scalp. In this demo, we used a 64-channel “wet” mobile system by Cognionics, Inc.
The final visualization is done in Unity3D and allows the user to fly around and through the brain with a gamepad while seeing real-time live brain activity from someone wearing an EEG cap.
This is an anatomically-realistic 3D brain visualization depicting real-time source-localized activity (power and “effective” connectivity) from EEG (electroencephalographic) signals. Each color represents inferred source power and connectivity in a different frequency band (theta, alpha, beta, gamma) and the golden lines are white matter anatomical fiber tracts. Estimated information transfer between brain regions is visualized as pulses of light flowing along the fiber tracts connecting the regions.
The modeling pipeline includes MRI (Magnetic Resonance Imaging) brain scanning to generate a high-resolution 3D model of an individual’s brain, skull, and scalp tissue, DTI (Diffusion Tensor Imaging) for reconstructing white matter tracts, and BCILAB / SIFT to remove artifacts and statistically reconstruct the locations and dynamics (amplitude and multivariate Granger-causal interactions) of multiple sources of activity inside the brain from signals measured at electrodes on the scalp. In this demo, we used a 64-channel “wet” mobile system by Cognionics, Inc.
The final visualization is done in Unity3D and allows the user to fly around and through the brain with a gamepad while seeing real-time live brain activity from someone wearing an EEG cap.
For additional details on the methods used here please consult the following resources (available HERE):
-
Panel talk by Mullen, Kothe, and Konigs at GPU Technology Conference 2014.
Mullen, T., Kothe, C. Chi, Y.M., Ojeda, A., Kerth, T., Makeig, S., Cauwenberghs, G., Jung, T-P. (2013) Real-Time Modeling and 3D Visualization of Source Dynamics and Connectivity Using Wearable EEG. 35th Annual International Conference of the IEEE Engineering in Biology and Medicine Society.
- Mullen, T. R. (2014). "The dynamic brain: Modeling neural dynamics and interactions from human electrophysiological recordings" (Order No. 3639187). Available from Dissertations & Theses @ University of California; ProQuest Dissertations & Theses A&I. (1619637939). Retrieved from http://search.proquest.com/docview/1619637939?accountid=14524. See chapters 1 and 9, and Supplementary Material.