Real-World Neuroimaging: Demo of the Escape Room (In-game assessment screen)


Real-world neuroimaging

The past twenty years have witnessed remark able advances in both fundamental neuroscience research and next-generation neurotechnologies. However, nearly all the neuroscience research studies were conducted in well-controlled laboratory settings. It has been argued that fundamental differences between laboratory-based and naturalistic wearable multi-modal bio-sensing headsethuman behavior may exist. It remains unclear how well the current knowledge of human brain function translates into the highly dynamic real world (McDowell, et al., 2014). Therefore, there is a need to study the brain in ecologically valid environments in order to truly understand how the human brain functions to optimally control behavior in face of ever-changing physical and cognitive circumstances. To this end, our group has developed and validated a wearable multi-modal biosensing system capable of collecting, synchronizing, and recording and transmitting data from multiple bio-sensors from unconstrained, freely moving subjects in everyday environments. The figure below shows a prototype of our multi-modal bio-sensing platform (Siddharth, et al., 2018). A video demo can be found at

Fig. 1 A wearable multi-modal bio-sensing headset features (A) World camera, (B) EEG Sensors, (C) Battery, (D) EEG Reference Electrode, (E) Eye Camera, (F) Earlobe PPG Sensor, (G) Headphone/speaker connector, and (H) Embedded System. [Figure is copied from Siddharth, et al., 2018]

Real-time EEG analysis and modeling

Recent advances in dry-electrode electroencephalography (EEG) and wearable/wireless data-acquisition systems have ignited new research and developments of the applications of EEG for real-world cognitive-state monitoring, affective computing, clinical diagnostics and therapeutics, and brain–computer interfaces (BCI), among others. These practical applications of EEG call for further developments in signal processing and machine learning to improve real-time (and online) measurement and classification of the brain and behavioral states from small samples of noisy EEG data (Mullen et al., 2015).

Our Center has recently developed a real-time software framework that includes adaptive artifact rejection (ASR), cortical source localization, multivariate effective connectivity inference, data visualization, and cognitive state classification from connectivity features using a constrained logistic regression approach (ProxConn) (Mullen et al., 2015). The figure below shows a schematic of the real-time data processing pipeline. A video demo can be found at

dry-electrod EEG

Real-time EEG Source-mapping Toolbox and its application to automatic artifact removal

SCCN graduate students, Shawn Hsu and Luca Pion-Tonachini have recently developed a new Matlab-based toolbox, namely Real-time EEG Source-mapping Toolbox (REST), for real-time multi-channel EEG data analysis and visualization (Pion-Tonachini, Hsu et al., EMBC 2015Pion-Tonachini, Hsu et al., 2018).

The novel REST combines adaptive artifact rejection (ASR), online recursive source separation algorithms (e.g. ORICA, Hsu et al., 2015), and machine learning approaches to automatically reject muscle, movement and eye artifacts in EEG data in near real-time (Pion-Tonachini, Hsu et al., EMBC 2015Chang,  Hsu et al., 2018Pion-Tonachini, Hsu et al., 2018). The video demos are available at The open source codes of the Toolbox are available at Github).

Brain-computer interfaces

Brain-computer interfaces (BCIs) allow users to translate their intention into commands to control external devices, enabling an intuitive interface for disabled and/or nondisabled users (Wolpaw et al., 2002). Among various neuroimaging modalities, electroencephalogram (EEG) is one of the most popular ones used for real-world BCI applications due to its non-invasiveness, low cost, and high temporal resolution (Wolpaw et al., 2002). Recently, our group at Swartz Center for Computational Neuroscience has developed state-of-the-art BCIs based on steady-state visual evoked potentials (SSVEP), an intrinsic neural electrophysiological response to repetitive visual stimulation, and achieved a new record of BCI communication speed (Information Transfer Rates of 325 bits/min). The figure below shows a high-speed BCI speller adopted from our publication in PNAS (Chen et al., 2015). More recently, our recent BCI speller obtained a high spelling rate up to 75 characters (∼15 words) per minute (Nakanishi et al., 2018). Video demos can be found at

Fig. 1. A closed-loop system design of the SSVEP-based BCI speller. (A) System diagram of the BCI speller, which consists of four main procedures: visual stimulation, EEG recording, real-time data processing, and feedback presentation. The 5 × 8 stimulation matrix includes the 26 letters of the English alphabet, 10 numbers, and 4 symbols (i.e., space, comma, period, and backspace). Real-time data analysis recognizes the attended target character through preprocessing, feature extraction, and classification.  The image of the stimulation matrix was only for illustration.


Affective computing

Affective computing (sometimes called artificial emotional intelligence, or emotion AI) aims to recognize, interpret, process, and simulate human affects. Throughout the past decade, many studies in affective computing have mainly classified human emotions using face images or videos. Recent developments in bio-sensing hardware have spurred increasing development of using physiological measurements such as electroencephalogram (EEG), electrocardiogram (ECG), galvanic skin response (GSR), heart-rate variability (HRV) to detect human valance and arousal.  However, the results of these studies are constrained by the limitations of these modalities such as the absence of physiological biomarkers in the face-video analysis, poor spatial resolution in EEG, poor temporal resolution of the GSR etc. Scant research has been conducted to compare the merits of these modalities and understand how to best use them individually and jointly (Siddharth et al., 2018). Our group has been developing novel signal-processing and deep-learning-based methods to detect human emotion based on various bio-sensing and video-based modalities. We first individually evaluate the emotion-classification performance obtained by each modality. We then evaluate the performance obtained by fusing the features from these modalities, and compare its performance with those obtained by single modalities.

Related journal publications

  1. Lin, Y-P, Duann, J-R, Chen, J-H, Jung, T-P “Electroencephalographic Dynamics of Musical Emotion Perception Revealed by Independent Spectral Components,” NeuroReport, 21(6):410-415, 2010.
  2. Lin, Y-P, Wang, C-H, Jung, T-P, Wu, T-L, Jeng, S-K, Duann, J-R, and Chen, J-H. “EEG-based Emotion Recognition in Music Listening,” IEEE Transactions on Biomedical Engineering, 57(7), 1798–1806, 2010.
  3. Lin, Y-P, Duann, J-R, Feng, W., Chen, J-H, Jung, T-P, Revealing Spatio-spectral Electroencephalographic Dynamics of Musical Mode and Tempo Perception by Independent Component Analysis, Journal of Neural Engineering and Rehabilitation, 11:18, 2014.     doi:10.1186/1743-0003-11-18.
  4. Lin, Y-P, Yang, Y-H., Jung, T-P., Fusion of Electroencephalogram dynamics and musical contents for estimating emotional responses in music listening, Frontiers in Neuroscience, 8: article 94, 2014.  doi: 10.3389/ fnins.2014.00094
  5. Lin, Y-P., and Jung, T-P., Improving EEG-Based Emotion Classification Using Conditional Transfer Learning, Frontiers in Human Neuroscience 11: article 334, 2017.

Book chapters

  1. Lin, Y-P., Jung, T-P, Wang, J., Onton, J. “Toward Affective Brain-Computer Interface: Fundamentals and Analysis of EEG-based Emotion Classification,” In: Advances in Emotion Recognition, Wiley-Blackwell series Jan, 2015.
  2. Lin, Y-P., Hsu, S-H, Jung, T-P, "Exploring Day-to-day Variability in the Relations between Emotion and EEG Signals", in Schmorrow, C. Fidopiastis (Ed.): Foundations of Augmented Cognition, Lecture Notes in Artificial Intelligence, 9183: 461-9, 2015.

A VR-based BCI objectively assesses our vision

Assessment of loss of visual function is an essential component of the management of numerous conditions, including glaucoma, retinal and neurological disorders. The current assessment of visual-field loss with standard automated perimetry (SAP) is limited by the subjectivity of patient responses and high test-retest variability, and frequently requires many tests for effective detection of change over time. Our group has collaborated with Dr. Fellipe Medeiros of Duke University to develop a wearable device that uses a head-mounted display (HMD) integrated with wireless EEG, capable of objectively assessing visual-field deficits using multifocal steady-state visual-evoked potentials (mfSSVEP) (Nakanishi et al., 2017). In contrast to the transient event-related potentials elicited during conventional visual-evoked potentials (VEP) or multi-focal VEP examination, the use of rapid flickering visual stimulation can produce a brain response known as steady-state visual evoked potential (SSVEP), which is characterized by a quasi-sinusoidal waveform whose frequency components are constant in amplitude and phase. We have succeeded recently in employing the mfSSVEP technique to detect peripheral and localized visual field losses (Nakanishi et al., 2017). The figure below shows our nGoggle prototype from our publication in JAMA Ophthalmology in 2017.

nGoggle prototype


Real-world stress affects human performance and brain activities (funded by NSF and Pied by Dr. Ying Wu and co-PI T-P Jung)

Various real-life stressors, such as fatigue, anxiety, and stress can induce homeostatic changes in the human behavior, brain, and body. For instance, associations have been uncovered between achievement on tests of memory and attention and the experience of everyday stressors such as poor sleep (Gamaldo, et al., 2010) or minor stressful events (Neupert, et al., 2006; Sliwinski, et al., 2006). Most research paradigms in cognitive neuroscience have been compelled to dismiss the influence of such stressors as uncontrollable noise.  However, with the advent of inexpensive, wearable biomonitors and advances in wireless and cloud-based computing, it is becoming increasingly feasible to detect and record fluctuations in daily experience on a 24/7 basis and relate them to human performance across a broad range of perceptual, cognitive and motor functions. Our study aims to examine how everyday experience relates to intra-individual variability in performance on cognitively demanding tasks. Using low-cost, unobtrusive, wearable biosensors and mobile, cloud-based computing systems, a Daily Sampling System (DSS) is implemented to measure day-to-day fluctuations along a number of parameters related to cognitive performance, including sleep quality, stress, and anxiety. Participants are scheduled for same day experimental problem-solving sessions in cases when their daily profile suggests optimal, fair, or unfavorable conditions for achievement. This approach affords a comparison within the same individuals of variability in performance measures as a function of their daily state. During scheduled sessions, classic stress induction techniques are integrated into an engaging, realistic Escape Room challenge implemented in virtual reality (VR) while continuous EEG, ECG, eye gaze, pupillometry, and HR/HRV are recorded and synchronized. We hypothesized that problem-solving strategies and outcomes will be modulated to different degrees across individuals by recent experiences of stress or fatigue. Further, on the basis of the project leaders' recent work, it is anticipated that vulnerable versus resilient individuals may exhibit different patterns of both resting-state and task-related EEG dynamics. From this line of findings, it is possible to begin modeling the impact of variations in daily experience on neurocognitive processes. The figure below shows the experimenter’s view of the VR Escape Room and the subject’s visual field through the HTV Vive HMD. A video demo can be found at

nGoggle prototype

A Brain-Computer Interface for Cognitive-state monitoring

Most research in the field of Brain-Computer Interfaces this area is trying to develop research for a patient’s failed nervous system, such as patients who are paralyzed with no control over their muscles. However, a much larger population of ‘healthy’ people who suffer momentary, episodic or progressive cognitive impairments in daily life have, ironically, been overlooked. For example, as the last seventy years of research in human vigilance and attention has shown, humans are not well suited for maintaining alertness and attention under monotonous conditions, particularly during the normal sleep phase of their circadian cycle. Catastrophic errors can result from momentary lapses in alertness and attention during periods of relative inactivity. Many people, not just patients, can benefit from a prosthetic system that continuously monitors fluctuations in cognitive states in the workplace and/or the home (Lin et al., 2008). Our research team has published several fundamental studies that lead to a practical BCI and demonstrated the feasibility of accurately estimating individual’s cognitive states, as indexed by changes in their level of task performance, by monitoring the changes in EEG power spectra or other measures. We have developed apparatus and real-time signal processing and machine-learning algorithms to mitigate cognitive fatigue and attentional lapses. Below is a list of publications.

Related journal publications

  1. Makeig S. and Jung, T-P., “Changes in Alertness is Principal Component of Variance in the EEG Spectrum,” NeuroReport 7, 1995. (pp. 213-216)
  2. Makeig S. and Jung, T-P., “Tonic, phasic and transient EEG correlates of auditory awareness in drowsiness,” Cognitive Brain Research 4, 1996.(pp.15-25)
  3. Jung, T-P., Makeig, S., Stensmo, M. and Sejnowski, T.J., “Estimating Alertness from the EEG Power Spectrum,” IEEE Transactions on Biomedical Engineering 44(1), 1997. (pp. 60-69)
  4. Makeig, S., Enghoff, S., Jung, T-P., and Sejnowski, T.J., “A Natural Basis for Efficient Brain-Actuated Control,” IEEE Trans on Rehab Eng, 8, 2000.  (pp. 208-11)
  5. Van Orden, K., Jung, T-P., and Makeig, S., “Eye Activity Correlates of Fatigue,” Biological Psychology, 52(3), 2000. (pp. 221-40)
  6. Makeig, S., Jung, T-P., and Sejnowski, T.J., “Awareness During Drowsiness: Dynamics and Electrophysiological Correlates,” Canadian Journal Experimental Psychology, 54(4), 2000 (pp. 266-73)
  7. Van Orden, K., Makeig, S., Limbert, W., and Jung, T-P., “Eye Activity      Correlates of Workload During A Visual Spatial Memory Task,” Human Factor, 43(1), 2001 (pp. 111-21)
  8. Lin, C-T., Wu, R-C, Jung, T-P., Liang, S-F, and Huang, T-Y. “Estimating Driving Performance Based on EEG Spectrum Analysis,” EURASIP Journal on Applied Signal Processing 19: 3165-74, 2005.
  9. Lin, C-T., Wu, R-C, Liang, S-F, and Huang, T-Y. Chao, W-H. Chen, Y-J. Jung, T-P.,  “EEG-based Drowsiness Estimation for Safety Driving Using Independent Component Analysis,” IEEE Transactions on Circuit and System 52(12):2726 - 2738, 2005.
  10. Lin, C-T., Ko, L-W., Chung, I-F., Huang, T-Y.,  Chen, Y-C., Jung, T-P., and Liang, S-F,  “Adaptive EEG-based Alertness Estimation System by Using ICA-based Fuzzy Neural Networks,” IEEE Transactions on Circuits and Systems I, 53(11): 2469-76, 2006.
  11. Tsai, Y.F., Viirre, E., Strychacz, C., Chase, B, and Jung, T-P., “Task Performance and Eye Activity Relating to Cognitive Workload”, Aviation, Space, and Environmental Medicine, 78(5):B176-85, 2007.
  12. Huang, R-S, Jung, T-P., Delorme A, Makeig, S. “Tonic and phasic electroencephalographic dynamics during continuous compensatory tracking,” NeuroImage, 39:1896–1909, 2008.
  13. Lin, C-T., Ko, L-W, Chiou, J-C, Duann, J-R., Huang, R-S., Liang, S-F, Chiu, T-W., Jung, T-P. “Noninvasive neural prostheses using mobile and wireless EEG,” Proceedings of the IEEE, 96(7):1167-83, 2008.
  14. Pal, N.R., Chuang, C-Y., Ko, L-W, Chao, C-F., Jung, T-P., Liang, S-F. Lin, C-T. “EEG-based Subject- and Session-independent Drowsiness Detection: An Unsupervised Approach,” EURASIP Journal on Applied Signal Processing, 2008.
  15. Lin, C-T, Huang, K-C, Chao, C-F, Chen, J-A, Chiu, T-W, Ko, L-W, Jung, T-P. “Tonic and phasic EEG and behavioral changes induced by arousing feedback,” NeuroImage, 52: 633–42, 2010.
  16. Chuang, S-W., Ko, L-W., Lin, Y-P, Huang, R-S., Jung, T-P., Lin, C-T., Co-modulatory Spectral Changes in Independent Brain Processes Are Correlated with Task Performance, NeuroImage, 62: 1469-77, 2012.
  17. Lin, C-T., Huang, K-C., Chuang, Ko, L-W., Jung, T-P., Can Arousing Feedback Rectify Lapses in Driving? : Prediction from EEG Power Spectrum, Journal of Neural Engineering, 10, 2013.
  18. Chuang, C-H., Lin, Y-P., Ko, L-W., Jung, T-P., Lin, C-T. Independent Component Ensemble of EEG for Brain-Computer Interface, IEEE Trans Neural System and Rehabilitation Engineering, 22(2): 230-8, 2014.
  19. Chuang CH, Ko LW, Jung TP, Lin CT., Kinesthesia in a sustained-attention driving task, NeuroImage, 2014. doi: 10.1016/j.neuroimage.2014.01.015.
  20. Wang, Y-T., Huang, K-C., Wei, C-S., Huang, T-Y., Ko, L-W., Lin, C-T., Cheng, CK, Jung, T-P. Developing an EEG based On-line Closed-loop Lapse Detection and Mitigation System,  Frontiers in Neuroscience -Neuroprosthetics, 8:321. doi: 10.3389/fnins.2014.00321.
  21. Lin, C-T, Wang, Y-K., Chen, S-A., Jung, T-P., “EEG-Based Attention Tracking during Distracted Driving,” IEEE Transactions on Neural Systems & Rehabilitation Engineering, 23(6):1085-94, 2015.
  22. Lin, C-T., Chuang, C-H., Kerick, S., Mullen, T., Jung, T-P., Ko, L-W., Chen, S-A., King, J-T.  & McDowell, K. A Mind-Wandering Tends to Occur under Low Perceptual Demands during Driving, Scientific Report, 2016.
  23. Huang, K. C., Huang, T-Y., Chuang, C-H., King, J-T., Lin, C-T. & Jung, T-P.  “An EEG-based Fatigue Prediction and Mitigation System,” International Journal of Neural Systems, 2016.
  24. Li, J., Wang, Y., Zhang, L., Cichocki, A., Jung, T-P. “Decoding EEG in Cognitive Tasks with Time-Frequency and Connectivity Masks,”  IEEE Transactions on Cognitive and Developmental Systems, 2016. DOI 10.1109/TCDS.2016.2555952.
  25. Wu, D., Chin, Y-T., Chuang, C-H, Lin, C-T., Jung, T-P., Spatial Filtering for EEG-Based Regression Problems in Brain-Computer Interface (BCI), IEEE Transactions on Fuzzy Systems, in press.
  26. Hsu, S-H. and Jung, T-P., Monitoring alert and drowsy states by modeling EEG source nonstationarity, Journal of Neural Engineering, 14(5):056012, Oct. 2017.
  27. Ko, L-W., Komarov, O., Hairston, W.D., Jung, T-P., and Lin, C-T., Sustained Attention in Real Classroom Settings: An EEG Study, Frontiers in Human Neuroscience, 11:article 388, 2017.
  28. Wu, D., Lance, B.J., Lawhern, V.J., Gordon, S., Jung, T-P., Lin, C-T.,  EEG-Based User Reaction Time Estimation Using Riemannian Geometry Features, IEEE Trans. Neural Syst. Rehab. Eng., 25(11): 2157-68, 2017.
  29. Molina, E., Sanabria, D., Jung, T.-P., Correa, A. Electroencephalographic and peripheral temperature dynamics during a prolonged psychomotor vigilance task, Accident Analysis & Prevention, pii: S0001-4575(17)30370-6, 2017 Oct 20. DOI: 10.1016/j.aap.2017.10.014.
  30. Wei, C.-S, Wang, Y.-T., Lin, C.-T., Jung, T.-P., Toward Drowsiness Detection Using Non-Hair-Bearing EEG-Based Brain-Computer Interfaces, IEEE Trans. Neural Syst. Rehab. Eng., 26(2): 400-6, 2018.
  31. Xu, M., Xiao, X., Wang, Y., Jung, T.-P., Ming, D., A brain computer interface based on miniature event-related potentials induced by very small lateral visual stimuli, IEEE Trans. Biomed. Eng., 66(5): 1166-75, 2018.
  32. Wang, Y.-K., Jung, T.-P., Lin, C.-T. Theta and Alpha Oscillations in Attentional Interaction during Distracted Driving, Frontiers in Behavioral Neuroscience 12, 3, 2018.   
  33. Wei, C-S., Lin, Y-P., Wang, Y-T., Lin, C-T., Jung, T-P., Subject-Transfer Framework for Obviating Inter- and Intra-Subject Variability in EEG-Based Drowsiness Detection, NeuroImage, 174: 407-19, 2018.
  34. Hsu, S-H., Pion-Tonachini, L., Palmer, J., Miyakoshi, M., Makeig, S., Jung, T-P., Modeling brain dynamic state changes with adaptive mixture independent component analysis, NeuroImage, in press.