top of page
Search

EEG and fNIRS in Human Factors and User Experience Research

  • Writer: Mohsen Rafiei
    Mohsen Rafiei
  • Nov 11, 2025
  • 7 min read




Understanding how people think, feel, and perform while interacting with systems requires methods that can capture the brain and body in real time without disrupting the task itself. In Human Factors (HF) and User Experience (UX) research, tools like Electroencephalography (EEG), Functional Near-Infrared Spectroscopy (fNIRS), eye tracking, and Galvanic Skin Response (GSR) provide powerful ways to study attention, emotion, and cognitive load as users engage with technology. Having worked extensively with these tools during my PhD, postdoctoral research, and now in my labs, I’ve seen firsthand how combining them can uncover layers of user behavior that are invisible to standard UX testing. I know this is not a feasible approach for 99.99% of UX studies, these methods are complex, expensive, and require technical expertise, but for those who can use them properly, they are game changers. When these physiological measures are combined with behavioral data and oral reports such as interviews, they form a gold mine of insight into how people perceive, process, and emotionally respond to experiences. These techniques are especially valuable in applied Human Factors and UX fields where understanding cognitive load, attention, and emotional engagement directly impacts safety and design. They have proven particularly practical in areas such as aviation, automotive interfaces, healthcare systems, virtual and augmented reality (VR/AR) environments, adaptive training, and high-stakes control rooms, where operator attention, stress, or fatigue can determine performance outcomes. In such settings, EEG and fNIRS allow researchers to evaluate how users perceive, decide, and adapt in real time, revealing cognitive and emotional states that behavioral data alone cannot capture. These methods offer complementary windows into brain function: EEG measures electrical activity with millisecond precision, while fNIRS measures blood-oxygen changes that reflect localized cortical activation. Together, they allow researchers to move beyond subjective reports and observe the cognitive and emotional states that drive behavior.


What EEG Measures


EEG records the brain’s electrical activity through electrodes placed on the scalp. The signal primarily reflects postsynaptic potentials generated by synchronized populations of cortical neurons. Because electrical changes happen almost instantly, EEG provides exceptionally high temporal resolution, making it ideal for studying fast, transient cognitive events such as attention shifts, perception, or decision making.


One of the most powerful approaches in EEG research is the Event-Related Potential (ERP) method. ERPs are specific patterns of brain activity that occur in response to a defined stimulus or event. They are extracted by averaging EEG signals across many trials, aligning them in time to the stimulus onset. This averaging process filters out background electrical noise and isolates neural responses associated with particular cognitive operations. For instance, the P300 component, an upward voltage deflection occurring around 300 milliseconds after a target stimulus, is commonly interpreted as a neural signature of attention and decision making. Similarly, the N170 is linked to face perception, and the Late Positive Potential (LPP) indexes emotional engagement. By examining these ERP components, researchers can determine precisely when a cognitive process occurs and how its magnitude changes with task difficulty, fatigue, or design features.


Beyond ERPs, EEG can also be analyzed in terms of oscillatory frequency bands, such as theta, alpha, beta, and gamma rhythms. Each band reflects a different aspect of brain function: theta activity often indicates working memory load, alpha suppression relates to focused attention, and beta rhythms are associated with sensorimotor activity or active engagement. Together, ERP and frequency-domain analyses provide a rich and detailed view of user cognition.


The trade-off, however, is that EEG’s spatial resolution is limited. The skull and scalp spread electrical signals, so EEG cannot precisely locate the source of activity within the brain. The method is also sensitive to noise from eye movements, blinks, and muscle activity. Nonetheless, EEG’s ability to capture moment-to-moment neural dynamics makes it a cornerstone for assessing when cognitive processes occur.


What fNIRS Measures


fNIRS, on the other hand, is an optical imaging technique that measures how light absorption changes as blood oxygenation levels fluctuate in the cortex. When a brain region becomes active, it consumes more oxygen, and local blood flow increases to replenish it. fNIRS detects these changes by emitting near-infrared light into the scalp and measuring the reflected light with sensors. From these readings, researchers infer changes in oxygenated (HbO) and deoxygenated hemoglobin (HbR) concentrations. These changes represent the hemodynamic response, a slower but spatially informative signal that complements EEG’s rapid electrical information.

While fNIRS cannot match EEG’s temporal resolution, its signal peaks several seconds after neural activity, it provides better spatial localization for surface cortical areas, such as the prefrontal cortex. Another advantage is its robustness to movement: unlike fMRI or high-density EEG, fNIRS systems tolerate head motion, making them suitable for more naturalistic tasks such as driving simulations, collaborative work, or classroom studies. However, fNIRS measures only the outer layers of the cortex, as light cannot penetrate deeply into the brain.


Applications in Human Factors and UX


In Human Factors and UX research, EEG and fNIRS are valuable for studying how users engage cognitively and emotionally with interfaces, systems, or environments. They allow researchers to quantify mental states that are otherwise inferred from performance metrics or self-reports. For instance, cognitive workload, the mental effort required to perform a task, can be evaluated by monitoring EEG rhythms or fNIRS activation in the prefrontal cortex. When a user’s workload exceeds optimal levels, performance tends to degrade. Measuring these patterns helps designers balance task demands, interface complexity, and automation.


EEG is particularly powerful for capturing attentional dynamics. ERP components like the P300 reveal when a user detects an important signal, while other markers indicate lapses in attention or mind wandering. Such measures are invaluable in safety-critical domains like aviation, medical monitoring, or automotive design. In contrast, fNIRS excels in studying sustained attention or long-term mental engagement. By tracking gradual increases in oxygenation, researchers can assess how concentration fluctuates during extended operations or monotonous tasks.


Emotion and stress are also key dimensions of user experience that EEG and fNIRS can illuminate. EEG measures of frontal alpha asymmetry have long been associated with emotional valence (positive versus negative), while the Late Positive Potential (LPP) is sensitive to emotional intensity. fNIRS complements these findings by capturing sustained activation in prefrontal and temporal regions during emotional appraisal and regulation. In applied contexts, such as virtual reality environments or customer experience research, these measures help evaluate whether designs evoke the intended affective states, engagement, trust, or frustration.


Beyond individual users, fNIRS supports social and collaborative research through a method known as hyperscanning, in which two or more participants’ brain activity is recorded simultaneously. This approach allows researchers to examine interpersonal neural synchrony, a physiological marker of shared attention, empathy, or cooperation. It has proven particularly useful in studying teamwork, training, and human-AI collaboration. Meanwhile, EEG continues to play a central role in the brain–computer interfaces (BCIs), enabling adaptive systems that adjust difficulty, feedback, or automation levels based on real-time cognitive states.


Importantly, researchers are increasingly combining EEG and fNIRS to capture a more complete picture of human cognition and emotion. EEG’s millisecond precision reveals the timing of neural events, while fNIRS shows which cortical regions sustain the effort or emotion over time. This hybrid approach, sometimes called multimodal neuroimaging, has become a powerful strategy in neuroergonomics and UX research. For example, during a high-stakes decision task, EEG might detect when a user becomes overloaded, and fNIRS could identify which prefrontal areas are compensating to maintain performance. Together, they provide both temporal and spatial insight into how people think and feel while interacting with technology.


Questions EEG and fNIRS Can Answer


In practice, EEG and fNIRS can answer a wide range of HF and UX research questions:

  • How does task complexity affect cognitive load and fatigue over time?

  • Do users detect critical alerts or interface changes quickly enough to maintain safety?

  • Which interface design minimizes stress or frustration under time pressure?

  • Are team members neurally synchronized during collaboration or training?

  • How immersive and emotionally engaging is a virtual environment?


These questions move beyond subjective preferences to uncover how the human brain responds to design and interaction in real time. EEG helps determine when critical cognitive events occur, while fNIRS helps identify where in the cortex sustained engagement or emotional regulation takes place.


Limitations and Practical Challenges


Despite their promise, both modalities have limitations that must be carefully managed in applied research. EEG offers high temporal precision but poor spatial localization and is prone to contamination by muscle or eye artifacts. Its setup can also be time-consuming, requiring careful electrode placement and impedance checks. fNIRS, by contrast, provides stable, spatially localized signals but with slower timing and shallow depth penetration. It is influenced by physiological factors like scalp blood flow or respiration, which can mask cortical activity if not properly controlled.


Another common challenge in HF and UX contexts is balancing ecological validity with data quality. Real-world environments, cockpits, vehicles, or workplaces, introduce movement and noise that complicate data collection. Researchers must design protocols that preserve realism while maintaining analyzable signals. Combining EEG and fNIRS, sometimes referred to as a hybrid approach, can help overcome individual weaknesses by merging EEG’s rapid electrical signals with fNIRS’s spatially resolved hemodynamic information. This fusion improves classification accuracy in workload detection and adaptive interface studies but requires careful synchronization and more complex data processing.


Finally, both techniques require clear interpretation boundaries. Neural metrics are correlates, not direct measures, of mental states. In HF and UX research, it is crucial to triangulate physiological data with behavioral performance and subjective reports to draw valid conclusions. Proper experimental controls, baseline recordings, and artifact correction pipelines are essential to ensure reliability.


EEG and fNIRS have transformed the ability of HF and UX researchers to study the mind in motion. EEG captures the rapid electrical pulses that underlie perception, attention, and decision making, providing insight into the timing of cognitive events. fNIRS captures slower, spatially specific blood-oxygen changes linked to sustained effort, emotional engagement, and cortical activation patterns. Each tool addresses different questions: EEG answers when something happens, fNIRS answers where and for how long. Used together, they form a powerful multimodal framework for designing safer, more efficient, and more emotionally resonant systems. When combined with thoughtful task design and complementary behavioral data, these technologies bridge the gap between brain science and practical human-centered design.


 
 
 

Recent Posts

See All
Quasi-Experimental Methods in UX Research

For a long time, UX teams treated A/B testing as the only legitimate way to make causal claims. If you could not randomize users, you reported trends, correlations, or anecdotes and hoped stakeholders

 
 
 

Comments


  • LinkedIn

©2020 by Mohsen Rafiei.

bottom of page