Match!

Early brain responses to affective faces: A simultaneous EEG-fMRI study

Published on Sep 1, 2018in NeuroImage5.812
· DOI :10.1016/j.neuroimage.2018.05.081
Miriam Müller-Bardorff2
Estimated H-index: 2
(WWU: University of Münster),
Maximilian Bruchmann11
Estimated H-index: 11
(WWU: University of Münster)
+ 5 AuthorsThomas Straube32
Estimated H-index: 32
(WWU: University of Münster)
Sources
Abstract
Abstract The spatio-temporal neural basis of earliest differentiation between emotional and neutral facial expressions is a matter of debate. The present study used concurrent electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) in order to investigate the ‘when’ and ‘where’ of earliest prioritization of emotional over neutral expressions. We measured event-related potentials (ERPs) and blood oxygen dependent (BOLD) signal changes in response to facial expressions of varying emotional intensity and different valence categories. Facial expressions were presented superimposed by two horizontal bars and participants engaged in a focal bars task (low load, high load), in order to manipulate the availability of attentional resources during face perception. EEG data revealed the earliest expression effects in the P1 range (76–128 ms) as a parametric function of stimulus arousal independent of load conditions. Conventional fMRI data analysis also demonstrated significant modulations as a function of stimulus arousal, independent of load, in amygdala, superior temporal sulcus, fusiform gyrus and lateral occipital cortex. Correspondingly, EEG-informed fMRI analysis revealed a significant positive correlation between single-trial P1 amplitudes and BOLD responses in amygdala and lateral posterior occipital cortex. Our results are in line with the hypothesis of the amygdala as fast responding relevance detector and corresponding effects in early visual face processing areas across facial expressions and load conditions.
  • References (104)
  • Citations (7)
📖 Papers frequently viewed together
2013ISNN: International Symposium on Neural Networks
5 Authors (Tiantong Zhou, ..., Nong Qian)
1 Citations
46 Citations
78% of Scinapse members use related papers. After signing in, all features are FREE.
References104
Newest
#1M. Justin Kim (Dartmouth College)H-Index: 7
#2Alison M. Mattek (Dartmouth College)H-Index: 4
Last. Paul J. Whalen (Dartmouth College)H-Index: 47
view all 6 authors...
Human amygdala function has been traditionally associated with processing the affective valence (positive versus negative) of an emotionally charged event, especially those that signal fear or threat. However, this account of human amygdala function can be explained by alternative views, which posit that the amygdala might be tuned to either (a) more general emotional arousal (more relevant versus less relevant) or (b) more specific emotion categories (fear versus happy). Delineating the pure ef...
6 CitationsSource
#1Wiebke Hammerschmidt (GAU: University of Göttingen)H-Index: 3
Last. Annekathrin Schacht (GAU: University of Göttingen)H-Index: 28
view all 3 authors...
Abstract Facial expressions of emotion have an undeniable processing advantage over neutral faces, discernible both at behavioral level and in emotion-related modulations of several event-related potentials (ERPs). Recently it was proposed that also inherently neutral stimuli might gain salience through associative learning mechanisms. The present study investigated whether acquired motivational salience leads to processing advantages similar to biologically determined origins of inherent emotio...
17 CitationsSource
#1Roxane J. Itier (UW: University of Waterloo)H-Index: 23
#2Karly N. Neath-Tavares (UW: University of Waterloo)H-Index: 2
Abstract Task demands shape how we process environmental stimuli but their impact on the early neural processing of facial expressions remains unclear. In a within-subject design, ERPs were recorded to the same fearful, happy and neutral facial expressions presented during a gender discrimination, an explicit emotion discrimination and an oddball detection tasks, the most studied tasks in the field. Using an eye tracker, fixation on the face nose was enforced using a gaze-contingent presentation...
12 CitationsSource
#1Huiyan Lin (WWU: University of Münster)H-Index: 5
#2Miriam Mueller-Bardorff (WWU: University of Münster)H-Index: 1
Last. Thomas Straube (WWU: University of Münster)H-Index: 32
view all 7 authors...
For several stimulus categories (e.g., pictures, odors and words), the arousal of both negative and positive stimuli has been shown to modulate amygdalar activation. In contrast, previous studies did not observe similar amygdalar effects in response to negative and positive facial expressions with varying intensity of facial expressions. Reasons for this discrepancy may be related to analytical strategies, experimental design and stimuli. Therefore, the present study aimed at re-investigating wh...
5 CitationsSource
#1Hannah M. Muench (University of Marburg)H-Index: 2
#2Stefan Westermann (University of Bern)H-Index: 15
Last. Erik M. Mueller (University of Marburg)H-Index: 15
view all 5 authors...
Abstract Anxiety states are characterized by attentional biases to threat and increased early brain responses to potentially threat signaling stimuli. How such stimuli are processed further depends on prior learning experiences (e.g. conditioning and extinction) and the context in which a stimulus appears. Whether context information and prior learning experiences interact with early threat processing in humans is largely unknown. Here, EEG was recorded while healthy participants (N = 20) viewed...
7 CitationsSource
Human intracranial amygdala recordings reveal fast-latency responses to broad and low, but not high, spatial frequency components of fearful, but not happy or neutral, faces, which are not observed with unpleasant scenes. Amygdala fearful face responses are faster than in fusiform cortex, supporting a phylogenetically old, subcortical pathway to human amygdala.
105 CitationsSource
#1Miriam Müller-Bardorff (WWU: University of Münster)H-Index: 2
#2Claudia Schulz (WWU: University of Münster)H-Index: 9
Last. Thomas Straube (WWU: University of Münster)H-Index: 32
view all 7 authors...
Effects of emotional intensity and valence on visual event-related potentials (ERPs) are still poorly understood, in particular in the context of limited attentional resources. In the present EEG study, we investigated the effect of emotional intensity of different emotional facial expressions on P1, N170, early posterior negativity (EPN) and late positive potential (LPP) while varying the amount of available attentional resources. A new stimulus set comprising 90 full color pictures of neutral,...
12 CitationsSource
#1Kristen A. Lindquist (UNC: University of North Carolina at Chapel Hill)H-Index: 21
#2Ajay B. Satpute (Pomona College)H-Index: 19
Last. Lisa Feldman Barrett L F (NU: Northeastern University)H-Index: 138
view all 5 authors...
The ability to experience pleasant or unpleasant feelings or to represent objects as "positive" or "negative" is known as representing hedonic "valence." Although scientists overwhelmingly agree that valence is a basic psychological phenomenon, debate continues about how to best conceptualize it scientifically. We used a meta-analysis of 397 functional magnetic resonance imaging (fMRI) and positron emission tomography studies (containing 914 experimental contrasts and 6827 participants) to test ...
153 CitationsSource
#1Oshrit Arviv (BIU: Bar-Ilan University)H-Index: 2
#2Abraham Goldstein (BIU: Bar-Ilan University)H-Index: 21
Last. Eva Gilboa-Schechtman (BIU: Bar-Ilan University)H-Index: 28
view all 6 authors...
Deciphering the social meaning of facial displays is a highly complex neurological process. The M170, an event related field component of MEG recording, like its EEG counterpart N170, was repeatedly shown to be associated with structural encoding of faces. However, the scope of information encoded during the M170 time window is still being debated. We investigated the neuronal origin of facial processing of integrated social rank cues (SRCs) and emotional facial expressions (EFEs) during the M17...
1 CitationsSource
#1Josefien Huijgen (University of Paris)H-Index: 1
#2Vera Dinkelacker (University of Paris)H-Index: 4
Last. Nathalie George (University of Paris)H-Index: 28
view all 9 authors...
The amygdala is a key structure for monitoring the relevance of environmental stimuli. Yet, little is known about the dynamics of its response to primary social cues such as gaze and emotion. Here, we examined evoked amygdala responses to gaze and facial emotion changes in five epileptic patients with intracerebral electrodes. Patients first viewed a neutral face that would then convey social cues: it turned either happy or fearful with or without gaze aversion. This social cue was followed by a...
9 CitationsSource
Cited By7
Newest
#1Maximilian Bruchmann (WWU: University of Münster)H-Index: 11
#2Sebastian Schindler (WWU: University of Münster)H-Index: 9
Last. Thomas Straube (WWU: University of Münster)H-Index: 32
view all 3 authors...
Prioritized processing of fearful compared to neutral faces is reflected in behavioral advantages such as lower detection thresholds, but also in enhanced early and late event-related potentials (ERPs). Behavioral advantages have recently been associated with the spatial frequency spectrum of fearful faces, better fitting the human contrast sensitivity function than the spectrum of neutral faces. However, it is unclear whether and to which extent early and late ERP differences are due to low-lev...
Source
#1Ilkka Muukkonen (UH: University of Helsinki)H-Index: 2
#2Kaisu Ölander (UH: University of Helsinki)H-Index: 2
Last. Viljami Salmela (Aalto University)H-Index: 8
view all 4 authors...
Abstract The temporal and spatial neural processing of faces has been investigated rigorously, but few studies have unified these dimensions to reveal the spatio-temporal dynamics postulated by the models of face processing. We used support vector machine decoding and representational similarity analysis to combine information from different locations (fMRI), time windows (EEG), and theoretical models. By correlating representational dissimilarity matrices (RDMs) derived from multiple pairwise c...
Source
#1Ezgi Fide (Dokuz Eylül University)
#2Derya Durusu Emek-Savaş (Dokuz Eylül University)H-Index: 1
Last. Görsev Yener (Dokuz Eylül University)H-Index: 21
view all 6 authors...
Abstract Objectives The present study aims to evaluate the amplitude and latency of event-related potentials (ERPs) P100, N170, VPP and N230 in individuals with Alzheimer's disease (AD) compared to healthy elderly controls, using a passive viewing task of emotional facial expressions. Methods Twenty-four individuals with mild to moderate AD and 23 demographically matched healthy elderly controls were included in the study. ERP P100, N170, VPP and N230 amplitude and latency values were compared b...
1 CitationsSource
#1Nicolas Burra (University of Geneva)H-Index: 6
#2Inês Mares (Birkbeck, University of London)H-Index: 2
Last. Atsushi Senju (Birkbeck, University of London)H-Index: 32
view all 3 authors...
Gaze or eye contact is one of the most important non-verbal social cues, which is fundamental to human social interactions. To achieve real time and dynamic face-to-face communication, our brain needs to process another person's gaze direction rapidly and without explicit instruction. In order to explain fast and spontaneous processing of direct gaze, the fast-track modulator model (Senju & Johnson, 2009) was proposed. Here, we review recent developments in gaze processing research in the last d...
Source
#1Annett Schirmer (CUHK: The Chinese University of Hong Kong)H-Index: 26
#2Maria Teresa Wijaya (CUHK: The Chinese University of Hong Kong)H-Index: 1
Last. Trevor B. Penney (CUHK: The Chinese University of Hong Kong)H-Index: 27
view all 4 authors...
: This pre-registered event-related potential study explored how vocal emotions shape visual perception as a function of attention and listener sex. Visual task displays occurred in silence or with a neutral or an angry voice. Voices were task-irrelevant in a single-task block, but had to be categorized by speaker sex in a dual-task block. In the single task, angry voices increased the occipital N2 component relative to neutral voices in women, but not men. In the dual task, angry voices relativ...
Source
#1Elias Ebrahimzadeh (UT: University of Tehran)H-Index: 7
#2Hamid Soltanian-Zadeh (UT: University of Tehran)H-Index: 35
Last. Jafar Mehvari Habibabadi (IUMS: Isfahan University of Medical Sciences)H-Index: 4
view all 5 authors...
Abstract Background Simultaneous EEG-fMRI experiments record spatiotemporal dynamics of epileptic activity. A shortcoming of spike-based EEG-fMRI studies is their inability to provide information about behavior of epileptic generators when no spikes are visible. New method We extract time series of epileptic components identified on EEG and fit them with Generalized Linear Model (GLM) model. This allows a precise and reliable localization of epileptic foci in addition to predicting generator's b...
3 CitationsSource
#1Sebastian Schindler (WWU: University of Münster)H-Index: 9
#2Maximilian Bruchmann (WWU: University of Münster)H-Index: 11
Last. Thomas Straube (WWU: University of Münster)H-Index: 32
view all 4 authors...
: In neuroscientific studies, the naturalness of face presentation differs; a third of published studies makes use of close-up full coloured faces, a third uses close-up grey-scaled faces and another third employs cutout grey-scaled faces. Whether and how these methodological choices affect emotion-sensitive components of the event-related brain potentials (ERPs) is yet unclear. Therefore, this pre-registered study examined ERP modulations to close-up full-coloured and grey-scaled faces as well ...
2 CitationsSource
#1Fuqian Shi (WMU: Wenzhou Medical College)H-Index: 13
#2Nilanjan Dey (Techno India College of Technology)H-Index: 29
Last. R. Simon Sherratt (University of Reading)H-Index: 14
view all 5 authors...
Traditional KANSEI methodology is an important tool in the field of psychology to comprehend the concepts and meanings; it mainly focuses on semantic differential methods. Valence-arousal is regarded as a reflection of the KANSEI adjectives, which is the core concept in the theory of effective dimensions for brain recognition. From previous studies, it has been found that brain fMRI datasets can contain significant information related to valence and arousal. In this current work, a valence-arous...
2 CitationsSource
#1Ilkka Muukkonen (UH: University of Helsinki)H-Index: 2
#2Kaisu Ölander (UH: University of Helsinki)H-Index: 2
Last. Viljami Salmela (UH: University of Helsinki)H-Index: 8
view all 4 authors...
Temporal and spatial neural processing of faces have been investigated rigorously, but few studies have unified these dimensions to reveal the spatio-temporal dynamics postulated by the models of face processing. We used support vector machine decoding and representational similarity analysis to combine information from different locations (fMRI), timepoints (EEG), and theoretical models. By correlating information matrices derived from pair-wise decodings of neural responses to different facial...
Source