首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Functional magnetic resonance imaging indicates that observation of the human body induces a selective activation of a lateral occipitotemporal cortical area called extrastriate body area (EBA). This area is responsive to static and moving images of the human body and parts of it, but it is insensitive to faces and stimulus categories unrelated to the human body. With event-related repetitive transcranial magnetic stimulation, we tested the possible causal relation between neural activity in EBA and visual processing of body-related, nonfacial stimuli. Facial and noncorporeal stimuli were used as a control. Interference with neural activity in EBA induced a clear impairment, consisting of a significant increase in discriminative reaction time, in the visual processing of body parts. The effect was selective for stimulus type, because it affected responses to nonfacial body stimuli but not to noncorporeal and facial stimuli, and for locus of stimulation, because the effect from the interfering stimulation of EBA was absent during a corresponding stimulation of primary visual cortex. The results provide strong evidence that neural activity in EBA is not only correlated with but also causally involved in the visual processing of the human body and its parts, except the face.  相似文献   

2.
Many studies have linked the processing of different object categories to specific event-related potentials (ERPs) such as the face-specific N170. Despite reports showing that object-related ERPs are influenced by visual stimulus features, there is consensus that these components primarily reflect categorical aspects of the stimuli. Here, we re-investigated this idea by systematically measuring the effects of visual feature manipulations on ERP responses elicited by both structure-from-motion (SFM)-defined and luminance-defined object stimuli. SFM objects elicited a novel component at 200-250 ms (N250) over parietal and posterior temporal sites. We found, however, that the N250 amplitude was unaffected by restructuring SFM stimuli into meaningless objects based on identical visual cues. This suggests that this N250 peak was not uniquely linked to categorical aspects of the objects, but is strongly determined by visual stimulus features. We provide strong support for this hypothesis by parametrically manipulating the depth range of both SFM- and luminance-defined object stimuli and showing that the N250 evoked by SFM stimuli as well as the well-known N170 to static faces were sensitive to this manipulation. Importantly, this effect could not be attributed to compromised object categorization in low depth stimuli, confirming a strong impact of visual stimulus features on object-related ERP signals. As ERP components linked with visual categorical object perception are likely determined by multiple stimulus features, this creates an interesting inverse problem when deriving specific perceptual processes from variations in ERP components.  相似文献   

3.
The current study examined the time course of implicit processing of distinct facial features and the associate event-related potential (ERP) components. To this end, we used a masked priming paradigm to investigate implicit processing of the eyes and mouth in upright and inverted faces, using a prime duration of 33 ms. Two types of prime-target pairs were used: 1. congruent (e.g., open eyes only in both prime and target or open mouth only in both prime and target); 2. incongruent (e.g., open mouth only in prime and open eyes only in target or open eyes only in prime and open mouth only in target). The identity of the faces changed between prime and target. Participants pressed a button when the target face had the eyes open and another button when the target face had the mouth open. The behavioral results showed faster RTs for the eyes in upright faces than the eyes in inverted faces, the mouth in upright and inverted faces. Moreover they also revealed a congruent priming effect for the mouth in upright faces. The ERP findings showed a face orientation effect across all ERP components studied (P1, N1, N170, P2, N2, P3) starting at about 80 ms, and a congruency/priming effect on late components (P2, N2, P3), starting at about 150 ms. Crucially, the results showed that the orientation effect was driven by the eye region (N170, P2) and that the congruency effect started earlier (P2) for the eyes than for the mouth (N2). These findings mark the time course of the processing of internal facial features and provide further evidence that the eyes are automatically processed and that they are very salient facial features that strongly affect the amplitude, latency, and distribution of neural responses to faces.  相似文献   

4.
Many people experience transient difficulties in recognizing faces but only a small number of them cannot recognize their family members when meeting them unexpectedly. Such face blindness is associated with serious problems in everyday life. A better understanding of the neuro-functional basis of impaired face recognition may be achieved by a careful comparison with an equally unique object category and by a adding a more realistic setting involving neutral faces as well facial expressions. We used event-related functional magnetic resonance imaging (fMRI) to investigate the neuro-functional basis of perceiving faces and bodies in three developmental prosopagnosics (DP) and matched healthy controls. Our approach involved materials consisting of neutral faces and bodies as well as faces and bodies expressing fear or happiness. The first main result is that the presence of emotional information has a different effect in the patient vs. the control group in the fusiform face area (FFA). Neutral faces trigger lower activation in the DP group, compared to the control group, while activation for facial expressions is the same in both groups. The second main result is that compared to controls, DPs have increased activation for bodies in the inferior occipital gyrus (IOG) and for neutral faces in the extrastriate body area (EBA), indicating that body and face sensitive processes are less categorically segregated in DP. Taken together our study shows the importance of using naturalistic emotional stimuli for a better understanding of developmental face deficits.  相似文献   

5.
Extensive research has demonstrated that several specialized cortical regions respond preferentially to faces. One such region, located in the inferior occipital gyrus, has been dubbed the occipital face area (OFA). The OFA is the first stage in two influential face-processing models, both of which suggest that it constructs an initial representation of a face, but how and when it does so remains unclear. The present study revealed that repetitive transcranial magnetic stimulation (rTMS) targeted at the right OFA (rOFA) disrupted accurate discrimination of face parts but had no effect on the discrimination of spacing between these parts. rTMS to left OFA had no effect. A matched part and spacing discrimination task that used house stimuli showed no impairment. In a second experiment, rTMS to rOFA replicated the face-part impairment but did not produce the same effect in an adjacent area, the lateral occipital cortex. A third experiment delivered double pulses of TMS separated by 40 ms at six periods after stimulus presentation during face-part discrimination. Accuracy dropped when pulses were delivered at 60 and 100 ms only. These findings indicate that the rOFA processes face-part information at an early stage in the face-processing stream.  相似文献   

6.
Face processing relies on a distributed, patchy network of cortical regions in the temporal and frontal lobes that respond disproportionately to face stimuli, other cortical regions that are not even primarily visual (such as somatosensory cortex), and subcortical structures such as the amygdala. Higher-level face perception abilities, such as judging identity, emotion and trustworthiness, appear to rely on an intact face-processing network that includes the occipital face area (OFA), whereas lower-level face categorization abilities, such as discriminating faces from objects, can be achieved without OFA, perhaps via the direct connections to the fusiform face area (FFA) from several extrastriate cortical areas. Some lesion, transcranial magnetic stimulation (TMS) and functional magnetic resonance imaging (fMRI) findings argue against a strict feed-forward hierarchical model of face perception, in which the OFA is the principal and common source of input for other visual and non-visual cortical regions involved in face perception, including the FFA, face-selective superior temporal sulcus and somatosensory cortex. Instead, these findings point to a more interactive model in which higher-level face perception abilities depend on the interplay between several functionally and anatomically distinct neural regions. Furthermore, the nature of these interactions may depend on the particular demands of the task. We review the lesion and TMS literature on this topic and highlight the dynamic and distributed nature of face processing.  相似文献   

7.
Transcranial Magnetic Stimulation (TMS) is an effective method for establishing a causal link between a cortical area and cognitive/neurophysiological effects. Specifically, by creating a transient interference with the normal activity of a target region and measuring changes in an electrophysiological signal, we can establish a causal link between the stimulated brain area or network and the electrophysiological signal that we record. If target brain areas are functionally defined with prior fMRI scan, TMS could be used to link the fMRI activations with evoked potentials recorded. However, conducting such experiments presents significant technical challenges given the high amplitude artifacts introduced into the EEG signal by the magnetic pulse, and the difficulty to successfully target areas that were functionally defined by fMRI. Here we describe a methodology for combining these three common tools: TMS, EEG, and fMRI. We explain how to guide the stimulator''s coil to the desired target area using anatomical or functional MRI data, how to record EEG during concurrent TMS, how to design an ERP study suitable for EEG-TMS combination and how to extract reliable ERP from the recorded data. We will provide representative results from a previously published study, in which fMRI-guided TMS was used concurrently with EEG to show that the face-selective N1 and the body-selective N1 component of the ERP are associated with distinct neural networks in extrastriate cortex. This method allows us to combine the high spatial resolution of fMRI with the high temporal resolution of TMS and EEG and therefore obtain a comprehensive understanding of the neural basis of various cognitive processes.  相似文献   

8.
There appears to be a significant disconnect between symptomatic and functional recovery in bipolar disorder (BD). Some evidence points to interepisode cognitive dysfunction. We tested the hypothesis that some of this dysfunction was related to emotional reactivity in euthymic bipolar subjects may effect cognitive processing. A modification of emotional gender categorization oddball task was used. The target was gender (probability 25%) of faces with negative, positive, and neutral emotional expression. The experiment had 720 trials (3 blocks × 240 trials each). Each stimulus was presented for 150 ms, and the EEG/ERP responses were recorded for 1,000 ms. The inter-trial interval was varied in 1,100–1,500 ms range to avoid expectancy effects. Task took about 35 min to complete. There were 9 BD and 9 control subjects matched for age and gender. Reaction time (RT) was globally slower in BD subjects. The centro-parietal amplitudes at N170 and N200, and P200 and P300 were generally smaller in the BD group compared to controls. Latency was shorter to neutral and negative targets in BD. Frontal P200 amplitude was higher to emotional negative facial non-targets in BD subjects. The frontal N200 in response to positive facial emotion was less negative in BD subjects. The frontal P300 of BD subjects was lower to emotionally neutral targets. ERP responses to facial emotion in BD subjects varied significantly from normal controls. These variations are consistent with the common depressive symptomology seen in long term studies of bipolar subjects.  相似文献   

9.
10.
Ten healthy volunteers were submitted to an auditory oddball event related potentials (ERP) paradigm. Single trial 500 ms poststimulus ERPs (Pz, Cz, Fz--linked earlobes) along with the correspondent 1000 ms prestimulus EEG (O1-Cz) were stored. EEG epochs were submitted to spectral analysis and a slow wave index (SWI = delta + theta/total) was computed. Three selective ERP averages corresponding to low, medium and high SWI were computed. N2 latency was longer and P3a amplitude was lower in high SWI averages as compared to low SWI averages.  相似文献   

11.
Adaptation-related aftereffects (AEs) show how face perception can be altered by recent perceptual experiences. Along with contrastive behavioural biases, modulations of the early event-related potentials (ERPs) were typically reported on categorical levels. Nevertheless, the role of the adaptor stimulus per se for face identity-specific AEs is not completely understood and was therefore investigated in the present study. Participants were adapted to faces (S1s) varying systematically on a morphing continuum between pairs of famous identities (identities A and B), or to Fourier phase-randomized faces, and had to match the subsequently presented ambiguous faces (S2s; 50/50% identity A/B) to one of the respective original faces. We found that S1s identical with or near to the original identities led to strong contrastive biases with more identity B responses following A adaptation and vice versa. In addition, the closer S1s were to the 50/50% S2 on the morphing continuum, the smaller the magnitude of the AE was. The relation between S1s and AE was, however, not linear. Additionally, stronger AEs were accompanied by faster reaction times. Analyses of the simultaneously recorded ERPs revealed categorical adaptation effects starting at 100 ms post-stimulus onset, that were most pronounced at around 125–240 ms for occipito-temporal sites over both hemispheres. S1-specific amplitude modulations were found at around 300–400 ms. Response-specific analyses of ERPs showed reduced voltages starting at around 125 ms when the S1 biased perception in a contrastive way as compared to when it did not. Our results suggest that face identity AEs do not only depend on physical differences between S1 and S2, but also on perceptual factors, such as the ambiguity of S1. Furthermore, short-term plasticity of face identity processing might work in parallel to object-category processing, and is reflected in the first 400 ms of the ERP.  相似文献   

12.
It is part of basic emotions like fear or anger that they prepare the brain to act adaptively. Hence scenes representing emotional events are normally associated with characteristic adaptive behavior. Normally, face and body representation areas in the brain are modulated by these emotions when presented in the face or body. Here, we provide neuroimaging evidence (using functional magnetic resonance imaging) that the extrastriate body area (EBA) is highly responsive when subjects observe isolated faces presented in emotional scenes. This response of EBA to threatening scenes in which no body is present gives rise to speculation about its function. We discuss the possibility that the brain reacts proactively to the emotional meaning of the scene.  相似文献   

13.
In children of 7-8 and 9-10 years old, the ERP components were studied by comparing two non-verbalized visuo-spatial stimuli shown in succession with 1.5-1.8 s interstimulus interval. We found the age-related differences in the specific way (and the extent to which) the cortical areas were involved into the processes of the reference stimulus (the first stimulus in the pair) encoding and into the process of comparing the memory trace against the test stimulus. In both age groups, the sensory-specific N1 ERP component in the visual cortices had larger amplitude during working memory than during free observation. Age-related differences in the processing of the sensory-specific parameters of a stimulus are most pronounced in ERP to the test stimulus: in children of 9-10, the amplitude of N1 component increased significantly in all caudal leads following the earlier increase in P1 component in the inferior temporal and occipital areas. In the children of that age, unlike children of 7-8, the early involvement of ventro-lateral prefrontal cortex becomes apparent. In that area an increase of positivity confined to 100-200 ms post-stimulus is observed. Substantial inter-group differences are observed in the late ERP components that are related to cognitive operations. In children of 7-8, presenting both reference and test stimuli causes a significant increase in the amplitude of late positive complex (LPC) in caudal leads with maximal increase being observed in parietal areas at 300-800 ms post-stimulus. In children of 9-10, one can see some adult-like features of the late ERP components during different stages of the working memory process: in fronto-central areas N400 component increases in response to the reference stimulus, whereas LPC increases in response to the test stimulus. The data reported in this work show that the almost mature functional organization of working memory is already in place at the age of 9-10. However, the extent of the prefrontal cortex (especially its dorsal areas) involvement does not yet match the level of maturity.  相似文献   

14.

Background

Adults with bipolar disorder (BD) have cognitive impairments that affect face processing and social cognition. However, it remains unknown whether these deficits in euthymic BD have impaired brain markers of emotional processing.

Methodology/Principal Findings

We recruited twenty six participants, 13 controls subjects with an equal number of euthymic BD participants. We used an event-related potential (ERP) assessment of a dual valence task (DVT), in which faces (angry and happy), words (pleasant and unpleasant), and face-word simultaneous combinations are presented to test the effects of the stimulus type (face vs word) and valence (positive vs. negative). All participants received clinical, neuropsychological and social cognition evaluations. ERP analysis revealed that both groups showed N170 modulation of stimulus type effects (face > word). BD patients exhibited reduced and enhanced N170 to facial and semantic valence, respectively. The neural source estimation of N170 was a posterior section of the fusiform gyrus (FG), including the face fusiform area (FFA). Neural generators of N170 for faces (FG and FFA) were reduced in BD. In these patients, N170 modulation was associated with social cognition (theory of mind).

Conclusions/Significance

This is the first report of euthymic BD exhibiting abnormal N170 emotional discrimination associated with theory of mind impairments.  相似文献   

15.
Rapid detection of evolutionarily relevant threats (e.g., fearful faces) is important for human survival. The ability to rapidly detect fearful faces exhibits high variability across individuals. The present study aimed to investigate the relationship between behavioral detection ability and brain activity, using both event-related potential (ERP) and event-related oscillation (ERO) measurements. Faces with fearful or neutral facial expressions were presented for 17 ms or 200 ms in a backward masking paradigm. Forty-two participants were required to discriminate facial expressions of the masked faces. The behavioral sensitivity index d'' showed that the detection ability to rapidly presented and masked fearful faces varied across participants. The ANOVA analyses showed that the facial expression, hemisphere, and presentation duration affected the grand-mean ERP (N1, P1, and N170) and ERO (below 20 Hz and lasted from 100 ms to 250 ms post-stimulus, mainly in theta band) brain activity. More importantly, the overall detection ability of 42 subjects was significantly correlated with the emotion effect (i.e., fearful vs. neutral) on ERP (r = 0.403) and ERO (r = 0.552) measurements. A higher d'' value was corresponding to a larger size of the emotional effect (i.e., fearful – neutral) of N170 amplitude and a larger size of the emotional effect of the specific ERO spectral power at the right hemisphere. The present results suggested a close link between behavioral detection ability and the N170 amplitude as well as the ERO spectral power below 20 Hz in individuals. The emotional effect size between fearful and neutral faces in brain activity may reflect the level of conscious awareness of fearful faces.  相似文献   

16.
在用事件相关电位(event-related potentials,ERP)研究视觉空间注意问题时,直接观察ERP数据就可得出,空间注意的主要作用是对视觉信息处理的调制,它出现在刺激开始后大约80~250ms,主要表现为枕叶的P1、N1和P2波有明显的增强但它们的潜伏期没有变化。采用基于协同学的时空模式分解方法,把视觉空间注意ERP分解为3个模式成分。结果表明,注意不仅使模式1的第一个正波成分(P11)、第一个负波成分(N11)以及第二个正波成分(P12)增强,还使模式3的第一个正波成分(P31)的潜伏期缩短。用探照灯模型对这些现象作了初步解释,说明该方法是研究注意ERP的一种有潜力的新方法。  相似文献   

17.
Hietanen JK  Nummenmaa L 《PloS one》2011,6(11):e24408
Recent event-related potential studies have shown that the occipitotemporal N170 component--best known for its sensitivity to faces--is also sensitive to perception of human bodies. Considering that in the timescale of evolution clothing is a relatively new invention that hides the bodily features relevant for sexual selection and arousal, we investigated whether the early N170 brain response would be enhanced to nude over clothed bodies. In two experiments, we measured N170 responses to nude bodies, bodies wearing swimsuits, clothed bodies, faces, and control stimuli (cars). We found that the N170 amplitude was larger to opposite and same-sex nude vs. clothed bodies. Moreover, the N170 amplitude increased linearly as the amount of clothing decreased from full clothing via swimsuits to nude bodies. Strikingly, the N170 response to nude bodies was even greater than that to faces, and the N170 amplitude to bodies was independent of whether the face of the bodies was visible or not. All human stimuli evoked greater N170 responses than did the control stimulus. Autonomic measurements and self-evaluations showed that nude bodies were affectively more arousing compared to the other stimulus categories. We conclude that the early visual processing of human bodies is sensitive to the visibility of the sex-related features of human bodies and that the visual processing of other people's nude bodies is enhanced in the brain. This enhancement is likely to reflect affective arousal elicited by nude bodies. Such facilitated visual processing of other people's nude bodies is possibly beneficial in identifying potential mating partners and competitors, and for triggering sexual behavior.  相似文献   

18.
A key to understanding visual cognition is to determine when, how, and with what information the human brain distinguishes between visual categories. So far, the dynamics of information processing for categorization of visual stimuli has not been elucidated. By using an ecologically important categorization task (seven expressions of emotion), we demonstrate, in three human observers, that an early brain event (the N170 Event Related Potential, occurring 170 ms after stimulus onset) integrates visual information specific to each expression, according to a pattern. Specifically, starting 50 ms prior to the ERP peak, facial information tends to be integrated from the eyes downward in the face. This integration stops, and the ERP peaks, when the information diagnostic for judging a particular expression has been integrated (e.g., the eyes in fear, the corners of the nose in disgust, or the mouth in happiness). Consequently, the duration of information integration from the eyes down determines the latency of the N170 for each expression (e.g., with "fear" being faster than "disgust," itself faster than "happy"). For the first time in visual categorization, we relate the dynamics of an important brain event to the dynamics of a precise information-processing function.  相似文献   

19.
A recent functional magnetic resonance imaging (fMRI) study by our group demonstrated that dynamic emotional faces are more accurately recognized and evoked more widespread patterns of hemodynamic brain responses than static emotional faces. Based on this experimental design, the present study aimed at investigating the spatio-temporal processing of static and dynamic emotional facial expressions in 19 healthy women by means of multi-channel electroencephalography (EEG), event-related potentials (ERP) and fMRI-constrained regional source analyses. ERP analysis showed an increased amplitude of the LPP (late posterior positivity) over centro-parietal regions for static facial expressions of disgust compared to neutral faces. In addition, the LPP was more widespread and temporally prolonged for dynamic compared to static faces of disgust and happiness. fMRI constrained source analysis on static emotional face stimuli indicated the spatio-temporal modulation of predominantly posterior regional brain activation related to the visual processing stream for both emotional valences when compared to the neutral condition in the fusiform gyrus. The spatio-temporal processing of dynamic stimuli yielded enhanced source activity for emotional compared to neutral conditions in temporal (e.g., fusiform gyrus), and frontal regions (e.g., ventromedial prefrontal cortex, medial and inferior frontal cortex) in early and again in later time windows. The present data support the view that dynamic facial displays trigger more information reflected in complex neural networks, in particular because of their changing features potentially triggering sustained activation related to a continuing evaluation of those faces. A combined fMRI and EEG approach thus provides an advanced insight to the spatio-temporal characteristics of emotional face processing, by also revealing additional neural generators, not identifiable by the only use of an fMRI approach.  相似文献   

20.
The differential effect of stimulus inversion on face and object recognition suggests that inverted faces are processed by mechanisms for the perception of other objects rather than by face perception mechanisms. We investigated the face inversion using functional magnetic resonance imaging (fMRI). The principal effect of face inversion on was an increased response in ventral extrastriate regions that respond preferentially to another class of objects (houses). In contrast, house inversion did not produce a similar change in face-selective regions. Moreover, stimulus inversion had equivalent, minimal effects for faces in in face-selective regions and for houses in house-selective regions. The results suggest that the failure of face perception systems with inverted faces leads to the recruitment of processing resources in object perception systems, but this failure is not reflected by altered activity in face perception systems.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号