首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The four-dimensional spherical emotional space has been obtained by multi-dimensional scaling of subjective differences between the emotional expressions in sound samples (the words "Yes" and "No" pronounced in different emotional conditions). Euclidean space axes are interpreted as the following neural mechanisms. The first two dimensions are related with the estimation of a sign of emotional condition: the dimension 1--pleasant/unpleasant, useful or not, the dimension 2--an extent of information certainty. The third and the fourth axes are associated with the incentive. The dimension 3 encodes active (anger) or passive (fear) defensive reaction, and the dimension 4 corresponds to achievement. Three angles of four-dimensional hypersphere: the one between the axes 1 and 2, the second between the axes 3 and 4, the third between these two planes determine subjectively experienced emotion characteristics such as described by Vundt emotion modality (pleasure-unpleaure), excitation-quietness-suppression, and tension-relaxation, respectively. Thus, the first and the second angles regulate the modality of ten basic emotions: five emotions determined by a situation and five emotions determined by personal activity. In case of another system of angular parameters (three angles between the axes 4 and 1, 3 and 2, and the angle between the respective planes), another system of emotion classification, which is usually described in the studies of facial expressions (Shlosberg's and Izma?lov's circular system) and semantics (Osgood) can be realized: emotion modality or sign (regulates 6 basic emotions), emotion activity or brightness (excitation-rest) and emotion saturation (strength of emotion expression).  相似文献   

2.
Changes in activity of 83 neurons in the rabbit colliculus superior evoked by the replacement of eight color and eight achromatic stimuli in pairs were analyzed. It was found out that neurons displayed the early and late phasic responses (within 50-90 and 120-300 ms respectively, after the replacement) and long-term tonic response component, which depended on stimuli intensity. Analysis of phasic component revealed three neuronal groups. The first group (n=25, 30%) selected on the basis of the earliest component, was specialized to differentiate stimuli only by intensities. The perceptual spaces of these neurons reconstructed on the basis of spike discharge in the earliest response were two-dimensional. The second group of neurons (n=16, 19%) selected on the basis of the late phasic component demonstrated four-dimensional structure of perceptual space. Neurons of the third group (n=4, 5%) possessed a two-dimensional structure of perceptual space reconstructed by the analysis of the early component, whereas analysis of the late response revealed a four-dimensional structure. We suggest that information about differences between stimuli in color and intensity coming from cortical neurons is necessary for the reconstruction of four-dimensional space. The structure of perceptual spaces reconstructed on the basis of phasic responses of neurons in the colliculus superior was similar to the spaces of neurons in the primary visual cortex and lateral geniculate nucleus. The structure of perceptual space reconstructed on the basis of neuronal spikes was also similar to the space calculated from the N85 component of the visual evoked potential recorded under similar conditions. This finding confirms the general principle of vector coding in the visual system.  相似文献   

3.
Human cortical visual potentials (VEP) were studied to obtain electrophysiological data concerning face discrimination and to compare them with the direct estimates of differences between faces obtained in the previous publications. The present schematic faces varied in curvature of a mouth and/or declination of eyebrows. These features determined the emotional expression of the schematic faces. We recorded the VEP as the response to the instant replacement of one schematic face (referent stimulus) by an other one (test stimulus) rather then to presentation of a single stimulus. Thus we recorded direct electrophysiological differences between schematic faces. A characteristic feature of this approach was the application of the set of functionally connected test stimuli with monotonously increasing values of differences between the referent and test stimuli. In a result of analysis the complex of components P120-N180-P230 in sites O1, O2, P3, P4, T5, T6 was described. Interpeaks amplitudes of the components shows high correlations with subjective differences between the same pairs of stimuli as well as with physical (configurative) differences between stimuli measured as the angles of lines, defining curvature of a mouth and a declination of eyebrows. The highest correlation with subjective estimates of emotional differences between faces was shown by interpeaks amplitudes N180-P230 in sites O1 and P3. In the some time the interpeaks amplitudes P120-N180 in sites O1 and T5 reflected highest correlation between configurative measures and subjective estimates of stimuli differences.  相似文献   

4.
A model which explains the human vision protanopic deficiency and its biologic prototype with the absence of red-absorbing pigment (rabbit) was constructed from neuron-like elements. In behavioral experiments and by means of evoked potential technique it was shown that the rabbit's color space is characterized by a spherical four-dimensional with a reduction of red-coding area. Similar spherical four-dimensional structure of color space is characteristic for a group of protanopic human subjects. The perceptive space of another group of protanopic subjects (protanomals) is characterized by a reduction of both parts of the red-green opponent axis. These disorders are reproduced in the model either by a loss of some color-coding elements (the absence of the red-absorbing pigment as in protanops) or a shift of the spectral characteristics of the red pigment towards those of the green one (protanomals).  相似文献   

5.
The informational significance of human perceptive and semantic evoked potentials to an abrupt change in non-verbal or verbal stimuli, respectively, is discussed. The amplitudes of perceptive and semantic evoked potentials were shown to be positively correlated with subjective estimates of differences between these stimuli. Multidimensional scaling of amplitude matrices and subjective estimates of differences after pair-wise replacement of the stimuli showed that colors and color names were encoded by excitation vectors of equal lengths in four-dimensional spherical space of colors. Color differences were shown to be equal to absolute values of their excitation vectors, whereas semantic differences in color names turned to be determined by the absolute values of vector differences between color memory traces represented as long-term memory excitation vectors. The data were summarized in the framework of cognitive spherical model.  相似文献   

6.
Changes in activity of 54 neurons in the rabbit visual cortex evoked by the replacement of eight color and eight achromatic stimuli in pairs were analyzed. The diffused stimuli generated by color SVGA monitor were used in the experiments. The earliest response of phasic neurons (50-90 ms after the replacement) was strongly correlated with differences between stimuli in color or intensity. This response ("the signal of differences") was used as a basis of a matrix (8 x 8) constructed for each neuron. Such matrices included mean numbers of spikes per second in responses to changes of different stimuli pairs. All matrices were subjected to factor analysis, and the basic axes (the main factors) of sensory spaces were revealed. It was found that 16 neurons (30%) detected only achromatic differences between stimuli. Perceptual spaces of these neurons were two-dimensional with brightness and darkness orthogonal axes. The spaces of 12 neurons (22%) were four-dimensional with two chromatic and two achromatic axes. The structure of the perceptual space reconstructed from neuronal spikes was similar to the space calculated from the early VEP components recorded under similar conditions and to another space reconstructed on the basis of rabbit's instrumental learning. The fundamental coincidence of color spaces revealed by different methods may reflect the general principle of vector coding in the visual system and suggests the coexistence of two independent cortical mechanisms of the detection of chromatic and achromatic differences.  相似文献   

7.
Facial expressions aid social transactions and serve as socialization tools, with smiles signaling approval and reward, and angry faces signaling disapproval and punishment. The present study examined whether the subjective experience of positive vs. negative facial expressions differs between children and adults. Specifically, we examined age-related differences in biases toward happy and angry facial expressions. Young children (5–7 years) and young adults (18–29 years) rated the intensity of happy and angry expressions as well as levels of experienced arousal. Results showed that young children—but not young adults—rated happy facial expressions as both more intense and arousing than angry faces. This finding, which we replicated in two independent samples, was not due to differences in the ability to identify facial expressions, and suggests that children are more tuned to information in positive expressions. Together these studies provide evidence that children see unambiguous adult emotional expressions through rose-colored glasses, and suggest that what is emotionally relevant can shift with development.  相似文献   

8.
Rigoulot S  Pell MD 《PloS one》2012,7(1):e30740
Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0-1250 ms], [1250-2500 ms], [2500-5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions.  相似文献   

9.
Two experiments were run to examine the effects of dynamic displays of facial expressions of emotions on time judgments. The participants were given a temporal bisection task with emotional facial expressions presented in a dynamic or a static display. Two emotional facial expressions and a neutral expression were tested and compared. Each of the emotional expressions had the same affective valence (unpleasant), but one was high-arousing (expressing anger) and the other low-arousing (expressing sadness). Our results showed that time judgments are highly sensitive to movements in facial expressions and the emotions expressed. Indeed, longer perceived durations were found in response to the dynamic faces and the high-arousing emotional expressions compared to the static faces and low-arousing expressions. In addition, the facial movements amplified the effect of emotions on time perception. Dynamic facial expressions are thus interesting tools for examining variations in temporal judgments in different social contexts.  相似文献   

10.
Facial expressions of emotion play a key role in guiding social judgements, including deciding whether or not to approach another person. However, no research has examined how situational context modulates approachability judgements assigned to emotional faces, or the relationship between perceived threat and approachability judgements. Fifty-two participants provided approachability judgements to angry, disgusted, fearful, happy, neutral, and sad faces across three situational contexts: no context, when giving help, and when receiving help. Participants also rated the emotional faces for level of perceived threat and labelled the facial expressions. Results indicated that context modulated approachability judgements to faces depicting negative emotions. Specifically, faces depicting distress-related emotions (i.e., sadness and fear) were considered more approachable in the giving help context than both the receiving help and neutral context. Furthermore, higher ratings of threat were associated with the assessment of angry, happy and neutral faces as less approachable. These findings are the first to demonstrate the significant role that context plays in the evaluation of an individual’s approachability and illustrate the important relationship between perceived threat and the evaluation of approachability.  相似文献   

11.
The aim of the study was to investigate influence of color hue saturation on emotional state of human. We use frontal EEG asymmetry to determine subject's emotional state. Our emotional stimuli summon opposite dynamics of frontal EEG asymmetry. Negative stimuli elicits decreasing of the value of frontal EEG asymmetry and positive stimuli increases the value of frontal EEG asymmetry in fronto-polar and frontal leads. Such dynamics of frontal EEG asymmetry point the emotional experience in accordance the stimulus modality. Blue and red color modification of stimuli leads changes in dynamics of frontal EEG asymmetry during presentation of emotional stimuli and after. In fact, that no one subject gave a report about color difference between stimuli during an experiment, we conclude that influence of color modification was unconscious. Our result shows the possibility of unconscious perception color modification to emotional state of human.  相似文献   

12.
We examined the processing of facial expressions of pain and anger in 8-month-old infants and adults by measuring event-related brain potentials (ERPs) and frontal EEG alpha asymmetry. The ERP results revealed that while adults showed a late positive potential (LPP) to emotional expressions that was enhanced to pain expressions, reflecting increased evaluation and emotional arousal to pain expressions, infants showed a negative component (Nc) to emotional expressions that was enhanced to angry expressions, reflecting increased allocation of attention to angry faces. Moreover, infants and adults showed opposite patterns in their frontal asymmetry responses to pain and anger, suggesting developmental differences in the motivational processes engendered by these facial expressions. These findings are discussed in the light of associated individual differences in infant temperament and adult dispositional empathy.  相似文献   

13.
Emotion processing has been shown to acquire priority by biasing allocation of attentional resources. Aversive images or fearful expressions are processed quickly and automatically. Many existing findings suggested that processing of emotional information was pre-attentive, largely immune from attentional control. Other studies argued that attention gated the processing of emotion. To tackle this controversy, the current study examined whether and to what degrees attention modulated processing of emotion using a stimulus-response-compatibility (SRC) paradigm. We conducted two flanker experiments using color scale faces in neutral expressions or gray scale faces in emotional expressions. We found SRC effects for all three dimensions (color, gender, and emotion) and SRC effects were larger when the conflicts were task relevant than when they were task irrelevant, suggesting that conflict processing of emotion was modulated by attention, similar to those of color and face identity (gender). However, task modulation on color SRC effect was significantly greater than that on gender or emotion SRC effect, indicating that processing of salient information was modulated by attention to a lesser degree than processing of non-emotional stimuli. We proposed that emotion processing can be influenced by attentional control, but at the same time salience of emotional information may bias toward bottom-up processing, rendering less top-down modulation than that on non-emotional stimuli.  相似文献   

14.
It has been established that the recognition of facial expressions integrates contextual information. In this study, we aimed to clarify the influence of contextual odors. The participants were asked to match a target face varying in expression intensity with non-ambiguous expressive faces. Intensity variations in the target faces were designed by morphing expressive faces with neutral faces. In addition, the influence of verbal information was assessed by providing half the participants with the emotion names. Odor cues were manipulated by placing participants in a pleasant (strawberry), aversive (butyric acid), or no-odor control context. The results showed two main effects of the odor context. First, the minimum amount of visual information required to perceive an expression was lowered when the odor context was emotionally congruent: happiness was correctly perceived at lower intensities in the faces displayed in the pleasant odor context, and the same phenomenon occurred for disgust and anger in the aversive odor context. Second, the odor context influenced the false perception of expressions that were not used in target faces, with distinct patterns according to the presence of emotion names. When emotion names were provided, the aversive odor context decreased intrusions for disgust ambiguous faces but increased them for anger. When the emotion names were not provided, this effect did not occur and the pleasant odor context elicited an overall increase in intrusions for negative expressions. We conclude that olfaction plays a role in the way facial expressions are perceived in interaction with other contextual influences such as verbal information.  相似文献   

15.
Differences in oscillatory responses to emotional facial expressions were studied in 40 subjects (19 men and 21 women aged from 18 to 30 years) varying in severity of depressive symptoms. Compared with perception of angry and neutral faces, perception of happy faces was accompanied by lower Δ synchronization in subjects with a low severity of depressive symptoms (Group 2) and higher Δ synchronization in subjects with a high severity of depressive symptoms (Group 1). Because synchronization of Δ oscillations is usually observed in aversive states, it was assumed that happy faces were perceived as negative stimuli by the Group 1 subjects. Perception of angry faces was accompanied by α desynchronization in Group 2 and α synchronization in Group 1. Based on Klimesch’s theory, the effect was assumed to indicate that the Group 1 subjects were initially set up for perception of negative emotional information. The effect of the emotional stimulus category was significant in Group 2 and nonsignificant in Group 1, testifying that the recognition of emotional information is hindered in depression-prone individuals.  相似文献   

16.
K Guo 《PloS one》2012,7(8):e42585
Using faces representing exaggerated emotional expressions, recent behaviour and eye-tracking studies have suggested a dominant role of individual facial features in transmitting diagnostic cues for decoding facial expressions. Considering that in everyday life we frequently view low-intensity expressive faces in which local facial cues are more ambiguous, we probably need to combine expressive cues from more than one facial feature to reliably decode naturalistic facial affects. In this study we applied a morphing technique to systematically vary intensities of six basic facial expressions of emotion, and employed a self-paced expression categorization task to measure participants' categorization performance and associated gaze patterns. The analysis of pooled data from all expressions showed that increasing expression intensity would improve categorization accuracy, shorten reaction time and reduce number of fixations directed at faces. The proportion of fixations and viewing time directed at internal facial features (eyes, nose and mouth region), however, was not affected by varying levels of intensity. Further comparison between individual facial expressions revealed that although proportional gaze allocation at individual facial features was quantitatively modulated by the viewed expressions, the overall gaze distribution in face viewing was qualitatively similar across different facial expressions and different intensities. It seems that we adopt a holistic viewing strategy to extract expressive cues from all internal facial features in processing of naturalistic facial expressions.  相似文献   

17.
A correlation between some characteristics of the visual evoked potentials and individual personality traits (by the Kettell scale) was revealed in 40 healthy subjects when they recognized facial expressions of anger and fear. As compared to emotionally stable subjects, emotionally unstable subjects had shorter latencies of evoked potentials and suppressed late negativity in the occipital and temporal areas. In contrast, amplitude of these waves in the frontal areas was increased. In emotionally stable group of subjects differences in the evoked potentials related to emotional expressions were evident throughout the whole signal processing beginning from the early sensory stage (P1 wave). In emotionally unstable group differences in the evoked potentials related to recognized emotional expressions developed later. Sensitivity of the evoked potentials to emotional salience of faces was also more pronounced in the emotionally stable group. The involvement of the frontal cortex, amygdala, and the anterior cingulate cortex in the development of individual features of recognition of facial expressions of anger and fear is discussed.  相似文献   

18.
There is a growing body of literature to show that color can convey information, owing to its emotionally meaningful associations. Most research so far has focused on negative hue–meaning associations (e.g., red) with the exception of the positive aspects associated with green. We therefore set out to investigate the positive associations of two colors (i.e., green and pink), using an emotional facial expression recognition task in which colors provided the emotional contextual information for the face processing. In two experiments, green and pink backgrounds enhanced happy face recognition and impaired sad face recognition, compared with a control color (gray). Our findings therefore suggest that because green and pink both convey positive information, they facilitate the processing of emotionally congruent facial expressions (i.e., faces expressing happiness) and interfere with that of incongruent facial expressions (i.e., faces expressing sadness). Data also revealed a positive association for white. Results are discussed within the theoretical framework of emotional cue processing and color meaning.  相似文献   

19.
Psychophysiological experiments were performed on 34 healthy subjects. We analyzed the accuracy and latency of motor response in recognizing two types of complex visual stimuli, animals and objects, which were presented immediately after a brief presentation of face images with different emotional expressions: anger, fear, happiness, and a neutral expression. We revealed the dependence of response latency on emotional expression of the masked face. The response latency was lower when the test stimuli were preceded by angry or fearful faces compared to happy or neutral faces. These effects depended on the type of stimulus and were more expressive when recognizing objects compared to animals. We found that the effects of emotional faces were related to personal features of the subjects that they exhibited in the emotional and communicative blocks of Cattell’s test and were more expressive in more sensitive, anxious, and pessimistic introverts. The mechanisms of the effects of unconsciously perceived emotional information on human visual behavior are discussed.  相似文献   

20.
Neuropeptide B/W receptor-1 (NPBWR1) is expressed in discrete brain regions in rodents and humans, with particularly strong expression in the limbic system, including the central nucleus of the amygdala. Recently, Nagata-Kuroiwa et al. reported that Npbwr1(-/-) mice showed changes in social behavior, suggesting that NPBWR1 plays important roles in the emotional responses of social interactions.The human NPBWR1 gene has a single nucleotide polymorphism at nucleotide 404 (404A>T; SNP rs33977775). This polymorphism results in an amino acid change, Y135F. The results of an in vitro experiment demonstrated that this change alters receptor function. We investigated the effect of this variation on emotional responses to stimuli of showing human faces with four categories of emotional expressions (anger, fear, happiness, and neutral). Subjects' emotional levels on seeing these faces were rated on scales of hedonic valence, emotional arousal, and dominance (V-A-D). A significant genotype difference was observed in valence evaluation; the 404AT group perceived facial expressions more pleasantly than did the 404AA group, regardless of the category of facial expression. Statistical analysis of each combination of [V-A-D and facial expression] also showed that the 404AT group tended to feel less submissive to an angry face than did the 404AA group. Thus, a single nucleotide polymorphism of NPBWR1 seems to affect human behavior in a social context.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号