共查询到20条相似文献,搜索用时 15 毫秒
1.
Previous studies have reported that trait anxiety (TA) affects decision making. However, results remain largely inconsistent across studies. The aim of the current study was to further address the interaction between TA and decision making. 304 subjects without depression from a sample consisting of 642 participants were grouped into high TA (HTA), medium TA (MTA) and low TA (LTA) groups based on their TA scores from State Trait Anxiety Inventory. All subjects were assessed with the Iowa Gambling Task (IGT) that measures decision making under ambiguity and the Game of Dice Task (GDT) that measures decision making under risk. While the HTA and LTA groups performed worse on the IGT compared to the MTA group, performances on the GDT between the three groups did not differ. Furthermore, the LTA and HTA groups showed different individual deck level preferences in the IGT: the former showed a preference for deck B indicating that these subjects focused more on the magnitude of rewards, and the latter showed a preference for deck A indicating significant decision making impairment. Our findings suggest that trait anxiety has effect on decision making under ambiguity but not decision making under risk and different levels of trait anxiety related differently to individual deck level preferences in the IGT. 相似文献
2.
Mkael Symmonds Julian J. Emmanuel Megan E. Drew Rachel L. Batterham Raymond J. Dolan 《PloS one》2010,5(6)
Background
Animals'' attitudes to risk are profoundly influenced by metabolic state (hunger and baseline energy stores). Specifically, animals often express a preference for risky (more variable) food sources when below a metabolic reference point (hungry), and safe (less variable) food sources when sated. Circulating hormones report the status of energy reserves and acute nutrient intake to widespread targets in the central nervous system that regulate feeding behaviour, including brain regions strongly implicated in risk and reward based decision-making in humans. Despite this, physiological influences per se have not been considered previously to influence economic decisions in humans. We hypothesised that baseline metabolic reserves and alterations in metabolic state would systematically modulate decision-making and financial risk-taking in humans.Methodology/Principal Findings
We used a controlled feeding manipulation and assayed decision-making preferences across different metabolic states following a meal. To elicit risk-preference, we presented a sequence of 200 paired lotteries, subjects'' task being to select their preferred option from each pair. We also measured prandial suppression of circulating acyl-ghrelin (a centrally-acting orexigenic hormone signalling acute nutrient intake), and circulating leptin levels (providing an assay of energy reserves). We show both immediate and delayed effects on risky decision-making following a meal, and that these changes correlate with an individual''s baseline leptin and changes in acyl-ghrelin levels respectively.Conclusions/Significance
We show that human risk preferences are exquisitely sensitive to current metabolic state, in a direction consistent with ecological models of feeding behaviour but not predicted by normative economic theory. These substantive effects of state changes on economic decisions perhaps reflect shared evolutionarily conserved neurobiological mechanisms. We suggest that this sensitivity in human risk-preference to current metabolic state has significant implications for both real-world economic transactions and for aberrant decision-making in eating disorders and obesity. 相似文献3.
4.
5.
Daniel Bennett Stefan Bode Maja Brydevall Hayley Warren Carsten Murawski 《PLoS computational biology》2016,12(7)
In a dynamic world, an accurate model of the environment is vital for survival, and agents ought regularly to seek out new information with which to update their world models. This aspect of behaviour is not captured well by classical theories of decision making, and the cognitive mechanisms of information seeking are poorly understood. In particular, it is not known whether information is valued only for its instrumental use, or whether humans also assign it a non-instrumental intrinsic value. To address this question, the present study assessed preference for non-instrumental information among 80 healthy participants in two experiments. Participants performed a novel information preference task in which they could choose to pay a monetary cost to receive advance information about the outcome of a monetary lottery. Importantly, acquiring information did not alter lottery outcome probabilities. We found that participants were willing to incur considerable monetary costs to acquire payoff-irrelevant information about the lottery outcome. This behaviour was well explained by a computational cognitive model in which information preference resulted from aversion to temporally prolonged uncertainty. These results strongly suggest that humans assign an intrinsic value to information in a manner inconsistent with normative accounts of decision making under uncertainty. This intrinsic value may be associated with adaptive behaviour in real-world environments by producing a bias towards exploratory and information-seeking behaviour. 相似文献
6.
JOHN R. SKALSKI RICHARD L. TOWNSEND BRIAN A. GILBERT 《The Journal of wildlife management》2007,71(4):1309-1316
Abstract: This paper illustrates how age-at-harvest data, when combined with hunter-effort information routinely collected by state game management agencies, can be used to estimate and monitor trends in big game abundance. Twenty-four years of age-at-harvest data for black-tailed deer (Odocoileus hemionus) were analyzed to produce abundance estimates ranging from 1,281 adult females to 3,232 adult females on a 22,079-ha tree farm in Pierce County, Washington, USA. The annual natural survival probability was estimated to be 0.7293 ( = 0.0097) for this female population. The estimated abundance was highly correlated with an independent browse damage index (r = 0.8131, P < 0.001). A population reconstruction incorporating the browse index did not substantially improve the model fit but did provide an auxiliary model for predicting deer abundance. This population reconstruction illustrates a cost-effective alternative to expensive big game survey methods. 相似文献
7.
Samanta Simioni Myriam Schluep Nadège Bault Giorgio Coricelli Joerg Kleeberg Renaud A. Du Pasquier Markus Gschwind Patrik Vuilleumier Jean-Marie Annoni 《PloS one》2012,7(12)
Introduction
Deficits in decision making (DM) are commonly associated with prefrontal cortical damage, but may occur with multiple sclerosis (MS). There are no data concerning the impact of MS on tasks evaluating DM under explicit risk, where different emotional and cognitive components can be distinguished.Methods
We assessed 72 relapsing-remitting MS (RRMS) patients with mild to moderate disease and 38 healthy controls in two DM tasks involving risk with explicit rules: (1) The Wheel of Fortune (WOF), which probes the anticipated affects of decisions outcomes on future choices; and (2) The Cambridge Gamble Task (CGT) which measures risk taking. Participants also underwent a neuropsychological and emotional assessment, and skin conductance responses (SCRs) were recorded.Results
In the WOF, RRMS patients showed deficits in integrating positive counterfactual information (p<0.005) and greater risk aversion (p<0.001). They reported less negative affect than controls (disappointment: p = 0.007; regret: p = 0.01), although their implicit emotional reactions as measured by post-choice SCRs did not differ. In the CGT, RRMS patients differed from controls in quality of DM (p = 0.01) and deliberation time (p = 0.0002), the latter difference being correlated with attention scores. Such changes did not result in overall decreases in performance (total gains).Conclusions
The quality of DM under risk was modified by MS in both tasks. The reduction in the expression of disappointment coexisted with an increased risk aversion in the WOF and alexithymia features. These concomitant emotional alterations may have implications for better understanding the components of explicit DM and for the clinical support of MS patients. 相似文献8.
9.
10.
Genetic studies of uniparental disomy (UPD) employing many markers have helped geneticists to gain a better understanding of the molecular mechanisms underlying nondisjunction. However, most existing methods cannot simultaneously analyze all genetic markers and consistently incorporate crossover interference; they thus fail to make the most use of genetic information in the data. In the present article, we describe a hidden Markov model for multilocus uniparental disomy data. This method is based on the chi-square model for the crossover process and can simultaneously incorporate all marker information including untyped and uninformative markers. We then apply this novel method to analyze a set of UPD15 data. 相似文献
11.
The Iowa Gambling Task (IGT) is widely used in investigations of decision making. A growing number of studies have linked performance on this task to personality differences, with the aim of explaining the large degree of variability in healthy individuals'' performance of the task. However, this line of research has yielded inconsistent results. In the present study, we tested whether increasing the conflict between short-term and long-term gains in the IGT can clarify personality-related modulations of decision making. We assessed performance on the original IGT as a function of the personality traits typically involved in risky decision making (i.e., impulsivity, sensation seeking, sensitivity to reward and punishment). The impact of these same personality traits was also evaluated on a modified version of the task in which the difference in immediate reward magnitude between disadvantageous and advantageous decks was increased, while keeping the net gain fixed. The results showed that only in this latter IGT variant were highly impulsive individuals and high sensation seekers lured into making disadvantageous choices. The opposite seems to be the case for participants who were highly sensitive to punishment, although further data are needed to corroborate this finding. The present preliminary results suggest that the IGT variant used in this study could be more effective than the original task at identifying personality effects in decision making. Implications for dispositional and situational effects on decision making are discussed. 相似文献
12.
新诊断标准下妊娠期糖尿病高危因素研究 总被引:1,自引:0,他引:1
侯美芹王治洁周玲石礼红乔侨 《现代生物医学进展》2012,12(10):1916-1919
目的:调查新诊断标准下国内妊娠期糖尿病(Gestational diabetes mellitus GDM)的发病情况,分析影响GDM发生的高危因素,为新标准下国内GDM孕妇临床早期管理、诊断和干预提供理论依据。方法:对2011年1月至2011年9月我院接受产前建卡检查的所有孕妇1152例进行临床资料的收集及回顾性研究,排除孕前糖尿病患者16例,采用GDM诊断新标准进行一步法诊断,收集包括年龄、孕产次、体质指数(body mass index BMI)、糖尿病家族史、多囊卵巢综合征等13种影响GDM发生的危险因素,并综合分析。结果:新标准下GDM检出率为10.39%(118/1136)2)单因素分析结果发现年龄≥35岁(X2=10.2814,P=0.0013)、肥胖(孕前BMI≥28kg/m2()X2=36.2384,P<0.0001)、多囊卵巢综合征(X2=20.6725,P<0.0001)、糖尿病家族史(X2=7.8783,P=0.0050)在GDM组与非GDM组有统计学差异,多因素逐步Logistic回归分析肥胖(OR=7.546 95%CI=2.356~20.129 P=0.0002)、多囊卵巢综合征(OR=6.342 95%CI=1.783~16.329,P=0.0019)、年龄(OR=3.021 95%CI=0.983~6.459 P=0.0108)、糖尿病家族史(OR=2.43895%CI=0.612~5.231 P=0.0256)为GDM的高危因素。结论:新标准下报告GDM检出率为10.39%。肥胖、多囊卵巢综合征、年龄、糖尿病家族史为影响GDM发生的高危因素。加强GDM筛查并对具有高危因素的妊娠期妇女早期诊断,早期干预、早期管理可改善妊娠结局,提高人口素质。 相似文献
13.
Deepika Mohan Derek C. Angus Daniel Ricketts Coreen Farris Baruch Fischhoff Matthew R. Rosengart Donald M. Yealy Amber E. Barnato 《PloS one》2014,9(8)
Background
Physician non-compliance with clinical practice guidelines remains a critical barrier to high quality care. Serious games (using gaming technology for serious purposes) have emerged as a method of studying physician decision making. However, little is known about their validity.Methods
We created a serious game and evaluated its construct validity. We used the decision context of trauma triage in the Emergency Department of non-trauma centers, given widely accepted guidelines that recommend the transfer of severely injured patients to trauma centers. We designed cases with the premise that the representativeness heuristic influences triage (i.e. physicians make transfer decisions based on archetypes of severely injured patients rather than guidelines). We randomized a convenience sample of emergency medicine physicians to a control or cognitive load arm, and compared performance (disposition decisions, number of orders entered, time spent per case). We hypothesized that cognitive load would increase the use of heuristics, increasing the transfer of representative cases and decreasing the transfer of non-representative cases.Findings
We recruited 209 physicians, of whom 168 (79%) began and 142 (68%) completed the task. Physicians transferred 31% of severely injured patients during the game, consistent with rates of transfer for severely injured patients in practice. They entered the same average number of orders in both arms (control (C): 10.9 [SD 4.8] vs. cognitive load (CL):10.7 [SD 5.6], p = 0.74), despite spending less time per case in the control arm (C: 9.7 [SD 7.1] vs. CL: 11.7 [SD 6.7] minutes, p<0.01). Physicians were equally likely to transfer representative cases in the two arms (C: 45% vs. CL: 34%, p = 0.20), but were more likely to transfer non-representative cases in the control arm (C: 38% vs. CL: 26%, p = 0.03).Conclusions
We found that physicians made decisions consistent with actual practice, that we could manipulate cognitive load, and that load increased the use of heuristics, as predicted by cognitive theory. 相似文献14.
15.
Statistical Measures of DNA Sequence Dissimilarity under Markov Chain Models of Base Composition 总被引:5,自引:0,他引:5
In molecular biology, the issue of quantifying the similarity between two biological sequences is very important. Past research has shown that word-based search tools are computationally efficient and can find some new functional similarities or dissimilarities invisible to other algorithms like FASTA. Recently, under the independent model of base composition, Wu, Burke, and Davison (1997, Biometrics 53, 1431 1439) characterized a family of word-based dissimilarity measures that defined distance between two sequences by simultaneously comparing the frequencies of all subsequences of n adjacent letters (i.e., n-words) in the two sequences. Specifically, they introduced the use of Mahalanobis distance and standardized Euclidean distance into the study of DNA sequence dissimilarity. They showed that both distances had better sensitivity and selectivity than the commonly used Euclidean distance. The purpose of this article is to extend Mahalanobis and standardized Euclidean distances to Markov chain models of base composition. In addition, a new dissimilarity measure based on Kullback-Leibler discrepancy between frequencies of all n-words in the two sequences is introduced. Applications to real data demonstrate that Kullback-Leibler discrepancy gives a better performance than Euclidean distance. Moreover, under a Markov chain model of order kQ for base composition, where kQ is the estimated order based on the query sequence, standardized Euclidean distance performs very well. Under such a model, it performs as well as Mahalanobis distance and better than Kullback-Leibler discrepancy and Euclidean distance. Since standardized Euclidean distance is drastically faster to compute than Mahalanobis distance, in a usual workstation/PC computing environment, the use of standardized Euclidean distance under the Markov chain model of order kQ of base composition is generally recommended. However, if the user is very concerned with computational efficiency, then the use of Kullback-Leibler discrepancy, which can be computed as fast as Euclidean distance, is recommended. This can significantly enhance the current technology in comparing large datasets of DNA sequences. 相似文献
16.
17.
18.
Aristides T. Hatjimihail 《PloS one》2009,4(6)
Background
An open problem in clinical chemistry is the estimation of the optimal sampling time intervals for the application of statistical quality control (QC) procedures that are based on the measurement of control materials. This is a probabilistic risk assessment problem that requires reliability analysis of the analytical system, and the estimation of the risk caused by the measurement error.Methodology/Principal Findings
Assuming that the states of the analytical system are the reliability state, the maintenance state, the critical-failure modes and their combinations, we can define risk functions based on the mean time of the states, their measurement error and the medically acceptable measurement error. Consequently, a residual risk measure rr can be defined for each sampling time interval. The rr depends on the state probability vectors of the analytical system, the state transition probability matrices before and after each application of the QC procedure and the state mean time matrices. As optimal sampling time intervals can be defined those minimizing a QC related cost measure while the rr is acceptable. I developed an algorithm that estimates the rr for any QC sampling time interval of a QC procedure applied to analytical systems with an arbitrary number of critical-failure modes, assuming any failure time and measurement error probability density function for each mode. Furthermore, given the acceptable rr, it can estimate the optimal QC sampling time intervals.Conclusions/Significance
It is possible to rationally estimate the optimal QC sampling time intervals of an analytical system to sustain an acceptable residual risk with the minimum QC related cost. For the optimization the reliability analysis of the analytical system and the risk analysis of the measurement error are needed. 相似文献19.
Elisabeth G. Biesta-Peters Martine W. Reij Leon G. M. Gorris Marcel H. Zwietering 《Applied and environmental microbiology》2010,76(17):5791-5801
A combination of multiple hurdles to limit microbial growth is frequently applied in foods to achieve an overall level of protection. Quantification of hurdle technology aims at identifying synergistic or multiplicative effects and is still being developed. The gamma hypothesis states that inhibitory environmental factors aiming at limiting microbial growth rates combine in a multiplicative manner rather than synergistically. Its validity was tested here with respect to the use of pH and various concentrations of undissociated acids, i.e., acetic, lactic, propionic, and formic acids, to control growth of Bacillus cereus in brain heart infusion broth. The key growth parameter considered was the maximum specific growth rate, μmax, as observed by determination of optical density. A variety of models from the literature describing the effects of various pH values and undissociated acid concentrations on μmax were fitted to experimental data sets and compared based on a predefined set of selection criteria, and the best models were selected. The cardinal model developed by Rosso (for pH dependency) and the model developed by Luong (for undissociated acid) were found to provide the best fit and were combined in a gamma model with good predictive performance. The introduction of synergy factors into the models was not able to improve the quality of the prediction. On the contrary, inclusion of synergy factors led to an overestimation of the growth boundary, with the inherent possibility of leading to underestimation of the risk under the conditions tested in this research.Consumers expect safe and sufficiently stable food within the given shelf life of a food product or component. Several growth-limiting factors, collectively referred to as hurdles, can be used to ensure food stability and safety. Examples of such hurdles are low pH, low water activity, or low temperature (12). Combining hurdles to achieve food stability and safety, known as hurdle technology, can be used to achieve an overall level of protection in food while minimizing impacts on food quality (20). When a combination of hurdles is used, generally the intensity of the hurdles may be lower, to exert a comparable preservative effect, than the intensity of those hurdles when used individually (20). Three classes of interaction can be defined when applying hurdle technology: “no interaction,” in which the effect of a combination is as expected from the response of the separate factors; “synergy,” in which the effect is greater than expected; and “antagonism,” in which the effect is less than expected (6).Though the concept of hurdle technology is rather well established, the quantification of the combined impact of hurdles on growth of microorganisms is still being developed. One significant problem is that there are two opposite views of how antimicrobial factors combine. One view states that there are interactive effects between hurdles; when they are applied together, they give a protection significantly greater than that expected on the basis of the application of the individual hurdles (synergy). The alternative view considers that the combined effect may be complex but that there are no interactive effects culminating in synergy. The latter view is called the gamma hypothesis (41) and states that inhibitory environmental factors combine in a multiplicative manner to produce the observed overall microbial inhibition. A major benefit of models based on the gamma hypothesis is a reduction in experimental work, since growth rates and, as a result, growth boundaries can be estimated upon evaluating single hurdles rather than their various combinations. This benefit can only be realized, however, when the gamma hypothesis is valid for the combination of hurdles considered. If the hypothesis is not valid and interactive effects are present, growth boundaries are estimated wrongly, which might result in fail-safe predictions.Over the years, the gamma hypothesis has been confirmed by several studies (16, 17, 26, 34, 38) that concluded that the combined effect of hurdles on growth rates is multiplicative rather than synergistic. Contrarily, Rödel and Scheuer (30) concluded that interaction occurs when various hurdles are combined, stressing the occurrence of synergy. Both Le Marc et al. (21) and Augustin and Carlier (5) developed a synergy model to take account of synergy occurring when hurdles are combined. It is prudent to conclude that the effect of combinations of hurdles is best evaluated on a case-by-case basis in order to ensure appropriate utility of hurdle technology approaches in establishing food designs that are stable and safe.This research aimed to validate or falsify the gamma hypothesis for two closely related hurdles often used in the food industry: the pH level and the undissociated acid concentration ([HA]). The approach chosen was to establish an overview of models for pH and undissociated acid from the literature. Based on predefined criteria, models were then selected to construct a new gamma model without synergy factors for the various hurdle combinations. The criteria were meant to enable evaluation of the fitting performance of all individual models to select the best-performing models for inclusion in the new gamma models. Finally, the validity of the gamma hypothesis was judged by comparing the predictive performance of the newly constructed gamma models with two gamma models, including a synergy factor reported in the literature. Bacillus cereus F4810/72, relevant for both food spoilage and poisoning (14, 19), was used as the model microorganism. Maximum specific growth rates were determined by optical density measurements combined with time to detection. This method was selected after thorough investigation of three different methods to obtain parameters for growth, as recently published (8). 相似文献
20.
The problem of combining information from separate trials is a key consideration when performing a meta‐analysis or planning a multicentre trial. Although there is a considerable journal literature on meta‐analysis based on individual patient data (IPD), i.e. a one‐step IPD meta‐analysis, versus analysis based on summary data, i.e. a two‐step IPD meta‐analysis, recent articles in the medical literature indicate that there is still confusion and uncertainty as to the validity of an analysis based on aggregate data. In this study, we address one of the central statistical issues by considering the estimation of a linear function of the mean, based on linear models for summary data and for IPD. The summary data from a trial is assumed to comprise the best linear unbiased estimator, or maximum likelihood estimator of the parameter, along with its covariance matrix. The setup, which allows for the presence of random effects and covariates in the model, is quite general and includes many of the commonly employed models, for example, linear models with fixed treatment effects and fixed or random trial effects. For this general model, we derive a condition under which the one‐step and two‐step IPD meta‐analysis estimators coincide, extending earlier work considerably. The implications of this result for the specific models mentioned above are illustrated in detail, both theoretically and in terms of two real data sets, and the roles of balance and heterogeneity are highlighted. Our analysis also shows that when covariates are present, which is typically the case, the two estimators coincide only under extra simplifying assumptions, which are somewhat unrealistic in practice. 相似文献