首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Theory of mind (ToM) is a great evolutionary achievement. It is a special intelligence that can assess not only one''s own desires and beliefs, but also those of others. Whether it is uniquely human or not is controversial, but it is clear that humans are, at least, significantly better at ToM than any other animal. Economists and game theorists have developed sophisticated and powerful models of ToM and we provide a detailed summary of this here. This economic ToM entails a hierarchy of beliefs. I know my preferences, and I have beliefs (a probabilistic distribution) about your preferences, beliefs about your beliefs about my preferences, and so on. We then contrast this economic ToM with the theoretical approaches of neuroscience and with empirical data in general. Although this economic view provides a benchmark and makes useful suggestions about empirical tendencies, it does not always generate a close fit with the data. This provides an opportunity for a synergistic interdisciplinary production of a falsifiable theory of bounded rationality. In particular, a ToM that is founded on evolutionary biology might well be sufficiently structured to have predictive power, while remaining quite general. We sketch two papers that represent preliminary steps in this direction.  相似文献   

2.
Much current vision research is predicated on the idea--and a rapidly growing body of evidence--that visual percepts are generated according to the empirical significance of light stimuli rather than their physical characteristics. As a result, an increasing number of investigators have asked how visual perception can be rationalized in these terms. Here, we compare two different theoretical frameworks for predicting what observers actually see in response to visual stimuli: Bayesian decision theory and empirical ranking theory. Deciding which of these approaches has greater merit is likely to determine how the statistical operations that apparently underlie visual perception are eventually understood.  相似文献   

3.
Niche construction theory (NCT) has emerged as a promising theoretical tool for interpreting zooarchaeological material. However, its juxtaposition against more established frameworks like optimal foraging theory (OFT) has raised important criticism around the testability of NCT for interpreting hominin foraging behavior. Here, we present an optimization foraging model with NCT features designed to consider the destructive realities of the archaeological record after providing a brief review of OFT and NCT. Our model was designed to consider a foragers decision to exploit an environment given predation risk, mortality, and payoff ratios between different ecologies, like more‐open or more‐forested environments. We then discuss how the model can be used with zooarchaeological data for inferring environmental exploitation by a primitive hominin, Homo floresiensis, from the island of Flores in Southeast Asia. Our example demonstrates that NCT can be used in combination with OFT principles to generate testable foraging hypotheses suitable for zooarchaeological research.  相似文献   

4.
5.
Summary Concepts from information theory can enhance our understanding of perceptual processes by providing a unified picture of the process of perception. A single equation is shown to embrace adaptation phenomena, stimulus-response relations, and differential thresholds. Sensory adaptation is regarded as representing a gain in information by the receptor. It is calculated that for constant stimuli in the form of step inputs, insects and arachnids obtain approximately the same amount of information per stimulus from their respective environments as do human beings.  相似文献   

6.
Paul Thompson, John Beatty, and Elisabeth Lloyd argue that attempts to resolve certain conceptual issues within evolutionary biology have failed because of a general adherence to the received view of scientific theories. They maintain that such issues can be clarified and resolved when one adopts a semantic approach to theories. In this paper, I argue that such conceptual issues are just as problematic on a semantic approach. Such issues arise from the complexity involved in providing formal accounts of theoretical laws and scientific explanations. That complexity is due to empirical and pragmatic considerations, not one's adherence to a particular formal approach to theories. This analysis raises a broader question. How can any formal account properly represent the complex nature of empirical phenomena?  相似文献   

7.
The use of diploid sequence markers is still challenging despite the good quality of the information they provide. There is a common problem to all sequencing approaches [traditional cloning and sequencing of PCR amplicons as well as next-generation sequencing (NGS)]: when no variation is found within the sequences from a given individual, homozygozity can never be asserted with certainty. As a consequence, sequence data from diploid markers are mostly analysed at the population (not the individual level) particularly in animal studies. This study aims at contributing to solve this. Using the Bayes theorem and the binomial law, useful results are derived, among which: (i) the number of sequence reads per individual (or sequencing depth) which is required to ensure, at a given probability threshold, that some heterozygotes are not considered erroneously as homozygotes, as a function of the observed heterozygozity (H(o) ) of the locus in the population; (ii) a way of estimating H(o) from low coverage NGS data; (iii) a way of testing the null hypothesis that a genetic marker corresponds to a single and diploid locus, in the absence of data from controlled crosses; (iv) strategies for characterizing sequence genotypes in populations minimizing the average number of sequence reads per individual; (v) a rationale to decide which are the variations that one needs to consider along the sequence, as a function of the sequencing depth affordable, the level of polymorphism desired and the risk of sequencing error. For traditional sequencing technology, optimal strategies appear surprisingly different from the usual empirical ones. The average number of sequence reads required to obtain 99% of fully determined genotypes never exceeds six, this value corresponding to the worst situation when H(o) equals 0.6. This threshold value of H(o) is strikingly stable when the tolerated proportion of nonfully resolved genotypes varies in a reasonable range. These results do not rely on the Hardy-Weinberg equilibrium assumption or on diallelism of nucleotidic sites.  相似文献   

8.
Purpose: Identification of biomarkers in major depressive disorder (MDD) has proceeded in an extemporised manner. No single biomarker has been identified with utility in screening, diagnosis, prognosis, or monitoring, and screening tests have different characteristics than the other functions. Using chaos, bifurcation, and perturbation (CBP) theories, the aim is to identify biomarkers to aid clinicians in screening for MDD.

Materials and methods: MDD is a complex disorder; consequently, a reductionist approach to characterize the complex system changes found in MDD will be inchoate and unreliable. A holistic approach is used to identify biomarkers reflecting the tipping points seen before the catastrophic bifurcation that results in MDD.

Results: Applying CBP theories revealed skew, resistance to change, flickering, increased variance and autocorrelation as patterns of biomarkers. Integrals and differentials of extracellular and intracellular biomarkers were identified, specifically focussed on hypothalamo-pituitary axis (HPA) dysfunction, metabolic dysfunction, inflammation and mitochondrial oxidative stress, and tryptophan metabolism.

Conclusions: Applying CBP theories to the dysfunctional complex biological systems in MDD led to development of integrals and differentials of biomarkers that can be used in screening for MDD and planning future biomarker research, targeting intracellular and extracellular inflammation, HPA axis dysfunction, and tryptophan metabolism.  相似文献   


9.
An information-theoretical approach to phylogeography   总被引:1,自引:1,他引:0  
Data analysis in phylogeographic investigations is typically conducted in either a qualitative manner, or alternatively via the testing of null hypotheses. The former, where inferences about population processes are derived from geographical patterns of genetic variation, may be subject to confirmation bias and prone to overinterpretation. Testing the predictions of null hypotheses is arguably less prone to bias than qualitative approaches, but only if the tested hypotheses are biologically meaningful. As it is difficult to know a priori if this is the case, there is the general need for additional methodological approaches in phylogeographic research. Here, we explore an alternative method for analysing phylogeographic data that utilizes information theory to quantify the probability of multiple hypotheses given the data. We accomplish this by augmenting the model‐selection procedure implemented in ima with calculations of Akaike Information Criterion scores and model probabilities. We generate a ranking of 17 models each representing a set of historical evolutionary processes that may have contributed to the evolution of Plethodon idahoensis, and then quantify the relative strength of support for each hypothesis given the data using metrics borrowed from information theory. Our results suggest that two models have high probability given the data. Each of these models includes population divergence and estimates of ancestral θ that differ from estimates of descendent θ, inferences consistent with prior work in this system. However, the models disagree in that one includes migration as a parameter and one does not, suggesting that there are two regions of parameter space that produce model likelihoods that are similar in magnitude given our data. Results of a simulation study suggest that when data are simulated with migration, most of the optimal models include migration as a parameter, and further that when all of the shared polymorphism results from incomplete lineage sorting, most of the optimal models do not. The results could also indicate a lack of precision, which may be a product of the amount of data that we have collected. In any case, the information‐theoretic metrics that we have applied to the analysis of our data are statistically rigorous, as are hypothesis‐testing approaches, but move beyond the ‘reject/fail to reject’ dichotomy of conventional hypothesis testing in a manner that provides considerably more flexibility to researchers.  相似文献   

10.
11.
Economics and ecology both present us with a key challenge: scaling up from individual behaviour to community-level effects. As a result, biologists have frequently utilized theories and frameworks from economics in their attempt to better understand animal behaviour. In the study of predator–prey interactions, we face a particularly difficult task—understanding how predator choices and strategies will impact the ecology and evolution not just of individual prey species, but whole communities. However, a similar challenge has been encountered, and largely solved, in Marketing, which has created frameworks that successfully predict human consumer behaviour at the community level. We argue that by applying these frameworks to non-human consumers, we can leverage this predictive power to understand the behaviour of these key ecological actors in shaping the communities they act upon. We here use predator–prey interactions, as a case study, to demonstrate and discuss the potential of marketing and human-consumer theory in helping us bridge the gap from laboratory experiments to complex community dynamics.  相似文献   

12.
One of the most challenging problems in computational chemistry and in drug discovery is the accurate prediction of the binding energy between a ligand and a protein receptor. It is well known that the binding energy calculated with the Hartree–Fock molecular orbital theory (HF) lacks the dispersion interaction energy that significantly affects the accuracy of the total binding energy of a large molecular system. We propose a simple and efficient dispersion energy correction to the HF theory (HF-Dtq). The performance of HF-Dtq was compared with those of several recently proposed dispersion corrected density functional theory methods (DFT-Ds) as to the binding energies of 68 small non-covalent complexes. The overall performance of HF-Dtq was found to be nearly equivalent to that of more sophisticated B3LYP-D3. HF-Dtq will thus be a useful and powerful method for accurately predicting the binding energy between a ligand and a protein, albeit it is a simple correction procedure based on HF.  相似文献   

13.
Applications of selective neutrality tests to molecular ecology   总被引:19,自引:0,他引:19  
Ford MJ 《Molecular ecology》2002,11(8):1245-1262
This paper reviews how statistical tests of neutrality have been used to address questions in molecular ecology are reviewed. The work consists of four major parts: a brief review of the current status of the neutral theory; a review of several particularly interesting examples of how statistical tests of neutrality have led to insight into ecological problems; a brief discussion of the pitfalls of assuming a strictly neutral model if it is false; and a discussion of some of the opportunities and problems that molecular ecologists face when using neutrality tests to study natural selection.  相似文献   

14.
15.
Ecologic systems, which are involved mainly in the processing of energy and materials, are actually nested one inside another—they are simultaneously parts and wholes. This fundamental hierarchical organization is easy to detect in nature but has been undervalued by ecologists as a source of new insights about the structure and development of ecosystems and as a means of understanding the crucial connections between ecologic processes and large-scale evolutionary patterns. These ecologic systems include individual organisms bundled into local populations, populations as functional components of local communities or ecosystems, local systems making up the working parts of larger regional ecosystems, and so on, right up to the entire biosphere. Systems at any level of organization can be described and interpreted based on aspects of scale (size, duration, and “membership” in more inclusive entities), integration (all the vital connections both at a particular focal level and across levels of hierarchical organization), spatiotemporal continuity (the “life history” of each system), and boundaries (either membranes, skins, or some other kind of border criterion). Considering hierarchical organization as a general feature of ecologic systems could reinvigorate theoretical ecology, provide a realistic scaling framework for paleoecologic studies, and – most importantly – forge new and productive connections between ecology and evolutionary theory.  相似文献   

16.
A. Revonsuo (2000b) proposed an evolutionary theory of dreaming, stating it is a threat simulation mechanism that allowed early humans to rehearse threat perception and avoidance without biological cost. The present study aimed to establish the proportion of dreams containing physical threats to the dreamer, whether these represent realistic life-threatening events, and whether the dreamer successfully and realistically escapes. It also examined incidence of threatening events in real life. A sample of most recent dreams was collected (N = 401). Only 8.48% of dreamers reported realistic life-threatening events in dreams and a realistic escape subsequently occurred in only one third of these reports. Actual severe life-threatening events were experienced by 44.58% of the sample. These findings contradict key aspects of Revonsuo's theory. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
Summary Indole-3-acetic acid (IAA) concentrations of mycorrhizal and non-mycorrhizal Scots pine roots under moderate and high-nitrogen nutrition were assayed using mass spectrometry with an internal standard. Contrary to current theory, IAA was lower in mycorrhizal roots than in the controls, and higher during highnitrogen nutrition.  相似文献   

18.
The branch of evolutionary theory known as signaling theory attempts to explain various forms of communication. Social scientists have explained many traditional rituals as forms of communication that promote cooperative social relationships among participants. Both evolutionists and social scientists have realized the importance of trust for the formation and maintenance of cooperative social relationships. These factors have led to attempts to apply signaling theory to traditional cultural rituals in various ways. This paper uses the traditional ritual of mumming in small Newfoundland fishing villages to evaluate alternative hypotheses about the connection between rituals, communication, trust, and cooperation. Mumming is found to be most consistent with the hypothesis that it is a ritual of trust wherein participants take a specific type of risk: the risk of harm at the hands of other participants. Individuals who take this risk actively signal their trust. Conversely, individuals who restrain themselves from inflicting harm on other participants actively signal their trustworthiness.
Christina Nicole PomianekEmail:
  相似文献   

19.
This paper introduces a new model that enables researchers to conduct protein folding simulations. A two-step in silico process is used in the course of structural analysis of a set of fast-folding proteins. The model assumes an early stage (ES) that depends solely on the backbone conformation, as described by its geometrical properties—specifically, by the V-angle between two sequential peptide bond planes (which determines the radius of curvature, also called R-radius, according to a second-degree polynomial form). The agreement between the structure under consideration and the assumed model is measured in terms of the magnitude of dispersion of both parameters with respect to idealized values. The second step, called late-stage folding (LS), is based on the “fuzzy oil drop” model, which involves an external hydrophobic force field described by a three-dimensional Gauss function. The degree of conformance between the structure under consideration and its idealized model is expressed quantitatively by means of the Kullback-Leibler entropy, which is a measure of disparity between the observed and expected hydrophobicity distributions. A set of proteins, representative of the fast-folding group - specifically, cold shock proteins - is shown to agree with the proposed model.  相似文献   

20.
Arnold B  Bomblies K  Wakeley J 《Genetics》2012,192(1):195-204
We develop coalescent models for autotetraploid species with tetrasomic inheritance. We show that the ancestral genetic process in a large population without recombination may be approximated using Kingman's standard coalescent, with a coalescent effective population size 4N. Numerical results suggest that this approximation is accurate for population sizes on the order of hundreds of individuals. Therefore, existing coalescent simulation programs can be adapted to study population history in autotetraploids simply by interpreting the timescale in units of 4N generations. We also consider the possibility of double reduction, a phenomenon unique to polysomic inheritance, and show that its effects on gene genealogies are similar to partial self-fertilization.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号