首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 187 毫秒
1.
Maximum likelihood (ML) (Neyman, 1971) is an increasingly popular optimality criterion for selecting evolutionary trees. Finding optimal ML trees appears to be a very hard computational task--in particular, algorithms and heuristics for ML take longer to run than algorithms and heuristics for maximum parsimony (MP). However, while MP has been known to be NP-complete for over 20 years, no such hardness result has been obtained so far for ML. In this work we make a first step in this direction by proving that ancestral maximum likelihood (AML) is NP-complete. The input to this problem is a set of aligned sequences of equal length and the goal is to find a tree and an assignment of ancestral sequences for all of that tree's internal vertices such that the likelihood of generating both the ancestral and contemporary sequences is maximized. Our NP-hardness proof follows that for MP given in (Day, Johnson and Sankoff, 1986) in that we use the same reduction from Vertex Cover; however, the proof of correctness for this reduction relative to AML is different and substantially more involved.  相似文献   

2.
Abstract— The method of parsimony in phylogenetic inference is often taken to mean two things: (1) that one should favor the genealogical hypothesis that minimizes the required number of homoplasies ( matchings of independently evolved derived character states ), and (2) that symplesiomorphies (matchings of primitive character states) have little or no evidential value for phylogenetic relationship. This paper shows both theses to be false by undermining recent likelihood arguments for them and by providing a more secure likelihood proof of a new method, which is incompatible with both (1) and (2).  相似文献   

3.
Summary Studies are carried out on the uniqueness of the stationary point on the likelihood function for estimating molecular phylogenetic trees, yielding proof that there exists at most one stationary point, i.e., the maximum point, in the parameter range for the one parameter model of nucleotide substitution. The proof is simple yet applicable to any type of tree topology with an arbitrary number of operational taxonomic units (OTUs). The proof ensures that any valid approximation algorithm be able to reach the unique maximum point under the conditions mentioned above. An algorithm developed incorporating Newton's approximation method is then compared with the conventional one by means of computers simulation. The results show that the newly developed algorithm always requires less CPU time than the conventional one, whereas both algorithms lead to identical molecular phylogenetic trees in accordance with the proof. Contribution No. 1780 from the National Institute of Genetics, Mishima 411, Japan  相似文献   

4.
Hjort & Claeskens (2003) developed an asymptotic theoryfor model selection, model averaging and subsequent inferenceusing likelihood methods in parametric models, along with associatedconfidence statements. In this article, we consider a semiparametricversion of this problem, wherein the likelihood depends on parametersand an unknown function, and model selection/averaging is tobe applied to the parametric parts of the model. We show thatall the results of Hjort & Claeskens hold in the semiparametriccontext, if the Fisher information matrix for parametric modelsis replaced by the semiparametric information bound for semiparametricmodels, and if maximum likelihood estimators for parametricmodels are replaced by semiparametric efficient profile estimators.Our methods of proof employ Le Cam's contiguity lemmas, leadingto transparent results. The results also describe the behaviourof semiparametric model estimators when the parametric componentis misspecified, and also have implications for pointwise-consistentmodel selectors.  相似文献   

5.
On Ewens'' equivalence theorem for ascertainment sampling schemes   总被引:1,自引:1,他引:0       下载免费PDF全文
The usual likelihood formulations for segregation analysis of a genetic trait ignore both the at-risk but unobservable families and the demographic structure of the surrounding population. Families are not ascertained if, by chance, they have no affected members or if the affected members are not ascertained. Ewens has shown that likelihoods which take into explicit account both unobservable families and demographic parameters lead to the same maximum likelihood estimates of segregation and ascertainment parameters as the usual likelihoods. This paper provides an alternative proof of Ewens' theorem based on the Poisson distribution and simple continuous optimization techniques.  相似文献   

6.
A LIKELIHOOD JUSTIFICATION OF PARSIMONY   总被引:2,自引:0,他引:2  
Abstract— A connection is established between maximally parsimonious cladograms and trees of highest likelihood. The assumptions needed to prove this are derivable from the structure of evolutionary theory and are independent of the frequency of homoplasy. The bearing of this justification on alternative methods of phylogenetic inference and on Felsenstein's (1978) proof that parsimony and other phylogenetic methods can be statistically inconsistent is discussed.  相似文献   

7.
This paper examines a quasi-equilibrium theory of rare alleles for subdivided populations that follow an island-model version of the Wright-Fisher model of evolution. All mutations are assumed to create new alleles. We present four results: (1) conditions for the theory to apply are formally established using properties of the moments of the binomial distribution; (2) approximations currently in the literature can be replaced with exact results that are in better agreement with our simulations; (3) a modified maximum likelihood estimator of migration rate exhibits the same good performance on island-model data or on data simulated from the multinomial mixed with the Dirichlet distribution, and (4) a connection between the rare-allele method and the Ewens Sampling Formula for the infinite-allele mutation model is made. This introduces a new and simpler proof for the expected number of alleles implied by the Ewens Sampling Formula.  相似文献   

8.
The precautionary principle reflects an old adage — an ounce of prevention is worth a pound of cure. Its four central components include: taking preventative action in the face of uncertainty; shifting the burden of proof to proponents of an activity; exploring a wide range of alternatives to possibly harmful actions; and increasing public participation in decision processes. Scholars in a range of fields have identified U.S. environmental laws, regulations, and decisions exhibiting precaution de facto. This study moves beyond the traditional treatments of the subject, the morass of definitions systematizing precaution into its basic elements. It poses a further question, within the current legal system and existing laws, how might the precautionary principle be implemented by modifying aspects of a statute? By applying a conceptual legal precautionary framework to a specific example of technological risk management, Washington State's energy facility siting statute, we reveal deficiencies in four areas: compensation issues; burden of proof; Type I or II error preferences; and systematic comparisons. Supplying these would, in all likelihood, ensure a more effective statute and process as well as an outcome consistent with legislative goals. However, were an explicit statement of the precautionary principle introduced, parties dissatisfied with an outcome would seek judicial review, and extensive litigation could counter the legislative mandate of abundant energy at a reasonable cost.  相似文献   

9.
There has been growing interest in the likelihood paradigm of statistics, where statistical evidence is represented by the likelihood function and its strength is measured by likelihood ratios. The available literature in this area has so far focused on parametric likelihood functions, though in some cases a parametric likelihood can be robustified. This focused discussion on parametric models, while insightful and productive, may have left the impression that the likelihood paradigm is best suited to parametric situations. This article discusses the use of empirical likelihood functions, a well‐developed methodology in the frequentist paradigm, to interpret statistical evidence in nonparametric and semiparametric situations. A comparative review of literature shows that, while an empirical likelihood is not a true probability density, it has the essential properties, namely consistency and local asymptotic normality that unify and justify the various parametric likelihood methods for evidential analysis. Real examples are presented to illustrate and compare the empirical likelihood method and the parametric likelihood methods. These methods are also compared in terms of asymptotic efficiency by combining relevant results from different areas. It is seen that a parametric likelihood based on a correctly specified model is generally more efficient than an empirical likelihood for the same parameter. However, when the working model fails, a parametric likelihood either breaks down or, if a robust version exists, becomes less efficient than the corresponding empirical likelihood.  相似文献   

10.
Aziz H  Zaas A  Ginsburg GS 《Genomic Medicine》2007,1(3-4):105-112
Whole blood gene expression profiling has the potential to be informative about dynamic changes in disease states and to provide information on underlying disease mechanisms. Having demonstrated proof of concept in animal models, a number of studies have now tried to tackle the complexity of cardiovascular disease in human hosts to develop better diagnostic and prognostic indicators. These studies show that genomic signatures are capable of classifying patients with cardiovascular diseases into finer categories based on the molecular architecture of a patient's disease and more accurately predict the likelihood of a cardiovascular event than current techniques. To highlight the spectrum of potential applications of whole blood gene expression profiling approach in cardiovascular science, we have chosen to review the findings in a number of complex cardiovascular diseases such as atherosclerosis, hypertension and myocardial infarction as well as thromboembolism, aortic aneurysm, and heart transplant.  相似文献   

11.
Supplementary feeding of birds, particularly in urban areas, is often associated with increased population size and fecundity. In the UK, the non‐native Grey Squirrel Sciurus carolinensis is common in rural and urban habitats. It exploits supplementary feeders and may induce interference competition by excluding birds, but empirical evidence of this is unavailable. Using controlled model presentation experiments, we demonstrate that Grey Squirrels could reduce bird use of supplementary feeders and induce interference competition. Total bird resource use was reduced by 98% and most species exhibited similar sensitivities. The likelihood and magnitude of interference competition will depend on how rapidly displaced birds find alternative food sources; it will be greatest where there are high Grey Squirrel densities and few supplementary feeders. Other studies suggest that supplementary feeding increases Grey Squirrel numbers, and the species is also predicted to expand its non‐native range across most of Europe. Our data indicate that Grey Squirrels may eventually alter the net effect of supplementary feeding on bird populations across the European continent; increased use of squirrel‐proof feeders may help to minimize such effects.  相似文献   

12.
Heinze G  Schemper M 《Biometrics》2001,57(1):114-119
The phenomenon of monotone likelihood is observed in the fitting process of a Cox model if the likelihood converges to a finite value while at least one parameter estimate diverges to +/- infinity. Monotone likelihood primarily occurs in small samples with substantial censoring of survival times and several highly predictive covariates. Previous options to deal with monotone likelihood have been unsatisfactory. The solution we suggest is an adaptation of a procedure by Firth (1993, Biometrika 80, 27-38) originally developed to reduce the bias of maximum likelihood estimates. This procedure produces finite parameter estimates by means of penalized maximum likelihood estimation. Corresponding Wald-type tests and confidence intervals are available, but it is shown that penalized likelihood ratio tests and profile penalized likelihood confidence intervals are often preferable. An empirical study of the suggested procedures confirms satisfactory performance of both estimation and inference. The advantage of the procedure over previous options of analysis is finally exemplified in the analysis of a breast cancer study.  相似文献   

13.
ABSTRACT: BACKGROUND: Linkage analysis is a useful tool for detecting genetic variants that regulate a trait of interest, especially genes associated with a given disease. Although penetrance parameters play an important role in determining gene location, they are assigned arbitrary values according to the researcher's intuition or as estimated by the maximum likelihood principle. Several methods exist by which to evaluate the maximum likelihood estimates of penetrance, although not all of these are supported by software packages and some are biased by marker genotype information, even when disease development is due solely to the genotype of a single allele. FINDINGS: Programs for exploring the maximum likelihood estimates of penetrance parameters were developed using the R statistical programming language supplemented by external C functions. The software returns a vector of polynomial coefficients of penetrance parameters, representing the likelihood of pedigree data. From the likelihood polynomial supplied by the proposed method, the likelihood value and its gradient can be precisely computed. To reduce the effect of the supplied dataset on the likelihood function, feasible parameter constraints can be introduced into maximum likelihood estimates, thus enabling flexible exploration of the penetrance estimates. An auxiliary program generates a perspective plot allowing visual validation of the model's convergence. The functions are collectively available as the MLEP R package. CONCLUSIONS: Linkage analysis using penetrance parameters estimated by the MLEP package enables feasible localization of a disease locus. This is shown through a simulation study and by demonstrating how the package is used to explore maximum likelihood estimates. Although the input dataset tends to bias the likelihood estimates, the method yields accurate results superior to the analysis using intuitive penetrance values for disease with low allele frequencies. MLEP is part of the Comprehensive R Archive Network and is freely available at http://cran.r-project.org/web/packages/MLEP/index.html.  相似文献   

14.
Viruses use a limited set of host pathways for infection. These pathways represent bona fide antiviral targets with low likelihood of viral resistance. We identified the salicylanilide niclosamide as a broad range antiviral agent targeting acidified endosomes. Niclosamide is approved for human use against helminthic infections, and has anti-neoplastic and antiviral effects. Its mode of action is unknown. Here, we show that niclosamide, which is a weak lipophilic acid inhibited infection with pH-dependent human rhinoviruses (HRV) and influenza virus. Structure-activity studies showed that antiviral efficacy and endolysosomal pH neutralization co-tracked, and acidification of the extracellular medium bypassed the virus entry block. Niclosamide did not affect the vacuolar H+-ATPase, but neutralized coated vesicles or synthetic liposomes, indicating a proton carrier mode-of-action independent of any protein target. This report demonstrates that physico-chemical interference with host pathways has broad range antiviral effects, and provides a proof of concept for the development of host-directed antivirals.  相似文献   

15.
The aim of this article is to present the optimization of a proof test procedure of ceramic hip joint ball heads. The proof test rejects defective samples in the production line before being implanted into human body. Thereby on every ceramic ball head a static load is applied, which is somewhat higher than the maximum physiological load. The magnitude of the applied load should not damage the samples which are free of flaws in the high stress area. The configuration of the proof test influences the stress distribution in the ball head, which should be similar to the physiological case. To determine the stress distribution, a non-linear finite element (FE) analysis was performed and the results were validated by measurements. With an iterative approach based on FE calculations the proof test configuration was optimized in such a way that the stress distribution in the ball head is similar to the stress distribution in vivo. In this study all ball heads showed very high fatigue resistance after being proof tested and fulfilled the requirements of the FDA (Food and Drug Administration, USA) described in the Guidance Document for the Preparation of Premarket Notifications for Ceramic Ball Hip System. The probability of a fracture of an implanted ceramic ball head can be decreased by the presented optimized proof test procedure. Latter can thus improve the reliability of ceramic hip joint ball heads. The study was supported by the KTI (Commission for Technology and Innovation, Switzerland).  相似文献   

16.
The origin and evolution of the perianth remains enigmatic. While it seems likely that an undifferentiated perianth consisting of tepals arose early in angiosperm evolution, it is unclear when and how differentiated perianths consisting of distinct organs, such as petals and sepals, arose. Phylogenetic reconstructions of ancestral perianth states across angiosperms have traditionally relied on morphological data from extant species, but these analyses often produce equivocal results. Here we describe the use of developmental genetic data as an additional strategy to infer the ancestral perianth character state for different angiosperm clades. By assessing functional data in combination with expression data in a maximum likelihood framework, we provide a novel approach for investigating the evolutionary history of the perianth. Results of this analysis provide new insights into perianth evolution and provide a proof of concept for using this strategy to explore the incorporation of developmental genetic data in character state reconstructions. As the assumptions outlined here are tested and more genetic data are generated, we hope that ancestral state reconstructions based on multiple lines of evidence will converge.  相似文献   

17.
Non-target impacts of poison baiting for predator control in Australia   总被引:1,自引:0,他引:1  
  • 1 Mammalian predators are controlled by poison baiting in many parts of the world, often to alleviate their impacts on agriculture or the environment. Although predator control can have substantial benefits, the poisons used may also be potentially harmful to other wildlife.
  • 2 Impacts on non‐target species must be minimized, but can be difficult to predict or quantify. Species and individuals vary in their sensitivity to toxins and their propensity to consume poison baits, while populations vary in their resilience. Wildlife populations can accrue benefits from predator control, which outweigh the occasional deaths of non‐target animals. We review recent advances in Australia, providing a framework for assessing non‐target effects of poisoning operations and for developing techniques to minimize such effects. We also emphasize that weak or circumstantial evidence of non‐target effects can be misleading.
  • 3 Weak evidence that poison baiting presents a potential risk to non‐target species comes from measuring the sensitivity of species to the toxin in the laboratory. More convincing evidence may be obtained by quantifying susceptibility in the field. This requires detailed information on the propensity of animals to locate and consume poison baits, as well as the likelihood of mortality if baits are consumed. Still stronger evidence may be obtained if predator baiting causes non‐target mortality in the field (with toxin detected by post‐mortem examination). Conclusive proof of a negative impact on populations of non‐target species can be obtained only if any observed non‐target mortality is followed by sustained reductions in population density.
  • 4 Such proof is difficult to obtain and the possibility of a population‐level impact cannot be reliably confirmed or dismissed without rigorous trials. In the absence of conclusive evidence, wildlife managers should adopt a precautionary approach which seeks to minimize potential risk to non‐target individuals, while clarifying population‐level effects through continued research.
  相似文献   

18.
19.
I D Bross 《Biometrics》1985,41(3):785-793
On the basis of simple, generally accepted biostatistical and public health principles, it is shown that for environmental health hazards a proof of safety is much more difficult than a proof of hazard. The effective sample sizes required for proof of safety are orders of magnitude greater than what is feasible in biostatistical-epidemiological studies. Although many assurances of safety "in the name of science" have been issued by government agencies and others, few if any of these assurances are statistically valid.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号