首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In most discussions of the Precautionary Principle, it is implicitly assumed that we are at a point near risk neutrality, so that the principle aims at moving away from risk neutrality in the direction of more risk-averse behavior. In this paper it is argued that actual decision-making in environmental issues is often on the opposite, risk taking, side of risk neutrality. A minimal version of the Precautionary Principle consists in moving from such a position in the direction of risk neutrality. Some methods for achieving this are discussed, such as less consensus-seeking scientific procedures, requirements that scientific committees identify less probable but serious scenarios, interpretative presumptions, and supplementary statistical measures for type II errors.  相似文献   

2.
The classical multiple testing model remains an important practical area of statistics with new approaches still being developed. In this paper we develop a new multiple testing procedure inspired by a method sometimes used in a problem with a different focus. Namely, the inference after model selection problem. We note that solutions to that problem are often accomplished by making use of a penalized likelihood function. A classic example is the Bayesian information criterion (BIC) method. In this paper we construct a generalized BIC method and evaluate its properties as a multiple testing procedure. The procedure is applicable to a wide variety of statistical models including regression, contrasts, treatment versus control, change point, and others. Numerical work indicates that, in particular, for sparse models the new generalized BIC would be preferred over existing multiple testing procedures.  相似文献   

3.
The Precautionary Principle came out of European efforts to clean-up and protect marine ecosystems in the 1980s. Since then, several North American activities have elaborated on this approach in U.S. environmental programs. Unfortunately, US organizations and agencies have not developed strategies and guidelines for implementing the Precautionary Principle in either statutory or voluntary environmental programs. Recent interest in this approach from some members of the scientific, non-profit, and regulatory communities highlights the need to understand the history and conceptual basis of the Precautionary Principle. In this paper we address several of these issues. First, we summarize the pertinent US history of the Precautionary Principle. Next, we describe the scientific framework for the principle. Finally, we make the case that this provides unique opportunities for scientists to obtain meaning in their work by fulfilling what has been called the new Social Contract.  相似文献   

4.
Qu A  Li R 《Biometrics》2006,62(2):379-391
Nonparametric smoothing methods are used to model longitudinal data, but the challenge remains to incorporate correlation into nonparametric estimation procedures. In this article, we propose an efficient estimation procedure for varying-coefficient models for longitudinal data. The proposed procedure can easily take into account correlation within subjects and deal directly with both continuous and discrete response longitudinal data under the framework of generalized linear models. The proposed approach yields a more efficient estimator than the generalized estimation equation approach when the working correlation is misspecified. For varying-coefficient models, it is often of interest to test whether coefficient functions are time varying or time invariant. We propose a unified and efficient nonparametric hypothesis testing procedure, and further demonstrate that the resulting test statistics have an asymptotic chi-squared distribution. In addition, the goodness-of-fit test is applied to test whether the model assumption is satisfied. The corresponding test is also useful for choosing basis functions and the number of knots for regression spline models in conjunction with the model selection criterion. We evaluate the finite sample performance of the proposed procedures with Monte Carlo simulation studies. The proposed methodology is illustrated by the analysis of an acquired immune deficiency syndrome (AIDS) data set.  相似文献   

5.
The European Commission has published a Communication on the Precautionary Principle and a White Book on Governance. These provide us (as research civil servants of the Commission) an institutional framework for handling scientific information that is often incomplete, uncertain, and contested. But, although the Precautionary Principle is intuitively straightforward to understand, there is no agreed way of applying it to real decision-making. To meet this perceived need, researchers have proposed a vast number of taxonomies. These include ignorance auditing, type one-two-three errors, a combination of uncertainty and decision stakes through post-normal science and the plotting of ignorance of probabilities against ignorance of consequences. Any of these could be used to define a precautionary principle region inside a multidimensional space and to position an issue within that region. The rôle of anticipatory research is clearly critical but scientific input is only part of the picture. It is difficult to imagine an issue where the application of the Precautionary Principle would be non-contentious. From genetically-modified food to electro-smog, from climate change to hormone growth in meat, it is clear that: 1) risk and cost-benefit are only part of the picture; 2) there are ethical issues involved; 3) there is a plurality of interests and perspectives that are often in conflict; 4) there will be losers and winners whatever decision is made. Operationalisation of the Precautionary Principle must preserve transparency. Only in this way will the incommensurable costs and benefits associated with different stakeholders be registered. A typical decision will include the following sorts of considerations: 1) the commercial interests of companies and the communities that depend on them; 2) the worldviews of those who might want a greener, less consumerist society and/or who believe in the sanctity of human or animal life; 3) potential benefits such as enabling the world's poor to improve farming; 4) risks such as pollution, gene-flow, or the effects of climate change. In this paper we will discuss the use of a combination of methods on which we have worked and that we consider useful to frame the debate and facilitate the dialogue among stakeholders on where and how to apply the Precautionary Principle.  相似文献   

6.
While epidemiological data typically contain a multivariate response and often also multiple exposure parameters, current methods for safe dose calculations, including the widely used benchmark approach, rely on standard regression techniques. In practice, dose-response modeling and calculation of the exposure limit are often based on the seemingly most sensitive outcome. However, this procedure ignores other available data, is inefficient, and fails to account for multiple testing. Instead, risk assessment could be based on structural equation models, which can accommodate both a multivariate exposure and a multivariate response function. Furthermore, such models will allow for measurement error in the observed variables, which is a requirement for unbiased estimation of the benchmark dose. This methodology is illustrated with the data on neurobehavioral effects in children prenatally exposed to methylmercury, where results based on standard regression models cause an underestimation of the true risk.  相似文献   

7.
Scientific research is of proven value to protecting public health and the environment from current and future problems. We explore the extent to which the Precautionary Principle is a threat to this rôle for science and technology. Not surprisingly for a relatively simple yet still incompletely defined concept, supporters of the Precautionary Principle come from different viewpoints, including a viewpoint that is at least uneasy with the rôle of science, and particularly its use in risk assessment. There are also aspects of the Precautionary Principle that inherently restrict obtaining and using science. The Hazardous Air Pollutant (HAP) provisions in the US Clean Air Act Amendments are an example of the Precautionary Principle, which both shifted the burden of proof so that the onus is now on showing a listed compound is harmless, and required maximum available control technology (MACT) instead of a primarily risk-based approach to pollution control. Since its passage in 1990 there has been a decrease in research funding for studies of HAPs. Other potential problems include that once MACT regulations are established, it may be difficult to develop new technological approaches that will further improve air pollution control; that by treating all regulated HAPs similarly, no distinction is made between those that provide a higher or lower risk; and that there is a perverse incentive to use less well studied agents that are not on the existing list. As acting on the Precautionary Principle inherently imposes significant costs for what is a potentially erroneous action, additional scientific study should be required to determine if the precautionary action was successful. If we are to maximize the value of the Precautionary Principle to public health and the environment, it is crucial that its impact not adversely affect the potent preventive rôle of science and technology.  相似文献   

8.
Ecuador is a Latin American country with one of the biggest biodiversities. At the same time, social and environmental problems are also big. Poverty, political and social problems as well as questions like old transport systems, imported hazards from industrialized countries and lack of information and weak health care systems are the framework of this situation. The most common problems are the use of heavy metals in many activities without safety and health protection, a low technological oil production during two decades, intensive use of pesticides in agriculture, and some other chemical risks. A limited capacity to develop prevention strategies, reduced technical and scientific skills, and the absence of a reliable information and control system, lead to a weak response mechanism. The Precautionary Principle could help to stimulate prevention, protection and to have a new tool to improve the interest in environment and health problems. Reinforcing the presence of international organizations like WHO and ILO, establishing bridges among scientific organizations from developed and developing countries and introducing the Precautionary Principle in the legislation and daily practices of industry and agriculture could lead to an improvement in our environment and health.  相似文献   

9.
Ryman N  Jorde PE 《Molecular ecology》2001,10(10):2361-2373
A variety of statistical procedures are commonly employed when testing for genetic differentiation. In a typical situation two or more samples of individuals have been genotyped at several gene loci by molecular or biochemical means, and in a first step a statistical test for allele frequency homogeneity is performed at each locus separately, using, e.g. the contingency chi-square test, Fisher's exact test, or some modification thereof. In a second step the results from the separate tests are combined for evaluation of the joint null hypothesis that there is no allele frequency difference at any locus, corresponding to the important case where the samples would be regarded as drawn from the same statistical and, hence, biological population. Presently, there are two conceptually different strategies in use for testing the joint null hypothesis of no difference at any locus. One approach is based on the summation of chi-square statistics over loci. Another method is employed by investigators applying the Bonferroni technique (adjusting the P-value required for rejection to account for the elevated alpha errors when performing multiple tests simultaneously) to test if the heterogeneity observed at any particular locus can be regarded significant when considered separately. Under this approach the joint null hypothesis is rejected if one or more of the component single locus tests is considered significant under the Bonferroni criterion. We used computer simulations to evaluate the statistical power and realized alpha errors of these strategies when evaluating the joint hypothesis after scoring multiple loci. We find that the 'extended' Bonferroni approach generally is associated with low statistical power and should not be applied in the current setting. Further, and contrary to what might be expected, we find that 'exact' tests typically behave poorly when combined in existing procedures for joint hypothesis testing. Thus, while exact tests are generally to be preferred over approximate ones when testing each particular locus, approximate tests such as the traditional chi-square seem preferable when addressing the joint hypothesis.  相似文献   

10.
This essay attempts to provide an analytical apparatus which may be used for finding an authoritative formulation1 of the Precautionary Principle. Several formulations of the Precautionary Principle are examined. Four dimensions of the principle are identified: (1) the threat dimension, (2) the uncertainty dimension, (3) the action dimension, and (4) the command dimension. It is argued that the Precautionary Principle can be recast into the following if-clause, containing these four dimensions: “If there is (1) a threat, which is (2) uncertain, then (3) some kind of action (4) is mandatory.” The phrases expressing these dimensions may vary in (a) precision and (b) strength. It is shown that it is the dimension containing the weakest phrase that determines the strength of the entire principle. It is suggested that the four-dimensional if-clause be used as an analytical apparatus in negotiations of the Precautionary Principle.  相似文献   

11.
Aspects of the statistical modeling and assessment of hypotheses concerning quantitative traits in genetics research are discussed. It is suggested that a traditional approach to such modeling and hypothesis testing, whereby competing models are "nested" in an effort to simplify their probabilistic assessment, can be complimented by an alternative statistical paradigm - the separate-families-of-hypotheses approach to segregation analysis. Two bootstrap-based methods are described that allow testing of any two, possibly non-nested, parametric genetic hypotheses. These procedures utilize a strategy in which the unknown distribution of a likelihood ratio-based test statistic is simulated, thereby allowing the estimation of critical values for the test statistic. Though the focus of this paper concerns quantitative traits, the strategies described can be applied to qualitative traits as well. The conceptual advantages and computational ease of these strategies are discussed, and their significance levels and power are examined through Monte Carlo experimentation. It is concluded that the separate-families-of-hypotheses approach, when carried out with the methods described in this paper, not only possesses some favorable statistical properties but also is well suited for genetic segregation analysis.  相似文献   

12.
Ethics tells us: do good and do no harm and invokes the norms of justice, equity and respect for autonomy in protecting and promoting health and well-being. The Precautionary Principle, a contemporary re-definition of Bradford Hill's case for action, gives us a common sense rule for doing good by preventing harm to public health from delay: when in doubt about the presence of a hazard, there should be no doubt about its prevention or removal. It shifts the burden of proof from showing presence of risk to showing absence of risk, aims to do good by preventing harm, and subsumes the upstream strategies of the DPSEEA (Driving Forces Pressure Stress Exposure Effect Action) model and downstream strategies from molecular epidemiology for detection and prevention of risk. The Precautionary Principle has emerged because of the ethical import of delays in detection of risks to human health and the environment. Ethical principles, the Precautionary Principle, the DPSEEA model and molecular epidemiology all imply re-emphasizing epidemiology's classic rôle for early detection and prevention. Delays in recognizing risks from past exposures and acting on the findings (e.g., cigarette smoking and lung cancer, asbestos, organochlorines and endocrine disruption, radiofrequency, raised travel speeds) were examples of failures that were not only scientific, but ethical, since they resulted in preventable harm to exposed populations. These may delay results from, among other things, external and internal determinants of epidemiologic investigations of hazard and risk, including misuse of tests of statistical significance. Furthermore, applying the Precautionary Principle to ensure justice, equity, and respect for autonomy raises questions concerning the short-term costs of implementation to achieve long-term goals and the principles that guide compensation.  相似文献   

13.
Todem D  Hsu WW  Kim K 《Biometrics》2012,68(3):975-982
Summary In many applications of two-component mixture models for discrete data such as zero-inflated models, it is often of interest to conduct inferences for the mixing weights. Score tests derived from the marginal model that allows for negative mixing weights have been particularly useful for this purpose. But the existing testing procedures often rely on restrictive assumptions such as the constancy of the mixing weights and typically ignore the structural constraints of the marginal model. In this article, we develop a score test of homogeneity that overcomes the limitations of existing procedures. The technique is based on a decomposition of the mixing weights into terms that have an obvious statistical interpretation. We exploit this decomposition to lay the foundation of the test. Simulation results show that the proposed covariate-adjusted test statistic can greatly improve the efficiency over test statistics based on constant mixing weights. A real-life example in dental caries research is used to illustrate the methodology.  相似文献   

14.
In epidemiologic studies, measurement error in the exposure variable can have a detrimental effect on the power of hypothesis testing for detecting the impact of exposure in the development of a disease. To adjust for misclassification in the hypothesis testing procedure involving a misclassified binary exposure variable, we consider a retrospective case–control scenario under the assumption of nondifferential misclassification. We develop a test under Bayesian approach from a posterior distribution generated by a MCMC algorithm and a normal prior under realistic assumptions. We compared this test with an equivalent likelihood ratio test developed under the frequentist approach, using various simulated settings and in the presence or the absence of validation data. In our simulations, we considered varying degrees of sensitivity, specificity, sample sizes, exposure prevalence, and proportion of unvalidated and validated data. In these scenarios, our simulation study shows that the adjusted model (with-validation data model) is always better than the unadjusted model (without validation data model). However, we showed that exception is possible in the fixed budget scenario where collection of the validation data requires a much higher cost. We also showed that both Bayesian and frequentist hypothesis testing procedures reach the same conclusions for the scenarios under consideration. The Bayesian approach is, however, computationally more stable in rare exposure contexts. A real case–control study was used to show the application of the hypothesis testing procedures under consideration.  相似文献   

15.
长期大量实践说明,引进天敌防治外来入侵杂草的传统生物防治方法是治理外来入侵杂草的一条切实可行的有效途径,但对其潜在的生态风险——对本土生物的直接或间接不良影响不容忽视。利用传统评价方法预测候选天敌的生态风险存在缺陷,主要表现在:(1)寄主专一性测定过分依赖室内进行的生理寄主范围测定结果,对生态寄主范围(实际寄主范围)问题重视不够,后者指在新环境中的一系列物理和生物条件下的寄主利用预测;(2)在生理寄主范围测定中,过分依赖完成生长发育的可能性,对行为、遗传性状以及系统发育关系重视不够;(3)在风险评估中,过多强调对经济作物的风险,而对自然生态系统的风险重视不够。对此,建议:(1)鼓励对已释放的天敌进行回顾性跟踪研究,从而为杂草生物防治实践提供生态学理论支撑;(2)在运用生物防治手段对付外来入侵杂草实践中,建议采用“有害推论”的预防性原则,以避免在面临入侵生物重大威胁时草率做出释放天敌的决策;(3)在评估候选天敌风险中重视生态效应的风险评估。  相似文献   

16.
In oncology studies with immunotherapies, populations of “super‐responders” (patients in whom the treatment works particularly well) are often suspected to be related to biomarkers. In this paper, we explore various ways of confirmatory statistical hypothesis testing for joint inference on the subpopulation of putative “super‐responders” and the full study population. A model‐based testing framework is proposed, which allows to define, up‐front, the strength of evidence required from both full and subpopulations in terms of clinical efficacy. This framework is based on a two‐way analysis of variance (ANOVA) model with an interaction in combination with multiple comparison procedures. The ease of implementation of this model‐based approach is emphasized and details are provided for the practitioner who would like to adopt this approach. The discussion is exemplified by a hypothetical trial that uses an immune‐marker in oncology to define the subpopulation and tumor growth as the primary endpoint.  相似文献   

17.
Summary Clinicians are often interested in the effect of covariates on survival probabilities at prespecified study times. Because different factors can be associated with the risk of short‐ and long‐term failure, a flexible modeling strategy is pursued. Given a set of multiple candidate working models, an objective methodology is proposed that aims to construct consistent and asymptotically normal estimators of regression coefficients and average prediction error for each working model, that are free from the nuisance censoring variable. It requires the conditional distribution of censoring given covariates to be modeled. The model selection strategy uses stepup or stepdown multiple hypothesis testing procedures that control either the proportion of false positives or generalized familywise error rate when comparing models based on estimates of average prediction error. The context can actually be cast as a missing data problem, where augmented inverse probability weighted complete case estimators of regression coefficients and prediction error can be used ( Tsiatis, 2006 , Semiparametric Theory and Missing Data). A simulation study and an interesting analysis of a recent AIDS trial are provided.  相似文献   

18.
In biostatistics, more and more complex models are being developed. This is particularly the case in system biology. Fitting complex models can be very time‐consuming, since many models often have to be explored. Among the possibilities are the introduction of explanatory variables and the determination of random effects. The particularity of this use of the score test is that the null hypothesis is not itself very simple; typically, some random effects may be present under the null hypothesis. Moreover, the information matrix cannot be computed, but only an approximation based on the score. This article examines this situation with the specific example of HIV dynamics models. We examine the score test statistics for testing the effect of explanatory variables and the variance of random effect in this complex situation. We study type I errors and the statistical powers of this score test statistics and we apply the score test approach to a real data set of HIV‐infected patients.  相似文献   

19.
The Precautionary Principle is in sharp political focus today because (1) the nature of scientific uncertainty is changing and (2) there is increasing pressure to base governmental action on more “rational” schemes, such as cost-benefit analysis and quantitative risk assessment, the former being an embodiment of ‘rational choice theory’ promoted by the Chicago school of law and economics. The Precautionary Principle has been criticized as being both too vague and too arbitrary to form a basis for rational decision making. The assumption underlying this criticism is that any scheme not based on cost-benefit analysis and risk assessment is both irrational and without secure foundation in either science or economics. This paper contests that view and makes explicit the rational tenets of the Precautionary Principle within an analytical framework as rigorous as uncertainties permit, and one that mirrors democratic values embodied in regulatory, compensatory, and common law. Unlike other formulations that reject risk assessment, this paper argues that risk assessment can be used within the formalism of tradeoff analysis—a more appropriate alternative to traditional cost-benefit analysis and one that satisfies the need for well-grounded public policy decision making. This paper will argue that the precautionary approach is the most appropriate basis for policy, even when large uncertainties do not exist, especially where the fairness of the distributions of costs and benefits of hazardous activities and products are a concern. Furthermore, it will offer an approach to making decisions within an analytic framework, based on equity and justice, to replace the economic paradigm of utilitarian cost-benefit analysis.  相似文献   

20.
Mixture modeling provides an effective approach to the differential expression problem in microarray data analysis. Methods based on fully parametric mixture models are available, but lack of fit in some examples indicates that more flexible models may be beneficial. Existing, more flexible, mixture models work at the level of one-dimensional gene-specific summary statistics, and so when there are relatively few measurements per gene these methods may not provide sensitive detectors of differential expression. We propose a hierarchical mixture model to provide methodology that is both sensitive in detecting differential expression and sufficiently flexible to account for the complex variability of normalized microarray data. EM-based algorithms are used to fit both parametric and semiparametric versions of the model. We restrict attention to the two-sample comparison problem; an experiment involving Affymetrix microarrays and yeast translation provides the motivating case study. Gene-specific posterior probabilities of differential expression form the basis of statistical inference; they define short gene lists and false discovery rates. Compared to several competing methodologies, the proposed methodology exhibits good operating characteristics in a simulation study, on the analysis of spike-in data, and in a cross-validation calculation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号