首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 437 毫秒
1.
This paper deals with a Cox proportional hazards regression model, where some covariates of interest are randomly right‐censored. While methods for censored outcomes have become ubiquitous in the literature, methods for censored covariates have thus far received little attention and, for the most part, dealt with the issue of limit‐of‐detection. For randomly censored covariates, an often‐used method is the inefficient complete‐case analysis (CCA) which consists in deleting censored observations in the data analysis. When censoring is not completely independent, the CCA leads to biased and spurious results. Methods for missing covariate data, including type I and type II covariate censoring as well as limit‐of‐detection do not readily apply due to the fundamentally different nature of randomly censored covariates. We develop a novel method for censored covariates using a conditional mean imputation based on either Kaplan–Meier estimates or a Cox proportional hazards model to estimate the effects of these covariates on a time‐to‐event outcome. We evaluate the performance of the proposed method through simulation studies and show that it provides good bias reduction and statistical efficiency. Finally, we illustrate the method using data from the Framingham Heart Study to assess the relationship between offspring and parental age of onset of cardiovascular events.  相似文献   

2.
We discuss Bayesian log-linear models for incomplete contingency tables with both missing and interval censored cells, with the aim of obtaining reliable population size estimates. We also discuss use of external information on the censoring probability, which may substantially reduce uncertainty. We show in simulation that information on lower bounds and external information can each improve the mean squared error of population size estimates, even when the external information is not completely accurate. We conclude with an original example on estimation of prevalence of multiple sclerosis in the metropolitan area of Rome, where five out of six lists have interval censored counts. External information comes from mortality rates of multiple sclerosis patients.  相似文献   

3.

Purpose

Results of life cycle assessments (LCAs) of power generation technologies are increasingly reported in terms of typical values and possible ranges. Extents of these ranges result from both variability and uncertainty. Uncertainty may be reduced via additional research. However, variability is a characteristic of supply chains as they exist; as such, it cannot be reduced without modifying existing systems. The goal of this study is to separately quantify uncertainty and variability in LCA.

Methods

In this paper, we present a novel method for differentiating uncertainty from variability in life cycle assessments of coal-fueled power generation, with a specific focus on greenhouse gas emissions. Individual coal supply chains were analyzed for 364 US coal power plants. Uncertainty in CO2 and CH4 emissions throughout these supply chains was quantified via Monte Carlo simulation. The method may be used to identify key factors that drive the range of life cycle emissions as well as the limits of precision of an LCA.

Results and discussion

Using this method, we statistically characterized the carbon footprint of coal power in the USA in 2009. Our method reveals that the average carbon footprint of coal power (100 year time horizon) ranges from 0.97 to 1.69 kg CO2eq/kWh of generated electricity (95 % confidence interval), primarily due to variability in plant efficiency. Uncertainty in the carbon footprints of individual plants spans a factor of 1.04 for the least uncertain plant footprint to a factor of 1.2 for the most uncertain plant footprint (95 % uncertainty intervals). The uncertainty in the total carbon footprint of all US coal power plants spans a factor of 1.05.

Conclusions

We have developed and successfully implemented a framework for separating uncertainty and variability in the carbon footprint of coal-fired power plants. Reduction of uncertainty will not substantially reduce the range of predicted emissions. The range can only be reduced via substantial changes to the US coal power infrastructure. The finding that variability is larger than uncertainty can obviously not be generalized to other product systems and impact categories. Our framework can, however, be used to assess the relative influence of uncertainty and variability for a whole range of product systems and environmental impacts.  相似文献   

4.
We explore the estimation of uncertainty in evolutionary parameters using a recently devised approach for resampling entire additive genetic variance–covariance matrices ( G ). Large‐sample theory shows that maximum‐likelihood estimates (including restricted maximum likelihood, REML) asymptotically have a multivariate normal distribution, with covariance matrix derived from the inverse of the information matrix, and mean equal to the estimated G . This suggests that sampling estimates of G from this distribution can be used to assess the variability of estimates of G , and of functions of G . We refer to this as the REML‐MVN method. This has been implemented in the mixed‐model program WOMBAT. Estimates of sampling variances from REML‐MVN were compared to those from the parametric bootstrap and from a Bayesian Markov chain Monte Carlo (MCMC) approach (implemented in the R package MCMCglmm). We apply each approach to evolvability statistics previously estimated for a large, 20‐dimensional data set for Drosophila wings. REML‐MVN and MCMC sampling variances are close to those estimated with the parametric bootstrap. Both slightly underestimate the error in the best‐estimated aspects of the G matrix. REML analysis supports the previous conclusion that the G matrix for this population is full rank. REML‐MVN is computationally very efficient, making it an attractive alternative to both data resampling and MCMC approaches to assessing confidence in parameters of evolutionary interest.  相似文献   

5.
Trace elements in coal   总被引:1,自引:0,他引:1  
Trace elements can have profound adverse effects on the health of people burning coal in homes or living near coal deposits, coal mines, and coal-burning power plants. Trace elements such as arsenic emitted from coal-burning power plants in Europe and Asia have been shown to cause severe health problems. Perhaps the most widespread health problems are caused by domestic coal combustion in developing countries where millions of people suffer from fluorosis and thousands from arsenism. Better knowledge of coal quality characteristics may help to reduce some of these health problems. For example, information on concentrations and distributions of potentially toxic elements in coal may help delineate areas of a coal deposit to be avoided. Information on the modes of occurrence of these elements and the textural relations of the minerals in coal may help to predict the behavior of the potentially toxic trace metals during coal cleaning, combustion, weathering, and leaching.  相似文献   

6.
Using instrumental neutron activation analyses and photoninduced x-ray emission techniques for analysis of size-fractionated atmospheric and combustion aerosols and other emission samples arising from fluidized-bed combustion of North Bohemian lignites up to 42 elements were determined in all sample types. This allowed the evaluation of element enrichment, time trends, and interelement correlations and the performance of factor analysis of various fractions of atmospheric aerosols. The data obtained on mass and element size distributions of aerosols and emission samples obtained upon lignite combustion in an experimental scale atmospheric fluidized-bed combustor without and with added hydrated lime and limestone were used to elucidate the mechanism of abatement of toxic trace and matrix elements from flue gas.  相似文献   

7.
The traditional q1 * methodology for constructing upper confidence limits (UCLs) for the low-dose slopes of quantal dose-response functions has two limitations: (i) it is based on an asymptotic statistical result that has been shown via Monte Carlo simulation not to hold in practice for small, real bioassay experiments (Portier and Hoel, 1983); and (ii) it assumes that the multistage model (which represents cumulative hazard as a polynomial function of dose) is correct. This paper presents an uncertainty analysis approach for fitting dose-response functions to data that does not require specific parametric assumptions or depend on asymptotic results. It has the advantage that the resulting estimates of the dose-response function (and uncertainties about it) no longer depend on the validity of an assumed parametric family nor on the accuracy of the asymptotic approximation. The method derives posterior densities for the true response rates in the dose groups, rather than deriving posterior densities for model parameters, as in other Bayesian approaches (Sielken, 1991), or resampling the observed data points, as in the bootstrap and other resampling methods. It does so by conditioning constrained maximum-entropy priors on the observed data. Monte Carlo sampling of the posterior (constrained, conditioned) probability distributions generate values of response probabilities that might be observed if the experiment were repeated with very large sample sizes. A dose-response curve is fit to each such simulated dataset. If no parametric model has been specified, then a generalized representation (e.g., a power-series or orthonormal polynomial expansion) of the unknown dose-response function is fit to each simulated dataset using “model-free” methods. The simulation-based frequency distribution of all the dose-response curves fit to the simulated datasets yields a posterior distribution function for the low-dose slope of the dose-response curve. An upper confidence limit on the low-dose slope is obtained directly from this posterior distribution. This “Data Cube” procedure is illustrated with a real dataset for benzene, and is seen to produce more policy-relevant insights than does the traditional q1 * methodology. For example, it shows how far apart are the 90%, 95%, and 99% limits and reveals how uncertainty about total and incremental risk vary with dose level (typically being dominated at low doses by uncertainty about the response of the control group, and being dominated at high doses by sampling variability). Strengths and limitations of the Data Cube approach are summarized, and potential decision-analytic applications to making better informed risk management decisions are briefly discussed.  相似文献   

8.
We use bootstrap simulation to characterize uncertainty in parametric distributions, including Normal, Lognormal, Gamma, Weibull, and Beta, commonly used to represent variability in probabilistic assessments. Bootstrap simulation enables one to estimate sampling distributions for sample statistics, such as distribution parameters, even when analytical solutions are not available. Using a two-dimensional framework for both uncertainty and variability, uncertainties in cumulative distribution functions were simulated. The mathematical properties of uncertain frequency distributions were evaluated in a series of case studies during which the parameters of each type of distribution were varied for sample sizes of 5, 10, and 20. For positively skewed distributions such as Lognormal, Weibull, and Gamma, the range of uncertainty is widest at the upper tail of the distribution. For symmetric unbounded distributions, such as Normal, the uncertainties are widest at both tails of the distribution. For bounded distributions, such as Beta, the uncertainties are typically widest in the central portions of the distribution. Bootstrap simulation enables complex dependencies between sampling distributions to be captured. The effects of uncertainty, variability, and parameter dependencies were studied for several generic functional forms of models, including models in which two-dimensional random variables are added, multiplied, and divided, to show the sensitivity of model results to different assumptions regarding model input distributions, ranges of variability, and ranges of uncertainty and to show the types of errors that may be obtained from mis-specification of parameter dependence. A total of 1,098 case studies were simulated. In some cases, counter-intuitive results were obtained. For example, the point value of the 95th percentile of uncertainty for the 95th percentile of variability of the product of four Gamma or Weibull distributions decreases as the coefficient of variation of each model input increases and, therefore, may not provide a conservative estimate. Failure to properly characterize parameter uncertainties and their dependencies can lead to orders-of-magnitude mis-estimates of both variability and uncertainty. In many cases, the numerical stability of two-dimensional simulation results was found to decrease as the coefficient of variation of the inputs increases. We discuss the strengths and limitations of bootstrap simulation as a method for quantifying uncertainty due to random sampling error.  相似文献   

9.
10.
Mathematical approaches are not well established for calculating the upper confidence limit (UCL) of the mean of a set of concentration values that have been measured using a count-based analytical approach such as is commonly used for asbestos in air. This is because the uncertainty around the sample mean is determined not only by the authentic between-sample variation (sampling error), but also by random Poisson variation that occurs in the measurement of sample concentrations (measurement error). This report describes a computer-based application, referred to as CB-UCL, that supports the estimation of UCL values for asbestos and other count-based samples sets, with special attention to datasets with relatively small numbers of samples and relatively low counts (including datasets with all-zero count samples). Evaluation of the performance of the application with a range of test datasets indicates the application is useful for deriving UCL estimates for datasets of this type.  相似文献   

11.
Genome-scale data have greatly facilitated the resolution of recalcitrant nodes that Sanger-based datasets have been unable to resolve. However, phylogenomic studies continue to use traditional methods such as bootstrapping to estimate branch support; and high bootstrap values are still interpreted as providing strong support for the correct topology. Furthermore, relatively little attention has been given to assessing discordances between gene and species trees, and the underlying processes that produce phylogenetic conflict. We generated novel genomic datasets to characterize and determine the causes of discordance in Old World treefrogs (Family: Rhacophoridae)—a group that is fraught with conflicting and poorly supported topologies among major clades. Additionally, a suite of data filtering strategies and analytical methods were applied to assess their impact on phylogenetic inference. We showed that incomplete lineage sorting was detected at all nodes that exhibited high levels of discordance. Those nodes were also associated with extremely short internal branches. We also clearly demonstrate that bootstrap values do not reflect uncertainty or confidence for the correct topology and, hence, should not be used as a measure of branch support in phylogenomic datasets. Overall, we showed that phylogenetic discordances in Old World treefrogs resulted from incomplete lineage sorting and that species tree inference can be improved using a multi-faceted, total-evidence approach, which uses the most amount of data and considers results from different analytical methods and datasets.  相似文献   

12.
利用隆线趋光行为评价铬的生物毒性   总被引:14,自引:0,他引:14  
报道了以隆线单克隆Dc42为生物监测器,利用其趋光行为变化评价铬生物毒性的方法.结果表明,隆线趋光行为抑制率能较好地反映水中铬的污染程度.在重铬酸钾标准毒物溶液中,趋光指数与Cr6+的浓度呈极显著负相关(R2=0.8089,P<0.001),Cr6+浓度的检测下限为0.056 mg·L-1,远低于LC50和EC50,平均精度达到5.46%,说明趋光指数法用于监测化学物质生物毒性灵敏度高、精确可靠.  相似文献   

13.
Quantification of uncertainty associated with risk estimates is an important part of risk assessment. In recent years, use of second-order distributions, and two-dimensional simulations have been suggested for quantifying both variability and uncertainty. These approaches are better interpreted within the Bayesian framework. To help practitioners better use such methods and interpret the results, in this article, we describe propagation and interpretation of uncertainty in the Bayesian paradigm. We consider both the estimation problem where some summary measures of the risk distribution (e.g., mean, variance, or selected percentiles) are to be estimated, and the prediction problem, where the risk values for some specific individuals are to be predicted. We discuss some connections and differences between uncertainties in estimation and prediction problems, and present an interpretation of a decomposition of total variability/uncertainty into variability and uncertainty in terms of expected squared error of prediction and its reduction from perfect information. We also discuss the role of Monte Carlo methods in characterizing uncertainty. We explain the basic ideas using a simple example, and demonstrate Monte Carlo calculations using another example from the literature.  相似文献   

14.
We investigate the uncertainties associated with modeling the potential health effects on piscivorous animals of mercury released to the atmosphere. The multimedia modeling system combines an atmospheric fate and transport model, an aquatic cycling model, and a terrestrial food web model. First, the modeling system is used to calculate point values of the animals' hazard quotients (i.e., measures of toxic dose). Next, we use a simplified version of the modeling system to conduct a probabilistic analysis for the Great Lakes region that takes into account input uncertainty, variability, and uncertainty and variability combined. The use of two different software packages for the combined uncertainty/variability analysis led to similar results except for high values (>90th percentile) where some differences were evident. A sensitivity study was performed on the combined uncertainty and variability analysis. Regional variability caused more than 70% of the variance in the results, with the fish bioaccumulation factor accounting for the majority of the variability. The major sources of uncertainty were the speciation of the mercury emissions, the lake pH, and the sediment burial rate.  相似文献   

15.
The purpose of this study was to evaluate the remediation potential and disturbance response indicators of Impatiens walleriana exposed to benzene and chromium. Numerous studies over the years have found abundant evidence of the carcinogenicity of benzene and chromium (VI) in humans. Benzene and chromium are two toxic industrial chemicals commonly found together at contaminated sites, and one of the most common management strategies employed in the recovery of sites contaminated by petroleum products and trace metals is in situ remediation. Given that increasing interest has focused on the use of plants as depollution agents, direct injection tests and benzene misting were performed on I. walleriana to evaluate the remediation potential of this species. I. walleriana accumulated hexavalent chromium, mainly in the root system (164.23 mg kg?1), to the detriment of the aerial part (39.72 mg kg?1), and presented visible damage only at the highest concentration (30 mg L?1). Unlike chromium (VI), chromium (III) was retained almost entirely by the soil, leaving it available for removal by phytotechnology. However, after the contamination stopped, I. walleriana responded positively to the detoxification process, recovering its stem stiffness and leaf color. I. walleriana showed visible changes such as leaf chlorosis during the ten days of benzene contamination. When benzene is absorbed by the roots, it is translocated to and accumulated in the plant's aerial part. This mechanism the plant uses ensures its tolerance to the organic compound, enabling the species to survive and reproduce after treatment with benzene. Although I. walleriana accumulates minor amounts of hexavalent chromium in the aerial part, this amount suffices to induce greater oxidative stress and to increase the amount of hydrogen peroxide when compared to that of benzene. It was therefore concluded that I. walleriana is a species that possesses desirable characteristics for phytotechnology.  相似文献   

16.
Li XG  Lv Y  Ma BG  Jian SW  Tan HB 《Bioresource technology》2011,102(20):9783-9787
The thermal behavior of high-ash anthracite coal, tobacco residue and their blends during combustion processes was investigated by means of thermogravimetric analysis (20 K min(-1), ranging from ambient temperature to 1273 K). Effects of the mixed proportion between coal and tobacco residue on the combustion process, ignition and burnout characteristics were also studied. The results indicated that the combustion of tobacco residue was controlled by the emission of volatile matter; the regions were more complex for tobacco residue (four peaks) than for coal (two peaks). Also, the blends had integrative thermal profiles that reflected both tobacco residue and coal. The incorporation of tobacco residue could improve the combustion characteristics of high-ash anthracite coal, especially the ignition and burnout characteristics comparing with the separate burning of tobacco residue and coal. It was feasible to use the co-combustion of tobacco residue and high-ash anthracite coal as fuel.  相似文献   

17.
森林火灾碳排放计量模型研究进展   总被引:7,自引:0,他引:7  
森林火灾是森林生态系统重要的干扰因子,是导致植被和土壤碳储量减少的重要途径之一.森林火灾含碳气体排放对大气碳平衡及全球气候变化具有重要影响,科学有效地对其进行计量,对了解森林火灾在全球碳循环和碳平衡中的地位具有重要意义.本文从3个方面阐述森林火灾碳排放计量模型的研究进展: 森林火灾直接排放总碳和含碳气体计量方法;森林火灾碳排放计量模型的影响因子及计量参数;森林火灾碳排放计量中不确定性原因剖析.最后提出了提高碳排放计量定量化的3种路径选择: 利用高分辨率遥感数据、改进算法、提高森林火灾面积的估测精度、结合有效可燃物计量模型,提高估测可燃物载量的准确率;使用高分辨率遥感影像,并结合室内控制实验、野外试验与火烧迹地调查确定燃烧效率;通过大量室内燃烧实验和野外空中采样来确定排放因子和排放比.  相似文献   

18.
Nitrous oxide (N2O) is a greenhouse gas with a global warming potential approximately 298 times greater than that of CO2. In 2006, the Intergovernmental Panel on Climate Change (IPCC) estimated N2O emission due to synthetic and organic nitrogen (N) fertilization at 1% of applied N. We investigated the uncertainty on this estimated value, by fitting 13 different models to a published dataset including 985 N2O measurements. These models were characterized by (i) the presence or absence of the explanatory variable “applied N”, (ii) the function relating N2O emission to applied N (exponential or linear function), (iii) fixed or random background (i.e. in the absence of N application) N2O emission and (iv) fixed or random applied N effect. We calculated ranges of uncertainty on N2O emissions from a subset of these models, and compared them with the uncertainty ranges currently used in the IPCC-Tier 1 method. The exponential models outperformed the linear models, and models including one or two random effects outperformed those including fixed effects only. The use of an exponential function rather than a linear function has an important practical consequence: the emission factor is not constant and increases as a function of applied N. Emission factors estimated using the exponential function were lower than 1% when the amount of N applied was below 160 kg N ha−1. Our uncertainty analysis shows that the uncertainty range currently used by the IPCC-Tier 1 method could be reduced.  相似文献   

19.
Area under the receiver operating characteristic curve (AROC) is commonly used to choose a biomechanical metric from which to construct an injury risk curve (IRC). However, AROC may not handle censored datasets adequately. Survival analysis creates robust estimates of IRCs which accommodate censored data. We present an observation-adjusted ROC (oaROC) which uses the survival-based IRC to estimate the AROC. We verified and evaluated this method using simulated datasets of different censoring statuses and sample sizes. For a dataset with 1000 left and right censored observations, the median AROC closely approached the oaROCTrue, or the oaROC calculated using an assumed “true” IRC, differing by a fraction of a percent, 0.1%. Using simulated datasets with various censoring, we found that oaROC converged onto oaROCTrue in all cases. For datasets with right and non-censored observations, AROC did not converge onto oaROCTrue. oaROC for datasets with only non-censored observations converged the fastest, and for a dataset with 10 observations, the median oaROC differed from oaROCTrue by 2.74% while the corresponding median AROC with left and right censored data differed from oaROCTrue by 9.74%. We also calculated the AROC and oaROC for a published side impact dataset, and differences between the two methods ranged between −24.08% and 24.55% depending on metric. Overall, when compared with AROC, we found oaROC performs equivalently for doubly censored data, better for non-censored data, and can accommodate more types of data than AROC. While more validation is needed, the results indicate that oaROC is a viable alternative which can be incorporated into the metric selection process for IRCs.  相似文献   

20.
Wood furniture is an important source of indoor air pollution. To date, the detection of harmful substances in wood furniture has relied on the control of a single formaldehyde component, therefore the detection and evaluation of pollutants released by wood furniture are necessary. A novel method based on a cataluminescence (CTL) sensor system generated on the surface of nano‐3TiO2–2BiVO4 was proposed for the simultaneous detection of pollutants released by wood furniture. Formaldehyde and benzene were selected as a model to investigate the CTL‐sensing properties of the sensor system. Field emission scanning electronic microscopy (FESEM), transmission electron microscopy (TEM) and X‐ray diffraction (XRD) were employed to characterize the as‐prepared samples. The results showed that the as‐prepared test system exhibited outstanding CTL properties such as stable intensity, a high signal‐to‐noise ratio, and short response and recovery times. In addition, the limit of detection for formaldehyde and benzene was below the standard permitted concentrations. Moreover, the sensor system showed outstanding selectivity for formaldehyde and benzene compared with eight other common volatile organic compounds (VOCs). The performance of the sensor system will enable furniture VOC limit emissions standards to be promulgated as soon as possible. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号