首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 328 毫秒
1.
The hydrologic model is the foundation of water resource management and planning. Conceptual model is the essential component of groundwater model. Due to limited understanding of natural hydrogeological conditions, the conceptual model is always constructed incompletely. Therefore, the uncertainty in the model's output is evitable when natural groundwater field is simulated by a single groundwater model. A synthetic groundwater model is built and regarded as the true model, and three alternative conceptual models are constructed by considering incomplete hydrogeological conditions. The outputs (groundwater budget terms from boundary conditions) of these groundwater models are analyzed statistically. The results show that when the conceptual model is closer to the true hydrogeological conditions, the distributions of outputs of the groundwater model are more concentrated on the true outputs. Therefore, the more reliable the structure of the conceptual model is, the more reliable the output of the groundwater model is. Moreover, the uncertainty caused by the conceptual model cannot be compensated by parameter uncertainty.  相似文献   

2.
A method is presented to statistically evaluate toxicity study design for dose– response assessment aimed at minimizing the uncertainty in resulting Benchmark dose (BMD) estimates. Although the BMD method has been accepted as a valuable tool for risk assessment, the traditional no observed adverse effect level (NOAEL)/lowest observed adverse effective level (LOAEL) approach is still the principal basis for toxicological study design. To develop similar protocols for experimental design for BMD estimation, methods are needed that account for variability in experimental outcomes, and uncertainty in dose–response model selection and model parameter estimates. Based on Bayesian model averaging (BMA) BMD estimation, this study focuses on identifying the study design criteria that can reduce the uncertainty in BMA BMD estimates by using a Monte Carlo pre-posterior analysis on BMA BMD predictions. The results suggest that (1) as more animals are tested there is less uncertainty in BMD estimates; (2) one relatively high dose is needed and other doses can then be appropriately spread over the resulting dose scale; (3) placing different numbers of animals in different dose groups has very limited influence on improving BMD estimation; and (4) when the total number of animals is fixed, using more (but smaller) dose groups is a preferred strategy.  相似文献   

3.
Purpose

Consequential life cycle assessment (C-LCA) aims to assess the environmental consequences of a decision. It differs from traditional LCA because its inventory includes all the processes affected by the decision which are identified by accounting for causal links (physical, economic, etc.). However, C-LCA results could be quite uncertain which makes the interpretation phase harder. Therefore, strategies to assess and reduce uncertainty in C-LCA are needed. Part of uncertainty in C-LCA is due to spatial variability that can be reduced using regionalization. However, regionalization can be complex and time-consuming if straightforwardly applied to an entire LCA model.

Methods

The main purpose of this article is to prioritize regionalization efforts to enhance interpretation in C-LCA by assessing the spatial uncertainty of a case study building on a partial equilibrium economic model. Three specific objectives are derived: (1) perform a C-LCA case study of alternative transportation scenarios to investigate the benefits of implementing a public policy for energy transition in France by 2050 with an uncertainty analysis to explore the strength of our conclusions, (2) perform global sensitivity analyses to identify and quantify the main sources of spatial uncertainty between foreground inventory model from partial equilibrium economic modeling, background inventory model and characterization factors, (3) propose a strategy to reduce the spatial uncertainty for our C-LCA case study by prioritizing regionalization.

Results and discussion

Results show that the implementation of alternative transport scenarios in compliance with public policy for the energy transition in France is beneficial for some impact categories (ICs) (global warming, marine acidification, marine eutrophication, terrestrial acidification, thermally polluted water, photochemical oxidant formation, and particulate matter formation), with a confidence level of 95%. For other ICs, uncertainty reduction is required to determine conclusions with a similar level of confidence. Input variables with spatial variability from the partial equilibrium economic model are significant contributors to the C-LCA spatial uncertainty and should be prioritized for spatial uncertainty reduction. In addition, characterization factors are significant contributors to the spatial uncertainty results for all regionalized ICs (except land occupation IC).

Conclusions

Ways to reduce the spatial uncertainty from economic modeling should be explored. Uncertainty reduction to enhance the interpretation phase and the decision-making should be prioritized depending on the goal and scope of the LCA study. In addition, using regionalized CFs in C-LCA seems to be relevant, and C-LCA calculation tools should be adapted accordingly.

  相似文献   

4.
傅煜  雷渊才  曾伟生 《生态学报》2015,35(23):7738-7747
采用系统抽样体系江西省固定样地杉木连续观测数据和生物量数据,通过Monte Carlo法反复模拟由单木生物量模型推算区域尺度地上生物量的过程,估计了江西省杉木地上总生物量。基于不同水平建模样本量n及不同决定系数R~2的设计,分别研究了单木生物量模型参数变异性及模型残差变异性对区域尺度生物量估计不确定性的影响。研究结果表明:2009年江西省杉木地上生物量估计值为(19.84±1.27)t/hm~2,不确定性占生物量估计值约6.41%。生物量估计值和不确定性值达到平稳状态所需的运算时间随建模样本量及决定系数R~2的增大而减小;相对于模型参数变异性,残差变异性对不确定性的影响更小。  相似文献   

5.
Purpose

Objective uncertainty quantification (UQ) of a product life-cycle assessment (LCA) is a critical step for decision-making. Environmental impacts can be measured directly or by using models. Underlying mathematical functions describe a model that approximate the environmental impacts during various LCA stages. In this study, three possible uncertainty sources of a mathematical model, i.e., input variability, model parameter (differentiate from input in this study), and model-form uncertainties, were investigated. A simple and easy to implement method is proposed to quantify each source.

Methods

Various data analytics methods were used to conduct a thorough model uncertainty analysis; (1) Interval analysis was used for input uncertainty quantification. A direct sampling using Monte Carlo (MC) simulation was used for interval analysis, and results were compared to that of indirect nonlinear optimization as an alternative approach. A machine learning surrogate model was developed to perform direct MC sampling as well as indirect nonlinear optimization. (2) A Bayesian inference was adopted to quantify parameter uncertainty. (3) A recently introduced model correction method based on orthogonal polynomial basis functions was used to evaluate the model-form uncertainty. The methods are applied to a pavement LCA to propagate uncertainties throughout an energy and global warming potential (GWP) estimation model; a case of a pavement section in Chicago metropolitan area was used.

Results and discussion

Results indicate that each uncertainty source contributes to the overall energy and GWP output of the LCA. Input uncertainty was shown to have significant impact on overall GWP output; for the example case study, GWP interval was around 50%. Parameter uncertainty results showed that an assumption of ±?10% uniform variation in the model parameter priors resulted in 28% variation in the GWP output. Model-form uncertainty had the lowest impact (less than 10% variation in the GWP). This is because the original energy model is relatively accurate in estimating the energy. However, sensitivity of the model-form uncertainty showed that even up to 180% variation in the results can be achieved due to lower original model accuracies.

Conclusions

Investigating each uncertainty source of the model indicated the importance of the accurate characterization, propagation, and quantification of uncertainty. The outcome of this study proposed independent and relatively easy to implement methods that provide robust grounds for objective model uncertainty analysis for LCA applications. Assumptions on inputs, parameter distributions, and model form need to be justified. Input uncertainty plays a key role in overall pavement LCA output. The proposed model correction method as well as interval analysis were relatively easy to implement. Research is still needed to develop a more generic and simplified MCMC simulation procedure that is fast to implement.

  相似文献   

6.
BackgroundRecent approaches have sought to harness the potential of stem cells to regenerate bone that is lost as a consequence of trauma or disease. Bone marrow aspirate (BMA) provides an autologous source of osteoprogenitors for such applications. However, previous studies indicated that the concentration of osteoprogenitors present in BMA is less than required for robust bone regeneration. We provide further evidence for the importance of BMA enrichment for skeletal tissue engineering strategies using a novel acoustic wave-facilitated filtration strategy to concentrate BMA for osteoprogenitors, clinically applicable for intraoperative orthopedic use.MethodsFemoral BMA from 15 patients of an elderly cohort was concentrated for the nucleated cell fraction against erythrocytes and excess plasma volume via size exclusion filtration facilitated by acoustic agitation. The effect of aspirate concentration was assessed by assays for colony formation, flow cytometry, multilineage differentiation and scaffold seeding efficiency.ResultsBMA was filtered to achieve a mean 4.2-fold reduction in volume with a corresponding enrichment of viable and functional osteoprogenitors, indicated by flow cytometry and assays for colony formation. Enhanced osteogenic and chondrogenic differentiation was observed using concentrated aspirate and enhanced cell-seeding efficiency onto allogeneic bone graft as an effect of osteoprogenitor concentration relative specifically to the concentration of erythrocytes in the aspirate.ConclusionsThese studies provide evidence for the importance of BMA nucleated cell concentration for both cell differentiation and cell seeding efficiency and demonstrate the potential of this approach for intraoperative application to enhance bone healing.  相似文献   

7.
High-throughput experimentation has revolutionized data-driven experimental sciences and opened the door to the application of machine learning techniques. Nevertheless, the quality of any data analysis strongly depends on the quality of the data and specifically the degree to which random effects in the experimental data-generating process are quantified and accounted for. Accordingly calibration, i.e. the quantitative association between observed quantities and measurement responses, is a core element of many workflows in experimental sciences.Particularly in life sciences, univariate calibration, often involving non-linear saturation effects, must be performed to extract quantitative information from measured data. At the same time, the estimation of uncertainty is inseparably connected to quantitative experimentation. Adequate calibration models that describe not only the input/output relationship in a measurement system but also its inherent measurement noise are required. Due to its mathematical nature, statistically robust calibration modeling remains a challenge for many practitioners, at the same time being extremely beneficial for machine learning applications.In this work, we present a bottom-up conceptual and computational approach that solves many problems of understanding and implementing non-linear, empirical calibration modeling for quantification of analytes and process modeling. The methodology is first applied to the optical measurement of biomass concentrations in a high-throughput cultivation system, then to the quantification of glucose by an automated enzymatic assay. We implemented the conceptual framework in two Python packages, calibr8 and murefi, with which we demonstrate how to make uncertainty quantification for various calibration tasks more accessible. Our software packages enable more reproducible and automatable data analysis routines compared to commonly observed workflows in life sciences.Subsequently, we combine the previously established calibration models with a hierarchical Monod-like ordinary differential equation model of microbial growth to describe multiple replicates of Corynebacterium glutamicum batch cultures. Key process model parameters are learned by both maximum likelihood estimation and Bayesian inference, highlighting the flexibility of the statistical and computational framework.  相似文献   

8.
This article investigates an ensemble‐based technique called Bayesian Model Averaging (BMA) to improve the performance of protein amino acid pKa predictions. Structure‐based pKa calculations play an important role in the mechanistic interpretation of protein structure and are also used to determine a wide range of protein properties. A diverse set of methods currently exist for pKa prediction, ranging from empirical statistical models to ab initio quantum mechanical approaches. However, each of these methods are based on a set of conceptual assumptions that can effect a model's accuracy and generalizability for pKa prediction in complicated biomolecular systems. We use BMA to combine eleven diverse prediction methods that each estimate pKa values of amino acids in staphylococcal nuclease. These methods are based on work conducted for the pKa Cooperative and the pKa measurements are based on experimental work conducted by the García‐Moreno lab. Our cross‐validation study demonstrates that the aggregated estimate obtained from BMA outperforms all individual prediction methods with improvements ranging from 45 to 73% over other method classes. This study also compares BMA's predictive performance to other ensemble‐based techniques and demonstrates that BMA can outperform these approaches with improvements ranging from 27 to 60%. This work illustrates a new possible mechanism for improving the accuracy of pKa prediction and lays the foundation for future work on aggregate models that balance computational cost with prediction accuracy. Proteins 2014; 82:354–363. © 2013 Wiley Periodicals, Inc.  相似文献   

9.
MOTIVATION: Selecting a small number of relevant genes for accurate classification of samples is essential for the development of diagnostic tests. We present the Bayesian model averaging (BMA) method for gene selection and classification of microarray data. Typical gene selection and classification procedures ignore model uncertainty and use a single set of relevant genes (model) to predict the class. BMA accounts for the uncertainty about the best set to choose by averaging over multiple models (sets of potentially overlapping relevant genes). RESULTS: We have shown that BMA selects smaller numbers of relevant genes (compared with other methods) and achieves a high prediction accuracy on three microarray datasets. Our BMA algorithm is applicable to microarray datasets with any number of classes, and outputs posterior probabilities for the selected genes and models. Our selected models typically consist of only a few genes. The combination of high accuracy, small numbers of genes and posterior probabilities for the predictions should make BMA a powerful tool for developing diagnostics from expression data. AVAILABILITY: The source codes and datasets used are available from our Supplementary website.  相似文献   

10.
BackgroundCurrent methods for estimating the timeliness of cancer diagnosis are not robust because dates of key defining milestones, for example first presentation, are uncertain. This is exacerbated when patients have other conditions (multimorbidity), particularly those that share symptoms with cancer. Methods independent of this uncertainty are needed for accurate estimates of the timeliness of cancer diagnosis, and to understand how multimorbidity impacts the diagnostic process.MethodsParticipants were diagnosed with oesophagogastric cancer between 2010 and 2019. Controls were matched on year of birth, sex, general practice and multimorbidity burden calculated using the Cambridge Multimorbidity Score. Primary care data (Clinical Practice Research Datalink) was used to explore population-level consultation rates for up to two years before diagnosis across different multimorbidity burdens. Five approaches were compared on the timing of the consultation frequency increase, the inflection point for different multimorbidity burdens, different aggregated time-periods and sample sizes.ResultsWe included 15,410 participants, of which 13,328 (86.5 %) had a measurable multimorbidity burden. Our new maximum likelihood estimation method found evidence that the inflection point in consultation frequency varied with multimorbidity burden, from 154 days (95 %CI 131.8–176.2) before diagnosis for patients with no multimorbidity, to 126 days (108.5–143.5) for patients with the greatest multimorbidity burden. Inflection points identified using alternative methods were closer to diagnosis for up to three burden groups. Sample size reduction and changing the aggregation period resulted in inflection points closer to diagnosis, with the smallest change for the maximum likelihood method.DiscussionExisting methods to identify changes in consultation rates can introduce substantial bias which depends on sample size and aggregation period. The direct maximum likelihood method was less prone to this bias than other methods and offers a robust, population-level alternative for estimating the timeliness of cancer diagnosis.  相似文献   

11.
Abstract

The cumulative onset curves for smoking, drinking, and sexual intercourse have been tracked through adolescence with reasonable success by recursive equations positing an “epidemic” or contagious process. The gist of these models is that the likelihood of onset in the next time period is proportional to the prevalence of the behavior among an adolescent's peers in the current time period. The present paper extends this approach to official delinquency. The fits to the data (from the Philadelphia cohort studies) are extremely tight. Several conceptual mismatches between the theory underlying the model and the model itself are discussed.  相似文献   

12.
In this paper, we propose a generalization of the mixture (binary) cure rate model, motivated by the existence of a zero-modified (inflation or deflation) distribution, on the initial number of causes, under a competing cause scenario. This non-linear transformation cure rate model is in the same form of models studied in the past; however, following our approach, we are able to give a realistic interpretation to a specific class of proper transformation functions, for the cure rate modeling. The estimation of the parameters is then carried out using the maximum likelihood method along with a profile approach. A simulation study examines the accuracy of the proposed estimation method and the model discrimination based on the likelihood ratio test. For illustrative purposes, analysis of two real life data-sets, one on recidivism and another on cutaneous melanoma, is also carried out.  相似文献   

13.
Abstract

A comprehensive study was conducted from a semi-arid part of Yavtmal District, Maharashtra, India through combination approaches of geochemical modeling and its health consequences. The groundwater quality assessment shows that 55% of groundwater samples have the concentration of fluoride above the desirable limit. The high Na+/Ca+ ratio (>1.0) suggest the occurrence of cation exchange, which is further supported by Scholler’ chloro-alkaline indices. The geochemical modeling reveals that the existence of CaCO3 precipitation and CaF2 in groundwater. Simulation analysis indicates the dissolution of calcite, gypsum, and albite and precipitation of dolomite, fluorite, halite, and K-feldspar along with cation exchange as the main water–rock interactions influencing the groundwater chemistry. This is further significantly supported by pollution index of groundwater (PIG). PIG indicates about 18% of total samples fall in very high pollution zone, 3% in high pollution zone, 8% in moderate pollution zone, 24% in low pollution zone, and remaining (47%) express insignificant pollution. The 28% of subject studied have skeletal fluorosis varying from mild to severe type. In different pollution zones, the affected persons by dental fluorosis are varying from 15% to 41%. A proper monitoring and treatment are required for high fluoride water before its use for drinking and cooking.  相似文献   

14.
Abstract

The effectiveness of the traditional risk analysis approach is enhanced by the integration of the fuzzy logic and Multi-Criteria Decision Making (MCDM) methods. In fact, human decisions are ambiguous and blurred and do not fit to express with absolute numerical values. For this reason, it is more realistic to use verbal variables in modeling human decisions. In this paper, a new fuzzy based hazard evaluation approach is proposed to deal with the risk assessment process. The proposed methodology consists on MCDM with a fuzzy system which includes a hybrid structure consists the Pythagorean Fuzzy Analytic Hierarchy Process (PFAHP) method with cosine similarity, and also Neutrosophic Analytic Hierarchy Process (NFAHP) to support facing of uncertainty in the risk assessment process for asphalt production, laying and coating services which are important and should be examined in terms of occupational health and safety. To the best of our knowledge, this study is the first to propose facing uncertainty in the hazard evaluations and risk assessment for asphalt production, laying, and coating services. As an outcome of the analysis by the proposed method, according to PFAHP and NFAHP methodologies the criterions “manometer size” and “calibration” are found to be most critical factors, respectively.  相似文献   

15.
We recently introduced likelihood-based methods for fitting stochastic integrate-and-fire models to spike train data. The key component of this method involves the likelihood that the model will emit a spike at a given time t. Computing this likelihood is equivalent to computing a Markov first passage time density (the probability that the model voltage crosses threshold for the first time at time t). Here we detail an improved method for computing this likelihood, based on solving a certain integral equation. This integral equation method has several advantages over the techniques discussed in our previous work: in particular, the new method has fewer free parameters and is easily differentiable (for gradient computations). The new method is also easily adaptable for the case in which the model conductance, not just the input current, is time-varying. Finally, we describe how to incorporate large deviations approximations to very small likelihoods. Action Editor: Barry J. Richmond  相似文献   

16.
17.
Abstract

Personalized bio-fixed implants require good modeling efficiency, matching, and stress distribution for optimal function. We performed three-dimensional reconstruction of the tibial implant using reverse and positive methods, performed finite element analysis, and then used the optimized model structure and the optimal node arrangement of the finite element method to design a biomechanical tibia implant. Next, we used selective laser melting equipment for direct manufacturing and then determined the mechanical properties of the completed implant unit structure. The results indicated that the finite element method allows good modeling, the strain performance is equivalent to that of material produced using the traditional modeling method, and the resulting product has a more even stress distribution. The porous structure of the material formed by SLM showed a good forming effect within 3?mm of the pillar, had less powder adhesive on the surface, and lacked obvious dross, suggesting the utility of this method for preparation of personalized bio-fixed implants.  相似文献   

18.
Summary Model‐based estimation of the effect of an exposure on an outcome is generally sensitive to the choice of which confounding factors are included in the model. We propose a new approach, which we call Bayesian adjustment for confounding (BAC), to estimate the effect of an exposure of interest on the outcome, while accounting for the uncertainty in the choice of confounders. Our approach is based on specifying two models: (1) the outcome as a function of the exposure and the potential confounders (the outcome model); and (2) the exposure as a function of the potential confounders (the exposure model). We consider Bayesian variable selection on both models and link the two by introducing a dependence parameter, , denoting the prior odds of including a predictor in the outcome model, given that the same predictor is in the exposure model. In the absence of dependence (), BAC reduces to traditional Bayesian model averaging (BMA). In simulation studies, we show that BAC, with estimates the exposure effect with smaller bias than traditional BMA, and improved coverage. We, then, compare BAC, a recent approach of Crainiceanu, Dominici, and Parmigiani (2008 , Biometrika 95, 635–651), and traditional BMA in a time series data set of hospital admissions, air pollution levels, and weather variables in Nassau, NY for the period 1999–2005. Using each approach, we estimate the short‐term effects of on emergency admissions for cardiovascular diseases, accounting for confounding. This application illustrates the potentially significant pitfalls of misusing variable selection methods in the context of adjustment uncertainty.  相似文献   

19.
PurposeRadiochromic films change their color upon irradiation due to polymerization of the sensitive component embedded within the sensitive layer. However, agents, other than monitored radiation, can lead to a change in the color of the sensitive layer (temperature, humidity, UV light) that can be considered as a background signal and can be removed from the actual measurement by using a control film piece. In this work, we investigate the impact of the use of control film pieces on both accuracy and uncertainty of dose measured using radiochromic film based reference dosimetry protocol.MethodsWe irradiated “control” film pieces (EBT3 GafChromicTM film model) to known doses in a range of 0.05–1 Gy, and five film pieces of the same size to 2, 5, 10, 15 and 20 Gy, considered to be “unknown” doses. Depending on a dose range, two approaches to incorporating control film piece were investigated: signal and dose corrected method.ResultsFor dose values greater than 10 Gy, the increase in accuracy of 3% led to uncertainty loss of 5% by using dose corrected approach. At lower doses and signals of the order of 5%, we observed an increase in accuracy of 10% with a loss of uncertainty lower than 1% by using the corrected signal approach.ConclusionsIncorporation of the signal registered by the control film piece into dose measurement analysis should be a judgment call of the user based on a tradeoff between deemed accuracy and acceptable uncertainty for a given dose measurement.  相似文献   

20.
Abstract

Groundwater quality is defined by various water quality parameters. The aims of the research are to understand the relationships among different groundwater quality parameters and to trace the sources and affecting factors of groundwater pollution via statistical and multivariate statistical techniques. The 36 shallow groundwater samples collected from shallow pumping wells in Yan’an City were analyzed for various water quality parameters. Correlation analysis, principal component analysis (PCA), hierarchical cluster analysis (HCA), and multivariable linear regressions (MLR) were jointly used in this study to explore the sources and affecting factors of groundwater pollution. The study reveals that the mineral dissolution/precipitation and anthropogenic input are the main sources of the physicochemical indices and trace elements in the groundwater. Groundwater chemistry is predominantly regulated by natural processes such as dissolution of carbonates, silicates, and evaporates and soil leaching, followed by human activities as the second factor. Climatic factors and land use types are also important in affecting groundwater chemistry. Cl is the greatest contributor to the overall groundwater quality revealed by the two regression models. The first model which has eight dependent variables is high in model reliability and stability, and is recommended for the overall groundwater quality prediction. The study is helpful for understanding groundwater quality variation in urban areas.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号