首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 22 毫秒
1.
Statistical methods allow the effects of uncertainty to be incorporated into finite element models. This has potential benefits for the analysis of biological systems where natural variability can give rise to substantial uncertainty in both material and geometrical properties. In this study, a simple model of the intervertebral disc under compression was created and analysed as both a deterministic and a stochastic system. Factorial analysis was used to determine the important parameters to be included in the stochastic analysis. The predictions from the model were compared to experimental results from 21 sheep discs. The size and shape of the distribution of the axial deformations predicted by the model was consistent with the experimental results given that the number of model solutions far exceeded the number of experimental results. Stochastic models could be valuable in determining the range and most likely value of stress in a tissue or implant.  相似文献   

2.
Purpose

Despite the wide use of LCA for environmental profiling, the approach for determining the system boundary within LCA models continues to be subjective and lacking in mathematical rigor. As a result, life cycle models are often developed in an ad hoc manner, and are difficult to compare. Significant environmental impacts may be inadvertently left out. Overcoming this shortcoming can help elicit greater confidence in life cycle models and their use for decision making.

Methods

This paper describes a framework for hybrid life cycle model generation by selecting activities based on their importance, parametric uncertainty, and contribution to network complexity. The importance of activities is determined by structural path analysis—which then guides the construction of life cycle models based on uncertainty and complexity indicators. Information about uncertainty is from the available life cycle inventory; complexity is quantified by cost or granularity. The life cycle model is developed in a hierarchical manner by adding the most important activities until error requirements are satisfied or network complexity exceeds user-specified constraints.

Results and Discussion

The framework is applied to an illustrative example for building a hybrid LCA model. Since this is a constructed example, the results can be compared with the actual impact, to validate the approach. This application demonstrates how the algorithm sequentially develops a life cycle model of acceptable uncertainty and network complexity. Challenges in applying this framework to practical problems are discussed.

Conclusion

The presented algorithm designs system boundaries between scales of hybrid LCA models, includes or omits activities from the system based on path analysis of environmental impact contribution at upstream network nodes, and provides model quality indicators that permit comparison between different LCA models.

  相似文献   

3.
Purpose

Objective uncertainty quantification (UQ) of a product life-cycle assessment (LCA) is a critical step for decision-making. Environmental impacts can be measured directly or by using models. Underlying mathematical functions describe a model that approximate the environmental impacts during various LCA stages. In this study, three possible uncertainty sources of a mathematical model, i.e., input variability, model parameter (differentiate from input in this study), and model-form uncertainties, were investigated. A simple and easy to implement method is proposed to quantify each source.

Methods

Various data analytics methods were used to conduct a thorough model uncertainty analysis; (1) Interval analysis was used for input uncertainty quantification. A direct sampling using Monte Carlo (MC) simulation was used for interval analysis, and results were compared to that of indirect nonlinear optimization as an alternative approach. A machine learning surrogate model was developed to perform direct MC sampling as well as indirect nonlinear optimization. (2) A Bayesian inference was adopted to quantify parameter uncertainty. (3) A recently introduced model correction method based on orthogonal polynomial basis functions was used to evaluate the model-form uncertainty. The methods are applied to a pavement LCA to propagate uncertainties throughout an energy and global warming potential (GWP) estimation model; a case of a pavement section in Chicago metropolitan area was used.

Results and discussion

Results indicate that each uncertainty source contributes to the overall energy and GWP output of the LCA. Input uncertainty was shown to have significant impact on overall GWP output; for the example case study, GWP interval was around 50%. Parameter uncertainty results showed that an assumption of ±?10% uniform variation in the model parameter priors resulted in 28% variation in the GWP output. Model-form uncertainty had the lowest impact (less than 10% variation in the GWP). This is because the original energy model is relatively accurate in estimating the energy. However, sensitivity of the model-form uncertainty showed that even up to 180% variation in the results can be achieved due to lower original model accuracies.

Conclusions

Investigating each uncertainty source of the model indicated the importance of the accurate characterization, propagation, and quantification of uncertainty. The outcome of this study proposed independent and relatively easy to implement methods that provide robust grounds for objective model uncertainty analysis for LCA applications. Assumptions on inputs, parameter distributions, and model form need to be justified. Input uncertainty plays a key role in overall pavement LCA output. The proposed model correction method as well as interval analysis were relatively easy to implement. Research is still needed to develop a more generic and simplified MCMC simulation procedure that is fast to implement.

  相似文献   

4.
5.
ABSTRACT

Delay in viral production may have a significant impact on the early stages of infection. During the eclipse phase, the time from viral entry until active production of viral particles, no viruses are produced. This delay affects the probability that a viral infection becomes established and timing of the peak viral load. Deterministic and stochastic models are formulated with either multiple latent stages or a fixed delay for the eclipse phase. The deterministic model with multiple latent stages approaches in the limit the model with a fixed delay as the number of stages approaches infinity. The deterministic model framework is used to formulate continuous-time Markov chain and stochastic differential equation models. The probability of a minor infection with rapid viral clearance as opposed to a major full-blown infection with a high viral load is estimated from a branching process approximation of the Markov chain model and the results are confirmed through numerical simulations. In addition, parameter values for influenza A are used to numerically estimate the time to peak viral infection and peak viral load for the deterministic and stochastic models. Although the average length of the eclipse phase is the same in each of the models, as the number of latent stages increases, the numerical results show that the time to viral peak and the peak viral load increase.  相似文献   

6.

Mechanistic models are a powerful tool to gain insights into biological processes. The parameters of such models, e.g. kinetic rate constants, usually cannot be measured directly but need to be inferred from experimental data. In this article, we study dynamical models of the translation kinetics after mRNA transfection and analyze their parameter identifiability. That is, whether parameters can be uniquely determined from perfect or realistic data in theory and practice. Previous studies have considered ordinary differential equation (ODE) models of the process, and here we formulate a stochastic differential equation (SDE) model. For both model types, we consider structural identifiability based on the model equations and practical identifiability based on simulated as well as experimental data and find that the SDE model provides better parameter identifiability than the ODE model. Moreover, our analysis shows that even for those parameters of the ODE model that are considered to be identifiable, the obtained estimates are sometimes unreliable. Overall, our study clearly demonstrates the relevance of considering different modeling approaches and that stochastic models can provide more reliable and informative results.

  相似文献   

7.
We present a novel application of a stochastic ecological model to the study and analysis of microbial growth dynamics as influenced by environmental conditions in an extensive experimental data set. The model proved to be useful in bridging the gap between theoretical ideas in ecology and an applied problem in microbiology. The data consisted of recorded growth curves of Escherichia coli grown in triplicate in a base medium with all 32 possible combinations of five supplements: glucose, NH4Cl, HCl, EDTA, and NaCl. The potential complexity of 25 experimental treatments and their effects was reduced to 22 as just the metal chelator EDTA, the presumed osmotic pressure imposed by NaCl, and the interaction between these two factors were enough to explain the variability seen in the data. The statistical analysis showed that the positive and negative effects of the five chemical supplements and their combinations were directly translated into an increase or decrease in time required to attain stationary phase and the population size at which the stationary phase started. The stochastic ecological model proved to be useful, as it effectively explained and summarized the uncertainty seen in the recorded growth curves. Our findings have broad implications for both basic and applied research and illustrate how stochastic mathematical modeling coupled with rigorous statistical methods can be of great assistance in understanding basic processes in microbial ecology.  相似文献   

8.
Background

Mathematical modeling of biological processes is widely used to enhance quantitative understanding of bio-medical phenomena. This quantitative knowledge can be applied in both clinical and experimental settings. Recently, many investigators began studying mathematical models of tumor response to radiation therapy. We developed a simple mathematical model to simulate the growth of tumor volume and its response to a single fraction of high dose irradiation. The modelling study may provide clinicians important insights on radiation therapy strategies through identification of biological factors significantly influencing the treatment effectiveness.

Methods

We made several key assumptions of the model. Tumor volume is composed of proliferating (or dividing) cancer cells and non-dividing (or dead) cells. Tumor growth rate (or tumor volume doubling time) is proportional to the ratio of the volumes of tumor vasculature and the tumor. The vascular volume grows slower than the tumor by introducing the vascular growth retardation factor, θ. Upon irradiation, the proliferating cells gradually die over a fixed time period after irradiation. Dead cells are cleared away with cell clearance time. The model was applied to simulate pre-treatment growth and post-treatment radiation response of rat rhabdomyosarcoma tumors and metastatic brain tumors of five patients who were treated with Gamma Knife stereotactic radiosurgery (GKSRS).

Results

By selecting appropriate model parameters, we showed the temporal variation of the tumors for both the rat experiment and the clinical GKSRS cases could be easily replicated by the simple model. Additionally, the application of our model to the GKSRS cases showed that the α-value, which is an indicator of radiation sensitivity in the LQ model, and the value of θ could be predictors of the post-treatment volume change.

Conclusions

The proposed model was successful in representing both the animal experimental data and the clinically observed tumor volume changes. We showed that the model can be used to find the potential biological parameters, which may be able to predict the treatment outcome. However, there is a large statistical uncertainty of the result due to the small sample size. Therefore, a future clinical study with a larger number of patients is needed to confirm the finding.

  相似文献   

9.
Positive autoregulation in gene regulation networks has been shown in the past to exhibit stochastic behavior, including stochastic bistability, in which an initially uniform cell population develops into two distinct subpopulations. However, positive autoregulation is often mediated by signal molecules, which have not been considered in prior stochastic analysis of these networks. Here we propose both a full model of such a network that includes a signal molecule, and a simplified model in which the signal molecules have been eliminated through the use of two simplifications. The simplified model is amenable to direct mathematical analysis that shows that stochastic bistability is possible. We use stochastic Petri networks for simulating both types of models. The simulation results show that 1), the stochastic behavior of the two models is similar; and 2), that the analytical steady-state distribution of the simplified model matches well the transient results at times equal to that of a cell generation. A discussion of the simplifications we used in the context of the results indicates the importance of the signal molecule number as a factor determining the presence of bistability. This is further supported from a deterministic steady-state analysis of the full model that is shown to be a useful indicator of potential stochastic bistability. We use the regulation of SdiA in Escherichia coli as an example, due to the importance of this protein and of the signal molecule, a bacterial autoinducer, that is involved. However, the use of kinetic parameter values representing typical cellular activities make the conclusions applicable to other signal-mediated positive autoregulation networks as well.  相似文献   

10.
Predicting rice (Oryza sativa) productivity under future climates is important for global food security. Ecophysiological crop models in combination with climate model outputs are commonly used in yield prediction, but uncertainties associated with crop models remain largely unquantified. We evaluated 13 rice models against multi‐year experimental yield data at four sites with diverse climatic conditions in Asia and examined whether different modeling approaches on major physiological processes attribute to the uncertainties of prediction to field measured yields and to the uncertainties of sensitivity to changes in temperature and CO2 concentration [CO2]. We also examined whether a use of an ensemble of crop models can reduce the uncertainties. Individual models did not consistently reproduce both experimental and regional yields well, and uncertainty was larger at the warmest and coolest sites. The variation in yield projections was larger among crop models than variation resulting from 16 global climate model‐based scenarios. However, the mean of predictions of all crop models reproduced experimental data, with an uncertainty of less than 10% of measured yields. Using an ensemble of eight models calibrated only for phenology or five models calibrated in detail resulted in the uncertainty equivalent to that of the measured yield in well‐controlled agronomic field experiments. Sensitivity analysis indicates the necessity to improve the accuracy in predicting both biomass and harvest index in response to increasing [CO2] and temperature.  相似文献   

11.
Purpose

Product systems use the same unit process models to represent distinct but similar activities. This notably applies to activities in cyclic dependency relationships (or “feedback loops”) that are required an infinite number of times in a product system. The study aims to test the sensitivity of uncertainty results on the assumption made concerning these different instances of the same activities. The default assumption assumes homogeneous production, and the same parameter values are sampled for all instances (e.g., there is one truck). The alternative assumption is that every instance is distinct, and parameter values are independently sampled for different instances of unit processes (e.g., there are infinitely many trucks). Intuitively, sampling the same values for each instance of a unit process should result in more uncertain results.

Methods

The results of uncertainty analyses carried out under either assumption are compared. To simulate models where each instance of a unit process is independent, we convert network models to acyclic LCI models (tree models). This is done three times: (1) for a very simple product system, to explain the methodology; (2) for a sample product system from the ecoinvent database, for illustrative purposes; and (3) for thousands of product systems from ecoinvent databases.

Results and discussion

The uncertainty of network models is indeed greater than that of corresponding tree models. This is shown mathematically for the analytical approximation method to uncertainty propagation and is observed for Monte Carlo simulations with very large numbers of iterations. However, the magnitude of the difference in indicators of dispersion is, for the ecoinvent product systems, often less than a factor of 1.5. In few extreme cases, indicators of dispersion are different by a factor of 4. Monte Carlo simulations with smaller numbers of iterations sometimes give the opposite result.

Conclusions

Given the small magnitude of the difference, we believe that breaking away from the default approach is generally not warranted. Indeed, (1) the alternative approach is not more robust, (2) the current default approach is conservative, and (3) there are more pressing challenges for the LCA community to meet. This being said, the study focused on ecoinvent, which should normally be used as a background database. The difference in dispersion between the two approaches may be important in some contexts, and calculating the uncertainty of tree models as a sensitivity analysis could be useful.

  相似文献   

12.
Abstract

Many different models have been proposed to explain the complex binding behavior of insulin to its receptor, but a systematic comparison of models with experimental results is lacking. We have used network thermodynamic computer simulations to compare models of insulin binding against the results of several experimental tests designed to differentiate between the models. Six models of insulin binding were tested (simple, diffusion-reaction, conversion, dissociation, heterogeneous site, and two-step intramembrane) against results reported in the literature for isolated rat adipocytes. Although still a matter of experimental controversy, the criteria selected for modeling were curvilinear Scatchard plots, bi-or multi-exponential dissociation, insulin-accelerated dissociation, lack of dependence of the overall dissociation constant on receptor number, and receptor reserve. Using a given set of parameter values most appropriate for each model, none was able to account for all of the observed experimental results. This indicates both the complexity of the binding reaction and the need for further model development. The approach of using computer simulations to systematically test models against experimental results affords not only insight into the critical features of a model enabling it to pass a test, but also indicates potential experiments which might differentiate between models.  相似文献   

13.
14.

Background

Aedes aegypti is one of the most important mosquito vectors of human disease. The development of spatial models for Ae. aegypti provides a promising start toward model-guided vector control and risk assessment, but this will only be possible if models make reliable predictions. The reliability of model predictions is affected by specific sources of uncertainty in the model.

Methodology/Principal Findings

This study quantifies uncertainties in the predicted mosquito population dynamics at the community level (a cluster of 612 houses) and the individual-house level based on Skeeter Buster, a spatial model of Ae. aegypti, for the city of Iquitos, Peru. The study considers two types of uncertainty: 1) uncertainty in the estimates of 67 parameters that describe mosquito biology and life history, and 2) uncertainty due to environmental and demographic stochasticity. Our results show that for pupal density and for female adult density at the community level, respectively, the 95% prediction confidence interval ranges from 1000 to 3000 and from 700 to 5,000 individuals. The two parameters contributing most to the uncertainties in predicted population densities at both individual-house and community levels are the female adult survival rate and a coefficient determining weight loss due to energy used in metabolism at the larval stage (i.e. metabolic weight loss). Compared to parametric uncertainty, stochastic uncertainty is relatively low for population density predictions at the community level (less than 5% of the overall uncertainty) but is substantially higher for predictions at the individual-house level (larger than 40% of the overall uncertainty). Uncertainty in mosquito spatial dispersal has little effect on population density predictions at the community level but is important for the prediction of spatial clustering at the individual-house level.

Conclusion/Significance

This is the first systematic uncertainty analysis of a detailed Ae. aegypti population dynamics model and provides an approach for identifying those parameters for which more accurate estimates would improve model predictions.  相似文献   

15.
Tumors are often heterogeneous in which tumor cells of different phenotypes have distinct properties. For scientific and clinical interests, it is of fundamental importance to understand their properties and the dynamic variations among different phenotypes, specifically under radio- and/or chemo-therapy. Currently there are two controversial models describing tumor heterogeneity, the cancer stem cell (CSC) model and the stochastic model. To clarify the controversy, we measured probabilities of different division types and transitions of cells via in situ immunofluorescence. Based on the experiment data, we constructed a model that combines the CSC with the stochastic concepts, showing the existence of both distinctive CSC subpopulations and the stochastic transitions from NSCCs to CSCs. The results showed that the dynamic variations between CSCs and non-stem cancer cells (NSCCs) can be simulated with the model. Further studies also showed that the model can be used to describe the dynamics of the two subpopulations after radiation treatment. More importantly, analysis demonstrated that the experimental detectable equilibrium CSC proportion can be achieved only when the stochastic transitions from NSCCs to CSCs occur, indicating that tumor heterogeneity may exist in a model coordinating with both the CSC and the stochastic concepts. The mathematic model based on experimental parameters may contribute to a better understanding of the tumor heterogeneity, and provide references on the dynamics of CSC subpopulation during radiotherapy.  相似文献   

16.
PurposeThe accuracy of biomechanical models is predicated on the realism by which they represent their biomechanical tissues. Unfortunately, most models use phenomenological ligament models that neglect the behaviour in the failure region. Therefore, the purpose of this investigation was to test whether a mechanistic model of ligamentous tissue portrays behaviour representative of actual ligament failure tests.ModelThe model tracks the time-evolution of a population of collagen fibres in a theoretical ligament. Each collagen fibre is treated as an independent linear cables with constant stiffness. Model equations were derived by assuming these fibres act as a continuum and applying a conservation law akin to Huxley’s muscle model. A breaking function models the rate of collagen fibre breakage at a given displacement, and was chosen to be a linear function for this preliminary analysis.MethodsThe model was fitted to experimental average curves for the cervical anterior longitudinal ligament. In addition, the model was cyclically loaded to test whether the tissue model behaves similarly.ResultsThe model agreed very well with experiment with an RMS error of 14.23 N and an R2 of 0.995. Cyclic loading exhibited a reduction in force similar to experimental data.Discussion and conclusionThe proposed model showcases behaviour reminiscent of actual ligaments being strained to failure and undergoing cyclic load. Future work could incorporate viscous effects, or validate the model further by testing it in various loading conditions. Characterizing the breaking function more accurately would also lead to better results.  相似文献   

17.
BackgroundIsothermal titration calorimetry (ITC) is uniquely useful for characterizing binding thermodynamics, because it straightforwardly provides both the binding enthalpy and free energy. However, the precision of the results depends on the experimental setup and how thermodynamic results are obtained from the raw data.MethodsExperiments and Monte Carlo analysis are used to study how uncertainties in injection heat and concentration propagate to binding enthalpies in various scenarios. We identify regimes in which it is preferable to fix the stoichiometry parameter, N, and evaluate the reliability of uncertainties provided by the least squares method.ResultsThe noise in the injection heat is mainly proportional in character, with ~ 1% and ~ 3% uncertainty at 27C and 65C, respectively; concentration errors are ~ 1%. Simulations of experiments based on these uncertainties delineate how experimental design and curve fitting methods influence the uncertainty in the final results.ConclusionsIn most cases, experimental uncertainty is minimized by using more injections and by fixing N at its known value. With appropriate technique, the uncertainty in measured binding enthalpies can be kept below ~ 2% under many conditions, including low C values.General SignificanceWe quantify uncertainties in ITC data due to heat and concentration error, and identify practices to minimize these uncertainties. The resulting guidelines are important when ITC data are used quantitatively, such as to test computer simulations of binding. Reproducibility and further study are supported by free distribution of the new software developed here.  相似文献   

18.
Moment closure approximations are used to provide analytic approximations to non-linear stochastic population models. They often provide insights into model behaviour and help validate simulation results. However, existing closure schemes typically fail in situations where the population distribution is highly skewed or extinctions occur. In this study we address these problems by introducing novel second-and third-order moment closure approximations which we apply to the stochastic SI and SIS epidemic models. In the case of the SI model, which has a highly skewed distribution of infection, we develop a second-order approximation based on the beta-binomial distribution. In addition, a closure approximation based on mixture distribution is developed in order to capture the behaviour of the stochastic SIS model around the threshold between persistence and extinction. This mixture approximation comprises a probability distribution designed to capture the quasi-equilibrium probabilities of the system and a probability mass at 0 which represents the probability of extinction. Two third-order versions of this mixture approximation are considered in which the log-normal and the beta-binomial are used to model the quasi-equilibrium distribution. Comparison with simulation results shows: (1) the beta-binomial approximation is flexible in shape and matches the skewness predicted by simulation as shown by the stochastic SI model and (2) mixture approximations are able to predict transient and extinction behaviour as shown by the stochastic SIS model, in marked contrast with existing approaches. We also apply our mixture approximation to approximate a likehood function and carry out point and interval parameter estimation.  相似文献   

19.

Background

The dynamics of biochemical networks can be modelled by systems of ordinary differential equations. However, these networks are typically large and contain many parameters. Therefore model reduction procedures, such as lumping, sensitivity analysis and time-scale separation, are used to simplify models. Although there are many different model reduction procedures, the evaluation of reduced models is difficult and depends on the parameter values of the full model. There is a lack of a criteria for evaluating reduced models when the model parameters are uncertain.

Results

We developed a method to compare reduced models and select the model that results in similar dynamics and uncertainty as the original model. We simulated different parameter sets from the assumed parameter distributions. Then, we compared all reduced models for all parameter sets using cluster analysis. The clusters revealed which of the reduced models that were similar to the original model in dynamics and variability. This allowed us to select the smallest reduced model that best approximated the full model. Through examples we showed that when parameter uncertainty was large, the model should be reduced further and when parameter uncertainty was small, models should not be reduced much.

Conclusions

A method to compare different models under parameter uncertainty is developed. It can be applied to any model reduction method. We also showed that the amount of parameter uncertainty influences the choice of reduced models.
  相似文献   

20.
Habitat suitability index (HSI) models are commonly used to predict habitat quality and species distributions and are used to develop biological surveys, assess reserve and management priorities, and anticipate possible change under different management or climate change scenarios. Important management decisions may be based on model results, often without a clear understanding of the level of uncertainty associated with model outputs. We present an integrated methodology to assess the propagation of uncertainty from both inputs and structure of the HSI models on model outputs (uncertainty analysis: UA) and relative importance of uncertain model inputs and their interactions on the model output uncertainty (global sensitivity analysis: GSA). We illustrate the GSA/UA framework using simulated hydrology input data from a hydrodynamic model representing sea level changes and HSI models for two species of submerged aquatic vegetation (SAV) in southwest Everglades National Park: Vallisneria americana (tape grass) and Halodule wrightii (shoal grass). We found considerable spatial variation in uncertainty for both species, but distributions of HSI scores still allowed discrimination of sites with good versus poor conditions. Ranking of input parameter sensitivities also varied spatially for both species, with high habitat quality sites showing higher sensitivity to different parameters than low‐quality sites. HSI models may be especially useful when species distribution data are unavailable, providing means of exploiting widely available environmental datasets to model past, current, and future habitat conditions. The GSA/UA approach provides a general method for better understanding HSI model dynamics, the spatial and temporal variation in uncertainties, and the parameters that contribute most to model uncertainty. Including an uncertainty and sensitivity analysis in modeling efforts as part of the decision‐making framework will result in better‐informed, more robust decisions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号