首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
 In this paper, known results on optimal intervention policies for the general stochastic epidemic model are extended to epidemic models with more general infection and removal rate functions. We consider first policies allowing for the isolation of any number of infectives from the susceptible population at any time, secondly policies allowing for the immunisation of the entire susceptible population at any time, and finally policies allowing for either of these interventions. In each case the costs of infection, isolation and immunisation are assumed to have a particular, rather simple, form. Sufficient conditions are given on the infection and removal rate functions of the model for the optimal policies to take the same simple form as in the case of the general stochastic epidemic model. More general costs are briefly discussed, and some numerical examples given. Finally, we discuss possible directions for further work. Received: 16 February 1999  相似文献   

2.
We present a stochastic programming framework for finding the optimal vaccination policy for controlling infectious disease epidemics under parameter uncertainty. Stochastic programming is a popular framework for including the effects of parameter uncertainty in a mathematical optimization model. The problem is initially formulated to find the minimum cost vaccination policy under a chance-constraint. The chance-constraint requires that the probability that R(*) 相似文献   

3.
Gustafson P 《Biometrics》2007,63(1):69-77
Yin and Ibrahim (2005a, Biometrics 61, 208-216) use a Box-Cox transformed hazard model to acknowledge uncertainty about how a linear predictor acts upon the hazard function of a failure-time response. Particularly, additive and proportional hazards models arise for particular values of the transformation parameter. As is often the case, however, this added model flexibility is obtained at the cost of lessened parameter interpretability. Particularly, the interpretation of the coefficients in the linear predictor is intertwined with the value of the transformation parameter. Moreover, some data sets contain very little information about this parameter. To shed light on the situation, we consider average effects based on averaging (over the joint distribution of the explanatory variables and the failure-time response) the partial derivatives of the hazard, or the log-hazard, with respect to the explanatory variables. First, we consider fitting models which do assume a particular form of covariate effects, for example, proportional hazards or additive hazards. In some such circumstances, average effects are seen to be inferential targets which are robust to the effect form being misspecified. Second, we consider average effects as targets of inference when using the transformed hazard model. We show that in addition to being more interpretable inferential targets, average effects can sometimes be estimated more efficiently than the corresponding regression coefficients.  相似文献   

4.
King R  Brooks SP 《Biometrics》2008,64(3):816-824
Summary .   We consider the estimation of the size of a closed population, often of interest for wild animal populations, using a capture–recapture study. The estimate of the total population size can be very sensitive to the choice of model used to fit to the data. We consider a Bayesian approach, in which we consider all eight plausible models initially described by Otis et al. (1978, Wildlife Monographs 62, 1–135) within a single framework, including models containing an individual heterogeneity component. We show how we are able to obtain a model-averaged estimate of the total population, incorporating both parameter and model uncertainty. To illustrate the methodology we initially perform a simulation study and analyze two datasets where the population size is known, before considering a real example relating to a population of dolphins off northeast Scotland.  相似文献   

5.
We consider the problem of using time-series data to inform a corresponding deterministic model and introduce the concept of genetic algorithms (GA) as a tool for parameter estimation, providing instructions for an implementation of the method that does not require access to special toolboxes or software. We give as an example a model for cholera, a disease for which there is much mechanistic uncertainty in the literature. We use GA to find parameter sets using available time-series data from the introduction of cholera in Haiti and we discuss the value of comparing multiple parameter sets with similar performances in describing the data.  相似文献   

6.

Background

The dynamics of biochemical networks can be modelled by systems of ordinary differential equations. However, these networks are typically large and contain many parameters. Therefore model reduction procedures, such as lumping, sensitivity analysis and time-scale separation, are used to simplify models. Although there are many different model reduction procedures, the evaluation of reduced models is difficult and depends on the parameter values of the full model. There is a lack of a criteria for evaluating reduced models when the model parameters are uncertain.

Results

We developed a method to compare reduced models and select the model that results in similar dynamics and uncertainty as the original model. We simulated different parameter sets from the assumed parameter distributions. Then, we compared all reduced models for all parameter sets using cluster analysis. The clusters revealed which of the reduced models that were similar to the original model in dynamics and variability. This allowed us to select the smallest reduced model that best approximated the full model. Through examples we showed that when parameter uncertainty was large, the model should be reduced further and when parameter uncertainty was small, models should not be reduced much.

Conclusions

A method to compare different models under parameter uncertainty is developed. It can be applied to any model reduction method. We also showed that the amount of parameter uncertainty influences the choice of reduced models.
  相似文献   

7.
8.
Most biological models of intermediate size, and probably all large models, need to cope with the fact that many of their parameter values are unknown. In addition, it may not be possible to identify these values unambiguously on the basis of experimental data. This poses the question how reliable predictions made using such models are. Sensitivity analysis is commonly used to measure the impact of each model parameter on its variables. However, the results of such analyses can be dependent on an exact set of parameter values due to nonlinearity. To mitigate this problem, global sensitivity analysis techniques are used to calculate parameter sensitivities in a wider parameter space. We applied global sensitivity analysis to a selection of five signalling and metabolic models, several of which incorporate experimentally well-determined parameters. Assuming these models represent physiological reality, we explored how the results could change under increasing amounts of parameter uncertainty. Our results show that parameter sensitivities calculated with the physiological parameter values are not necessarily the most frequently observed under random sampling, even in a small interval around the physiological values. Often multimodal distributions were observed. Unsurprisingly, the range of possible sensitivity coefficient values increased with the level of parameter uncertainty, though the amount of parameter uncertainty at which the pattern of control was able to change differed among the models analysed. We suggest that this level of uncertainty can be used as a global measure of model robustness. Finally a comparison of different global sensitivity analysis techniques shows that, if high-throughput computing resources are available, then random sampling may actually be the most suitable technique.  相似文献   

9.
We address the problem of controlling an assembly system in which the processing times as well as the types of subassemblies are stochastic. The quality (or performance) of the final part depends on the characteristics of the subassemblies to be assembled, which are not constant. Furthermore, the processing time of a subassembly is random. We analyze the trade-off between the increase in the potential value of parts gained by delaying the assembly operation and the inventory costs caused by this delay. We also consider the effects of processing time uncertainty. Our problem is motivated by the assembly of passive and active plates in flat panel display manufacturing. We formulate the optimal control problem as a Markov decision process. However, the optimal policy is very complex, and we therefore develop simple heuristic policies. We report the results of a simulation study that tests the performance of our heuristics. The computational results indicate that the heuristics are effective for a wide variety of cases.  相似文献   

10.
The Human Toxicity Potential (HTP) is a quantita tive toxic equivalency potential (TEP) that has been introduced previously to express the potential harm of a unit of chemical released into the environment. HTP includes both inherent toxicity and generic source-to-dose relationships for pollutant emissions. Three issues associated with the use of HTP in Life Cycle Impact Assessment (LCIA) are evaluated here. First is the use of regional multimedia models to define source-to-dose relationships for the HTP. Second is uncertainty and variability in sourceto-dose calculations. And third is model performance evaluation for TEP models. Using the HTP as a case study, we consider important sources of uncertainty/variability in the development of source-to-dose models — including parameter variability/uncertainty, model uncertainty, and decision rule uncertainty. Once sources of uncertainty are made explicit, a model performance evaluation is appropriate and useful and thus introduced. Model performance evaluation can illustrate the relative value of increasing model complexity, assembling more data, and/or providing a more explicit representation of uncertainty. This work reveals that an understanding of the uncertainty in TEPs as well as a model performance evaluation are needed to a) refine and target the assessment process and b) improve decision making.  相似文献   

11.
Worldwide, we rely on introduced plants for the essentials of human life; however, intentional plant introductions for commercial benefit have resulted in invaders with negative environmental, economic or social impacts. We argue that plant species of low expected economic value should be less acceptable for introduction than species of high economic value if their other traits are similar; however, key traits such as likelihood of escape and costs of escape are often highly uncertain. Methods do not currently exist which allow decision makers to evaluate costs and benefits of introduction under uncertainty. We developed a cost-benefit analysis for determining plant introduction that incorporates probability of escape, expected economic costs after escape, expected commercial benefits, and the efficiency and cost of containment. We used a model to obtain optimal decisions for the introduction and containment of commercial plants while maximizing net benefit or avoiding losses. We also obtained conditions for robust decisions which take into account severe uncertainty in model parameters using information-gap decision theory. Optimal decisions for introduction and containment of commercial plants depended, not only on the probability of escape and subsequent costs incurred, but also on the anticipated commercial benefit, and the cost and efficiency of containment. When our objective is to maximize net benefit, increasing uncertainty in parameter values increased the likelihood of introduction; in contrast, if our objective is to avoid losses, more uncertainty decreased the likelihood of introduction.  相似文献   

12.
Mathematical modeling is now frequently used in outbreak investigations to understand underlying mechanisms of infectious disease dynamics, assess patterns in epidemiological data, and forecast the trajectory of epidemics. However, the successful application of mathematical models to guide public health interventions lies in the ability to reliably estimate model parameters and their corresponding uncertainty. Here, we present and illustrate a simple computational method for assessing parameter identifiability in compartmental epidemic models. We describe a parametric bootstrap approach to generate simulated data from dynamical systems to quantify parameter uncertainty and identifiability. We calculate confidence intervals and mean squared error of estimated parameter distributions to assess parameter identifiability. To demonstrate this approach, we begin with a low-complexity SEIR model and work through examples of increasingly more complex compartmental models that correspond with applications to pandemic influenza, Ebola, and Zika. Overall, parameter identifiability issues are more likely to arise with more complex models (based on number of equations/states and parameters). As the number of parameters being jointly estimated increases, the uncertainty surrounding estimated parameters tends to increase, on average, as well. We found that, in most cases, R0 is often robust to parameter identifiability issues affecting individual parameters in the model. Despite large confidence intervals and higher mean squared error of other individual model parameters, R0 can still be estimated with precision and accuracy. Because public health policies can be influenced by results of mathematical modeling studies, it is important to conduct parameter identifiability analyses prior to fitting the models to available data and to report parameter estimates with quantified uncertainty. The method described is helpful in these regards and enhances the essential toolkit for conducting model-based inferences using compartmental dynamic models.  相似文献   

13.
Optimisation of fed batch fermenters can substantially increase the profitability of these processes. Optimal control of a fed batch fermenter is usually based on a nominal process model. Parameter uncertainties are not taken into account. Simulation studies show that results obtained with fixed nominal model parameters can be quite sensitive to the uncertainty in parameter values. This paper presents a method for obtaining robust optimal control profiles in the presence of uncertainty in the model parameters. The proposed approach is illustrated with a case study. It is also shown that feedback controllers can reduce the effect of the uncertainties.  相似文献   

14.
We consider the model of invasion prevention in a system of lakes that are connected via traffic of recreational boats. It is shown that in presence of an Allee effect, the general optimal control problem can be reduced to a significantly simpler stationary optimization problem of optimal invasion stopping. We consider possible values of model parameters for zebra mussels. The general N-lake control problem has to be solved numerically, and we show a number of typical features of solutions: distribution of control efforts in space and optimal stopping configurations related with the clusters in lake connection structure.  相似文献   

15.
Ford ED  Kennedy MC 《Annals of botany》2011,108(6):1043-1053

Background and Aims

Constructing functional–structural plant models (FSPMs) is a valuable method for examining how physiology and morphology interact in determining plant processes. However, such models always have uncertainty concerned with whether model components have been selected and represented effectively, with the number of model outputs simulated and with the quality of data used in assessment. We provide a procedure for defining uncertainty of an FSPM and how this uncertainty can be reduced.

Methods

An important characteristic of FSPMs is that typically they calculate many variables. These can be variables that the model is designed to predict and also variables that give indications of how the model functions. Together these variables are used as criteria in a method of multi-criteria assessment. Expected ranges are defined and an evolutionary computation algorithm searches for model parameters that achieve criteria within these ranges. Typically, different combinations of model parameter values provide solutions achieving different combinations of variables within their specified ranges. We show how these solutions define a Pareto Frontier that can inform about the functioning of the model.

Key Results

The method of multi-criteria assessment is applied to development of BRANCHPRO, an FSPM for foliage reiteration on old-growth branches of Pseudotsuga menziesii. A geometric model utilizing probabilities for bud growth is developed into a causal explanation for the pattern of reiteration found on these branches and how this pattern may contribute to the longevity of this species.

Conclusions

FSPMs should be assessed by their ability to simulate multiple criteria simultaneously. When different combinations of parameter values achieve different groups of assessment criteria effectively a Pareto Frontier can be calculated and used to define the sources of model uncertainty.  相似文献   

16.
Dengue fever is currently the most important arthropod-borne viral disease in Brazil. Mathematical modeling of disease dynamics is a very useful tool for the evaluation of control measures. To be used in decision-making, however, a mathematical model must be carefully parameterized and validated with epidemiological and entomological data. In this work, we developed a simple dengue model to answer three questions: (i) which parameters are worth pursuing in the field in order to develop a dengue transmission model for Brazilian cities; (ii) how vector density spatial heterogeneity influences control efforts; (iii) with a degree of uncertainty, what is the invasion potential of dengue virus type 4 (DEN-4) in Rio de Janeiro city. Our model consists of an expression for the basic reproductive number (R0) that incorporates vector density spatial heterogeneity. To deal with the uncertainty regarding parameter values, we parameterized the model using a priori probability density functions covering a range of plausible values for each parameter. Using the Latin Hypercube Sampling procedure, values for the parameters were generated. We conclude that, even in the presence of vector spatial heterogeneity, the two most important entomological parameters to be estimated in the field are the mortality rate and the extrinsic incubation period. The spatial heterogeneity of the vector population increases the risk of epidemics and makes the control strategies more complex. At last, we conclude that Rio de Janeiro is at risk of a DEN-4 invasion. Finally, we stress the point that epidemiologists, mathematicians, and entomologists need to interact more to find better approaches to the measuring and interpretation of the transmission dynamics of arthropod-borne diseases.  相似文献   

17.
For the detailed analysis of sedimentation velocity data, the consideration of radial-dependent baseline offsets is indispensable. Two main approaches are data differencing (“delta-c” approach) and explicit inclusion of baseline parameters in the model (“direct boundary model” of the raw data). The current work aims to clarify the relationships between the two approaches. To this end, a simple model problem is examined. We show that the explicit consideration of the baseline in the model is equivalent to a differencing scheme where the average value is subtracted from all data points. Pairwise differencing in the delta-c approach always results in higher parameter uncertainty. For equidistant time points, the increase is smallest when the reference points are taken at intervals of 1/3 or 2/3 of the total number of time points. If the difference data are misinterpreted to be statistically independent samples, errors in the calculation of the parameter uncertainties can occur. Contrary to claims in the literature, we observe that there is no distinction in the approaches regarding their “model dependence”; both approaches arise from the integral or differential form of the same model, and both approaches can and should provide explicit estimates of the baseline values in the original data space for optimal discrimination between macromolecular sedimentation models.  相似文献   

18.
The Southern High Plains is anticipated to experience significant changes in temperature and precipitation due to climate change. These changes may influence the lesser prairie-chicken (Tympanuchus pallidicinctus) in positive or negative ways. We assessed the potential changes in clutch size, incubation start date, and nest survival for lesser prairie-chickens for the years 2050 and 2080 based on modeled predictions of climate change and reproductive data for lesser prairie-chickens from 2001–2011 on the Southern High Plains of Texas and New Mexico. We developed 9 a priori models to assess the relationship between reproductive parameters and biologically relevant weather conditions. We selected weather variable(s) with the most model support and then obtained future predicted values from climatewizard.org. We conducted 1,000 simulations using each reproductive parameter’s linear equation obtained from regression calculations, and the future predicted value for each weather variable to predict future reproductive parameter values for lesser prairie-chickens. There was a high degree of model uncertainty for each reproductive value. Winter temperature had the greatest effect size for all three parameters, suggesting a negative relationship between above-average winter temperature and reproductive output. The above-average winter temperatures are correlated to La Niña events, which negatively affect lesser prairie-chickens through resulting drought conditions. By 2050 and 2080, nest survival was predicted to be below levels considered viable for population persistence; however, our assessment did not consider annual survival of adults, chick survival, or the positive benefit of habitat management and conservation, which may ultimately offset the potentially negative effect of drought on nest survival.  相似文献   

19.
A management policy for sika deer based on sex-specific hunting   总被引:3,自引:1,他引:2  
We consider here a management policy for a sika deer (Cervus nippon) population in the eastern part of Hokkaido. Deer populations are characterized by a large intrinsic rate of population increase, no significant density effects on population growth before population crash, and a relatively simple life history. Our goals of management for the deer population are (1) to avoid irruption with severe damage to agriculture and forestry, (2) to avoid the risk of extinction of the deer population, and (3) to maintain a sustainable yield of deer. To make a robust program on the basis of uncertain information about the deer population, we consider three levels of relative population size and four levels of hunting pressures. We also take into consideration a critical level for extinction, an optimal level, and an irruption level. The hunting pressure for females is set to increase with the population size. We also recommend catching males if the population size is between the critical and optimal levels and catching females and males if the population size is larger than the optimal level. We must avoid cases of irruption or threatened population under various sets of uncertain parameter values. The simulation results suggest that management based on sex-specific hunting is effective to diminish the annual variation in hunting yield. Received: April 8, 1998 / Accepted: December 25, 1998  相似文献   

20.
This paper describes a variational free-energy formulation of (partially observable) Markov decision problems in decision making under uncertainty. We show that optimal control can be cast as active inference. In active inference, both action and posterior beliefs about hidden states minimise a free energy bound on the negative log-likelihood of observed states, under a generative model. In this setting, reward or cost functions are absorbed into prior beliefs about state transitions and terminal states. Effectively, this converts optimal control into a pure inference problem, enabling the application of standard Bayesian filtering techniques. We then consider optimal trajectories that rest on posterior beliefs about hidden states in the future. Crucially, this entails modelling control as a hidden state that endows the generative model with a representation of agency. This leads to a distinction between models with and without inference on hidden control states; namely, agency-free and agency-based models, respectively.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号