首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Managing natural resources in a sustainable way is a hard task, due to uncertainties, dynamics and conflicting objectives (ecological, social, and economical). We propose a stochastic viability approach to address such problems. We consider a discrete-time control dynamical model with uncertainties, representing a bioeconomic system. The sustainability of this system is described by a set of constraints, defined in practice by indicators - namely, state, control and uncertainty functions - together with thresholds. This approach aims at identifying decision rules such that a set of constraints, representing various objectives, is respected with maximal probability. Under appropriate monotonicity properties of dynamics and constraints, having economic and biological content, we characterize an optimal feedback. The connection is made between this approach and the so-called Management Strategy Evaluation for fisheries. A numerical application to sustainable management of Bay of Biscay nephrops-hakes mixed fishery is given.  相似文献   

2.
Many countries are currently dealing with the COVID-19 epidemic and are searching for an exit strategy such that life in society can return to normal. To support this search, computational models are used to predict the spread of the virus and to assess the efficacy of policy measures before actual implementation. The model output has to be interpreted carefully though, as computational models are subject to uncertainties. These can stem from, e.g., limited knowledge about input parameters values or from the intrinsic stochastic nature of some computational models. They lead to uncertainties in the model predictions, raising the question what distribution of values the model produces for key indicators of the severity of the epidemic. Here we show how to tackle this question using techniques for uncertainty quantification and sensitivity analysis. We assess the uncertainties and sensitivities of four exit strategies implemented in an agent-based transmission model with geographical stratification. The exit strategies are termed Flattening the Curve, Contact Tracing, Intermittent Lockdown and Phased Opening. We consider two key indicators of the ability of exit strategies to avoid catastrophic health care overload: the maximum number of prevalent cases in intensive care (IC), and the total number of IC patient-days in excess of IC bed capacity. Our results show that uncertainties not directly related to the exit strategies are secondary, although they should still be considered in comprehensive analysis intended to inform policy makers. The sensitivity analysis discloses the crucial role of the intervention uptake by the population and of the capability to trace infected individuals. Finally, we explore the existence of a safe operating space. For Intermittent Lockdown we find only a small region in the model parameter space where the key indicators of the model stay within safe bounds, whereas this region is larger for the other exit strategies.  相似文献   

3.
Although variability in connective tissue parameters is widely reported and recognized, systematic examination of the effect of such parametric uncertainties on predictions derived from a full anatomical joint model is lacking. As such, a sensitivity analysis was performed to consider the behavior of a three-dimensional, non-linear, finite element knee model with connective tissue material parameters that varied within a given interval. The model included the coupled mechanics of the tibio-femoral and patello-femoral degrees of freedom. Seven primary connective tissues modeled as non-linear continua, articular cartilages described by a linear elastic model, and menisci modeled as transverse isotropic elastic materials were included. In this study, a multi-factorial global sensitivity analysis is proposed, which can detect the contribution of influential material parameters while maintaining the potential effect of parametric interactions. To illustrate the effect of material uncertainties on model predictions, exemplar loading conditions reported in a number of isolated experimental paradigms were used. Our findings illustrated that the inclusion of material uncertainties in a coupled tibio-femoral and patello-femoral model reveals biomechanical interactions that otherwise would remain unknown. For example, our analysis revealed that the effect of anterior cruciate ligament parameter variations on the patello-femoral kinematic and kinetic response sensitivities was significantly larger, over a range of flexion angles, when compared to variations associated with material parameters of tissues intrinsic to the patello-femoral joint. We argue that the systematic sensitivity framework presented herein will help identify key material uncertainties that merit further research and provide insight on those uncertainties that may not be as relative to a given response.  相似文献   

4.
In this paper, a new cost estimation method based on parallel Monte Carlo simulation and market investigation for the chemical engineering construction project is proposed to consider both the uncertainties of cost estimation and market drivers. The critical items of exerting large impacts on the cost estimation are selected by the market investigation. Then important critical items are chosen by the sensitivity analysis based on the parallel Monte-Carlo simulation combining with the Likert scale method from critical items. The Relative Important Indices and Normalized Important Indices are obtained according to the discipline and procurement experts’ experience in the relative construction market. Then re-rankings of market drivers will be acted as guidelines for carrying out project cost simulations based on the parallel Monte-Carlo method, with inquired information of important critical items with more efficiency. An illustrative example in a petrochemical Engineering Procurement Construction contracting project in Saudi verified the validity and practicability of the proposed method.  相似文献   

5.
Recent breakthroughs in CO(2) fumigation methods using free-air CO(2) enrichment (FACE) technology have prompted comparisons between FACE experiments and enclosure studies with respect to quantification of the effects of projected atmospheric CO(2) concentrations on crop yields. On the basis of one such comparison, it was argued that model projections of future food supply (some of which are based on older enclosure data) may have significantly overestimated the positive effect of elevated CO(2) concentration on crop yields and, by extension, food security. However, in the comparison, no effort was made to differentiate enclosure study methodologies with respect to maintaining projected CO(2) concentration or to consider other climatic changes (e.g. warming) that could impact crop yields. In this review, we demonstrate that relative yield stimulations in response to future CO(2) concentrations obtained using a number of enclosure methodologies are quantitatively consistent with FACE results for three crops of global importance: rice (Oryza sativa), soybean (Glycine max) and wheat (Triticum aestivum). We suggest, that instead of focusing on methodological disparities per se, improved projections of future food supply could be achieved by better characterization of the biotic/abiotic uncertainties associated with projected changes in CO(2) and climate and incorporation of these uncertainties into current crop models.  相似文献   

6.
Studies report different findings concerning the climate benefits of bioenergy, in part due to varying scope and use of different approaches to define spatial and temporal system boundaries. We quantify carbon balances for bioenergy systems that use biomass from forests managed with long rotations, employing different approaches and boundary conditions. Two approaches to represent landscapes and quantify their carbon balances – expanding vs. constant spatial boundaries – are compared. We show that for a conceptual forest landscape, constructed by combining a series of time‐shifted forest stands, the two approaches sometimes yield different results. We argue that the approach that uses constant spatial boundaries is preferable because it captures all carbon flows in the landscape throughout the accounting period. The approach that uses expanding system boundaries fails to accurately describe the carbon fluxes in the landscape due to incomplete coverage of carbon flows and influence of the stand‐level dynamics, which in turn arise from the way temporal system boundaries are defined on the stand level. Modelling of profit‐driven forest management using location‐specific forest data shows that the implications for carbon balance of management changes across the landscape (which are partly neglected when expanding system boundaries are used) depend on many factors such as forest structure and forest owners’ expectations of market development for bioenergy and other wood products. Assessments should not consider forest‐based bioenergy in isolation but should ideally consider all forest products and how forest management planning as a whole is affected by bioenergy incentives – and how this in turn affects carbon balances in forest landscapes and forest product pools. Due to uncertainties, we modelled several alternative scenarios for forest products markets. We recommend that future work consider alternative scenarios for other critical factors, such as policy options and energy technology pathways.  相似文献   

7.
The aim of this study is to determine effects of size deviations of brachytherapy seeds on two dimensional dose distributions around the seed. Although many uncertainties are well known, the uncertainties which stem from geometric features of radiation sources are weakly considered and predicted. Neither TG-43 report which is not completely in common consensus, nor individual scientific MC and experimental studies include sufficient data for geometric uncertainties. Sizes of seed and its components can vary in a manufacturing deviation. This causes geometrical uncertainties, too. In this study, three seeds which have different geometrical properties were modeled using EGSnrc-Code Packages. Seeds were designed with all their details using the geometry package. 5% deviations of seed sizes were assumed. Modified seeds were derived from original seed by changing sizes by 5%. Normalizations of doses which were calculated from three kinds of brachytherapy seed and their derivations were found to be about 3%–20%. It was shown that manufacturing differences of brachytherapy seed cause considerable changes in dose distribution.  相似文献   

8.
Setlow RB 《Mutation research》1999,430(2):774-175
  相似文献   

9.
Human activities have severely disrupted the Lake Erie ecosystem. Recent changes in the structure of the lower trophic level associated with exotic species invasions and reduced nutrient loading have created ecological uncertainties for fisheries management. Decisions that naïvely assume certainty may be different and suboptimal compared to choices that consider uncertainty. Here we illustrate how multiobjective Bayesian decision analysis can recognize the multiple goals of management in evaluations of the effect of ecological uncertainties on management and the value of information from ecological research. Value judgments and subjective probabilities required by the decision analysis were provided by six Lake Erie fishery agency biologists. The Lake Erie Ecological Model was used to project the impacts of each combination of management actions and lower trophic level parameter values. The analysis shows that explicitly considering lower trophic level uncertainties can alter decisions concerning Lake Erie fishery harvests. Of the research projects considered, investigation of goby predation of zebra mussels (Dreissena sp.) and lakewide estimation of secondary production appear to have the greatest expected value for fisheries management. We also find that changes in the weights assigned to management goals affects decisions and value of information more than do changes in probability judgments.  相似文献   

10.
The stock‐driven dynamic material flow analysis (MFA) model is one of the prevalent tools to investigate the evolution and related material metabolism of the building stock. There exists substantial uncertainty inherent to input parameters of the stock‐driven dynamic building stock MFA model, which has not been comprehensively evaluated yet. In this study, a probabilistic, stock‐driven dynamic MFA model is established and China's urban housing stock is selected as the empirical case. This probabilistic dynamic MFA model has the ability to depict the future evolution pathway of China's housing stock and capture uncertainties in its material stock, inflow, and outflow. By means of probabilistic methods, a detailed and transparent estimation of China's housing stock and its material metabolism behavior is presented. Under a scenario with a saturation level of the population, urbanization, and living space, the median value of the urban housing stock area, newly completed area, and demolished area would peak at around 49, 2.2, and 2.2 billion square meters, respectively. The corresponding material stock and flows are 79, 3.5, and 3.3 billion tonnes, respectively. Uncertainties regarding housing stock and its material stock and flows are non‐negligible. Relative uncertainties of the material stock and flows are above 50%. The uncertainty importance analysis demonstrates that the material intensity and the total population are major contributions to the uncertainty. Policy makers in the housing sector should consider the material efficiency as an essential policy to mitigate material flows of the urban building stock and to lower the risk of policy failures.  相似文献   

11.
Reconstruction by data integration is an emerging trend to reconstruct large protein assemblies, but uncertainties on the input data yield average models whose quantitative interpretation is challenging. This article presents methods to probe fuzzy models of large assemblies against atomic resolution models of subsystems. Consider a toleranced model (TOM) of a macromolecular assembly, namely a continuum of nested shapes representing the assembly at multiple scales. Also consider a template namely an atomic resolution 3D model of a subsystem (a complex) of this assembly. We present graph‐based algorithms performing a multi‐scale assessment of the complexes of the TOM, by comparing the pairwise contacts which appear in the TOM against those of the template. We apply this machinery on TOM derived from an average model of the nuclear pore complex, to explore the connections among members of its well‐characterized Y‐complex. Proteins 2013; 81:2034–2044. © 2013 Wiley Periodicals, Inc.  相似文献   

12.
Hernke MT  Podein RJ 《EcoHealth》2011,8(2):223-232
This article considers health concerns associated with lawn pesticide use and potential policy actions to address those concerns. We first briefly present the generations of pesticide technology, and then apply a sustainability lens to consider the dissipative use of persistent compounds. We enumerate uncertainties in available science and gaps in toxicity testing of pesticides, along with potential for exposure and evidence of harm from lawn pesticide exposure. We consider how a precautionary approach complements a sustainability perspective and detailed scientific findings, and then briefly present practical approaches to reducing use of lawn pesticides. Finally, we highlight factors pivotal for successful policy to limit lawn pesticide use.  相似文献   

13.

Purpose

Quantitative uncertainties are a direct consequence of averaging, a common procedure when building life cycle inventories (LCIs). This averaging can be amongst locations, times, products, scales or production technologies. To date, however, quantified uncertainties at the unit process level have largely been generated using a Numerical Unit Spread Assessment Pedigree (NUSAP) approach and often disregard inherent uncertainties (inaccurate measurements) and spread (variability around means).

Methods

A decision tree for primary and secondary data at the unit process level was initially created. Around this decision tree, a protocol was developed with the recognition that dispersions can be either results of inherent uncertainty, spread amongst data points or products of unrepresentative data. In order to estimate the characteristics of uncertainties for secondary data, a method for weighting means amongst studies is proposed. As for unrepresentativeness, the origin and adaptation of NUSAP to the field of life cycle assessment are discussed, and recommendations are given.

Results and discussion

By using the proposed protocol, cross-referencing of outdated data is avoided, and user influence on results is reduced. In the meantime, more accurate estimates can be made for horizontally averaged data with accompanying spread and inherent uncertainties, as these deviations often contribute substantially towards the overall dispersion.

Conclusions

In this article, we highlight the importance of including inherent uncertainties and spread alongside the NUSAP pedigree. As uncertainty data often are missing in LCI literature, we here describe a method for evaluating these by taking several reported values into account. While this protocol presents a practical way towards estimating overall dispersion, better reporting in literature is promoted in order to determine real uncertainty parameters.  相似文献   

14.

Purpose

In LCA, a multi-functionality problem exists whenever the environmental impacts of a multi-functional process have to be allocated between its multiple functions. Methods for fixing this multi-functionality problem are controversially discussed because the methods include ambiguous choices. To study the influence of these choices, the ISO standard requires a sensitivity analysis. This work presents an analytical method for analyzing sensitivities and uncertainties of LCA results with respect to the choices made when a multi-functionality problem is fixed.

Methods

The existing matrix algebra for LCA is expanded by explicit equations for methods that fix multi-functionality problems: allocation and avoided burden. For allocation, choices exist between alternative allocation factors. The expanded equations allow calculating LCA results as a function of allocation factors. For avoided burden, choices exist in selecting an avoided burden process from multiple candidates. This choice is represented by so-called aggregation factors. For avoided burden, the expanded equations calculate LCA results as a function of aggregation factors. The expanded equations are used to derive sensitivity coefficients for LCA results with respect to allocation factors and aggregation factors. Based on the sensitivity coefficients, uncertainties due to fixing a multi-functionality problem by allocation or avoided burden are analytically propagated. The method is illustrated using a virtual numerical example.

Results and discussion

The presented approach rigorously quantifies sensitivities of LCA results with respect to the choices made when multi-functionality problems are fixed with allocation and avoided burden. The uncertainties due to fixing multi-functionality problems are analytically propagated to uncertainties in LCA results using a first-order approximation. For uncertainties in allocation factors, the first-order approximation is exact if no loops of the allocated functional flows exist. The contribution of uncertainties due to fixing multi-functionality problems can be directly compared to the uncertainty contributions induced by uncertain process data or characterization factors. The presented method allows the computationally efficient study of uncertainties due to fixing multi-functionality problems and could be automated in software tools.

Conclusions

This work provides a systematic method for the sensitivity analysis required by the ISO standard in case choices between alternative allocation procedures exist. The resulting analytical approach includes contributions of uncertainties in process data, characterization factors, and—in extension to existing methods—uncertainties due to fixing multi-functionality problems in a unifying rigorous framework. Based on the uncertainty contributions, LCA practitioners can select fields for data refinement to decrease the overall uncertainty in LCA results.  相似文献   

15.
In this paper, the design problem of state estimator for genetic regulatory networks with time delays and randomly occurring uncertainties has been addressed by a delay decomposition approach. The norm-bounded uncertainties enter into the genetic regulatory networks (GRNs) in random ways, and such randomly occurring uncertainties (ROUs) obey certain mutually uncorrelated Bernoulli distributed white noise sequences. Under these circumstances, the state estimator is designed to estimate the true concentration of the mRNA and the protein of the uncertain GRNs. Delay-dependent stability criteria are obtained in terms of linear matrix inequalities by constructing a Lyapunov–Krasovskii functional and using some inequality techniques (LMIs). Then, the desired state estimator, which can ensure the estimation error dynamics to be globally asymptotically robustly stochastically stable, is designed from the solutions of LMIs. Finally, a numerical example is provided to demonstrate the feasibility of the proposed estimation schemes.  相似文献   

16.
Multi-pathway risk assessments (MPRAs) of contaminant emissions to the atmosphere consider both direct exposures, via ambient air, and indirect exposures, via deposition to land and water. MPRAs embody numerous interconnected models and parameters. Concatenation of many multiplicative and incompletely defined assumptions and inputs can result in risk estimates with considerable uncertainties, which are difficult to quantify and elucidate. Here, three MPRA case-studies approach uncertainties in ways that better inform context-specific judgments of risk. In the first case, default values predicted implausibly large impacts; substitution of site-specific data within conservative methods resulted in reasonable and intuitive worst-case estimates. In the second, a simpler, clearly worst-case water quality model sufficed to demonstrate acceptable risks. In the third case, exposures were intentionally and transparently overestimated. Choices made within particular MPRAs depend on availability of data as suitable replacements for default assumptions, regulatory requirements, and thoughtful consideration of the concerns of interested stakeholders. Explicit consideration of the biases inherent in each risk assessment lends greater credibility to the assessment results, and can form the bases for evidence-based decision-making.  相似文献   

17.
Uncertainties in sediment quality assessments are discussed in five categories: (1) sediment sampling, transport and storage; (2) sediment chemistry; (3) ecotoxicology; (4) benthic community structure; and (5) data uncertainties and QA/QC. Three major exposure routes are considered: whole sediments, and waters in sediment pores and at the sediment-water interface. If these uncertainties are not recognized and addressed in the assessment process, then erroneous conclusions may result. Recommendations are provided for addressing the identified uncertainties in each of the key areas. The purpose of this paper is to improve the reporting of sediment quality assessments.  相似文献   

18.
A convenient method for evaluation of biochemical reaction rate coefficients and their uncertainties is described. The motivation for developing this method was the complexity of existing statistical methods for analysis of biochemical rate equations, as well as the shortcomings of linear approaches, such as Lineweaver-Burk plots. The nonlinear least-squares method provides accurate estimates of the rate coefficients and their uncertainties from experimental data. Linearized methods that involve inversion of data are unreliable since several important assumptions of linear regression are violated. Furthermore, when linearized methods are used, there is no basis for calculation of the uncertainties in the rate coefficients. Uncertainty estimates are crucial to studies involving comparisons of rates for different organisms or environmental conditions. The spreadsheet method uses weighted least-squares analysis to determine the best-fit values of the rate coefficients for the integrated Monod equation. Although the integrated Monod equation is an implicit expression of substrate concentration, weighted least-squares analysis can be employed to calculate approximate differences in substrate concentration between model predictions and data. An iterative search routine in a spreadsheet program is utilized to search for the best-fit values of the coefficients by minimizing the sum of squared weighted errors. The uncertainties in the best-fit values of the rate coefficients are calculated by an approximate method that can also be implemented in a spreadsheet. The uncertainty method can be used to calculate single-parameter (coefficient) confidence intervals, degrees of correlation between parameters, and joint confidence regions for two or more parameters. Example sets of calculations are presented for acetate utilization by a methanogenic mixed culture and trichloroethylene cometabolism by a methane-oxidizing mixed culture. An additional advantage of application of this method to the integrated Monod equation compared with application of linearized methods is the economy of obtaining rate coefficients from a single batch experiment or a few batch experiments rather than having to obtain large numbers of initial rate measurements. However, when initial rate measurements are used, this method can still be used with greater reliability than linearized approaches.  相似文献   

19.
Patterns that resemble strongly skewed size distributions are frequently observed in ecology. A typical example represents tree size distributions of stem diameters. Empirical tests of ecological theories predicting their parameters have been conducted, but the results are difficult to interpret because the statistical methods that are applied to fit such decaying size distributions vary. In addition, binning of field data as well as measurement errors might potentially bias parameter estimates. Here, we compare three different methods for parameter estimation – the common maximum likelihood estimation (MLE) and two modified types of MLE correcting for binning of observations or random measurement errors. We test whether three typical frequency distributions, namely the power-law, negative exponential and Weibull distribution can be precisely identified, and how parameter estimates are biased when observations are additionally either binned or contain measurement error. We show that uncorrected MLE already loses the ability to discern functional form and parameters at relatively small levels of uncertainties. The modified MLE methods that consider such uncertainties (either binning or measurement error) are comparatively much more robust. We conclude that it is important to reduce binning of observations, if possible, and to quantify observation accuracy in empirical studies for fitting strongly skewed size distributions. In general, modified MLE methods that correct binning or measurement errors can be applied to ensure reliable results.  相似文献   

20.
The debate over chlorine in industrialized economies has become extremely polarized in the last decade. Environmental pressure groups are striving for a virtual phaseout of chlorine and chlorinated hydrocarbons (CHCs), because they are convinced that the risks cannot be managed. Industry argues this is not necessary because environmental risks can be controlled, nor is it feasible, because at least 60% of all firms use CHCs, produds made with CHCs, or elemental chlorine. In an attempt to give this discussion a more factual basis, the Dutch minister of environment launched a strategic study on chlorine (see Kleijn et al. I997;Tukker et al. 1995). Using all available knowledge about emissions and contemporary evaluation methods, the study found only a limited number of environmental issues outstanding related to the chlorine chain: however, it also found important uncertainties. This article describes the outstanding uncertainties in more detail. It defines which uncertainties have to be regarded as chlorine-specific and the extent to which additional research can resolve them. For the remaining uncertainties the potential benefts of uncertainty reduction strategies are evaluted, relying mainly on the precautionary principle  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号