首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In the risk assessment methods for new and existing chemicals in the European Union (EU), environmental “risk” is characterized by the deterministic quotient of exposure and effects (PEC/PNEC). From a scientific viewpoint, the uncertainty in the risk quotient should be accounted for explicitly in the decision making, which can be done in a probabilistic risk framework. To demonstrate the feasibility and benefits of such a framework, a sample risk assessment for an existing chemical (dibutylphthalate, DBP) is presented in this paper. The example shows a probabilistic framework to be feasible with relatively little extra effort; such a framework also provides more relevant information. The deterministic risk quotients turned out to be worst cases at generally higher than the 95th percentile of the probability distributions. Sensitivity analysis proves to be a powerful tool in identifying the main sources of uncertainty and thus will be effective for efficient further testing. The distributions assigned to the assess ment factors (derivation of the PNEC) dominate the total uncertainty in the risk assessment; uncertainties in the release estimates come second. Large uncertainties are an inherent part of risk assessment that we have to deal with quantitatively. However, the most appropriate way to characterise effects and risks requires further attention. Recommendations for further study are identified.  相似文献   

2.
In the European Union, Directive 92/32/EC and EC Council Regulation (EC) 793/93 require the risk assessment of industrial chemicals. In this framework, it is agreed to characterise the level of “risk” by means of the deterministic quotient of exposure and effects parameters. Decision makers require that the uncertainty in the risk assessment be accounted for as explicitly as possible. Therefore, this paper intends to show the advantages and possibilities of a probabilistic human health risk assessment of an industrial chemical, dibutylphthalate (DBP). The risk assessment is based on non-cancer endpoints assumed to have a threshold for toxicity. This example risk assessment shows that a probabilistic risk assessment in the EU framework covering both the exposure and the effects assessment is feasible with currently available techniques. It shows the possibility of comparing the various uncertainties involved in a typical risk assessment, including the uncertainty in the exposure estimate, the uncertainty in the effect parameter, and the uncertainty in assessment factors used in the extrapolation from experimental animals to sensitive human beings. The analysis done did not confirm the reasonable worst-case character of the deterministic EU-assessment of DBP. Sensitivity analysis revealed the extrapolation procedure in the human effects assessment to be the main source of uncertainty. Since the probabilistic approach allows determination of the range of possible outcomes and their likelihood, it better informs both risk assessors and risk managers.  相似文献   

3.
Recently, there has been a growing trend toward using stochastic (probabilistic) methods in ecological and public health risk assessment. These methods are favored because they overcome the problem of compounded conservatism and allow the systematic consideration of uncertainty and variability typically encountered in risk assessment. This article demonstrates a new methodology for the analysis of uncertainty in risk assessment using the first-order reliability method (FORM). The reliability method is formulated such that the probability that incremental lifetime cancer risk exceeds a predefined threshold level is calculated. Furthermore, the stochastic sensitivity of this probability with respect to the random variables is provided. The emphasis is on exploring the different types of probabilistic sensitivity obtained through the reliability analysis. The method is applied to a case study given by Thompson et al. (1992) on cancer risk resulting from dermal contact with benzo(a)pyrene (BaP)-contaminated soils. The reliability results matched those of the Monte Carlo simulation method. On average, the Monte Carlo simulation method required about 35 times as many function evaluations as that of FORM to calculate the probability of exceeding the target risk level. The analysis emphasizes the significant impact that the uncertainty in cancer potency factor has on the probabilistic modeling results compared with other parameters.  相似文献   

4.
5.
The selection of the most appropriate model for an ecological risk assessment depends on the application, the data and resources available, the knowledge base of the assessor, the relevant endpoints, and the extent to which the model deals with uncertainty. Since ecological systems are highly variable and our knowledge of model input parameters is uncertain, it is important that models include treatments of uncertainty and variability, and that results are reported in this light. In this paper we discuss treatments of variation and uncertainty in a variety of population models. In ecological risk assessments, the risk relates to the probability of an adverse event in the context of environmental variation. Uncertainty relates to ignorance about parameter values, e.g., measurement error and systematic error. An assessment of the full distribution of risks, under variability and parameter uncertainty, will give the most comprehensive and flexible endpoint. In this paper we present the rationale behind probabilistic risk assessment, identify the sources of uncertainty relevant for risk assessment and provide an overview of a range of population models. While all of the models reviewed have some utility in ecology, some have more comprehensive treatments of uncertainty than others. We identify the models that allow probabilistic assessments and sensitivity analyses, and we offer recommendations for further developments that aim towards more comprehensive and reliable ecological risk assessments for populations.  相似文献   

6.
Quantification of uncertainty associated with risk estimates is an important part of risk assessment. In recent years, use of second-order distributions, and two-dimensional simulations have been suggested for quantifying both variability and uncertainty. These approaches are better interpreted within the Bayesian framework. To help practitioners better use such methods and interpret the results, in this article, we describe propagation and interpretation of uncertainty in the Bayesian paradigm. We consider both the estimation problem where some summary measures of the risk distribution (e.g., mean, variance, or selected percentiles) are to be estimated, and the prediction problem, where the risk values for some specific individuals are to be predicted. We discuss some connections and differences between uncertainties in estimation and prediction problems, and present an interpretation of a decomposition of total variability/uncertainty into variability and uncertainty in terms of expected squared error of prediction and its reduction from perfect information. We also discuss the role of Monte Carlo methods in characterizing uncertainty. We explain the basic ideas using a simple example, and demonstrate Monte Carlo calculations using another example from the literature.  相似文献   

7.
This article evaluates selected sensitivity analysis methods applicable to risk assessment models with two-dimensional probabilistic frameworks, using a microbial food safety process risk model as a test-bed. Six sampling-based sensitivity analysis methods were evaluated including Pearson and Spearman correlation, sample and rank linear regression, and sample and rank stepwise regression. In a two-dimensional risk model, the identification of key controllable inputs that can be priorities for risk management can be confounded by uncertainty. However, despite uncertainty, results show that key inputs can be distinguished from those that are unimportant, and inputs can be grouped into categories of similar levels of importance. All selected methods are capable of identifying unimportant inputs, which is helpful in that efforts to collect data to improve the assessment or to focus risk management strategies can be prioritized elsewhere. Rank-based methods provided more robust insights with respect to the key sources of variability in that they produced narrower ranges of uncertainty for sensitivity results and more clear distinctions when comparing the importance of inputs or groups of inputs. Regression-based methods have advantages over correlation approaches because they can be configured to provide insight regarding interactions and nonlinearities in the model.  相似文献   

8.
A flexible framework for conducting nationwide multimedia, multipathway and multireceptor risk assessments (3MRA) under uncertainty was developed to estimate protective chemical concentration limits in a source area. The framework consists of two components: risk assessment and uncertainty analysis. The risk component utilizes linked source, fate/transport, exposure and risk assessment models to estimate the risk exposures for the receptors of concern. Both human and ecological receptors are included in the risk assessment framework. The flexibility of the framework is based on its ability to address problems varying in spatial scales from site-specific to regional and even national levels; and its ability to accommodate varying types of source, fate/transport, exposure and risk assessment models. The uncertainty component of the 3MRA framework is based on a two-stage Monte Carlo methodology. It allows the calculation of uncertainty in risk estimates, and the incorporation of the effects of uncertainty on the determination of regulatory concentration limits as a function of variability and uncertainty in input data, as well as potential errors in fate and transport and risk and exposure models. The framework can be adapted to handle a wide range of multimedia risk assessment problems. Two examples are presented to illustrate its use, and to demonstrate how regulatory decisions can be structured to incorporate the uncertainty in risk estimates.  相似文献   

9.
Population variability and uncertainty are important features of biological systems that must be considered when developing mathematical models for these systems. In this paper we present probability-based parameter estimation methods that account for such variability and uncertainty. Theoretical results that establish well-posedness and stability for these methods are discussed. A probabilistic parameter estimation technique is then applied to a toxicokinetic model for trichloroethylene using several types of simulated data. Comparison with results obtained using a standard, deterministic parameter estimation method suggests that the probabilistic methods are better able to capture population variability and uncertainty in model parameters.  相似文献   

10.
We compared the effect of uncertainty in dose‐response model form on health risk estimates to the effect of uncertainty and variability in exposure. We used three different dose‐response models to characterize neurological effects in children exposed in utero to methylmercury, and applied these models to calculate risks to a native population exposed to potentially contaminated fish from a reservoir in British Columbia. Uncertainty in model form was explicitly incorporated into the risk estimates. The selection of dose‐response model strongly influenced both mean risk estimates and distributions of risk, and had a much greater impact than altering exposure distributions. We conclude that incorporating uncertainty in dose‐response model form is at least as important as accounting for variability and uncertainty in exposure parameters in probabilistic risk assessment.  相似文献   

11.
Application of uncertainty and variability in LCA   总被引:1,自引:0,他引:1  
As yet, the application of an uncertainty and variability analysis is not common practice in LCAs. A proper analysis will be facilitated when it is clear which types of uncertainties and variabilities exist in LCAs and which tools are available to deal with them. Therefore, a framework is developed to classify types of uncertainty and variability in LCAs. Uncertainty is divided in (1) parameter uncertainty, (2) model uncertainty, and (3) uncertainty due to choices, while variability covers (4) spatial variability, (5) temporal variability, and (6) variability between objects and sources. A tool to deal with parameter uncertainty and variability between objects and sources in both the inventory and the impact assessment is probabilistic simulation. Uncertainty due to choices can be dealt with in a scenario analysis or reduced by standardisation and peer review. The feasibility of dealing with temporal and spatial variability is limited, implying model uncertainty in LCAs. Other model uncertainties can be reduced partly by more sophisticated modelling, such as the use of non-linear inventory models in the inventory and multi media models in the characterisation phase.  相似文献   

12.
In recent years, risk assessors have increasingly been moving away from deterministic risk assessment approaches and are applying probabilistic approaches that incorporate distributions of possible values for each input parameter. This paper reviews several approaches that are being used or that could potentially be used to develop distributions for carcinogenic slope factors (CSFs). Based on the primary tool or framework that is applied, these approaches have been divided into the following three categories: the statistical framework, the decision analysis framework, and the biological framework. Work that has been done on each approach is summarized, and the aspects of variability and uncertainty that are incorporated into each approach are examined. The implications of the resulting distributional information for calculating risk estimates or risk-based concentrations is explored. The approaches differ in their stage of development, the degree to which they diverge from the U.S. Environmental Protection Agency's (EPA) current practice in establishing CSF values, their flexibility to accommodate varying data sets or theories of carcinogenicity, and their complexity of application. In some cases, wide ranges of potential potency estimates are indicated by these approaches. Such findings suggest widely divergent risk assessment implications and the need for additional evaluation of the goals of developing CSF distributions for use in risk assessment applications and the types of information that should be reflected in such distributions. Some combination of the features offered by these approaches may best support risk assessment and risk management decisions.  相似文献   

13.
The importance of fitting distributions to data for risk analysis continues to grow as regulatory agencies, like the Environmental Protection Agency (EPA), continue to shift from deterministic to probabilistic risk assessment techniques. The use of Monte Carlo simulation as a tool for propagating variability and uncertainty in risk requires specification of the risk model's inputs in the form of distributions or tables of data. Several software tools exist to support risk assessors in their efforts to develop distributions. However, users must keep in mind that these tools do not replace clear thought about judgments that must be made in characterizing the information from data. This overview introduces risk assessors to the statistical concepts and physical reasons that support important judgments about appropriate types of parametric distributions and goodness-of-fit. In the context of using data to improve risk assessment and ultimately risk management, this paper discusses issues related to the nature of the data (representativeness, quantity, and quality, correlation with space and time, and distinguishing between variability and uncertainty for a set of data), and matching data and distributions appropriately. All data analysis (whether “Frequentist” or “Bayesian” or oblivious to the distinction) requires the use of subjective judgment. The paper offers an iterative process for developing distributions using data to characterize variability and uncertainty for inputs to risk models that provides incentives for collecting better information when the value of information exceeds its cost. Risk analysts need to focus attention on characterizing the information appropriately for purposes of the risk assessment (and risk management questions at hand), not on characterization for its own sake.  相似文献   

14.
Finite element (FE) models of bone, developed from computed tomography (CT) scan data, are used to evaluate stresses and strains, load transfer and fixation of implants, and potential for fracture. The experimentally derived relationships used to transform CT scan data in Hounsfield unit to modulus and strength contain substantial scatter. The scatter in these relationships has potential to impact the results and conclusions of bone studies. The objectives of this study were to develop a computationally efficient probabilistic FE-based platform capable of incorporating uncertainty in bone property relationships, and to apply the model to a representative analysis; variability in stresses and fracture risk was predicted in five proximal femurs under stance loading conditions. Based on published variability in strength and modulus relationships derived in the proximal femur, the probabilistic analysis predicted the distributions of stress and risk. For the five femurs analyzed, the 1 and 99 percentile bounds varied by an average of 17.3 MPa for stress and by 0.28 for risk. In each femur, the predicted variability in risk was greater than 50% of the mean risk calculated, with obvious implications for clinical assessment. Results using the advanced mean value (AMV) method required only seven analysis trials (1h) and differed by less than 2% when compared to a 1000-trial Monte-Carlo simulation (400 h). The probabilistic modeling platform developed has broad applicability to bone studies and can be similarly implemented to investigate other loading conditions, structures, sources of uncertainty, or output measures of interest.  相似文献   

15.
This article reviews the status of comparative risk assessment within the context of environmental decision-making; evaluates its potential application as a decision-making framework for selecting alternative technologies for dredged material management; and makes recommendations for implementing such a framework. One of the most important points from this review for decision-making is that comparative risk assessment, however conducted, is an inherently subjective, value-laden process. There is some objection to this lack of total scientific objectivity (“hard version” of comparative risk assessment). However, the “hard versions” provide little help in suggesting a method that surmounts the psychology of choice in decision-making schemes. The application of comparative risk assessment in the decision-making process at dredged material management facilities will have an element of value and professional judgment in the process. The literature suggests that the best way to incorporate this subjectivity and still maintain a defensible comparative framework is to develop a method that is logically consistent and allows for uncertainty by comparing risks on the basis of more than one set of criteria, more than one set of categories, and more than one set of experts. It should incorporate a probabilistic approach where necessary and possible, based on management goals.  相似文献   

16.
A guideline is presented for selection of sensitivity analysis methods applied to microbial food safety process risk (MFSPR) models. The guideline provides useful boundaries and principles for selecting sensitivity analysis methods for MSFPR models. Although the guideline is predicated on a specific branch of risk assessment models related to food-borne diseases, the principles and recommendations provided are typically generally applicable to other types of risk models. Applicable situations include: prioritizing potential critical control points; identifying key sources of variability and uncertainty; and refinement, verification, and validation of a model. Based on the objective of the analysis, characteristics of the model under study, amount of detail expected from sensitivity analysis, and characteristics of the sensitivity analysis method, recommendations for selection of sensitivity analysis methods are provided. A decision framework for method selection is introduced. The decision framework can substantially facilitate the process of selecting a sensitivity analysis method.  相似文献   

17.
The results of quantitative risk assessments are key factors in a risk manager's decision of the necessity to implement actions to reduce risk. The extent of the uncertainty in the assessment will play a large part in the degree of confidence a risk manager has in the reported significance and probability of a given risk. The two main sources of uncertainty in such risk assessments are variability and incertitude. In this paper we use two methods, a second-order two-dimensional Monte Carlo analysis and probability bounds analysis, to investigate the impact of both types of uncertainty on the results of a food-web exposure model. We demonstrate how the full extent of uncertainty in a risk estimate can be fully portrayed in a way that is useful to risk managers. We show that probability bounds analysis is a useful tool for identifying the parameters that contribute the most to uncertainty in a risk estimate and how it can be used to complement established practices in risk assessment. We conclude by promoting the use of probability analysis in conjunction with Monte Carlo analyses as a method for checking how plausible Monte Carlo results are in the full context of uncertainty.  相似文献   

18.
Four different probabilistic risk assessment methods were compared using the data from the Sangamo Weston/Lake Hartwell Superfund site. These were one-dimensional Monte Carlo, two-dimensional Monte Carlo considering uncertainty in the concentration term, two-dimensional Monte Carlo considering uncertainty in ingestion rate, and microexposure event analysis. Estimated high-end risks ranged from 2.0×10?4 to 3.3×10?3. Microexposure event analysis produced a lower risk estimate than any of the other methods due to incorporation of time-dependent changes in the concentration term.  相似文献   

19.
International harmonization of risk assessment approaches affords a number of opportunities and advantages. Overall, harmonization will lead to more efficient use of resources, but also will lead to better understanding amongst scientists and regulators worldwide. It is with these goals in mind that in 1994 the International Programme on Chemical Safety (IPCS) initiated its Project on the Harmonization of Approaches to the Assessment of Risk from Exposure to Chemicals (Harmonization Project). An ongoing activity under this project addresses uncertainty and variability in risk assessment. The goal of the overall activity is to promote harmonization of risk assessment methodologies for noncancer endpoints. However, given the common links in uncertainty and variability that apply across a range of end-point-specific activities, these links are identified wherever possible. This paper provides an overview of the IPCS Harmonization Project and reviews the activity and future plans related to uncertainty and variability.  相似文献   

20.
区域生态风险评价的关键问题与展望   总被引:10,自引:0,他引:10  
区域生态风险评价具有多风险因子、多风险受体、多评价终点、强调不确定性因素以及空间异质性的特点,它与传统的生态风险评价在风险源、胁迫因子和评价尺度上具有明显区别。尝试建立了一个基于陆地生态系统的区域生态风险评价框架,同时针对目前区域生态风险评价的研究现状,指出不确定性分析、尺度外推难、评价指标不统一、评价标准不统一、风险因子筛选及优先排序、区域内污染物复合、水生过渡到陆生生态系统风险评价、特殊的人为因素等是目前区域生态风险评价存在的关键问题及难点所在,并提出解决这些问题可能所需的工具、手段和理论方法突破。最后指出区域生态风险观测与数据采集加工、区域生态风险指标体系的统一与整合、区域生态风险评价方法论、区域生态风险的空间分布特征与表达以及区域生态风险评价反馈与管理机制5个方面是区域生态风险评价未来的研究重点。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号