首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 58 毫秒
1.
In the European Union, Directive 92/32/EC and EC Council Regulation (EC) 793/93 require the risk assessment of industrial chemicals. In this framework, it is agreed to characterise the level of “risk” by means of the deterministic quotient of exposure and effects parameters. Decision makers require that the uncertainty in the risk assessment be accounted for as explicitly as possible. Therefore, this paper intends to show the advantages and possibilities of a probabilistic human health risk assessment of an industrial chemical, dibutylphthalate (DBP). The risk assessment is based on non-cancer endpoints assumed to have a threshold for toxicity. This example risk assessment shows that a probabilistic risk assessment in the EU framework covering both the exposure and the effects assessment is feasible with currently available techniques. It shows the possibility of comparing the various uncertainties involved in a typical risk assessment, including the uncertainty in the exposure estimate, the uncertainty in the effect parameter, and the uncertainty in assessment factors used in the extrapolation from experimental animals to sensitive human beings. The analysis done did not confirm the reasonable worst-case character of the deterministic EU-assessment of DBP. Sensitivity analysis revealed the extrapolation procedure in the human effects assessment to be the main source of uncertainty. Since the probabilistic approach allows determination of the range of possible outcomes and their likelihood, it better informs both risk assessors and risk managers.  相似文献   

2.
3.
In recent years, risk assessors have increasingly been moving away from deterministic risk assessment approaches and are applying probabilistic approaches that incorporate distributions of possible values for each input parameter. This paper reviews several approaches that are being used or that could potentially be used to develop distributions for carcinogenic slope factors (CSFs). Based on the primary tool or framework that is applied, these approaches have been divided into the following three categories: the statistical framework, the decision analysis framework, and the biological framework. Work that has been done on each approach is summarized, and the aspects of variability and uncertainty that are incorporated into each approach are examined. The implications of the resulting distributional information for calculating risk estimates or risk-based concentrations is explored. The approaches differ in their stage of development, the degree to which they diverge from the U.S. Environmental Protection Agency's (EPA) current practice in establishing CSF values, their flexibility to accommodate varying data sets or theories of carcinogenicity, and their complexity of application. In some cases, wide ranges of potential potency estimates are indicated by these approaches. Such findings suggest widely divergent risk assessment implications and the need for additional evaluation of the goals of developing CSF distributions for use in risk assessment applications and the types of information that should be reflected in such distributions. Some combination of the features offered by these approaches may best support risk assessment and risk management decisions.  相似文献   

4.
This study examined relationships between hazard quotients (HQ) and probabilistic estimates of aquatic ecological risk. Questions addressed included the magnitude at which an HQ equates to significant risk, and the factors influencing the HQ-risk relationship. The analysis was based upon predicted exposure concentrations (PEC) for copper, hypothetical predicted no effect concentrations (PNEC) distributions, and measured PNEC data for aquatic species acutely and chronically exposed to copper, ammonia, cadmium, cyanide, dieldrin, DDT, phenanthrene, silver. and zinc. The cumulative PNEC and PEC distributions differed in slopes and magnitudes. The relationship between HQ and probabilistic risk, both of which were computed using conventional techniques, depended on the slopes of the PNEC and PEC distributions. Hazard quotients equaling 1.0 affected ~ 5% of the species because they were based on PNECs intended to protect 5% of the species. Hazard quotients greater than 1.0 depended on PNEC slope. For example, HQs for toxicants with the steeper PNEC distributions affected a large percentage of species (18 to 49%, depending on slope) at HQ=2 to 3. Other factors (e.g., variability in both PEC and PNEC data, and use of arithmetic or geometric means or their confidence limits) had variable influences on the HQ-risk relationship.  相似文献   

5.
Ecological risk assessment has a short history but a framework similar to human health risk assessment. The Toxic Substances Control Act (TSCA) and the Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) played a significant role in the development of the ecological risk process. Data developed and risk procedures used within TSCA and FIFRA have become generally standardized. Fundamental components of the risk process require data on the effects of chemicals in the form of concentration (or dose) — response profiles for species and an exposure profile to quantify the magnitude, spatial and temporal patterns of exposure relevant to significant biological endpoints being studied. Risk characterization generally involves comparing exposure and effects using point estimates (e.g., quotient method) but risk estimation is moving toward a probabilistic approach by comparing distributions of values with more consideration of the sources of uncertainty. Ecological testing guidelines in TSCA and FIFRA are discussed along with the risk characterization process used in each statute.  相似文献   

6.
We use bootstrap simulation to characterize uncertainty in parametric distributions, including Normal, Lognormal, Gamma, Weibull, and Beta, commonly used to represent variability in probabilistic assessments. Bootstrap simulation enables one to estimate sampling distributions for sample statistics, such as distribution parameters, even when analytical solutions are not available. Using a two-dimensional framework for both uncertainty and variability, uncertainties in cumulative distribution functions were simulated. The mathematical properties of uncertain frequency distributions were evaluated in a series of case studies during which the parameters of each type of distribution were varied for sample sizes of 5, 10, and 20. For positively skewed distributions such as Lognormal, Weibull, and Gamma, the range of uncertainty is widest at the upper tail of the distribution. For symmetric unbounded distributions, such as Normal, the uncertainties are widest at both tails of the distribution. For bounded distributions, such as Beta, the uncertainties are typically widest in the central portions of the distribution. Bootstrap simulation enables complex dependencies between sampling distributions to be captured. The effects of uncertainty, variability, and parameter dependencies were studied for several generic functional forms of models, including models in which two-dimensional random variables are added, multiplied, and divided, to show the sensitivity of model results to different assumptions regarding model input distributions, ranges of variability, and ranges of uncertainty and to show the types of errors that may be obtained from mis-specification of parameter dependence. A total of 1,098 case studies were simulated. In some cases, counter-intuitive results were obtained. For example, the point value of the 95th percentile of uncertainty for the 95th percentile of variability of the product of four Gamma or Weibull distributions decreases as the coefficient of variation of each model input increases and, therefore, may not provide a conservative estimate. Failure to properly characterize parameter uncertainties and their dependencies can lead to orders-of-magnitude mis-estimates of both variability and uncertainty. In many cases, the numerical stability of two-dimensional simulation results was found to decrease as the coefficient of variation of the inputs increases. We discuss the strengths and limitations of bootstrap simulation as a method for quantifying uncertainty due to random sampling error.  相似文献   

7.
There has been a trend in recent years toward the use of probabilistic methods for the analysis of uncertainty and variability in risk assessment. By developing a plausible distribution of risk, it is possible to obtain a more complete characterization of risk than is provided by either best estimates or upper limits. We describe in this paper a general framework for evaluating uncertainty and variability in risk estimation and outline how this framework can be used in the establishment of drinking water quality objectives. In addition to characterizing uncertainty and variability in risk, this framework also facilitates the identification of specific factors that contribute most to uncertainty and variability. The application of these probabilistic risk assessment methods is illustrated using tetrachloroethylene and trihalomethanes as examples.  相似文献   

8.
Quantification of uncertainty associated with risk estimates is an important part of risk assessment. In recent years, use of second-order distributions, and two-dimensional simulations have been suggested for quantifying both variability and uncertainty. These approaches are better interpreted within the Bayesian framework. To help practitioners better use such methods and interpret the results, in this article, we describe propagation and interpretation of uncertainty in the Bayesian paradigm. We consider both the estimation problem where some summary measures of the risk distribution (e.g., mean, variance, or selected percentiles) are to be estimated, and the prediction problem, where the risk values for some specific individuals are to be predicted. We discuss some connections and differences between uncertainties in estimation and prediction problems, and present an interpretation of a decomposition of total variability/uncertainty into variability and uncertainty in terms of expected squared error of prediction and its reduction from perfect information. We also discuss the role of Monte Carlo methods in characterizing uncertainty. We explain the basic ideas using a simple example, and demonstrate Monte Carlo calculations using another example from the literature.  相似文献   

9.
The importance of fitting distributions to data for risk analysis continues to grow as regulatory agencies, like the Environmental Protection Agency (EPA), continue to shift from deterministic to probabilistic risk assessment techniques. The use of Monte Carlo simulation as a tool for propagating variability and uncertainty in risk requires specification of the risk model's inputs in the form of distributions or tables of data. Several software tools exist to support risk assessors in their efforts to develop distributions. However, users must keep in mind that these tools do not replace clear thought about judgments that must be made in characterizing the information from data. This overview introduces risk assessors to the statistical concepts and physical reasons that support important judgments about appropriate types of parametric distributions and goodness-of-fit. In the context of using data to improve risk assessment and ultimately risk management, this paper discusses issues related to the nature of the data (representativeness, quantity, and quality, correlation with space and time, and distinguishing between variability and uncertainty for a set of data), and matching data and distributions appropriately. All data analysis (whether “Frequentist” or “Bayesian” or oblivious to the distinction) requires the use of subjective judgment. The paper offers an iterative process for developing distributions using data to characterize variability and uncertainty for inputs to risk models that provides incentives for collecting better information when the value of information exceeds its cost. Risk analysts need to focus attention on characterizing the information appropriately for purposes of the risk assessment (and risk management questions at hand), not on characterization for its own sake.  相似文献   

10.
Climate change impact assessments are plagued with uncertainties from many sources, such as climate projections or the inadequacies in structure and parameters of the impact model. Previous studies tried to account for the uncertainty from one or two of these. Here, we developed a triple‐ensemble probabilistic assessment using seven crop models, multiple sets of model parameters and eight contrasting climate projections together to comprehensively account for uncertainties from these three important sources. We demonstrated the approach in assessing climate change impact on barley growth and yield at Jokioinen, Finland in the Boreal climatic zone and Lleida, Spain in the Mediterranean climatic zone, for the 2050s. We further quantified and compared the contribution of crop model structure, crop model parameters and climate projections to the total variance of ensemble output using Analysis of Variance (ANOVA). Based on the triple‐ensemble probabilistic assessment, the median of simulated yield change was ?4% and +16%, and the probability of decreasing yield was 63% and 31% in the 2050s, at Jokioinen and Lleida, respectively, relative to 1981–2010. The contribution of crop model structure to the total variance of ensemble output was larger than that from downscaled climate projections and model parameters. The relative contribution of crop model parameters and downscaled climate projections to the total variance of ensemble output varied greatly among the seven crop models and between the two sites. The contribution of downscaled climate projections was on average larger than that of crop model parameters. This information on the uncertainty from different sources can be quite useful for model users to decide where to put the most effort when preparing or choosing models or parameters for impact analyses. We concluded that the triple‐ensemble probabilistic approach that accounts for the uncertainties from multiple important sources provide more comprehensive information for quantifying uncertainties in climate change impact assessments as compared to the conventional approaches that are deterministic or only account for the uncertainties from one or two of the uncertainty sources.  相似文献   

11.
Efforts to model human exposures to chemicals are growing more sophisticated and encompass increasingly complex exposure scenarios. The scope of such analyses has increased, growing from assessments of single exposure pathways to complex evaluations of aggregate or cumulative chemical exposures occurring within a variety of settings and scenarios. In addition, quantitative modeling techniques have evolved from simple deterministic analyses using single point estimates for each necessary input parameter to more detailed probabilistic analyses that can accommodate distributions of input parameters and assessment results. As part of an overall effort to guide development of a comprehensive framework for modeling human exposures to chemicals, available information resources needed to derive input parameters for human exposure assessment models were compiled and critically reviewed. Ongoing research in the area of exposure assessment parameters was also identified. The results of these efforts are summarized and other relevant information that will be needed to apply the available data in a comprehensive exposure model is discussed. Critical data gaps in the available information are also identified. Exposure assessment modeling and associated research would benefit from the collection of additional data as well as by enhancing the accessibility of existing and evolving information resources.  相似文献   

12.
The Precautionary Principle is in sharp political focus today because (1) the nature of scientific uncertainty is changing and (2) there is increasing pressure to base governmental action on more “rational” schemes, such as cost-benefit analysis and quantitative risk assessment, the former being an embodiment of ‘rational choice theory’ promoted by the Chicago school of law and economics. The Precautionary Principle has been criticized as being both too vague and too arbitrary to form a basis for rational decision making. The assumption underlying this criticism is that any scheme not based on cost-benefit analysis and risk assessment is both irrational and without secure foundation in either science or economics. This paper contests that view and makes explicit the rational tenets of the Precautionary Principle within an analytical framework as rigorous as uncertainties permit, and one that mirrors democratic values embodied in regulatory, compensatory, and common law. Unlike other formulations that reject risk assessment, this paper argues that risk assessment can be used within the formalism of tradeoff analysis—a more appropriate alternative to traditional cost-benefit analysis and one that satisfies the need for well-grounded public policy decision making. This paper will argue that the precautionary approach is the most appropriate basis for policy, even when large uncertainties do not exist, especially where the fairness of the distributions of costs and benefits of hazardous activities and products are a concern. Furthermore, it will offer an approach to making decisions within an analytic framework, based on equity and justice, to replace the economic paradigm of utilitarian cost-benefit analysis.  相似文献   

13.
Since West Nile virus (WNV) was introduced to New York City in 1999, it has subsequently spread through the Americas, creating human and animal health risks. Our equine risk assessment focused on three pyrethroid insecticides (phenothrin, resmethrin, and permethrin), pyrethrins, and two organophosphate insecticides (malathion and naled). Piperonyl butoxide, a synergist commonly used in pyrethroids, was also assessed. The objective was to use deterministic and probabilistic risk assessment methodologies to evaluate health risks to horses from vector management tactics used for control of adult mosquitoes. Our exposure estimates were derived from the Kenaga nomogram for food deposition, AgDRIFT® for deposition onto soil and hair, AERMOD for ambient air concentrations, and PRZM-EXAMS for water concentrations. We used the risk quotient (RQ) method for our assessment with the RQ level of concern (LOC) set at 1.0. RQs were determined by comparing the exposure to no-observable-effect-levels. Acute deterministic RQs ranged from 0.0004 for phenothrin to 0.2 for naled. Subchronic deterministic RQs ranged from 0.001 for phenothrin to 0.6 for naled. The probabilistic assessment revealed estimates of deterministic acute and subchronic RQs were highly conservative. Our assessment revealed that risks to horses from adult mosquito insecticides are low and not likely to exceed the LOC.  相似文献   

14.
Goal, Scope and Background The paper describes different ecotoxicity effect indicator methods/approaches. The approaches cover three main groups, viz. PNEC approaches, PAF approaches and damage approaches. Ecotoxicity effect indicators used in life cycle impact assessment (LCIA) are typically modelled to the level of impact, indicating the potential impact on 'ecosystem health'. The few existing indicators, which are modelled all the way to damage, are poorly developed, and even though relevant alternatives from risk assessment exist (e.g. recovery time and mean extinction time), these are unfortunately at a very early stage of development, and only few attempts have been made to include them in LCIA. Methods The approaches are described and evaluated against a set of assessment criteria comprising compatibility with the methodological requirements of LCIA, environmental relevance, reproducibility, data demand, data availability, quantification of uncertainty, transparency and spatial differentiation. Results and Discussion The results of the evaluation of the two impact approaches (i.e. PNEC and PAF) show both pros and cons for each of them. The assessment factor-based PNEC approaches have a low data demand and use only the lowest data (e.g. lowest NOEC value). Because it is developed in tiered risk assessment, and hence makes use of conservative assessment factors, it is not optimal, in its present form, to use in the comparative framework of LCIA, where best estimates are sought. The PAF approaches have a higher data demand but use all data and can be based on effect data (PNEC is no-effect-based), thus making these approaches non-conservative and more suitable for LCIA. However, indiscriminate use of ecotoxicity data tends to make the PAF-approaches no more environmentally relevant than the assessment factor-based PNEC approaches. The PAF approaches, however, can at least in theory be linked to damage modelling. All the approaches for damage modelling which are included here have a high environmental relevance but very low data availability, apart from the 'media recovery-approach', which depends directly on the fate model. They are all at a very early stage of development. Conclusion Recommendations and Outlook. An analysis of the different PAF approaches shows that the crucial point is according to which principles and based on which data the hazardous concentration to 50% of the included species (i.e. HC50) is estimated. The ability to calculate many characterisation factors for ecotoxicity is important for this impact category to be included in LCIA in a proper way. However, the access to effect data for the relevant chemicals is typically limited. So, besides the coupling to damage modelling, the main challenge within the further development and improvement of ecotoxicity effect indicators is to find an optimal method to estimate HC50 based on little data.  相似文献   

15.
Application of uncertainty and variability in LCA   总被引:1,自引:0,他引:1  
As yet, the application of an uncertainty and variability analysis is not common practice in LCAs. A proper analysis will be facilitated when it is clear which types of uncertainties and variabilities exist in LCAs and which tools are available to deal with them. Therefore, a framework is developed to classify types of uncertainty and variability in LCAs. Uncertainty is divided in (1) parameter uncertainty, (2) model uncertainty, and (3) uncertainty due to choices, while variability covers (4) spatial variability, (5) temporal variability, and (6) variability between objects and sources. A tool to deal with parameter uncertainty and variability between objects and sources in both the inventory and the impact assessment is probabilistic simulation. Uncertainty due to choices can be dealt with in a scenario analysis or reduced by standardisation and peer review. The feasibility of dealing with temporal and spatial variability is limited, implying model uncertainty in LCAs. Other model uncertainties can be reduced partly by more sophisticated modelling, such as the use of non-linear inventory models in the inventory and multi media models in the characterisation phase.  相似文献   

16.
We compared the effect of uncertainty in dose‐response model form on health risk estimates to the effect of uncertainty and variability in exposure. We used three different dose‐response models to characterize neurological effects in children exposed in utero to methylmercury, and applied these models to calculate risks to a native population exposed to potentially contaminated fish from a reservoir in British Columbia. Uncertainty in model form was explicitly incorporated into the risk estimates. The selection of dose‐response model strongly influenced both mean risk estimates and distributions of risk, and had a much greater impact than altering exposure distributions. We conclude that incorporating uncertainty in dose‐response model form is at least as important as accounting for variability and uncertainty in exposure parameters in probabilistic risk assessment.  相似文献   

17.
Conventional risk assessment practices utilize a tenfold uncertainty factor (UF) to extrapolate from the general human population to sensitive subgroups, such as children and geriatrics. This study evaluated whether the tenfold UF can be reduced when pharmacokinetic and pharmacodynamic data for pharmaceuticals used by children and geriatrics are incorporated into the risk assessment for human sensitivity. Composite factors (kinetics X dynamics) were calculated from data-derived values for bumetanide, furosemide, metoprolol, atenolol, naproxen, and ibuprofen. For the compounds examined, all of the composite factors were lower than 10. Furthermore, 8 of the 12 composite factors were less than 5.5. Incorporation of human kinetic and dynamic data into risk assessment can aid in reducing the uncertainties associated with sensitive subgroups and further study is encouraged.  相似文献   

18.
Modelling groundwater depths in floodplains and peatlands remains a basic approach to assessing hydrological conditions of habitats. Groundwater flow models used to compute groundwater heads are known for their uncertainties, and the calibration of these models and the uncertainty assessments of parameters remain fundamental steps in providing reliable data. However, the elevation data used to determine the geometry of model domains are frequently considered deterministic and hence are seldom considered a source of uncertainty in model-based groundwater level estimations. Knowing that even the cutting-edge laser-scanning-based digital elevation models have errors due to vegetation effects and scanning procedure failures, we provide an assessment of uncertainty of water level estimations that remain basic data for wetland ecosystem assessment and management. We found that the uncertainty of the digital elevation model (DEM) significantly influenced the results of the assessment of the habitat’s hydrological conditions expressed as groundwater depths. In extreme cases, although the average habitat suitability index (HSI) assessed in a deterministic manner was defined as ‘unsuitable’, in a probabilistic approach (grid-cell-scale estimation), it reached a value of 40% probability, signifying ‘optimum’ or ‘tolerant’. For the 24 habitats analysed, we revealed vast differences between HSI scores calculated for individual grid cells of the model and HSI scores computed as average values from the set of grid cells located within the habitat patches. We conclude that groundwater-modelling-based decision support approaches to wetland assessment can result in incorrect management if the quality of DEM has not been addressed in studies referring to groundwater depths.  相似文献   

19.

Purpose

Life cycle costing (LCC) is a state-of-the-art method to analyze investment decisions in infrastructure projects. However, uncertainties inherent in long-term planning question the credibility of LCC results. Previous research has not systematically linked sources and methods to address this uncertainty. Part I of this series develops a framework to collect and categorize different sources of uncertainty and addressing methods. This systematization is a prerequisite to further analyze the suitability of methods and levels the playing field for part II.

Methods

Past reviews have dealt with selected issues of uncertainty in LCC. However, none has systematically collected uncertainties and linked methods to address them. No comprehensive categorization has been published to date. Part I addresses these two research gaps by conducting a systematic literature review. In a rigorous four-step approach, we first scrutinized major databases. Second, we performed a practical and methodological screening to identify in total 115 relevant publications, mostly case studies. Third, we applied content analysis using MAXQDA. Fourth, we illustrated results and concluded upon the research gaps.

Results and discussion

We identified 33 sources of uncertainty and 24 addressing methods. Sources of uncertainties were categorized according to (i) its origin, i.e., parameter, model, and scenario uncertainty and (ii) the nature of uncertainty, i.e., aleatoric or epistemic uncertainty. The methods to address uncertainties were classified into deterministic, probabilistic, possibilistic, and other methods. With regard to sources of uncertainties, lack of data and data quality was analyzed most often. Most uncertainties having been discussed were located in the use stage. With regard to methods, sensitivity analyses were applied most widely, while more complex methods such as Bayesian models were used less frequently. Data availability and the individual expertise of LCC practitioner foremost influence the selection of methods.

Conclusions

This article complements existing research by providing a thorough systematization of uncertainties in LCC. However, an unambiguous categorization of uncertainties is difficult and overlapping occurs. Such a systemizing approach is nevertheless necessary for further analyses and levels the playing field for readers not yet familiar with the topic. Part I concludes the following: First, an investigation about which methods are best suited to address a certain type of uncertainty is still outstanding. Second, an analysis of types of uncertainty that have been insufficiently addressed in previous LCC cases is still missing. Part II will focus on these research gaps.
  相似文献   

20.
This article reviews the status of comparative risk assessment within the context of environmental decision-making; evaluates its potential application as a decision-making framework for selecting alternative technologies for dredged material management; and makes recommendations for implementing such a framework. One of the most important points from this review for decision-making is that comparative risk assessment, however conducted, is an inherently subjective, value-laden process. There is some objection to this lack of total scientific objectivity (“hard version” of comparative risk assessment). However, the “hard versions” provide little help in suggesting a method that surmounts the psychology of choice in decision-making schemes. The application of comparative risk assessment in the decision-making process at dredged material management facilities will have an element of value and professional judgment in the process. The literature suggests that the best way to incorporate this subjectivity and still maintain a defensible comparative framework is to develop a method that is logically consistent and allows for uncertainty by comparing risks on the basis of more than one set of criteria, more than one set of categories, and more than one set of experts. It should incorporate a probabilistic approach where necessary and possible, based on management goals.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号