首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Recently, there has been a growing trend toward using stochastic (probabilistic) methods in ecological and public health risk assessment. These methods are favored because they overcome the problem of compounded conservatism and allow the systematic consideration of uncertainty and variability typically encountered in risk assessment. This article demonstrates a new methodology for the analysis of uncertainty in risk assessment using the first-order reliability method (FORM). The reliability method is formulated such that the probability that incremental lifetime cancer risk exceeds a predefined threshold level is calculated. Furthermore, the stochastic sensitivity of this probability with respect to the random variables is provided. The emphasis is on exploring the different types of probabilistic sensitivity obtained through the reliability analysis. The method is applied to a case study given by Thompson et al. (1992) on cancer risk resulting from dermal contact with benzo(a)pyrene (BaP)-contaminated soils. The reliability results matched those of the Monte Carlo simulation method. On average, the Monte Carlo simulation method required about 35 times as many function evaluations as that of FORM to calculate the probability of exceeding the target risk level. The analysis emphasizes the significant impact that the uncertainty in cancer potency factor has on the probabilistic modeling results compared with other parameters.  相似文献   

2.
Groundwater modeling typically relies on some hypothesis and approximations of reality, as the real hydrologic systems are far more complex than we can mathematically characterize. This kind of a model's errors cannot be neglected in the uncertainty analysis for a model's predictions in practical issues. As the scale and complexity increase, the associated uncertainties boost dramatically. In this study, a Bayesian uncertainty analysis method for a deterministic model's predictions is presented. The geostatistics of hydrogeologic parameters obtained from site characterization are treated as the prior parameter distribution in the Bayes’ theorem. Then the Markov-Chain Monte Carlo method is used to generate the posterior statistical distribution of the model's predictions, conditional to the observed hydrologic system behaviors. Finally, a series of synthetic examples are given by applying this method to a MODFLOW pumping test model, to test its capability and efficiency in order to assess various sources of the model's prediction uncertainty. The impacts of the model's parameter sensitivity, simplification, and observation errors to predict uncertainty are evaluated, respectively. The results are analyzed statistically to provide deterministic predictions with associated prediction errors. Risk analysis is also derived from the Bayesian results to draw tradeoff curves for decision-making about exploitation of groundwater resources.  相似文献   

3.
The importance of fitting distributions to data for risk analysis continues to grow as regulatory agencies, like the Environmental Protection Agency (EPA), continue to shift from deterministic to probabilistic risk assessment techniques. The use of Monte Carlo simulation as a tool for propagating variability and uncertainty in risk requires specification of the risk model's inputs in the form of distributions or tables of data. Several software tools exist to support risk assessors in their efforts to develop distributions. However, users must keep in mind that these tools do not replace clear thought about judgments that must be made in characterizing the information from data. This overview introduces risk assessors to the statistical concepts and physical reasons that support important judgments about appropriate types of parametric distributions and goodness-of-fit. In the context of using data to improve risk assessment and ultimately risk management, this paper discusses issues related to the nature of the data (representativeness, quantity, and quality, correlation with space and time, and distinguishing between variability and uncertainty for a set of data), and matching data and distributions appropriately. All data analysis (whether “Frequentist” or “Bayesian” or oblivious to the distinction) requires the use of subjective judgment. The paper offers an iterative process for developing distributions using data to characterize variability and uncertainty for inputs to risk models that provides incentives for collecting better information when the value of information exceeds its cost. Risk analysts need to focus attention on characterizing the information appropriately for purposes of the risk assessment (and risk management questions at hand), not on characterization for its own sake.  相似文献   

4.
The traditional q1 * methodology for constructing upper confidence limits (UCLs) for the low-dose slopes of quantal dose-response functions has two limitations: (i) it is based on an asymptotic statistical result that has been shown via Monte Carlo simulation not to hold in practice for small, real bioassay experiments (Portier and Hoel, 1983); and (ii) it assumes that the multistage model (which represents cumulative hazard as a polynomial function of dose) is correct. This paper presents an uncertainty analysis approach for fitting dose-response functions to data that does not require specific parametric assumptions or depend on asymptotic results. It has the advantage that the resulting estimates of the dose-response function (and uncertainties about it) no longer depend on the validity of an assumed parametric family nor on the accuracy of the asymptotic approximation. The method derives posterior densities for the true response rates in the dose groups, rather than deriving posterior densities for model parameters, as in other Bayesian approaches (Sielken, 1991), or resampling the observed data points, as in the bootstrap and other resampling methods. It does so by conditioning constrained maximum-entropy priors on the observed data. Monte Carlo sampling of the posterior (constrained, conditioned) probability distributions generate values of response probabilities that might be observed if the experiment were repeated with very large sample sizes. A dose-response curve is fit to each such simulated dataset. If no parametric model has been specified, then a generalized representation (e.g., a power-series or orthonormal polynomial expansion) of the unknown dose-response function is fit to each simulated dataset using “model-free” methods. The simulation-based frequency distribution of all the dose-response curves fit to the simulated datasets yields a posterior distribution function for the low-dose slope of the dose-response curve. An upper confidence limit on the low-dose slope is obtained directly from this posterior distribution. This “Data Cube” procedure is illustrated with a real dataset for benzene, and is seen to produce more policy-relevant insights than does the traditional q1 * methodology. For example, it shows how far apart are the 90%, 95%, and 99% limits and reveals how uncertainty about total and incremental risk vary with dose level (typically being dominated at low doses by uncertainty about the response of the control group, and being dominated at high doses by sampling variability). Strengths and limitations of the Data Cube approach are summarized, and potential decision-analytic applications to making better informed risk management decisions are briefly discussed.  相似文献   

5.
    
The selection of the most appropriate model for an ecological risk assessment depends on the application, the data and resources available, the knowledge base of the assessor, the relevant endpoints, and the extent to which the model deals with uncertainty. Since ecological systems are highly variable and our knowledge of model input parameters is uncertain, it is important that models include treatments of uncertainty and variability, and that results are reported in this light. In this paper we discuss treatments of variation and uncertainty in a variety of population models. In ecological risk assessments, the risk relates to the probability of an adverse event in the context of environmental variation. Uncertainty relates to ignorance about parameter values, e.g., measurement error and systematic error. An assessment of the full distribution of risks, under variability and parameter uncertainty, will give the most comprehensive and flexible endpoint. In this paper we present the rationale behind probabilistic risk assessment, identify the sources of uncertainty relevant for risk assessment and provide an overview of a range of population models. While all of the models reviewed have some utility in ecology, some have more comprehensive treatments of uncertainty than others. We identify the models that allow probabilistic assessments and sensitivity analyses, and we offer recommendations for further developments that aim towards more comprehensive and reliable ecological risk assessments for populations.  相似文献   

6.
刘长峰  侯鹰  陈卫平  崔昊天 《生态学报》2021,41(9):3343-3353
快速城市化导致城市周边区域生态系统服务损失并引发生态风险。以多种类型的生态系统服务作为生态风险的评价终点,构建了基于服务价值量的城市化区域生态风险表征方法,以北京市为例对方法进行了应用,并进行了风险评价结果的不确定性分析和参数敏感性分析。案例研究显示2015年北京市生态风险总体处于低风险接近中等风险水平,低风险和极低风险区域面积占全市的50%以上,主要分布于北京市西部和北部,高风险和极高风险区域面积占20%左右,主要分布于中心城区。生态风险空间格局特征表明北京市城市区域的扩张造成周边区域生态系统服务的下降,导致生态风险水平的上升。研究提出的生态风险指数同生态系统服务当量因子间具有显著的线性关系,可用于估算生态系统服务价值。不确定性和参数敏感性分析表明生态风险指数计算结果变异较小,指数具有较高的可靠性。研究方法能够综合表征城市化区域的生态风险,定量表征结果便于决策者理解,具有应用于风险评价和管理实践的价值。  相似文献   

7.
An integrated simulation-assessment modeling approach for analyzing environmental risks of groundwater contamination is proposed in this paper. It incorporates an analytical groundwater solute transport model, an exposure dose model, and a fuzzy risk assessment model within a general framework. The transport model is used for predicting contaminant concentrations in subsurface, and the exposure dose model is used for calculating contaminant ingestion during the exposure period under given exposure pathways. Both models are solved through the Monte Carlo simulation technique to reflect the associated uncertainties. Based on consideration of fuzzy relationships between exposure doses and cancer risks, risk levels of different exposure doses for each contaminant can be calculated to form a fuzzy relation matrix. The overall risks can then be quantified through further fuzzy synthesizing operations. Thus, probabilistic quantification of different risk levels (possibilities) can be realized. Results of the case study indicate that environmental risks at the waste landfill site can be effectively analyzed through the developed methodology. They are useful for supporting the related risk-management and remediation decisions.  相似文献   

8.
Quantitative uncertainty analysis has become a common component of risk assessments. In risk assessment models, the most robust method for propagating uncertainty is Monte Carlo simulation. Many software packages available today offer Monte Carlo capabilities while requiring minimal learning time, computational time, and/or computer memory. This paper presents an evalu ation of six software packages in the context of risk assessment: Crystal Ball, @Risk, Analytica, Stella II, PRISM, and Susa-PC. Crystal Ball and @Risk are spreadsheet based programs; Analytica and Stella II are multi-level, influence diagram based programs designed for the construction of complex models; PRISM and Susa-PC are both public-domain programs designed for incorpo rating uncertainty and sensitivity into any model written in Fortran. Each software package was evaluated on the basis of five criteria, with each criterion having several sub-criteria. A ‘User Preferences Table’ was also developed for an additional comparison of the software packages. The evaluations were based on nine weeks of experimentation with the software packages including use of the associated user manuals and test of the software through the use of example problems. The results of these evaluations indicate that Stella II has the most extensive modeling capabilities and can handle linear differential equations. Crystal Ball has the best input scheme for entering uncertain parameters and the best reference materials. @Risk offers a slightly better standard output scheme and requires a little less learning time. Susa-PC has the most options for detailed statistical analysis of the results, such as multiple options for a sensitivity analysis and sophisticated options for inputting correlations. Analytica is a versatile, menu- and graphics-driven package, while PRISM is a more specialized and less user friendly program. When choosing between software packages for uncertainty and sensitivity analysis, the choice largely depends on the specifics of the problem being modeled. However, for risk assessment problems that can be implemented on a spreadsheet, Crystal Ball is recommended because it offers the best input options, a good output scheme, adequate uncertainty and sensitivity analysis, superior reference materials, and an intuitive spreadsheet basis while requiring very little memory.  相似文献   

9.
Many vadose zone models are available for environmental remediation, but few offer the procedures for verifying model predictions with field data and for dealing with uncertainties associated with model input parameters. This article presents a modified model combining a one-dimensional vadose-zone transport model and a simple groundwater mixing model with a function of Monte Carlo simulation (MCS). The modified model is applied to determine soil remedial concentrations for methyl tertiary butyl ether (MTBE). The modified model generates a distribution of MTBE ground-water concentrations at the point of compliance. This distribution can be used to estimate the risk of exceeding groundwater quality standard given soil remedial concentrations. In a case study, soil remedial concentration for MTBE is established to be 5?µg/kg, with a 95% and 10?µg/kg with a 50% probability that groundwater concentration will not exceed the water quality objective of 13?µg/L. Furthermore, this study uses MCS to investigate uncertainties of model input parameter hydraulic conductivity (K). One set of data (K1) is based on the results of hydraulic conductivity laboratory tests, and the other (K2) is based on the results of slug tests conducted in the field. As expected, the laboratory data show smaller K values than the field data. The comparison of the MCS results obtained from the two sets of K data indicates that the MTBE groundwater concentrations calculated based on K1 are generally 160 to 625% greater than those calculated based on K2 at the same percentiles of the MCS distribution. A higher soil remedial concentration of9jig/kg is then calculated based on the MCS results from K2 at 95%ile and 19?µg/kg at 50%ile.  相似文献   

10.
Mavrodi  D. V.  Kovalenko  N. P.  Sokolov  S. L.  Parfenyuk  V. G.  Kosheleva  I. A.  Boronin  A. M. 《Microbiology》2003,72(5):597-604
The key genesnahAc and xylEof the naphthalene catabolism of fluorescent Pseudomonas spp. in total soil DNA samples were detected by the polymerase chain reaction (PCR) technique. The collection of fluorescent Pseudomonas spp. was screened for the occurrence of these genes. The results obtained show the possibility of using this approach in the goal-directed search for plasmid-containing naphthalene-degrading fluorescent pseudomonads in soil. The distribution of the naphthalene catabolism genes in soils contaminated with creosote and petroleum products was also studied.  相似文献   

11.
    
Mathematical modeling is an indispensable tool for research and development in biotechnology and bioengineering. The formulation of kinetic models of biochemical networks depends on knowledge of the kinetic properties of the enzymes of the individual reactions. However, kinetic data acquired from experimental observations bring along uncertainties due to various experimental conditions and measurement methods. In this contribution, we propose a novel way to model the uncertainty in the enzyme kinetics and to predict quantitatively the responses of metabolic reactions to the changes in enzyme activities under uncertainty. The proposed methodology accounts explicitly for mechanistic properties of enzymes and physico‐chemical and thermodynamic constraints, and is based on formalism from systems theory and metabolic control analysis. We achieve this by observing that kinetic responses of metabolic reactions depend: (i) on the distribution of the enzymes among their free form and all reactive states; (ii) on the equilibrium displacements of the overall reaction and that of the individual enzymatic steps; and (iii) on the net fluxes through the enzyme. Relying on this observation, we develop a novel, efficient Monte Carlo sampling procedure to generate all states within a metabolic reaction that satisfy imposed constrains. Thus, we derive the statistics of the expected responses of the metabolic reactions to changes in enzyme levels and activities, in the levels of metabolites, and in the values of the kinetic parameters. We present aspects of the proposed framework through an example of the fundamental three‐step reversible enzymatic reaction mechanism. We demonstrate that the equilibrium displacements of the individual enzymatic steps have an important influence on kinetic responses of the enzyme. Furthermore, we derive the conditions that must be satisfied by a reversible three‐step enzymatic reaction operating far away from the equilibrium in order to respond to changes in metabolite levels according to the irreversible Michelis–Menten kinetics. The efficient sampling procedure allows easy, scalable, implementation of this methodology to modeling of large‐scale biochemical networks. Biotechnol. Bioeng. 2011;108: 413–423. © 2010 Wiley Periodicals, Inc.  相似文献   

12.
The results of quantitative risk assessments are key factors in a risk manager's decision of the necessity to implement actions to reduce risk. The extent of the uncertainty in the assessment will play a large part in the degree of confidence a risk manager has in the reported significance and probability of a given risk. The two main sources of uncertainty in such risk assessments are variability and incertitude. In this paper we use two methods, a second-order two-dimensional Monte Carlo analysis and probability bounds analysis, to investigate the impact of both types of uncertainty on the results of a food-web exposure model. We demonstrate how the full extent of uncertainty in a risk estimate can be fully portrayed in a way that is useful to risk managers. We show that probability bounds analysis is a useful tool for identifying the parameters that contribute the most to uncertainty in a risk estimate and how it can be used to complement established practices in risk assessment. We conclude by promoting the use of probability analysis in conjunction with Monte Carlo analyses as a method for checking how plausible Monte Carlo results are in the full context of uncertainty.  相似文献   

13.
Process‐based model analyses are often used to estimate changes in soil organic carbon (SOC), particularly at regional to continental scales. However, uncertainties are rarely evaluated, and so it is difficult to determine how much confidence can be placed in the results. Our objective was to quantify uncertainties across multiple scales in a process‐based model analysis, and provide 95% confidence intervals for the estimates. Specifically, we used the Century ecosystem model to estimate changes in SOC stocks for US croplands during the 1990s, addressing uncertainties in model inputs, structure and scaling of results from point locations to regions and the entire country. Overall, SOC stocks increased in US croplands by 14.6 Tg C yr?1 from 1990 to 1995 and 17.5 Tg C yr?1 during 1995 to 2000, and uncertainties were ±22% and ±16% for the two time periods, respectively. Uncertainties were inversely related to spatial scale, with median uncertainties at the regional scale estimated at ±118% and ±114% during the early and latter part of 1990s, and even higher at the site scale with estimates at ±739% and ±674% for the time periods, respectively. This relationship appeared to be driven by the amount of the SOC stock change; changes in stocks that exceeded 200 Gg C yr?1 represented a threshold where uncertainties were always lower than ±100%. Consequently, the amount of uncertainty in estimates derived from process‐based models will partly depend on the level of SOC accumulation or loss. In general, the majority of uncertainty was associated with model structure in this application, and so attaining higher levels of precision in the estimates will largely depend on improving the model algorithms and parameterization, as well as increasing the number of measurement sites used to evaluate the structural uncertainty.  相似文献   

14.
Soil heavy metal contamination is a major environmental concern, and health risk associated with heavy metals is not fully explored. A combination of spatial analysis and Monte Carlo simulation was successfully used to identify the possible sources and health risk of cadmium (Cd), arsenic (As), mercury (Hg), lead (Pb), chromium (Cr), and copper (Cu) in soils collected from a rapidly developing region of China. It was found that mean concentrations of Cd (0.17 mg/kg ), As (8.74 mg/kg ), Hg (0.15 mg/kg ), Pb (27.28 mg/kg ), and Cu (33.32 mg/kg ) were greater than the soil background values. Accumulation and spatial variability of heavy metals were significantly affected by anthropogenic activities and soil properties. The risk assessment indicated that non-carcinogenic risk was not significant. However, 95% of the total cumulative carcinogenic risk of children was greater than 1E-05, implying high potential carcinogenic risk with As and Pb representing the major contributors. Ingestion of heavy metals in the soils was the main exposure pathway compared with the inhalation and the dermal exposure. Concentration of heavy metals in the soils, particulate emission factor, and dermal exposure ratio were the major parameters affecting health risk. This study highlights the importance of assessment of soil direct exposure health risk in studying heavy metal exposures.  相似文献   

15.
    
ABSTRACT

This study aims to quantitatively assess the risk of pesticides (used in Irish agriculture) and their degradation products to groundwater and human health. This assessment uses a human health Monte-Carlo risk-based approach that includes the leached quantity combined with an exposure estimate and the No Observed Adverse Effect Level (NOAEL) as a toxicity ranking endpoint, resulting in a chemical intake toxicity ratio statistic (R) for each pesticide. A total of 34 active substances and their metabolites registered and used in the agricultural field were evaluated. MCPA obtained the highest rank (i.e., in order of decreasing human health risk), followed by desethly-terbuthylazine and deethylatrazine (with risk ratio values of 1.1 × 10−5, 9.5 × 10−6, and 5.8 × 10−6, respectively). A sensitivity analysis revealed that the soil organic carbon content and soil sorption coefficient were the most important parameters that affected model predictions (correlation coefficient of –0.60 and –0.58, respectively), highlighting the importance of soil and pesticide properties in influencing risk estimates. The analysis highlights the importance of taking a risk-based approach when assessing pesticide risk. The model can help to prioritize pesticides, with potentially negative human health effects, for monitoring programs as opposed to traditional approaches based on pesticide leaching potential.  相似文献   

16.
Considerable uncertainty exists about occupational risks, future environmental health and safety (EHS) standards, and associated production and compliance costs for single‐wall carbon nanotube (SWNT) manufacturing processes. We propose and illustrate the use of risk analysis Monte Carlo (MC) models to assess cost and exposure trade‐offs of the high‐pressure carbon monoxide (HiPco) SWNT manufacturing process given these uncertainties. Assumptions regarding the timing, frequency, magnitude, and expense of EHS standards are modeled as stochastic events and examined for their impact on the expected values, variances, and probability distributions of total production costs and occupational exposure. With a better understanding of associated risks, these models can help policy makers and manufacturers explore potential EHS benefits, consequences, and trade‐offs. For example, results suggest that voluntary implementation of a low level of protection (rather than none at all) can lead to reduced cost and exposure uncertainty with insignificant increases in production costs, as well as lowering total manufacturing and liability costs, depending on the assumptions made. Conversely, slower implementation rates of higher standards produce greater uncertainty in long‐term costs and exposure. More generally, the results of this study underscore three important observations: (1) Expected costs alone are insufficient for informed decision making; (2) the best level of standards, overall cost, and optimal voluntary standards are highly dependent on uncertain health effects; and (3) the resultant amount of uncertainty in total costs and exposure can be extreme.  相似文献   

17.
    
Miniaturized bioreactor (MBR) systems are routinely used in the development of mammalian cell culture processes. However, scale-up of process strategies obtained in MBR- to larger scale is challenging due to mainly non-holistic scale-up approaches. In this study, a model-based workflow is introduced to quantify differences in the process dynamics between bioreactor scales and thus enable a more knowledge-driven scale-up. The workflow is applied to two case studies with antibody-producing Chinese hamster ovary cell lines. With the workflow, model parameter distributions are estimated first under consideration of experimental variability for different scales. Second, the obtained individual model parameter distributions are tested for statistical differences. In case of significant differences, model parametric distributions are transferred between the scales. In case study I, a fed-batch process in a microtiter plate (4 ml working volume) and lab-scale bioreactor (3750 ml working volume) was mathematically modeled and evaluated. No significant differences were identified for model parameter distributions reflecting process dynamics. Therefore, the microtiter plate can be applied as scale-down tool for the lab-scale bioreactor. In case study II, a fed-batch process in a 24-Deep-Well-Plate (2 ml working volume) and shake flask (40 ml working volume) with two feed media was investigated. Model parameter distributions showed significant differences. Thus, process strategies were mathematically transferred, and model predictions were simulated for a new shake flask culture setup and confirmed in validation experiments. Overall, the workflow enables a knowledge-driven evaluation of scale-up for a more efficient bioprocess design and optimization.  相似文献   

18.
ABSTRACT

High levels of heavy metals in Panax notoginseng (Sanchi), a valued traditional Chinese medicine, have drawn increasing concern regarding the safe usage of Sanchi preparations. Here, we measured the concentrations of six heavy metals in Sanchi samples from 20 major plantations, investigated the pharmaceutical processes and usages of Sanchi preparations, and assessed the associated potential health risks to consumers. The average concentrations of chromium (Cr), copper (Cu), nickel (Ni), zinc (Zn), lead (Pb), and arsenic (As) in the Sanchi samples were 2.7, 3.7, 6.2, 22.1, 2.0, and 1.4 mg/kg, respectively. The hazard quotients (HQs) for these six single metals and the hazard index (HI) of these metals’ combination were all far less than 1, indicating the absence of a non-carcinogenic health hazard to consumers. The carcinogenic risk of As was 2.1 × 10?6, which is higher than the allowable level suggested by the U.S. Environmental Protection Agency but less than the level suggested by the World Health Organization (WHO). The probabilities of consumers’ exposure due to daily medicine consumption exceeding the allowable daily intakes from medicine (ADIsdrug, 1% of the ADI) suggested by the WHO were 0.0%, 0.1%, 0.1%, 0.0%, 1.6%, and 27.3% for Cr, Ni, Cu, Zn, Pb, and As, respectively.  相似文献   

19.
    
Inventory data and characterization factors in life cycle assessment (LCA) contain considerable uncertainty. The most common method of parameter uncertainty propagation to the impact scores is Monte Carlo simulation, which remains a resource‐intensive option—probably one of the reasons why uncertainty assessment is not a regular step in LCA. An analytical approach based on Taylor series expansion constitutes an effective means to overcome the drawbacks of the Monte Carlo method. This project aimed to test the approach on a real case study, and the resulting analytical uncertainty was compared with Monte Carlo results. The sensitivity and contribution of input parameters to output uncertainty were also analytically calculated. This article outlines an uncertainty analysis of the comparison between two case study scenarios. We conclude that the analytical method provides a good approximation of the output uncertainty. Moreover, the sensitivity analysis reveals that the uncertainty of the most sensitive input parameters was not initially considered in the case study. The uncertainty analysis of the comparison of two scenarios is a useful means of highlighting the effects of correlation on uncertainty calculation. This article shows the importance of the analytical method in uncertainty calculation, which could lead to a more complete uncertainty analysis in LCA practice.  相似文献   

20.
Abstract

To assess the health risks caused by soil heavy metal in China’s mining areas, concentration data for eight heavy metals in 77 mines were collected from previous literature. Monte Carlo simulation was used to analyze the corresponding carcinogenic risks and noncarcinogenic risks, and sensitivity analysis was carried out for each parameter. The results showed that among the different types of mining areas, multi-metal mines have the highest risk of carcinogenesis, followed by tungsten and antimony mines. Their carcinogenic risk values are all greater than 10?4, which is unacceptable. Pb is a heavy metal with highest noncarcinogenic risk. The log-transformed value is 3.2, which is much larger than the threshold of 0; Pb is followed by As and Hg. Therefore, Pb, As, and Hg are the heavy metals that should be controlled preferentially in polluted mining areas. Sensitivity analysis showed that the soil ingestion rate, exposure frequency, and pollutant concentration level are the factors that have the greatest impacts on health risks. More attention should be paid to these factors when addressing heavy metal pollution in mining areas. In addition, for the surveyed mines, children had a greater health risk than adults, so children should be given extra attention.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号