共查询到20条相似文献,搜索用时 0 毫秒
1.
A number of programs within the U.S. Environmental Protection Agency (USEPA) currently set less-than-lifetime exposure limits in addition to the chronic reference dose (RfD) and reference concentration (RfC). A review of procedures within the USEPA for setting reference values suggests that less-thanlifetime reference values should be more routinely developed and captured in the USEPA's online IRIS database where chronic RfDs and RfCs, as well as cancer slope factors, are currently available. A review of standard testing study protocols was conducted to determine what data were available for setting acute, short-term, and longer-term reference values, as well as chronic values. This review was done from the point of view of endpoints assessed for specific organ systems (both structural and functional), life stages covered by exposure and outcome, durations of exposure covered and the outcomes evaluated for each, and evaluation of latency to response and/or reversibility of effects. This review revealed a number of data gaps and research needs, including the need for an acute and/or short-term testing protocol that can be used to set acute and shortterm reference values, a strategy for when to conduct more extensive testing based on initial screening data or other information ( e.g., chemical class, pharmacokinetics, mode of action), additonal standard testing guidlines protocols to allow more complete assessment of certain organ systems and life stages, development of pharmacokinetic data for different life stages, toxicity related to aging, and latency to response, particularly long-term latency as a result of developmental exposures. The implications of this review are discussed relative to characterizing hazard data for setting reference values, and the potential effects on uncertainty factors and low-dose extrapolation. 相似文献
2.
The U.S. Environmental Protection Agency (USEPA) has been reviewing several approaches to testing and risk assessment related to implementation of the Food Quality Protection Act (FQPA) and the Amendments to the Safe Drinking Water Act (SDWA), both signed into law in 1996. Based on recommendations from a review of issues related to children's health protection under these laws, the USEPA established the RfD Technical Panel to evaluate in depth the current reference dose (RfD) and reference concentration (RfC) process in general, and in particular with respect to how well children and other potentially sensitive subpopulations are protected. The RfD Technical Panel also was asked to consider scientific issues that have become of greater concern in RfD and RfC derivation ( e.g., neurotoxicity, immunotoxicity), and to raise issues that should be explored or developed further for application in the RfD/RfC process. This paper provides the current status of the activities of the RfD Technical Panel. The Technical Panel has recommended that acute, short- term, and intermediate reference values should be set for chemicals, where possible, and that these values should be incorporated into the USEPA's Integrated Risk Information System (IRIS) Database. A review of current testing procedures is underway, including the endpoints assessed, life stages covered by exposure and outcome evaluation, and information that can be derived from current protocols on various durations of exposure. Data gaps identified for risk assessment include the types of pharmacokinetic data that should be collected, especially for developmental toxicity studies, the impact of aging on toxic responses occurring after early exposure as well as concomitant with exposure in old age, and information available on latency to response. The implications of the RfD Technical Panel's recommendations for various uncertainty factors are also being explored. 相似文献
3.
The improved accessibility to data that can be used in human health risk assessment (HHRA) necessitates advanced methods to optimally incorporate them in HHRA analyses. This article investigates the application of data fusion methods to handling multiple sources of data in HHRA and its components. This application can be performed at two levels, first, as an integrative framework that incorporates various pieces of information with knowledge bases to build an improved knowledge about an entity and its behavior, and second, in a more specific manner, to combine multiple values for a state of a certain feature or variable ( e.g., toxicity) into a single estimation. This work first reviews data fusion formalisms in terms of architectures and techniques that correspond to each of the two mentioned levels. Then, by handling several data fusion problems related to HHRA components, it illustrates the benefits and challenges in their application. 相似文献
4.
Based on imperfect data and theory, agencies such as the United States Environmental Protection Agency (USEPA) currently derive “reference doses” (RfDs) to guide risk managers charged with ensuring that human exposures to chemicals are below population thresholds. The RfD for a chemical is typically reported as a single number, even though it is widely acknowledged that there are significant uncertainties inherent in the derivation of this number. In this article, the authors propose a probabilistic alternative to the EPA's method that expresses the human population threshold as a probability distribution of values (rather than a single RfD value), taking into account the major sources of scientific uncertainty in such estimates. The approach is illustrated using much of the same data that USEPA uses to justify their current RfD procedure. Like the EPA's approach, our approach recognizes the four key extrapolations that are necessary to define the human population threshold based on animal data: animal to human, human heterogeneity, LOAEL to NOAEL, and subchronic to chronic. Rather than using available data to define point estimates of “uncertainty factors” for these extrapolations, the proposed approach uses available data to define a probability distribution of adjustment factors. These initial characterizations of uncertainty can then be refined when more robust or specific data become available for a particular chemical or class of chemicals. Quantitative characterization of uncertainty in noncancer risk assessment will be useful to risk managers who face complex trade-offs between control costs and protection of public health. The new approach can help decision-makers understand how much extra control cost must be expended to achieve a specified increase in confidence that the human population threshold is not being exceeded. 相似文献
5.
Noncancer risk assessments are generally forced to rely on animal bioassay data to estimate a Tolerable Daily Intake or Reference Dose, as a proxy for the threshold of human response. In cases where animal bioassays are missing from a complete data base, the critical NOAEL (no-observed-adverse-effect level) needs to be adjusted to account for the impact of the missing bioassay(s). This paper presents two approaches for making such adjustments. One is based on regression analysis and seeks to provide a point estimate of the adjustment needed. The other relies on non-parametric analysis and is intended to provide a distributional estimate of the needed adjustment. The adjustment needed is dependent on the definition of a complete data base, the number of bioassays missing, the specific bioassays which are missing, and the method used for interspecies scaling. The results from either approach can be used in conjunction with current practices for computing the TDI or RfD, or as an element of distributional approaches for estimating the human population threshold. 相似文献
6.
The public understands and supports the ethical use of human subjects in medical research, recognizing the unique role for this type of study in the development of new drugs and therapeutic strategies for treatment of disease. The use of data from human subjects can also be of value in understanding the circumstances under which individuals exposed to chemicals in the food supply, in the workplace, or in the environment might experience toxicity, i.e., in support of risk assessment. However, questions have been raised as to whether this latter type of research is ethical, or can be performed in an ethical manner. Under what circumstances is it acceptable to intentionally expose human subjects to potentially toxic agents? This is an extremely important issue for the risk assessment community to address, because it affects in a fundamental way the types of information that will be available to conduct human health risk assessments. Four papers in this issue offer viewpoints on the value of human data, the circumstances under which human subjects might be exposed to toxic chemicals for research purposes, the ethical problems associated with this research, and the role of human vs. animal data in the development of toxicity values for human health risk assessment 相似文献
7.
The default uncertainty factors used for risk assessment are applied either to allow for different aspects of extrapolation of the dose-response curve or to allow for database deficiencies. Replacement of toxicokinetic or toxicodynamics defaults by chemical-specific data allows the calculation of a chemical-specific “data-derived factor”, which is the product of chemical-specific values and default uncertainty factors. Such chemical-specific composite values will improve the scientific basis of the risk assessment of that chemical, but the necessary chemical-specific data are rarely available. Categorical defaults related to pathways of elimination and mechanisms of toxicity could be used when the overall fate or mechanism is known, but there are no chemical-specific data sufficient to allow replacement of the default, and the development of an overall data-derived factor. The development of pathway-related categorical defaults is being undertaken using data on selected probe substrates for which adequate data are available. The concept and difficulties of this approach are illustrated using data for CYP1A2. 相似文献
8.
This article evaluates the health risk raised by exposure to naturally occurring radionuclides in soil around Khak-Sefid, Ramsar, Iran, which is an area of high natural background radiation. A high purity germanium detector was used to determine levels of radionuclides in soil samples and the cancer morbidity risk for a hypothetical resident farmer was evaluated using the RESidual RADioactivity (RESRAD) code. The average activity concentrations of 226Ra, 232Th, and 40K were found to be 13,201 ± 391, 27.9 ± 2.4, and 415.5 ± 16 Bq/kg, respectively. The maximum assessed cancer morbidity risks were calculated from external and internal exposure pathways as 4.73 × 10 ?2 and 3.40 × 10 ?2 for 226Ra, 1.41 × 10 ?4 and 7.88 × 10 ?5 for 232Th, and 1.3 × 10 ?4 and 4.233 × 10 ?4 for 40K. The RESRAD calculations also showed total cancer morbidity risks from external gamma and plant ingestion pathways were more important than from other exposure pathways. A sensitivity analysis was also performed to determine the input parameter values in the risk assessment process. In general, due to the high calculated risk of 226Ra compared with 232Th and 40K it can be the major source of concern for human heath in the study area. 相似文献
9.
Epidemiological studies of workers in the nickel industry, animal exposure studies, and reports on the potential mechanisms of nickel-induced toxicity and carcinogenicity indicate that only crystalline sulfidic nickel compounds have been clearly established as carcinogenic or potentially carcinogenic in humans. This observation indicates the need to modify and update regulatory approaches for nickel to reflect noncancer toxicity values for some individual nickel species. Analysis of nickel compounds in residual oil fly ash (ROFA) indicates that sulfidic nickel compounds ( e.g., nickel subsulfide, nickel sulfide) are not present. Thus, the potential for emission of carcinogenic nickel compounds from residual oil fly ash appears to be low. Preliminary reference concentrations (RfCs) for a number of nickel compounds, based on non-carcinogenic endpoints, are proposed on the basis of the benchmark dose approach in conjunction with NTP data for nickel species. 相似文献
10.
The use of animal vs. human data for the purposes of establishing human risk was examined for four pharmaceutical compounds: acetylsalicylic acid, cyclophosphamide, indomethacin and clofibric acid. Literature searches were conducted to identify preclinical and clinical data useful for the derivation of acceptable daily intakes (ADIs) from which a number of risk values including occupational exposure limits (OELs) could be calculated. OELs were calculated using human data and then again using animal data exclusively. For two compounds, ASA and clofibric acid use of animal data alone led to higher OELs (not health protective), while for indomethacin and cyclophosphamide use of animal data resulted in the same or lower OELs based on human data alone. In each case arguments were made for why the use of human data was preferred. The results of the analysis support a basic principle of risk assessment that all available data be considered 相似文献
11.
Understanding the spatial pattern of species distributions is fundamental in biogeography, and conservation and resource management applications. Most species distribution models (SDMs) require or prefer species presence and absence data for adequate estimation of model parameters. However, observations with unreliable or unreported species absences dominate and limit the implementation of SDMs. Presence-only models generally yield less accurate predictions of species distribution, and make it difficult to incorporate spatial autocorrelation. The availability of large amounts of historical presence records for freshwater fishes of the United States provides an opportunity for deriving reliable absences from data reported as presence-only, when sampling was predominantly community-based. In this study, we used boosted regression trees (BRT), logistic regression, and MaxEnt models to assess the performance of a historical metacommunity database with inferred absences, for modeling fish distributions, investigating the effect of model choice and data properties thereby. With models of the distribution of 76 native, non-game fish species of varied traits and rarity attributes in four river basins across the United States, we show that model accuracy depends on data quality (e.g., sample size, location precision), species’ rarity, statistical modeling technique, and consideration of spatial autocorrelation. The cross-validation area under the receiver-operating-characteristic curve (AUC) tended to be high in the spatial presence-absence models at the highest level of resolution for species with large geographic ranges and small local populations. Prevalence affected training but not validation AUC. The key habitat predictors identified and the fish-habitat relationships evaluated through partial dependence plots corroborated most previous studies. The community-based SDM framework broadens our capability to model species distributions by innovatively removing the constraint of lack of species absence data, thus providing a robust prediction of distribution for stream fishes in other regions where historical data exist, and for other taxa (e.g., benthic macroinvertebrates, birds) usually observed by community-based sampling designs. 相似文献
12.
Cumulative risk assessments (CRAs) include the examination of risks posed by multiple stressors and include population-specific vulnerabilities and susceptibilities. In this case study, we assess potential hearing impairment hazard due to joint exposure from noise and volatile organic compounds (VOCs) in order to examine the strengths and limitations of using secondary data on exposure and health effects for a CRA. Block group-level noise categories were estimated using modeled street-level data. A quantile regression model of sociodemographic and personal predictors from the 1999–2000 U.S. National Health and Nutrition Examination Survey VOC dataset was used along with block group-level sociodemographic and personal variables to estimate VOC exposures. Hazard indices (HIs) for potential hearing impairment due to joint noise and VOC exposures were calculated. County-averaged HIs for hearing impairment ranged from 0.8 (10th total VOCs percentile and 45–60 dB) to 1.7 (90th total VOCs percentile and 71–75 dB). Limitations of the exposure and health effects data included issues combining heterogeneous data and a lack of established threshold levels for combined low-level exposures; yet, this case study illustrates that screening-level CRAs, including non-chemical stressors, can be accomplished with publicly available data and existing methods. 相似文献
13.
Efforts at the restoration of river ecosystems are needed not only in local habitats but are also important in terrestrial regions. Large-scale assessment of human activities can be useful in integrated watershed management. In this study, we modified the Ecological Risk Index (ERI) by considering the spatial distribution of human activities in China's Haihe River Basin (HRB). The stressor factors of human activities included population, impervious surface, cattle, agricultural land use, industry, fertilizers, pesticides, water conservation facilities, and roads. A total of 423 assessment units in the HRB were created by combining watershed and administrative boundaries to analyze the spatial distribution of human activities. Two index options, the ERI (including all stressors) and the ERI-D (excluding reservoirs and sluices), were examined for different management objectives. All the stressors and both ecological risk indices (ERI and ERI-D) were ranked in four levels: low, moderate, high, and very high. Our study demonstrated that the ERI and ERI-D can provide an overview of the spatial pattern of human stressors related to river ecosystems across a large geographic region. The approach developed in this study is useful for prioritizing management actions in targeted areas. 相似文献
14.
Human health risks from occupational exposures are managed by limiting exposures to acceptable levels established by the American Conference of Governmental Industrial Hygienists or another similar body. Acceptable environmental exposures are benchmarked by values such as U.S. Environmental Protection Agency's Reference Doses and Reference Concentrations. The approaches to establishing these values are different, as are the groups they are intended to protect, complicating direct comparisons. Occupational limits are based on a healthy workforce in a narrow age range and do not generally consider sensitive populations. Limits for environmental exposures consider sensitive populations. In this evaluation, physiologically based pharmacokinetic modeling was used to predict tissue doses from acceptable/safe exposures as established by different organizations and agencies. Internal doses calculated for an agency's acceptable/safe exposures via oral and inhalation routes may differ substantially, but are sometimes in excellent agreement. The finding that internal doses resulting from occupational exposures are almost uniformly greater than those from environmental exposures suggests different mindsets among these groups regarding how safe is “safe.” 相似文献
15.
Toxicity tests are widely used to set “acceptable” levels of chemical exposure. Different organizations have identified a base set of tests specifying a mix of endpoints, durations, and species to be tested. A specific test and endpoint is chosen as the basis for calculation of human health risk values like reference doses (RfDs). This study empirically evaluates the data and choices made in setting acute and chronic RfDs for 352 conventional pesticides. The results suggest that for Acute, Acute-Female Specific, and Chronic RfDs one test is used far more than others. Ninety-six percent of the 116 Acute Female-Specific RfDs relied on a developmental toxicity test and 78% of Chronic RfDs used the chronic bioassay. Tests in rats were used far more often than other species in all RfD calculations. For all types of RfDs a total uncertainty factor of 100 was most common although values as low as 1 and as high as 3000 were seen. These results provide insights not only into the science policy frameworks used, but also into ways toxicity testing and risk assessment may be streamlined and made more efficient. 相似文献
16.
This article presents a risk assessment for human exposure to nonylphenol (NP). We critically reviewed and assessed all relevant full-text publications based on a variety of data quality attributes. Two categories of data, environmental monitoring and biomonitoring from exposed individuals, were used to estimate human exposure to NP. Environmental monitoring data included the measurement of NP in food, water, air, and dust. From these data and estimates of human intake rates for the sources, exposures were estimated from each source and source-specific Margins of Exposure (MOEs) calculated. However, the nature of the populations studied prevented the calculation of aggregate exposure calculations from these data. Rather, the most reliable estimates of aggregate exposure to NP were those derived from biomonitoring studies in exposed individuals. Using the daily absorbed dose estimates for NP, MOEs were calculated for these populations. The MOEs were based on the use of a No-Observed-Adverse-Effect-Level (NOAEL) for sensitive toxicological endpoints of interest, that is, systemic and reproductive toxicity from continuous-feeding more than 3.5 generations (13 mg/kg/day). The MOEs were all greater than 1000 (ranging from 2863 to 8.4 × 10 7), clearly indicating reasonable certainty of no harm for source-specific and aggregate (based on biomonitoring) exposures to NP. 相似文献
17.
Information Quality (IQ) is a critical factor for the success of many activities in the information age, including the development of data warehouses and implementation of data mining. The issue of IQ risk is recognized during the process of data mining; however, there is no formal methodological approach to dealing with such issues. Consequently, it is essential to measure the risk of IQ in a data warehouse to ensure success in implementing data mining. This article presents a methodology to determine three IQ risk characteristics: accuracy, comprehensiveness, and non-membership. The methodology provides a set of quantitative models to examine how the quality risks of source information affect the quality for information outputs produced using the relational algebra operations: Restriction, Projection, and Cubic product. It can be used to determine how quality risks associated with diverse data sources affect the derived data. The study also develops a data cube model and associated algebra to support IQ risk operations. 相似文献
18.
Bioaccessibility measurements have the potential to improve the accuracy of risk assessments and reduce the potential costs of remediation when they reveal that the solubility of chemicals in a matrix ( e.g., soil) differs markedly from that in the critical toxicity study ( i.e., the key study from which a toxicological or toxicity reference value is derived). We aimed to apply this approach to a brownfield site contaminated with chromium, and found that the speciation was Cr III, using a combination of alkaline digestion/diphenylcarbazide complexation and X-ray absorption near edge structure analysis. The bioaccessibility of Cr 2O 3, the compound on which a reference dose for Cr III is based, was substantially lower (<0.1%) than that of the Cr III in the soils, which was a maximum of 9%, giving relative bioaccessibility values of 13,000% in soil. This shows that the reference dose is based on essentially an insoluble compound, and thus we suggest that other compounds be considered for toxicity testing and derivation of reference dose. Two possibilities are CrCl 3·6H 2O and KCr(SO 4) 2·12H 2O, which have been used for derivation of ecological toxicity reference values and are soluble at a range of dosing levels in our bioaccessibility tests. 相似文献
19.
Children, particularly neonates, can be biologically more sensitive to the same toxicant on a body weight basis than adults. Current understanding of the rates of maturation of metabolism and evidence from case studies indicate that human infants up to 6 months of age typically lack the capacity to detoxify and eliminate substances as readily as adults. For most chemicals, the infant physiologic systems usually produce higher blood levels for longer periods. The newborn's metabolic capacity rapidly matures and, by 6 months of age, children are usually not more sensitive than adults based on their pharmacokinetic competence. Whether children are at greater risk from chemical exposures is another question. Drawing conclusions about the ability of the U.S. Environmental Protection Agency's intraspecies (UF H) and database (UF D) uncertainty factors to protect children on the basis of the modest data available is challenging. However, virtually all studies available suggest that a high percentage of the population, including children, is protected by using a 10-fold UF H or by using a 3.16-fold factor each for toxicokinetic and toxicodynamic variability. Based on specific comparisons for newborns, infants, children, adults and those with severe disease, the population protected is between 60% and 100%, with the studies in larger populations that include sensitive individuals suggesting that the value is closer to 100%. UF D is likewise protective when used with databases that are missing substantive studies. 相似文献
|