首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 99 毫秒
1.
The default uncertainty factors used for risk assessment are applied either to allow for different aspects of extrapolation of the dose-response curve or to allow for database deficiencies. Replacement of toxicokinetic or toxicodynamics defaults by chemical-specific data allows the calculation of a chemical-specific “data-derived factor”, which is the product of chemical-specific values and default uncertainty factors. Such chemical-specific composite values will improve the scientific basis of the risk assessment of that chemical, but the necessary chemical-specific data are rarely available. Categorical defaults related to pathways of elimination and mechanisms of toxicity could be used when the overall fate or mechanism is known, but there are no chemical-specific data sufficient to allow replacement of the default, and the development of an overall data-derived factor. The development of pathway-related categorical defaults is being undertaken using data on selected probe substrates for which adequate data are available. The concept and difficulties of this approach are illustrated using data for CYP1A2.  相似文献   

2.
A “safe” or sub-threshold dose is often estimated for oral toxicity of substances in order to protect humans from adverse health effects. This dose is referred to by several terms: reference dose (RfD), tolerable daily intake (TDI), and acceptable daily intake (ADI). Similarly, tolerable concentration (TC), and reference concentration (RfC) are commonly used terms for a “safe” concentration for inhalation. The process of deriving these doses generally involves identifying a no observed, or lowest observed adverse effect level (NOAEL or LOAEL) in animals, or humans, and application of uncertainty factors to account for the extrapolation from laboratory animals to humans and/or from an average human to a sensitive human. Public health agencies have begun to consider using a data derived approach, which uses available toxicokinetic and toxicodynamic data in the determination of uncertainty factors, rather than relying on the standard default values. Recently two different tolerable daily intake risk values were derived by two different World Health Organization (WHO) work groups. The International Programme on Chemical Safety, and the Working Group on Chemical Substances in Drinking Water both used the approach developed by Renwick (1993); however, the two groups interpreted and used the available data differently. The result was a difference of over twofold in the total uncertainty factor used. This review compares and contrasts the two approaches used by these WHO work groups.  相似文献   

3.
Investigations were performed on representative compounds from five different therapeutic classes to evaluate the use of categorical data-derived adjustment factors to account for interindividual variability. The five classes included antidepressants, angiotensin converting enzyme (ACE) inhibitors, nonsteroidal anti-inflammatory drugs (NSAIDS), cholesterol lowering agents, and antibiotics. Each of the case studies summarized the mode of action of the class responsible for both the therapeutic and adverse effects and the key pharmacodynamic (PD) and pharmacokinetic (PK) parameters that determine the likelihood of these responses for individual compounds in the class. For each class, an attempt was made to identify the key factors that determine interindividual variability and whether there was a common basis to establish a categorical default adjustment factor that could be applied across the class (or at least across specific subclasses within the class). Linking the PK and PD parameters to the critical endpoint used to establish a safe level of exposure was an important underlying theme throughout the investigations. Despite the wealth of PK and PD information in the published literature on the surrogate compounds representing these classes, it was difficult to derive a categorical adjustment factor that could be applied broadly within each class. The amount of information available may have hindered rather than helped the evaluations. Derivation of categorical defaults for different classes of “common” chemicals may be more straightforward if sufficient data are available. In a few cases (e.g., tricyclic antibiotics, ACE inhibitors and selected antiinflammatory agents) categorical defaults could be proposed, although it is unclear whether the reduction in uncertainty resulting from their application would be offset by the additional uncertainties that may have resulted from their application. Residual uncertainties may remain depending on the level of confidence in the underlying assumptions used to support the categorical defaults. Regardless of the conclusions on the utility of categorical defaults, these investigations provided further support for the use of data-derived adjustment factors on a compound-specific basis.  相似文献   

4.
For the risk to human health posed by chemicals that show threshold toxicity there is an increasing need to move away from using the default approaches, which inherently incorporate uncertainty, towards more biologically defensible risk assessments. However, most chemical databases do not contain data of sufficient quantity or quality that can be used to replace either the interspecies or interindividual aspects of toxicokinetic and toxicodynamic uncertainty. The purpose of the current analysis was to evaluate the use of alternative, species-specific, pathway-related, “categorical” default values to replace the current interspecies toxicokinetic default uncertainty factor of 4.0. The extent of the difference in the internal dose of a compound, for each test species, could then be related to the specific route of metabolism in humans. This refinement would allow for different categories of defaults to be used, providing that the metabolic fate of a toxicant was known in humans. Interspecies differences in metabolism, excretion, and bioavailability have been compared for probe substrates for four different human xenobiotic-metabolizing enzymes: CYP1A2 (caffeine, paraxanthine, theobromine, and theophylline), CYP3A4 (lidocaine), UDP-glucuronyltransferase (AZT), and esterases (aspirin). The results of this analysis showed that there are significant differences between humans and the four test species in the metabolic fate of the probe compounds, the enzymes involved, the route of excretion and oral bioavailability — all of which are factors that can influence the extent of the difference between humans and a test species in the internal dose of a toxicant. The wide variability between both compounds and the individual species suggests that the categorical approach for species differences may be of limited use in refining the current default approach. However, future work to incorporate a wider database of compounds that are metabolized extensively by any pathway in humans to provide more information on the extent to which the different test species are not covered by the default of 4.0. Ultimately this work supports the necessity to remove the uncertainty from the risk assessment process by the generation and use of compound-specific data.  相似文献   

5.
The traditional “safety factor”; method has been used for years to establish occupational exposure limits (OELs) for active ingredients used in drugs. In the past, a single safety factor was used to address all sources of uncertainty in the limit setting process. The traditional 100‐fold safety factor commonly used to derive an acceptable daily intake value incorporates a default factor of 10 each to account for interindividual variability and interspecies extrapolation. Use of these defaults can lead to overly conservative health‐based limits, especially when they are combined with other (up to 10‐fold) factors to adjust for inadequacies in the available database. In recent years, attempts have been made to quantitate individual sources of uncertainty and variability to improve the scientific basis for OELs. In this paper we discuss the science supporting reductions in the traditional default uncertainty factors. A number of workplace‐specific factors also support reductions in these factors. Recently proposed alternative methodologies provide a framework to make maximum use of preclinical and clinical information, e.g., toxicokinetic and toxicodynamic data, to reduce uncertainties when establishing OELs for pharmaceutical active ingredients.  相似文献   

6.
Boron, which is ubiquitous in the environment, causes developmental and reproductive effects in experimental animals. This observation has led to efforts to establish a Tolerable Intake value for boron. Although risk assessors agree on the use of fetal weight decreases observed in rats as an appropriate critical effect, consensus on the adequacy of toxicokinetic data as a basis for replacement of default uncertainty factors remains to be reached. A critical analysis of the existing data on boron toxicokinetics was conducted to clarify the appropriateness of replacing default uncertainty factors (10-fold for interspecies differences and 10-fold for intraspecies differences) with data-derived values. The default uncertainty factor for variability in response from animals to humans of 10-fold (default values of 4-fold for kinetics and 2.5-fold for dynamics) was recommended, since clearance of boron is 3-to 4-fold higher in rats than in humans and data on dynamic differences—in order to modify the default value—are unavailable. A data-derived adjustment of 6-fold (1.8 for kinetics and 3.1 for dynamics) rather than the default uncertainty factor of 10-fold was considered appropriate for intrahuman variability, based on variability in glomerular filtration rate during pregnancy in humans and the lack of available data on dynamic differences. Additional studies to investigate the toxicokinetics of boron in rats would be useful to provide a stronger basis for replacement of default uncertainty factors for interspecies variation.  相似文献   

7.
PurposeRadiochromic films change their color upon irradiation due to polymerization of the sensitive component embedded within the sensitive layer. However, agents, other than monitored radiation, can lead to a change in the color of the sensitive layer (temperature, humidity, UV light) that can be considered as a background signal and can be removed from the actual measurement by using a control film piece. In this work, we investigate the impact of the use of control film pieces on both accuracy and uncertainty of dose measured using radiochromic film based reference dosimetry protocol.MethodsWe irradiated “control” film pieces (EBT3 GafChromicTM film model) to known doses in a range of 0.05–1 Gy, and five film pieces of the same size to 2, 5, 10, 15 and 20 Gy, considered to be “unknown” doses. Depending on a dose range, two approaches to incorporating control film piece were investigated: signal and dose corrected method.ResultsFor dose values greater than 10 Gy, the increase in accuracy of 3% led to uncertainty loss of 5% by using dose corrected approach. At lower doses and signals of the order of 5%, we observed an increase in accuracy of 10% with a loss of uncertainty lower than 1% by using the corrected signal approach.ConclusionsIncorporation of the signal registered by the control film piece into dose measurement analysis should be a judgment call of the user based on a tradeoff between deemed accuracy and acceptable uncertainty for a given dose measurement.  相似文献   

8.
Tenfold uncertainty factors have been used in risk assessment for about 40 years to allow for species differences and inter-individual variability. Each factor has to allow for toxicokinetic and toxicodynamic differences. Subdividing the 10-fold factors into kinetic and dynamic defaults, which when multiplied give a product of 10, offers a number of advantages. A major advantage is that chemical-specific data can be introduced to replace one or more of the default subfactors, hence contributing to a chemical-related overall factor. Subdivision of the 10-fold factors also facilitates analysis of the appropriateness of the overall 10-fold defaults, and the development of a more refined approach to the use of uncertainty factors.  相似文献   

9.
Microbial risk assessors often make simplifying assumptions that lead to the selection of simple concave functions with low-dose linearity, consistent with no-threshold and single-hit hypotheses, as default dose–response model forms. However, evidence is accumulating as the “microbiome revolution” progresses that challenge these assumptions that influence the estimates of the nature and magnitude of uncertainties associated with microbial risks. Scientific advances in the knowledge of the human “superorganism” (hybrid consortium of human plus microbial communities that cooperatively regulates health and disease) enable the design of definitive studies to estimate the pathogen doses overcome by the innate defenses, including the protective microbiota. The systematic investigation of the events of non-typhoid salmonellosis in humans undertaken nearly 2 decades ago was updated to incorporate recent scientific advances in the understanding of impact of the healthy superorganism that strengthens and extends the biological motivations for sublinear or convex dose–response curves in microbial risk assessment. The knowledge of colonization resistance (innate protection of the human superorganism from low doses of ingested pathogens) and microbiota-mediated clearance is advancing mechanistically for many pathosystems. However, until more detailed mechanistic data become available for salmonellosis, the consideration of a variety of empirical model forms is essential for depicting the uncertainty of the “true” dose–response model.  相似文献   

10.
Life cycle inventory data have multiple sources of uncertainty. These data uncertainties are often modeled using probability density functions, and in the ecoinvent database the lognormal distribution is used by default to model exchange uncertainty values. The aim of this article is to systematically measure the effect of this default distribution by changing from the lognormal to several other distribution functions and examining how this change affects the uncertainty of life cycle assessment results. Using the ecoinvent 2.2 inventory database, data uncertainty distributions are switched from the lognormal distribution to the normal, triangular, and gamma distributions. The effect of the distribution switching is assessed for both impact assessment results of individual products system, as well as comparisons between product systems. Impact assessment results are generated using 5,000 Monte Carlo iterations for each product system, using the Intergovernmental Panel on Climate Change (IPCC) 2001 (100‐year time frame) method. When comparing the lognormal distribution to the alternative default distributions, the difference in the resulting median and standard deviation values range from slight to significant, depending on the distributions used by default. However, the switch shows practically no effect on product system comparisons. Yet, impact assessment results are sensitive to how the data uncertainties are defined. In this article, we followed what we believe to be ecoinvent standard practice and preserved the “most representative” value. Practitioners should recognize that the most representative value can depart from the average of a probability distribution. Consistent default distribution choices are necessary when performing product system comparisons.  相似文献   

11.
Both performing and validating a detailed risk analysis of a complex system are costly and time-consuming undertakings. With the increased use of probabilistic risk analysis (PRA) in regulatory decision making, both regulated parties and regulators have generally favored the use of defaults, because they can greatly facilitate the process of performing a PRA in the first place, as well as the process of reviewing and verifying the PRA. The use of defaults may also ensure more uniform standards of PRA quality. However, regulatory agencies differ in their approaches to the use of default values, and the implications of these differences are not yet well understood. Moreover, large heterogeneity among licensees makes it difficult to set suitable defaults. This paper focuses on the effect of default values on estimates of risk. In particular, we explore the effects of different levels of conservatism in setting defaults, and their implications for the crafting of regulatory incentives. The results can help decision makers evaluate the levels of safety likely to result from their regulatory policies.  相似文献   

12.
This paper presents the results of deliberations from participants who met on the second day of the Fourth Annual Workshop on the Evaluation of Uncertainty/Safety Factors in Health Risk Assessment. The group reviewed the previous day's presentations and implications for improvement in risk assessment. After much discussion, the group concluded that, in the short term, significant improvements could be made in the pharmacokinetic component of the inter-species uncertainty factor and developed a series of default options for this factor. These defaults consider route of exposure (oral or inhalation), and the form of the active compound (parent, metabolite, or very reactive metabolite). Several assumptions are key to this approach, such as a similar oral or inhalation bioavailability across species. We believe this method represents a useful default approach until more compound-specific information is available.  相似文献   

13.
The Reference Dose (RfD) is used in the risk assessment of non-carcinogenic chemicals. It is derived by dividing a point of departure by the product of the uncertainty (UFs) and modifying factors (MFs). Separate UFs are used for different variables, e.g., intraspecies variation and, in general, each UF is an order of magnitude (10-fold). On the other hand, the MF is usually based on some known variable such as differences in absorption of a chemical from food and water and its default value is one. The USEPA's Integrated Risk Information System (IRIS) has 14 chemicals that have RfDs based on human studies. We examined those IRIS files to determine the rationale for setting human intraspecies uncertainty factors (UFH). The first consideration was that the chemical had an adequate peer-reviewed human database. Without such, it would not be possible to derive an RfD based on human data. Ten of the 14 chemicals had an UFH of 1 or 3; four of these were essential trace elements (ETEs). The rationales for using less than a 10-fold UFH for the ETEs included; 1) nutritional data, 2) large human exposure groups, 3) minimal effect levels and/or 4) several studies with similar effect levels. For the other compounds, reasons included; 1) large human exposure groups, 2) a critical effect that was not adverse (cosmetic), 3) the most sensitive population was exposed, 4) the compound was on the FDA's “generally regarded as safe” (GRAS) list, 5) database uncertainties and 6) less-than-lifetime exposure adjusted for 70 years exposure. It is important to understand the reasons for selecting a UFH of 1, or 3 as they will apply to future chemicals considered by the USEPA and other agencies.  相似文献   

14.
The current guideline for risk assessment of chemicals having a toxic end point routinely uses the reference dose (RfD) approach based on uncertainty factors of 10. With this method the quality of individual risk assessment varies among chemicals, often resulting in either an over‐ or under‐estimation of adverse health risk. The purpose of this investigation is to evaluate whether the magnitude of the 10X uncertainty factors have scientific merit against data from published experimental studies. A compilation and comparison of ratios between LOAEL/NOAEL (Lowest Observed Adverse Effect Level/No Observed Adverse Effect Level), subchronic/chronic, and animal/human values were made. The results of the present investigation revealed that the use of default factors could be over‐conservative or unprotective. More reasonable estimates of the risk to human health would result in a reduction of unnecessary, and expensive over‐regulation. In addition to the LOAEL to NOAEL, and subchronic to chronic ratios, the adequacy of uncertainty factors for animal to human extrapolations were examined. Although a 10‐fold uncertainty factor (UF) is most commonly used in the risk assessment process, an examination of the literature for the compounds presented here suggests that the use of different values is scientifically justifiable.  相似文献   

15.
REDD+ reference levels directly impact the benefits which a country may receive. However, the existing “Compensation Reduction” (CR) and “Compensated Successful Efforts” (CSE) are only considered from a unilateral perspective of outputs or inputs. The combination of these two approaches is considered to estimate the REDD+ reference levels through the Zero-Sum-Gains Data Envelopment Analysis in this paper. The agricultural labor force and agricultural land area are used as input variables, and the gross agricultural production and carbon emissions from deforestation are considered as output variables. The REDD+ reference levels of 89 countries are calculated and classified through the Zero-Sum-Gains DEA model. The results demonstrate that the REDD+ reference levels are estimated efficiently through the Zero-Sum-Gains DEA model, and all countries with deforestation are in the Zero-Sum-Gains DEA frontier, indicating the overall Pareto optimality has been achieved. The empirical results also indicate that the use of Zero-Sum-Gains DEA model is more beneficial for Latin American and the Caribbean, while the countries that may see a revenue drop in REDD+ are in Africa, Asia and Oceania. Consequently, the final REDD+ reference levels should take into account both efficiency and fairness by selecting the appropriate fairness-efficiency weighting factor.  相似文献   

16.
IN assessing environmental health hazards, the question has arisen of whether “safe”, “tolerable” or “permissible” levels of carcinogens, mutagens or teratogens can be derived by extrapolation of bioassays using rodents exposed for various periods to very high concentrations of chemicals or using cultured mammalian cell lines. Variations in susceptibility are only rarely taken into account, if at all and doses which seem to be harmless to the average person may be harmful to susceptible people. The reduced capacity to repair ultraviolet-induced DNA lesions in xeroderma pigmentosum (XP) cells may exemplify a mechanism leading to an elevated neoplastic transformation rate in man1–4. The question arises as to whether cells with deficient repair synthesis respond to chemical carcinogens in the same manner as cells with adequate repair systems. We report here the levels of DNA repair synthesis in XP cells of five patients exposed to the carcinogenic5,6 and mutagenic7 compounds N-acetoxy or N-hydroxy-2-acetyl-aminofluorene, which are ultimate and proximate carcinogenic forms of 2-acetylaminofluorene (AAF). We were particularly interested in comparing the different levels of DNA repair synthesis following ultraviolet irradiation with those following treatment with chemical carcinogens.  相似文献   

17.
18.
An “expansive” risk assessment approach is illustrated, characterizing dose–response relationships for salmonellosis in light of the full body of evidence for human and murine superorganisms. Risk assessments often require analysis of costs and benefits for supporting public health decisions. Decision-makers and the public need to understand uncertainty in such analyses for two reasons. Uncertainty analyses provide a range of possibilities within a framework of present scientific knowledge, thus helping to avoid undesirable consequences associated with the selected policies. And, it encourages the risk assessors to scrutinize all available data and models, thus helping avoid subjective or systematic errors. Without the full analysis of uncertainty, decisions could be biased by judgments based solely on default assumptions, beliefs, and statistical analyses of selected correlative data. Alternative data and theories that incorporate variability and heterogeneity for the human and murine superorganisms, particularly colonization resistance, are emerging as major influences for microbial risk assessment. Salmonellosis risk assessments are often based on conservative default models derived from selected sets of outbreak data that overestimate illness. Consequently, the full extent of uncertainty of estimates of annual number of illnesses is not incorporated in risk assessments and the presently used models may be incorrect.  相似文献   

19.
The need to identify “toxicologically equivalent” doses across different species is a major issue in toxicology and risk assessment. In this article, we describe an approach for establishing default cross-species extrapolation factors used to scale oral doses across species for non-carcinogenic endpoints. This work represents part of an on-going effort to harmonize the way animal data are evaluated for carcinogenic and non-carcinogenic endpoints. In addition to considering default scaling factors, we also discuss how chemical-specific data (e.g., metabolic or mechanistic data) can be incorporated into the dose extrapolation process. After first examining the required properties of a default scaling methodology, we consider scaling approaches based on empirical relationships observed for particular classes of compounds and also more theoretical approaches based on general physiological principles (i.e, allometry). The available data suggest that the empirical and allometric approaches each provide support for the idea that toxicological risks are approximately equal when daily oral doses are proportional to body weight raised to the 3/4-power. We also discuss specific challenges for dose scaling related to different routes of exposure, acute versus chronic toxicity, and extrapolations related to particular life stages (e.g., childhood).  相似文献   

20.
The objective of this study was to assess from a societal perspective the cost‐effectiveness of the Active After‐school Communities (AASC) program, a key plank of the former Australian Government's obesity prevention program. The intervention was modeled for a 1‐year time horizon for Australian primary school children as part of the Assessing Cost‐Effectiveness in Obesity (ACE‐Obesity) project. Disability‐adjusted life year (DALY) benefits (based on calculated effects on BMI post‐intervention) and cost‐offsets (consequent savings from reductions in obesity‐related diseases) were tracked until the cohort reached the age of 100 years or death. The reference year was 2001, and a 3% discount rate was applied. Simulation‐modeling techniques were used to present a 95% uncertainty interval around the cost‐effectiveness ratio. An assessment of second‐stage filter criteria (“equity,” “strength of evidence,” “acceptability to stakeholders,” “feasibility of implementation,” “sustainability,” and “side‐effects”) was undertaken by a stakeholder Working Group to incorporate additional factors that impact on resource allocation decisions. The estimated number of children new to physical activity after‐school and therefore receiving the intervention benefit was 69,300. For 1 year, the intervention cost is Australian dollars (AUD) 40.3 million (95% uncertainty interval AUD 28.6 million; AUD 56.2 million), and resulted in an incremental saving of 450 (250; 770) DALYs. The resultant cost‐offsets were AUD 3.7 million, producing a net cost per DALY saved of AUD 82,000 (95% uncertainty interval AUD 40,000; AUD 165,000). Although the program has intuitive appeal, it was not cost‐effective under base‐case modeling assumptions. To improve its cost‐effectiveness credentials as an obesity prevention measure, a reduction in costs needs to be coupled with increases in the number of participating children and the amount of physical activity undertaken.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号