首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The default uncertainty factors used for risk assessment are applied either to allow for different aspects of extrapolation of the dose-response curve or to allow for database deficiencies. Replacement of toxicokinetic or toxicodynamics defaults by chemical-specific data allows the calculation of a chemical-specific “data-derived factor”, which is the product of chemical-specific values and default uncertainty factors. Such chemical-specific composite values will improve the scientific basis of the risk assessment of that chemical, but the necessary chemical-specific data are rarely available. Categorical defaults related to pathways of elimination and mechanisms of toxicity could be used when the overall fate or mechanism is known, but there are no chemical-specific data sufficient to allow replacement of the default, and the development of an overall data-derived factor. The development of pathway-related categorical defaults is being undertaken using data on selected probe substrates for which adequate data are available. The concept and difficulties of this approach are illustrated using data for CYP1A2.  相似文献   

2.
Regulatory authorities in North America, Europe and Australia use different approaches for the estimation of exposure reduction effectiveness of personal protective equipment (PPE) in registration processes of agrochemical pesticides. TNO has investigated current views and facts for the use of default values and set up a discussion paper which can be used as a starting point to achieve an internationally harmonised set of PPE protection factors for regulatory use. For inhalation exposure Loading it is proposed to use the assigned protection factors (APF) as deduced by BSI (British Standard Institution) and ANSI (American National Standards Institution). Since these values are somewhat variance and since in agricultural settings efficient control and proper training and education with respect to inhalation protection devices is generally absent, it is good to err on the safe side and to use the Lowest of both values, if available. For dermal exposure Loading differentiations are made for operators and re-entry workers and further for hand and body protection. Next to this the restrictions and framework for the use of the proposed defaults are very relevant. Oral exposure loading is only considered in special cases where dermal exposure may be relatively high and the hand-mouth shunt may lead to appreciable oral exposure loading. The presented defaults for PPE have been discussed with experts of regulatory authorities and industry, but a formal discussion still has to take place. This needs to be done on EU level between Member States. The current proposal is based on state-of-the-art knowledge and policy considerations, but further research is needed to better underpin the proposed values and/or to adapt them.  相似文献   

3.
For the risk to human health posed by chemicals that show threshold toxicity there is an increasing need to move away from using the default approaches, which inherently incorporate uncertainty, towards more biologically defensible risk assessments. However, most chemical databases do not contain data of sufficient quantity or quality that can be used to replace either the interspecies or interindividual aspects of toxicokinetic and toxicodynamic uncertainty. The purpose of the current analysis was to evaluate the use of alternative, species-specific, pathway-related, “categorical” default values to replace the current interspecies toxicokinetic default uncertainty factor of 4.0. The extent of the difference in the internal dose of a compound, for each test species, could then be related to the specific route of metabolism in humans. This refinement would allow for different categories of defaults to be used, providing that the metabolic fate of a toxicant was known in humans. Interspecies differences in metabolism, excretion, and bioavailability have been compared for probe substrates for four different human xenobiotic-metabolizing enzymes: CYP1A2 (caffeine, paraxanthine, theobromine, and theophylline), CYP3A4 (lidocaine), UDP-glucuronyltransferase (AZT), and esterases (aspirin). The results of this analysis showed that there are significant differences between humans and the four test species in the metabolic fate of the probe compounds, the enzymes involved, the route of excretion and oral bioavailability — all of which are factors that can influence the extent of the difference between humans and a test species in the internal dose of a toxicant. The wide variability between both compounds and the individual species suggests that the categorical approach for species differences may be of limited use in refining the current default approach. However, future work to incorporate a wider database of compounds that are metabolized extensively by any pathway in humans to provide more information on the extent to which the different test species are not covered by the default of 4.0. Ultimately this work supports the necessity to remove the uncertainty from the risk assessment process by the generation and use of compound-specific data.  相似文献   

4.
Tenfold uncertainty factors have been used in risk assessment for about 40 years to allow for species differences and inter-individual variability. Each factor has to allow for toxicokinetic and toxicodynamic differences. Subdividing the 10-fold factors into kinetic and dynamic defaults, which when multiplied give a product of 10, offers a number of advantages. A major advantage is that chemical-specific data can be introduced to replace one or more of the default subfactors, hence contributing to a chemical-related overall factor. Subdivision of the 10-fold factors also facilitates analysis of the appropriateness of the overall 10-fold defaults, and the development of a more refined approach to the use of uncertainty factors.  相似文献   

5.
In the area of risk assessment associated with ecotoxicological and plant protection products, probabilistic risk assessment (PRA) methodologies have been developed that enable quantification of variability and uncertainty. Despite the potential advantages of these new methodologies, end-user and regulatory uptake has not been, to date, extensive. A case study, utilizing the Theory of Planned Behavior, was conducted in order to identify potential determinants of end-user adoption of probabilistic risk assessments associated with the ecotoxicological impact of pesticides. Seventy potential end-users, drawn from academia, government, industry, and consultancy organizations, were included in the study. The results indicated that end-user intention to adopt PRA varied across the different end-user groups. The regulatory acceptance of PRA was contingent on social acceptance across the regulatory community regarding the reliability and utility of the outputs. Training in interpretation of outputs is therefore highly relevant to regulatory acceptance. In other end-user sectors, a positive attitude toward PRA, “hands on” experience, and perceived capability of actually performing PRA is an important determinant of end-user intention to adopt PRA. It is concluded that training programs targeted to the specific needs of different end-user sectors should be developed if end-user adoption of PRA is to be increased.  相似文献   

6.
Probabilistic risk assessment (PRA) represents an important step in the evolution of risk assessment methodology to assist decision-making at hazardous waste sites. Despite considerable progress in the development of PRA techniques, regulatory acceptance of PRA has been limited, in part because a number of practical issues in its use must yet be resolved. A recent workshop on PRA identified several areas to be addressed, including the need for: (1) better demonstration of the value of PRA in risk management; (2) PRA training and education opportunities; (3) the development of technical criteria for acceptability of a PRA; (4) policy decisions on acceptable risk distributions; (5) ways to deal with risk communication issues; and (6) a variety of technical issues, including ways to include estimates of variability and uncertainty associated with toxicity values. Solutions to many of these issues will require better dialog between risk assessors and risk managers than has existed in the past.  相似文献   

7.
State regulators in Florida recently approved a first-of-its-kind probabilistic risk assessment (PRA) for determining an alternative residential Soil Cleanup Target Level (SCTL) for dioxin (32 ng/kg TEQ). The default residential SCTL (7 ng/kg TEQ) is based on a single, deterministic calculation with numerous conservative assumptions, resulting in an overly conservative value far beyond the regulatory mandate (i.e., 10?6 increase in cancer risk). Conversely, this PRA used a Monte Carlo simulation to estimate risk for all members of a large population using a combination of scientific data and professional judgment, with final details developed during negotiations with regulators. The simulation parameters were defined probabilistically and reflect the ranges of values for the following exposure variables: body weight, exposure duration, exposure frequency, fraction from contaminated source, soil ingestion rate, and relative bioavailability. Other variable and uncertain parameters were treated deterministically per direction from the regulators. The state also required that a pre-supposed high-risk subpopulation be analyzed separate from the full receptor population. Despite the conservativeness of the alternative SCTL, this PRA represents a significant step toward more realistic estimates of human health risks caused by environmental contaminant exposure.  相似文献   

8.
Recreational reuse of contaminated lands can be a sustainable and cost-effective approach to address the increasing demand on land availability for community development as well as for ecologically valuable conservation areas. Conservatively, residential exposure defaults are often applied for evaluation of an alternative future land use. A risk-based fit-for-purpose approach is preferred in evaluating contaminated sites for reuse purposes. The consideration of exposure factors relevant to the anticipated land use is an important step in performing these assessments. Data from two time activity databases, the Consolidated Human Activity Database (CHAD) and the American Time Use Survey (ATUS), were analyzed for activities that might occur at recreational sites for two important exposure factors: duration and frequency of site activities. The information in these databases indicated that the majority of “doers” spent a total time less than 4–6 h per day on these activities. Only a very small percentage of the survey populations participated in these recreational activities. Limited information on activity frequency was reported in the two data sets. This analysis supports modification to the duration of activities at recreational reuse sites as compared to residential defaults, and provides a basis for the development of alternate values.  相似文献   

9.
This paper presents the results of deliberations from participants who met on the second day of the Fourth Annual Workshop on the Evaluation of Uncertainty/Safety Factors in Health Risk Assessment. The group reviewed the previous day's presentations and implications for improvement in risk assessment. After much discussion, the group concluded that, in the short term, significant improvements could be made in the pharmacokinetic component of the inter-species uncertainty factor and developed a series of default options for this factor. These defaults consider route of exposure (oral or inhalation), and the form of the active compound (parent, metabolite, or very reactive metabolite). Several assumptions are key to this approach, such as a similar oral or inhalation bioavailability across species. We believe this method represents a useful default approach until more compound-specific information is available.  相似文献   

10.
This paper studies P2P lending and the factors explaining loan default. This is an important issue because in P2P lending individual investors bear the credit risk, instead of financial institutions, which are experts in dealing with this risk. P2P lenders suffer a severe problem of information asymmetry, because they are at a disadvantage facing the borrower. For this reason, P2P lending sites provide potential lenders with information about borrowers and their loan purpose. They also assign a grade to each loan. The empirical study is based on loans’ data collected from Lending Club (N = 24,449) from 2008 to 2014 that are first analyzed by using univariate means tests and survival analysis. Factors explaining default are loan purpose, annual income, current housing situation, credit history and indebtedness. Secondly, a logistic regression model is developed to predict defaults. The grade assigned by the P2P lending site is the most predictive factor of default, but the accuracy of the model is improved by adding other information, especially the borrower’s debt level.  相似文献   

11.
The Association for Environmental Health and Sciences Foundation has been collecting information on state-by-state petroleum cleanup levels (CULs) for soil since 1990, with the most recent survey in 2012. These data form the basis for this analysis, including a comparison of the CULs to U.S. Environmental Protection Agency (USEPA) regulatory values. The results illustrate the evolving complexity of state regulatory approaches to petroleum mixtures; benzene, toluene, ethylbenzene, and xylenes; and carcinogenic polycyclic aromatic hydrocarbons, as well as the use of multiple exposure scenarios and pathways to regulate petroleum in soil. Different fractionation approaches in use by various states and the USEPA are discussed, their strengths and limitations are reviewed, and their implications for site CULs are evaluated. Because of an increasing array of scenarios and pathways, CUL ranges have widened over time. As the regulatory environment for petroleum releases becomes more complex, it is increasingly important to develop a conceptual site model for fate, transport, land use assumptions, and exposure pathways at petroleum-contaminated sites to enable selection of the most appropriate CULs available.  相似文献   

12.
The potential application of categorical (i.e., species, pathway, or group specific) defaults for several components of uncertainty relevant to development of tolerable or reference concentrations/doses is considered-namely, interspecies variation and adequacy of database. For the former, the adequacy of allometric scaling by body surface area as a species-specific default for oral tolerable or reference doses is considered. For the latter, the extent to which data from analyses of subchronic:chronic effect levels, LOAELs/NOAELs, and critical effect levels for complete versus incomplete datasets informs selection of defaults is examined. The relative role of categorical defaults for these aspects is considered in the context of the continuum of increasingly data-informed approaches to characterization of uncertainty and variability that range from default (“presumed protective”) to “biologically based predictive”.  相似文献   

13.
Federal and state regulatory water quality standards have existed in the US for more than a decade while none existed for soil. More recently, the US Environmental Protection Agency (US EPA) developed a procedural model to determine minimum contaminant levels in soil that may require further investigation at the federal level. This model to determine federal soil screening levels (SSLs) by the US EPA has been slightly modified to determine state regulatory soil residual contaminant levels (RCLs) in Wisconsin. We present simplified equations for use on semivolatile compounds to evaluate these regulatory soil levels in residential settings, and in the process, we show where regulatory federal and state soil contaminant levels may differ. Establishing generic soil cleanup levels requires determining the smallest of the acceptable contaminant levels that are still considered protective of human health for all exposure pathways of concern. For the protection of direct human exposure pathways, the State of Wisconsin uses residential assumptions which are generally more conservative than federal defaults to determine acceptable levels. However, when indirect ingestion pathway via leaching through groundwater is considered, the federal generic SSLs may be more conservative than Wisconsin's generic RCLs when an organic contaminant's sorption coefficient, Koc, falls between 4123 and 39,000?ml/g for non-carcinogens, and between 8568 and 45,525?ml/g for carcinogens. The simplified equations are used on several agricultural chemicals with current generic SSLs. Agricultural chemicals are unique because, unlike other compounds, they are designed for dispersal into the environment at legal application rates, and their generic cleanup levels may be developed based on their legal use rates. However, both the generic US EPA and Wisconsin models imply that if some agricultural chemicals are found at depth, even at use-rate levels, groundwater quality may be adversely affected.  相似文献   

14.
Decision-makers at all levels of public health and transfusion medicine have always assessed the risks and benefits of their decisions. Decisions are usually guided by immediately available information and a significant amount of experience and judgment. For decisions concerning familiar situations and common problems, judgment and experience may work quite well, but this type of decision process can lack clarity and accountability. Public health challenges are changing as emerging diseases and expensive technologies complicate the decision-makers' task, confronting the decision-maker with new problems that include multiple potential solutions. Decisions regarding polices and adoption of technologies are particularly complex in transfusion medicine due to the scope of the field, implications for public health, and legal, regulatory and public expectations regarding blood safety. To assist decision-makers, quantitative risk assessment and cost-effectiveness analysis are now being more widely applied. This set of articles will introduce risk assessment and cost-effectiveness methodologies and discuss recent applications of these methods in transfusion medicine.  相似文献   

15.
Counterparty risk denotes the risk that a party defaults in a bilateral contract. This risk not only depends on the two parties involved, but also on the risk from various other contracts each of these parties holds. In rather informal markets, such as the OTC (over-the-counter) derivative market, institutions only report their aggregated quarterly risk exposure, but no details about their counterparties. Hence, little is known about the diversification of counterparty risk. In this paper, we reconstruct the weighted and time-dependent network of counterparty risk in the OTC derivatives market of the United States between 1998 and 2012. To proxy unknown bilateral exposures, we first study the co-occurrence patterns of institutions based on their quarterly activity and ranking in the official report. The network obtained this way is further analysed by a weighted k-core decomposition, to reveal a core-periphery structure. This allows us to compare the activity-based ranking with a topology-based ranking, to identify the most important institutions and their mutual dependencies. We also analyse correlations in these activities, to show strong similarities in the behavior of the core institutions. Our analysis clearly demonstrates the clustering of counterparty risk in a small set of about a dozen US banks. This not only increases the default risk of the central institutions, but also the default risk of peripheral institutions which have contracts with the central ones. Hence, all institutions indirectly have to bear (part of) the counterparty risk of all others, which needs to be better reflected in the price of OTC derivatives.  相似文献   

16.
Despite data gaps and information shortfalls, government agencies in the United States are expected to produce timely and defensible decisions to regulate pesticide use under the Federal Insecticide, Fungicide, and Rodenticide Act and in compliance with the Endangered Species Act. The decision to register a pesticide is predicated on a conclusion that no unreasonable effects will accrue to the environment, including threatened and endangered species. We recognize that the definition of acceptable risk is a policy judgment stemming from legislative language and judicial interpretation. However, a common risk assessment approach with similar technical underpinnings and a high degree of transparency used by all the agencies would be cost effective and more likely to achieve consensus among interested parties. Quantitative probabilistic risk assessment (PRA) methods can be used to develop risk estimates and to describe the level of confidence in these estimates. PRA methods can also differentiate among the contributions of natural stochasticity, measurement variability, and lack of knowledge. Because this approach enhances transparency and increases understanding of the implications of limited data sets and associated assumptions, we encourage the appropriate agencies to implement PRA methods as a means of reaching common ground when assessing risks of pesticides to listed species.  相似文献   

17.
Major sources of arsenic exposure for humans are foods, particularly aquatic organisms, which are called seafood in this report. Although seafood contains a variety of arsenicals, including inorganic arsenic, which is toxic and carcinogenic, and arsenobetaine, which is considered nontoxic, the arsenic content of seafood commonly is reported only as total arsenic. A goal of this literature survey is to determine if generalizable values can be derived for the percentage of total arsenic in seafood that is inorganic arsenic. Generalizable values for percent inorganic arsenic are needed for use as default values in U.S. human health risk assessments of seafood from arsenic-contaminated sites. Data from the worldwide literature indicate the percent of inorganic arsenic in marine/estuarine finfish does not exceed 7.3% and in shellfish can reach 25% in organisms from presumably uncontaminated areas, with few data available for freshwater organisms. However, percentages can be much higher in organisms from contaminated areas and in seaweed. U.S. site-specific data for marine/estuarine finfish and shellfish are similar to the worldwide data, and for freshwater finfish indicate that the average percent inorganic arsenic is generally < 10%, but ranges up to nearly 30%. Derivation of nationwide defaults for percent inorganic arsenic in fish, shellfish, and seaweed collected from arsenic-contaminated areas in the United States is not supported by the surveyed literature.  相似文献   

18.
Plasma renin activity (PRA) and aldosterone (PA) levels are characterized by a circadian rhythmicity (CR). The present study revealed that this rhythmicity is influenced by several factors including posture, sodium intake and age. Time-qualified PRA and PA reference intervals can reduce the incidence of false positives and false negatives in a diagnostic work-up. The circadian rhythmicity of PRA and PA have been quantified in relation to posture, sodium intake and age. The cosinor procedure has been applied to quantify the properties of the circadian rhythmicity under these conditions.

Chronograms and circadian parameters can be used to optimize the use of PRA and PA measurements in clinical practice. The chronobiological specification of reference values for PRA and PA is of valuable importance since the assessment of PRA and PA circadian rhythmicity has a diagnostic interest for a certain type of clinical disorder. It should be noted that several studies have described circannual variations for renin and aldosterone. The next step in the optimation of laboratory time-qualified reference values is the assessment of changes induced by the deterministic factors on a circannual domain.  相似文献   

19.
(Eco)toxicity studies conducted according to internationally standardized test guidelines are often considered reliable by default and preferred as key evidence in regulatory risk assessment. At the same time regulatory agencies emphasize the use of all relevant (eco)toxicity data in the risk assessment process, including non-standard studies. However, there is a need to facilitate the use of such studies in regulatory risk assessment. Therefore, we propose a framework that facilitates a systematic and transparent evaluation of the reliability and relevance of (eco)toxicity in vivo studies for health and environmental risk assessment. The framework includes specific criteria to guide study evaluation, as well as a color-coding tool developed to aid the application of these criteria. In addition we provide guidance intended for researchers on how to report non-standard studies to ensure that they meet regulatory requirements. The intention of the evaluating and reporting criteria is to increase the usability of all relevant data that may fill information gaps in chemical risk assessments. The framework is publically available online, free of charge, at the Science in Risk Assessment and Policy (SciRAP) website: www.scirap.org. The aim of this article is to present the framework and resources available at the SciRAP website.  相似文献   

20.
State environmental regulatory agencies in the U.S. often establish a default background standard for naturally occurring elements in the soil, water, and air. The background standard is determined and then used as a benchmark across the entire jurisdiction. A variety of statistical techniques are used to determine this standard, but often ignore any inherent spatial dependencies within the jurisdiction. If the analysis indicates a specific site exceeds the default standard, additional background sampling and analysis must usually be performed. Frequently, this additional sampling is found to be unnecessary simply because the natural background levels were elevated for this particular site. Conversely, potential contamination may be overlooked in areas where the natural background levels are much lower. Thus, a single default background standard seems inadequate within this context.

This paper proposes the use of dissimilarity coefficients based on kriging estimates as a means to regionalize background standards. Along with cluster analysis techniques, these dissimilarity coefficients provide a means to stratify the population into geographic sub-areas. A regulatory agency may now define multiple default background standards based on geographic location. To illustrate, this paper examines a case study concerning residential soil arsenic for 83 Michigan counties.  相似文献   


设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号