首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 390 毫秒
1.
The 2001 European Commission proposal for the Registration, Evaluation and Authorisation of Chemicals (REACH) aims to improve public and environmental health by assessing the toxicity of, and restricting exposure to, potentially toxic chemicals. The greatest benefits are expected to accrue from decreased cancer incidences. Hence the accurate identification of chemical carcinogens must be a top priority for the REACH system. Due to a paucity of human clinical data, the identification of potential human carcinogens has conventionally relied on animal tests. However, our survey of the US Environmental Protection Agency's (EPAs) toxic chemicals database revealed that, for a majority of the chemicals of greatest public health concern (93/160, i.e. 58.1%), the EPA found animal carcinogenicity data to be inadequate to support classifications of probable human carcinogen or non-carcinogen. A wide variety of species were used, with rodents predominating; a wide variety of routes of administration were used; and a particularly wide variety of organ systems were affected. These factors raise serious biological obstacles that render accurate extrapolation to humans profoundly difficult. Furthermore, significantly different International Agency for Research on Cancer assessments of identical chemicals, indicate that the true human predictivity of animal carcinogenicity data is even poorer than is indicated by the EPA figures alone. Consequently, we propose the replacement of animal carcinogenicity bioassays with a tiered combination of non-animal assays, which can be expected to yield a weight-of-evidence characterisation of carcinogenic risk with superior human predictivity. Additional advantages include substantial savings of financial, human and animal resources, and potentially greater insights into mechanisms of carcinogenicity.  相似文献   

2.
The idea that synthetic chemicals such as DDT are major contributors to human cancer has been inspired, in part, by Rachel Carson's passionate book, Silent Spring. This chapter discusses evidence showing why this is not true. We also review research on the causes of cancer, and show why much cancer is preventable.Epidemiological evidence indicates several factors likely to have a major effect on reducing rates of cancer: reduction of smoking, increased consumption of fruits and vegetables, and control of infections. Other factors are avoidance of intense sun exposure, increases in physical activity, and reduction of alcohol consumption and possibly red meat. Already, risks of many forms of cancer can be reduced and the potential for further reductions is great. If lung cancer (which is primarily due to smoking) is excluded, cancer death rates are decreasing in the United States for all other cancers combined.Pollution appears to account for less than 1% of human cancer; yet public concern and resource allocation for chemical pollution are very high, in good part because of the use of animal cancer tests in cancer risk assessment. Animal cancer tests, which are done at the maximum tolerated dose (MTD), are being misinterpreted to mean that low doses of synthetic chemicals and industrial pollutants are relevant to human cancer. About half of the chemicals tested, whether synthetic or natural, are carcinogenic to rodents at these high doses. A plausible explanation for the high frequency of positive results is that testing at the MTD frequently can cause chronic cell killing and consequent cell replacement, a risk factor for cancer that can be limited to high doses. Ignoring this greatly exaggerates risks. Scientists must determine mechanisms of carcinogenesis for each substance and revise acceptable dose levels as understanding advances.The vast bulk of chemicals ingested by humans is natural. For example, 99.99% of the pesticides we eat are naturally present in plants to ward off insects and other predators. Half of these natural pesticides tested at the MTD are rodent carcinogens. Reducing exposure to the 0.01% that are synthetic will not reduce cancer rates. On the contrary, although fruits and vegetables contain a wide variety of naturally-occurring chemicals that are rodent carcinogens, inadequate consumption of fruits and vegetables doubles the human cancer risk for most types of cancer. Making them more expensive by reducing synthetic pesticide use will increase cancer. Humans also ingest large numbers of natural chemicals from cooking food. Over a thousand chemicals have been reported in roasted coffee: more than half of those tested (19/28) are rodent carcinogens. There are more rodent carcinogens in a single cup of coffee than potentially carcinogenic pesticide residues in the average American diet in a year, and there are still a thousand chemicals left to test in roasted coffee. This does not mean that coffee is dangerous but rather that animal cancer tests and worst-case risk assessment, build in enormous safety factors and should not be considered true risks.The reason humans can eat the tremendous variety of natural chemical "rodent carcinogens" is that humans, like other animals, are extremely well protected by many general defense enzymes, most of which are inducible (i.e., whenever a defense enzyme is in use, more of it is made). Since the defense enzymes are equally effective against natural and synthetic chemicals one does not expect, nor does one find, a general difference between synthetic and natural chemicals in ability to cause cancer in high-dose rodent tests.The idea that there is an epidemic of human cancer caused by synthetic industrial chemicals is false. In addition, there is a steady rise in life expectancy in the developed countries. Linear extrapolation from the maximum tolerated dose in rodents to low level exposure in humans has led to grossly exaggerated mortality forecasts.Such extrapolations can not be verified by epidemiology. Furthermore, relying on such extrapolations for synthetic chemicals while ignoring the enormous natural background, leads to an imbalanced perception of hazard and allocation of resources. It is the progress of scientific research and technology that will continue to lengthen human life expectancy.Zero exposure to rodent carcinogens cannot be achieved. Low levels of rodent carcinogens of natural origin are ubiquitous in the environment. It is thus impossible to obtain conditions totally free of exposure to rodent carcinogens or to background radiation. Major advances in analytical techniques enable the detection of extremely low concentrations of all substances, whether natural or synthetic, often thousands of times lower than could be detected 30 years ago.Risks compete with risks: society must distinguish between significant and trivial risks. Regulating trivial risks or exposure to substances erroneously inferred to cause cancer at low-doses, can harm health by diverting resources from programs that could be effective in protecting the health of the public. Moreover, wealth creates health: poor people have shorter life expectancy than wealthy people. When money and resources are wasted on trivial problems, society's wealth and hence health is harmed.  相似文献   

3.
The regulation of human exposure to potentially carcinogenic chemicals constitutes society's most important use of animal carcinogenicity data. Environmental contaminants of greatest concern within the USA are listed in the Environmental Protection Agency's (EPA's) Integrated Risk Information System (IRIS) chemicals database. However, of the 160 IRIS chemicals lacking even limited human exposure data but possessing animal data that had received a human carcinogenicity assessment by 1 January 2004, we found that in most cases (58.1%; 93/160), the EPA considered animal carcinogenicity data inadequate to support a classification of probable human carcinogen or non-carcinogen. For the 128 chemicals with human or animal data also assessed by the World Health Organisation's International Agency for Research on Cancer (IARC), human carcinogenicity classifications were compatible with EPA classifications only for those 17 having at least limited human data (p = 0.5896). For those 111 primarily reliant on animal data, the EPA was much more likely than the IARC to assign carcinogenicity classifications indicative of greater human risk (p < 0.0001). The IARC is a leading international authority on carcinogenicity assessments, and its significantly different human carcinogenicity classifications of identical chemicals indicate that: 1) in the absence of significant human data, the EPA is over-reliant on animal carcinogenicity data; 2) as a result, the EPA tends to over-predict carcinogenic risk; and 3) the true predictivity for human carcinogenicity of animal data is even poorer than is indicated by EPA figures alone. The EPA policy of erroneously assuming that tumours in animals are indicative of human carcinogenicity is implicated as a primary cause of these errors.  相似文献   

4.
5.
Rosenkranz HS 《Mutation research》2003,529(1-2):117-127
The health risk manager and policy analyst must frequently make recommendations based upon incomplete toxicity data. This is a situation which is encountered in the evaluation of human carcinogenic risks as animal cancer bioassay results are often not available. In this study, in order to assess the relevance of other possible indicators of carcinogenic risks, we used the "chemical diversity approach" to estimate the magnitude of the human carcinogenic risk based upon Salmonella mutagenicity and systemic toxicity data of the "universe of chemicals" to which humans have the potential to be exposed. Analyses of the properties of 10,000 agents representative of the "universe of chemicals" suggest that chemicals that have genotoxic potentials as well as exhibiting greater systemic toxicity are more likely to be carcinogens than non-genotoxicants or agents that exhibit lesser toxicity. Since "genotoxic" carcinogenicity is a hallmark of recognized human carcinogens, these findings are relevant to human cancer risk assessment.  相似文献   

6.
Brazil is the biggest market for pesticides in the world. In the registration process, a pesticide must be authorized by the Institute of the Environment, Health Surveillance Agency and Ministry of Agriculture. Evaluations follow a package of toxicological studies submitted by the companies and also based on the Brazilian law regarding pesticides. We confronted data produced by private laboratories, submitted to the Institute of the Environment for registration, with data obtained from scientific databases, corresponding to mutagenicity, carcinogenicity and teratogenicity of pesticides. All studies submitted by the companies were carried out by private laboratories. From 247 pesticide formulations analyzed, none showed positive results for mutagenicity, carcinogenicity or teratogenicity. From 574 articles in the scientific literature, 84% published by public laboratories showed positive results, while 79% of those showing negative results came from private laboratories. There is an ethical concern about a conflict of interest between public/independent laboratories and private laboratories that produce data for registering pesticides. We demonstrated that there is a clear contradiction between public and private laboratories. Brazilian regulatory authorities have approved the registration of pesticides based almost exclusively on the monographs provided by the pesticide industry, because the use of scientific articles or information from the independent literature is strongly belittled by the industry. Pesticide companies argue that scientific articles cannot be trusted. Also, according to the industry, pesticide registration cannot be refused based on results from scientific articles. Thus, the registration of pesticides with mutagenic, carcinogenic and teratogenic risks has been approved in Brazil.  相似文献   

7.
The Ames test has probably been the most common prescreening test for potential carcinogens. This system, however, occasionally presents false-positive or false-negative results for certain kinds of chemicals. We chose 24 agents, most of which showed a discrepancy in their results with the animal carcinogenicity test and the Ames test, and screened them by the DNA synthesis inhibition test using human fibroblasts. Among 22 agents, 13 (59%) showed consistent results, 6 (27%) showed discrepant results with those obtained by animal tests and 3 (14%) showed ambiguous results. The majority of the DNA synthesis inhibition test results for these chemicals were consistent with the results obtained by whole animal tests, suggesting that the DNA synthesis inhibition test with mammalian cells in culture is a reasonably reliable prescreening test for potential carcinogens.  相似文献   

8.
Biomarkers are becoming increasingly important in toxicology and human health. Many research groups are carrying out studies to develop biomarkers of exposure to chemicals and apply these for human monitoring. There is considerable interest in the use and application of biomarkers to identify the nature and amounts of chemical exposures in occupational and environmental situations. Major research goals are to develop and validate biomarkers that reflect specific exposures and permit the prediction of the risk of disease in individuals and groups. One important objective is to prevent human cancer. This review presents a commentary and consensus views about the major developments on biomarkers for monitoring human exposure to chemicals. A particular emphasis is on monitoring exposures to carcinogens. Significant developments in the areas of new and existing biomarkers, analytical methodologies, validation studies and field trials together with auditing and quality assessment of data are discussed. New developments in the relatively young field of toxicogenomics possibly leading to the identification of individual susceptibility to both cancer and non-cancer endpoints are also considered. The construction and development of reliable databases that integrate information from genomic and proteomic research programmes should offer a promising future for the application of these technologies in the prediction of risks and prevention of diseases related to chemical exposures. Currently adducts of chemicals with macromolecules are important and useful biomarkers especially for certain individual chemicals where there are incidences of occupational exposure. For monitoring exposure to genotoxic compounds protein adducts, such as those formed with haemoglobin, are considered effective biomarkers for determining individual exposure doses of reactive chemicals. For other organic chemicals, the excreted urinary metabolites can also give a useful and complementary indication of exposure for acute exposures. These methods have revealed ‘backgrounds’ in people not knowingly exposed to chemicals and the sources and significance of these need to be determined, particularly in the context of their contribution to background health risks.  相似文献   

9.
Biomarkers are becoming increasingly important in toxicology and human health. Many research groups are carrying out studies to develop biomarkers of exposure to chemicals and apply these for human monitoring. There is considerable interest in the use and application of biomarkers to identify the nature and amounts of chemical exposures in occupational and environmental situations. Major research goals are to develop and validate biomarkers that reflect specific exposures and permit the prediction of the risk of disease in individuals and groups. One important objective is to prevent human cancer. This review presents a commentary and consensus views about the major developments on biomarkers for monitoring human exposure to chemicals. A particular emphasis is on monitoring exposures to carcinogens. Significant developments in the areas of new and existing biomarkers, analytical methodologies, validation studies and field trials together with auditing and quality assessment of data are discussed. New developments in the relatively young field of toxicogenomics possibly leading to the identification of individual susceptibility to both cancer and non-cancer endpoints are also considered. The construction and development of reliable databases that integrate information from genomic and proteomic research programmes should offer a promising future for the application of these technologies in the prediction of risks and prevention of diseases related to chemical exposures. Currently adducts of chemicals with macromolecules are important and useful biomarkers especially for certain individual chemicals where there are incidences of occupational exposure. For monitoring exposure to genotoxic compounds protein adducts, such as those formed with haemoglobin, are considered effective biomarkers for determining individual exposure doses of reactive chemicals. For other organic chemicals, the excreted urinary metabolites can also give a useful and complementary indication of exposure for acute exposures. These methods have revealed 'backgrounds' in people not knowingly exposed to chemicals and the sources and significance of these need to be determined, particularly in the context of their contribution to background health risks.  相似文献   

10.
BACKGROUND: Toxicology studies utilizing animals and in vitro cellular or tissue preparations have been used to study the toxic effects and mechanism of action of drugs and chemicals and to determine the effective and safe dose of drugs in humans and the risk of toxicity from chemical exposures. Testing in animals could be improved if animal dosing using the mg/kg basis was abandoned and drugs and chemicals were administered to compare the effects of pharmacokinetically and toxicokinetically equivalent serum levels in the animal model and human. Because alert physicians or epidemiology studies, not animal studies, have discovered most human teratogens and toxicities in children, animal studies play a minor role in discovering teratogens and agents that are deleterious to infants and children. In vitro studies play even a less important role, although they are helpful in describing the cellular or tissue effects of the drugs or chemicals and their mechanism of action. One cannot determine the magnitude of human risks from in vitro studies when they are the only source of toxicology data. METHODS: Toxicology studies on adult animals is carried out by pharmaceutical companies, chemical companies, the Food and Drug Administration (FDA), many laboratories at the National Institutes of Health, and scientific investigators in laboratories throughout the world. Although there is a vast amount of animal toxicology studies carried out on pregnant animals and adult animals, there is a paucity of animal studies utilizing newborn, infant, and juvenile animals. This deficiency is compounded by the fact that there are very few toxicology studies carried out in children. That is one reason why pregnant women and children are referred to as "therapeutic orphans." RESULTS: When animal studies are carried out with newborn and developing animals, the results demonstrate that generalizations are less applicable and less predictable than the toxicology studies in pregnant animals. Although many studies show that infants and developing animals may have difficulty in metabolizing drugs and are more vulnerable to the toxic effects of environmental chemicals, there are exceptions that indicate that infants and developing animals may be less vulnerable and more resilient to some drugs and chemicals. In other words, the generalization indicating that developing animals are always more sensitive to environmental toxicants is not valid. For animal toxicology studies to be useful, animal studies have to utilize modern concepts of pharmacokinetics and toxicokinetics, as well as "mechanism of action" (MOA) studies to determine whether animal data can be utilized for determining human risk. One example is the inability to determine carcinogenic risks in humans for some drugs and chemicals that produce tumors in rodents, When the oncogenesis is the result of peroxisome proliferation, a reaction that is of diminished importance in humans. CONCLUSIONS: Scientists can utilize animal studies to study the toxicokinetic and toxicodynamic aspects of drugs and environmental toxicants. But they have to be carried out with the most modern techniques and interpreted with the highest level of scholarship and objectivity. Threshold exposures, no-adverse-effect level (NOAEL) exposures, and toxic effects can be determined in animals, but have to be interpreted with caution when applying them to the human. Adult problems in growth, endocrine dysfunction, neurobehavioral abnormalities, and oncogenesis may be related to exposures to drugs, chemicals, and physical agents during development and may be fruitful areas for investigation. Maximum permissible exposures have to be based on data, not on generalizations that are applied to all drugs and chemicals. Epidemiology studies are still the best methodology for determining the human risk and the effects of environmental toxicants. Carrying out these focused studies in developing humans will be difficult. Animal studies may be our only alternative for answering many questions with regard to specific postnatal developmental vulnerabilities.  相似文献   

11.
Conventional animal carcinogenicity tests take around three years to design, conduct and interpret. Consequently, only a tiny fraction of the thousands of industrial chemicals currently in use have been tested for carcinogenicity. Despite the costs of hundreds of millions of dollars and millions of skilled personnel hours, as well as millions of animal lives, several investigations have revealed that animal carcinogenicity data lack human specificity (i.e. the ability to identify human non-carcinogens), which severely limits the human predictivity of the bioassay. This is due to the scientific inadequacies of many carcinogenicity bioassays, and numerous serious biological obstacles, which render profoundly difficult any attempts to accurately extrapolate animal data in order to predict carcinogenic hazards to humans. Proposed modifications to the conventional bioassays have included the elimination of mice as a second species, and the use of genetically-altered or neonatal mice, decreased study durations, initiation-promotion models, the greater incorporation of toxicokinetic and toxicodynamic assessments, structure-activity relationship (computerised) systems, in vitro assays, cDNA microarrays for detecting changes in gene expression, limited human clinical trials, and epidemiological research. The potential advantages of non-animal assays when compared to bioassays include the superior human specificity of the results, substantially reduced time-frames, and greatly reduced demands on financial, personnel and animal resources. Inexplicably, however, the regulatory agencies have been frustratingly slow to adopt alternative protocols. In order to decrease the enormous cost of cancer to society, a substantial redirection of resources away from excessively slow and resource-intensive rodent bioassays, into the further development and implementation of non-animal assays, is both strongly justified and urgently required.  相似文献   

12.
C Ramel 《Mutation research》1986,168(3):327-342
The deployment of short-term assays for the detection of carcinogens inevitably has to be based on the genetic alterations actually involved in carcinogenesis. This paper gives an overview of oncogene activation and other mutagenic events connected with cancer induction. It is emphasized that there are indications of DNA alterations in carcinogenicity, which are not in accordance with "conventional" mutations and mutation frequencies, as measured by short-term assays of point mutations, chromosome aberrations and numerical chromosome changes. This discrepancy between DNA alterations in carcinogenicity and the endpoints of short-term assays in current use include transpositions, insertion mutations, polygene mutations, gene amplifications and DNA methylations. Furthermore, tumourigenicity may imply an induction of a genetic instability, followed by a cascade of genetic alterations. The evaluation of short-term assays for carcinogenesis mostly involves two correlations that is, between mutation and animal cancer data on the one hand and between animal cancer data and human carcinogenicity on the other. It should be stressed that animal bioassays for cancer in general imply tests specifically for the property of chemicals to function as complete carcinogens, which may be a rather poor reflection of the actual situation in human populations. The primary aim of short-term mutagenicity assays is to provide evidence as to whether a compound can be expected to cause mutations in humans, and such evidence has to be considered seriously even against a background of negative cancer data. For the evaluation of data from short-term assays the massive amount of empirical data from different assays should be used and new computer systems in that direction can be expected to provide improved predictions of carcinogenicity.  相似文献   

13.
111 chemicals of known rodent carcinogenicity (49 carcinogens, 62 noncarcinogens), including many promoters of carcinogenesis, nongenotoxic carcinogens, hepatocarcinogens, and halogenated hydrocarbons, were selected for study. The chemicals were administered by gavage in two dose levels to female Sprague-Dawley rats. The effects of these 111 chemicals on 4 biochemical assays (hepatic DNA damage by alkaline elution (DD), hepatic ornithine decarboxylase activity (ODC), serum alanine aminotransferase activity (ALT), and hepatic cytochrome P-450 content (P450)) were determined. Composite parameters are defined as follows: CP = [ODC and P450), CT = [ALT and ODC), and TS = [DD or CP or CT]. The operational characteristics of TS for predicting rodent cancer were sensitivity 55%, specificity 87%, positive predictivity 77%, negative predictivity 71%, and concordance 73%. For these chemicals, the 73% concordance of this study was superior to the concordance obtained from published data from other laboratories on the Ames test (53%), structural alerts (SA) (46%), chromosome aberrations in Chinese hamster ovary cells (ABS) (48%), cell mutation in mouse lymphoma 15178Y cells (MOLY) (52%), and sister-chromatid exchange in Chinese hamster ovary cells (SCE) (60%). The 4 in vivo biochemical assays were complementary to each other. The composite parameter TS also shows complementarity to all 5 other predictors of rodent cancer examined in this paper. For example, the Ames test alone has a concordance of only 53%. In combination with TS, the concordance is increased to 62% (Ames or TS) or to 63% (Ames and TS). For the 67 chemicals with data available for SA, the concordance for predicting rodent carcinogenicity was 47% (for SA alone), 54% (for SA or TS), and 66% (for SA and TS). These biochemical assays will be useful: (1) to predict rodent carcinogenicity per se, (2) to 'confirm' the results of short-term mutagenicity tests by the high specificity mode of the biochemical assays (the specificity and positive predictivity are both 100%), and (3) to be a component of future complementary batteries of tests for predicting rodent carcinogenicity.  相似文献   

14.
The approaches to quantitatively assessing the health risks of chemical exposure have not changed appreciably in the past 50 to 80 years, the focus remaining on high-dose studies that measure adverse outcomes in homogeneous animal populations. This expensive, low-throughput approach relies on conservative extrapolations to relate animal studies to much lower-dose human exposures and is of questionable relevance to predicting risks to humans at their typical low exposures. It makes little use of a mechanistic understanding of the mode of action by which chemicals perturb biological processes in human cells and tissues. An alternative vision, proposed by the U.S. National Research Council (NRC) report Toxicity Testing in the 21(st) Century: A Vision and a Strategy, called for moving away from traditional high-dose animal studies to an approach based on perturbation of cellular responses using well-designed in vitro assays. Central to this vision are (a) "toxicity pathways" (the innate cellular pathways that may be perturbed by chemicals) and (b) the determination of chemical concentration ranges where those perturbations are likely to be excessive, thereby leading to adverse health effects if present for a prolonged duration in an intact organism. In this paper we briefly review the original NRC report and responses to that report over the past 3 years, and discuss how the change in testing might be achieved in the U.S. and in the European Union (EU). EU initiatives in developing alternatives to animal testing of cosmetic ingredients have run very much in parallel with the NRC report. Moving from current practice to the NRC vision would require using prototype toxicity pathways to develop case studies showing the new vision in action. In this vein, we also discuss how the proposed strategy for toxicity testing might be applied to the toxicity pathways associated with DNA damage and repair.  相似文献   

15.
Entering a new millennium seems a good time to challenge some old ideas, which in our view are implausible, have little supportive evidence, and might best be left behind. In this essay, we summarize a decade of work, raising four issues that involve toxicology, nutrition, public health, and government regulatory policy. (a) Paracelsus or parascience: the dose (trace) makes the poison. Half of all chemicals, whether natural or synthetic, are positive in high-dose rodent cancer tests. These results are unlikely to be relevant at the low doses of human exposure. (b) Even Rachel Carson was made of chemicals: natural vs. synthetic chemicals. Human exposure to naturally occurring rodent carcinogens is ubiquitous, and dwarfs the general public's exposure to synthetic rodent carcinogens. (c) Errors of omission: micronutrient inadequacy is genotoxic. The major causes of cancer (other than smoking) do not involve exogenous carcinogenic chemicals: dietary imbalances, hormonal factors, infection and inflammation, and genetic factors. Insufficiency of many micronutrients, which appears to mimic radiation, is a preventable source of DNA damage. (d) Damage by distraction: regulating low hypothetical risks. Putting huge amounts of money into minuscule hypothetical risks damages public health by diverting resources and distracting the public from major risks.  相似文献   

16.
The impact of new technologies on human population studies   总被引:4,自引:0,他引:4  
Human population studies involve clinical or epidemiological observations that associate environmental exposures with health endpoints and disease. Clearly, these are the most sought after data to support assessments of human health risk from environmental exposures. However, the foundations of many health risk assessments rest on experimental studies in rodents performed at high doses that elicit adverse outcomes, such as organ toxicity or tumors. Using the results of human studies and animal data, risk assessors define the levels of environmental exposures that may lead to disease in a portion of the population. These decisions on potential health risks are frequently based on the use of default assumptions that reflect limitations in our scientific knowledge. An important immediate goal of toxicogenomics, including proteomics and metabonomics, is to offer the possibility of making decisions affecting public health and public based on detailed toxicity, mechanistic, and exposure data in which many of the uncertainties have been eliminated. Ultimately, these global technologies will dramatically impact the practice of public health and risk assessment as applied to environmental health protection. The impact is already being felt in the practice of toxicology where animal experimentation using highly controlled dose-time parameters is possible. It is also being seen in human population studies where understanding human genetic variation and genomic reactions to specific environmental exposures is enhancing our ability to uncover the causes of variations in human response to environmental exposures. These new disciplines hold the promise of reducing the costs and time lines associated with animal and human studies designed to assess both the toxicity of environmental pollutants and efficacy of therapeutic drugs. However, as with any new science, experience must be gained before the promise can be fulfilled. Given the numbers and diversity of drugs, chemicals and environmental agents; the various species in which they are studied and the time and dose factors that are critical to the induction of beneficial and adverse effects, it is only through the development of a profound knowledge base that toxicology and environmental health can rapidly advance. The National Institute of Environmental Health Sciences (NIEHS), National Center for Toxicogenomics and its university-based Toxicogenomics Research Consortium (TRC), and resource contracts, are engaged in the development, application and standardization of the science upon which to the build such a knowledge base on Chemical Effects in Biological Systems (CEBS). In addition, the NIEHS Environmental Genome Project (EGP) is working to systematically identify and characterize common sequence polymorphisms in many genes with suspected roles in determining chemical sensitivity. The rationale of the EGP is that certain genes have a greater than average influence over human susceptibility to environmental agents. If we identify and characterize the polymorphism in those genes, we will increase our understanding of human disease susceptibility. This knowledge can be used to protect susceptible individuals from disease and to reduce adverse exposure and environmentally induced disease.  相似文献   

17.
Due to limited human exposure data, risk classification and the consequent regulation of exposure to potential carcinogens has conventionally relied mainly upon animal tests. However, several investigations have revealed animal carcinogenicity data to be lacking in human predictivity. To investigate the reasons for this, we surveyed 160 chemicals possessing animal but not human exposure data within the US Environmental Protection Agency chemicals database, but which had received human carcinogenicity assessments by 1 January 2004. We discovered the use of a wide variety of species, with rodents predominating, and of a wide variety of routes of administration, and that there were effects on a particularly wide variety of organ systems. The likely causes of the poor human predictivity of rodent carcinogenicity bioassays include: 1) the profound discordance of bioassay results between rodent species, strains and genders, and further, between rodents and human beings; 2) the variable, yet substantial, stresses caused by handling and restraint, and the stressful routes of administration common to carcinogenicity bioassays, and their effects on hormonal regulation, immune status and predisposition to carcinogenesis; 3) differences in rates of absorption and transport mechanisms between test routes of administration and other important human routes of exposure; 4) the considerable variability of organ systems in response to carcinogenic insults, both between and within species; and 5) the predisposition of chronic high dose bioassays toward false positive results, due to the overwhelming of physiological defences, and the unnatural elevation of cell division rates during ad libitum feeding studies. Such factors render profoundly difficult any attempts to accurately extrapolate human carcinogenic hazards from animal data.  相似文献   

18.
Stewart BW 《Mutation research》2008,658(1-2):124-151
Readily achieved comparative assessment of carcinogenic risks consequent upon environmental exposures may increase understanding and contribute to cancer prevention. Procedures for hazard identification and quantitative risk assessment are established, but limited when addressing novel exposures to previously known carcinogens or any exposure to agents having only suspected carcinogenic activity. To complement other means of data evaluation, a procedure for qualitative assessment of carcinogenic risk is described. This involves categorizing the relevant carcinogen and circumstances under which exposure occurs. The categories for carcinogens are those used for hazard identification and involve whether the agent is (1) a recognized carcinogen for humans; (2) probably or (3) possibly carcinogenic for humans; (4) characterized by inadequate evidence of carcinogenicity; or (5) lacking carcinogenicity. Exposure is categorized by whether it is one which (1) establishes the agent as a recognized carcinogen; (2) is taken into account in establishing carcinogenicity status; (3) is distinct from those providing clearest evidence of carcinogenicity; (4) is not characterized in relation to carcinogenicity; or (5) involves an exposure in which absence of carcinogenic outcome is observed. These two categories of evidence allow the risk inherent in a situation to be banded as indicative of a proven, likely, inferred, unknown or unlikely carcinogenic outcome, and further characterized using sub-bands. The procedure has been applied to about fifty situations. For recognized carcinogens, including asbestos and polycyclic aromatic hydrocarbons, risks consequent upon occupational exposure, the impact of point source pollution, residence near contaminated sites and general environmental exposure are allocated across the proven band and a likely sub-band. For solvents, pesticides and other compounds having less clearly established carcinogenicity, impact on residents living near a production site, or near earlier related industrial activity is allocated to certain inferred sub-bands. Unknown carcinogenic outcome, which identifies exposure to an agent with inadequate evidence of carcinogenicity rather than being indicative of equivocal or negative data in any context, indicates both the impact of certain pollutants and user-exposure to some consumer products. Situations allocated to the unlikely risk band principally involve certain consumer products. Overall, such risk assessment may be of greatest worth in focusing community attention on proven causes of cancer and associated preventive measures.  相似文献   

19.
The approaches to quantitatively assessing the health risks of chemical exposure have not changed appreciably in the past 50 to 80 years, the focus remaining on high-dose studies that measure adverse outcomes in homogeneous animal populations. This expensive, low-throughput approach relies on conservative extrapolations to relate animal studies to much lower-dose human exposures and is of questionable relevance to predicting risks to humans at their typical low exposures. It makes little use of a mechanistic understanding of the mode of action by which chemicals perturb biological processes in human cells and tissues. An alternative vision, proposed by the U.S. National Research Council (NRC) report Toxicity Testing in the 21st Century: A Vision and a Strategy, called for moving away from traditional high-dose animal studies to an approach based on perturbation of cellular responses using well-designed in vitro assays. Central to this vision are (a) “toxicity pathways” (the innate cellular pathways that may be perturbed by chemicals) and (b) the determination of chemical concentration ranges where those perturbations are likely to be excessive, thereby leading to adverse health effects if present for a prolonged duration in an intact organism. In this paper we briefly review the original NRC report and responses to that report over the past 3 years, and discuss how the change in testing might be achieved in the U.S. and in the European Union (EU). EU initiatives in developing alternatives to animal testing of cosmetic ingredients have run very much in parallel with the NRC report. Moving from current practice to the NRC vision would require using prototype toxicity pathways to develop case studies showing the new vision in action. In this vein, we also discuss how the proposed strategy for toxicity testing might be applied to the toxicity pathways associated with DNA damage and repair.  相似文献   

20.
Facilitative mechanisms of lead as a carcinogen   总被引:6,自引:0,他引:6  
Silbergeld EK 《Mutation research》2003,533(1-2):121-133
The carcinogenicity of lead compounds has received renewed attention because of continuing environmental and occupational sources of exposure in many countries. The epidemiological evidence for an association between lead exposures and human cancer risk has been strengthened by recent studies, and new data on mechanisms of action provide biological plausibility for assessing lead as a human carcinogen. Both epidemiological and mechanistic data are consistent with a facilitative role for lead in carcinogenesis, that is, lead by itself may not be both necessary and sufficient for the induction of cancer, but at a cellular and molecular level lead may permit or enhance carcinogenic events involved in DNA damage, DNA repair, and regulation of tumor suppressor and promoter genes. Some of these events may also be relevant to understanding mechanisms of lead-induced reproductive toxicity.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号