首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The removal of carconogenic factors would be a most efficient measure to prevent cancer. As far as known chemicals are concerned, every effort is made to avert them, or at least to reduce the exposure to such compounds, but is necessary to detect unknown chemicals, especially those, drugs and foodstuffs for example, to which large populations are exposed. Giving suspected chemicals to laboratory animals is a standard carcinogenicity test. Studies of the carcinogenicity of unknown chemicals in animals are time consuming, expensive and cumbersome. This is why other means of establishing carcinogenicity are sought for. Several rapid tests are available to-day to select suspected carcinogens. These methods aim primarily at determining with chemicals--at the cell or tissue level--certain changes that would appear essential to trigger the carcinogenic process, such as somatic mutations. Studies are used on the mutagenicity of chemicals for bacteria of the Salmonella type, for yeast and cultured mammalian cells, together with the induction of recessive lethal mutations in Drosophila and of the unscheduled repair synthesis of DNA and the transformation of mammalian cells in vitro. Although there is an unequivocal correlation between the activity of chemicals in such tests and their carcinogenicity, discrepancies are found. Thus, the in vivo tests on laboratory animals remain the most reliable method to determine carcinogenicity. Whereas direct extrapolation of experimental data to human pathology is impossible, the experimental evidence of the carcinogenicity of any chemical should allow us to draw constructive conclusions. We shall never be able to reject drugs which produce the expected results and cannot be replaced by other drugs. But we can must the drugs whose beneficial effects are not exceptional and which can be replaced by other chemicals. As for the chemicals used in food additives and cosmetics, and recognized as carcinogenic in animals, they should be totally given up. Any decision made should be based on animal studies.  相似文献   

2.
Drug Anatomical Therapeutic Chemical (ATC) classification system is a widely used and accepted drug classification system. It is recommended and maintained by World Health Organization (WHO). Each drug in this system is assigned one or more ATC codes, indicating which classes it belongs to in each of five levels. Given a chemical/drug, correct identification of its ATC codes in such system can be helpful to understand its therapeutic effects. Several computational methods have been proposed to identify the first level ATC classes for any drug. Most of them built multi-label classifiers in this regard. One previous study proposed a quite different scheme, which contained two network methods, based on shortest path (SP) and random walk with restart (RWR) algorithms, respectively, to infer novel chemicals/drugs for each first level class. However, due to the limitations of SP and RWR algorithms, there still exist lots of hidden chemicals/drugs that above two methods cannot discover. This study employed another classic network algorithm, Laplacian heat diffusion (LHD) algorithm, to construct a new computational method for recognizing novel latent chemicals/drugs of each first level ATC class. This algorithm was applied on a chemical network, which containing lots of chemical interaction information, to evaluate the associations of candidate chemicals/drugs and each ATC class. Three screening tests, which measured the specificity and association to one ATC class, followed to yield more reliable potential members for each class. Some hidden chemicals/drugs were recognized, which cannot be found out by previous methods, and they were extensively analyzed to confirm that they can be novel members in the corresponding ATC class.  相似文献   

3.
Drug-induced liver injury (DILI) is a significant concern in drug development due to the poor concordance between preclinical and clinical findings of liver toxicity. We hypothesized that the DILI types (hepatotoxic side effects) seen in the clinic can be translated into the development of predictive in silico models for use in the drug discovery phase. We identified 13 hepatotoxic side effects with high accuracy for classifying marketed drugs for their DILI potential. We then developed in silico predictive models for each of these 13 side effects, which were further combined to construct a DILI prediction system (DILIps). The DILIps yielded 60-70% prediction accuracy for three independent validation sets. To enhance the confidence for identification of drugs that cause severe DILI in humans, the "Rule of Three" was developed in DILIps by using a consensus strategy based on 13 models. This gave high positive predictive value (91%) when applied to an external dataset containing 206 drugs from three independent literature datasets. Using the DILIps, we screened all the drugs in DrugBank and investigated their DILI potential in terms of protein targets and therapeutic categories through network modeling. We demonstrated that two therapeutic categories, anti-infectives for systemic use and musculoskeletal system drugs, were enriched for DILI, which is consistent with current knowledge. We also identified protein targets and pathways that are related to drugs that cause DILI by using pathway analysis and co-occurrence text mining. While marketed drugs were the focus of this study, the DILIps has a potential as an evaluation tool to screen and prioritize new drug candidates or chemicals, such as environmental chemicals, to avoid those that might cause liver toxicity. We expect that the methodology can be also applied to other drug safety endpoints, such as renal or cardiovascular toxicity.  相似文献   

4.
5.
Over the past decades, a number of drugs have been withdrawn or have required special labeling due to adverse effects observed post-marketing. Species differences in drug toxicity in preclinical safety tests and the lack of sensitive biomarkers and nonrepresentative patient population in clinical trials are probable reasons for the failures in predicting human drug toxicity. It is proposed that toxicology should evolve from an empirical practice to an investigative discipline. Accurate prediction of human drug toxicity requires resources and time to be spent in clearly defining key toxic pathways and corresponding risk factors, which hopefully, will be compensated by the benefits of a lower percentage of clinical failure due to toxicity and a decreased frequency of market withdrawal due to unacceptable adverse drug effects.  相似文献   

6.
Motility and feeding assays were assessed as in vitro systems for screening of novel compounds for anthelmintic activity against adult Haemonchus contortus. The study aimed to develop an assay with the parasitic adult stage of this species that could be used in conjunction with, or as an alternative to, the free-living larval stage screens commonly used for drug discovery with many parasitic nematode species. The feeding assay showed limitations due to the apparent continuation of a significant degree of feeding in worms showing greatly reduced motility in the presence of some drugs. Hence, it appeared most likely that the feeding assay would underestimate the toxicity of these drugs. The motility assay was able to detect toxicity of known anthelmintics, including the 'slow-acting' benzimidazoles. A small-scale screening exercise used the motility assay to detect toxicity towards adult parasites in 10 compounds out of a group of 200 chemicals (selected due to known toxic effects in larval development assays). The motility assay appeared suitable for drug screening against adult H. contortus. The use of the adult stage for drug screening in this way ensures that the drug is toxic towards the parasite life stage to be targeted in vivo. A lack of activity in subsequent in vivo trials could, therefore, be most likely attributable to host pharmacokinetic factors rather than an intrinsic lack of activity of the drug towards the adult parasite.  相似文献   

7.
A new international project to evaluate the relevance for human systemic and local toxicity of in vitro tests of general toxicity of chemicals has been organized by the Scandinavian Society of Cell Toxicology under the title Multicenter Evaluation of In Vitro Cytotoxicity (MEIC). The basic assumptions underlying the project, as well as the practical goals and the design of the program are outlined. The list of the first 50 reference chemicals is presented. The chemicals are an otherwise unbiased selection of compounds with known human acutely lethal dosage and blood concentrations, including LD50-values in the rat or mouse. Most agents also have other data on human toxicity and toxicokinetics, including more extensive animal toxicity data. International laboratories already using or developing in vitro tests of various partial aspects of general toxicity are invited to test the substances, the results of which will be evaluated by us. The predictivity of the in vitro results for both partial and gross human toxicity data will be determined with combined use of univariate regression analysis and soft multivariate modeling. The predictivity of the in vitro results will be compared with the predictivity of conventional animal tests for the same chemicals. Finally, batteries of tests with optimal prediction power for various types of human toxicity will be selected. The need for and possible uses of such batteries are discussed.  相似文献   

8.
Role of genetics and drug metabolism in human cancer risk   总被引:13,自引:0,他引:13  
D W Nebert 《Mutation research》1991,247(2):267-281
The research field concerning responses to drugs having a hereditary basis is called 'pharmacogenetics'. At least 5 dozen pharmacogenetic polymorphisms have been described in clinical medicine; many are responsible for marked differences in genetic predisposition toward toxicity or cancer. Three are detailed here: the acetylation, the debrisoquine, and the AH locus polymorphism. All 3 are very common among the United States' population: 1 in 2 is a 'slow acetylator', 1 in 12 is a 'poor metabolizer' for more than 2 dozen commonly prescribed drugs in the debrisoquine panel, and the CYP1A1 and CYP1A2 (cytochromes P(1)450 and P(3)450) genes are highly inducible by cigarette smoke in 1 of 10 patients. Differences in xenobiotic metabolism between individuals in the same family can be greater than 200-fold, suggesting that occupationally hazardous chemicals, as well as prescribed drugs having a narrow therapeutic window, might cause strikingly dissimilar effects between patients of differing genotypes. Our ultimate goal is 'preventive toxicology', i.e. the development of simple, inexpensive, unequivocal and sensitive assays to predict individual risk of toxicity or cancer. These tests could help the individual in choosing a safer life style or place of work and might aid the physician in deciding which drug to prescribe.  相似文献   

9.
Purpose

Limiting exposure to potentially toxic chemicals in food packaging can lead to environmental impact trade-offs. No available tool, however, considers trade-offs between environmental impacts of packaging systems and exposure to potentially toxic chemicals in food packaging. This study therefore explores the research needs for extending life cycle impact assessment (LCIA) to include exposure to chemicals in food packaging.

Methods

The LCIA framework for human toxicity was extended for the first time to include consumer exposure to chemicals in food packaging through the product intake fraction (PiF) metric. The related exposure pathway was added to LCIA without other modifications to the existing toxicity characterization framework used by USEtox®, i.e., effect factor derivation. The developed method was applied to a high impact polystyrene (HIPS) container case study with the functional unit of providing 1 kg of yogurt in single servings. Various exposure scenarios were considered, including an evidence-based scenario using concentration data and a migration model. Human toxicity impact scores in comparative toxic units (CTUh) for the use stage were evaluated and then compared to human toxicity impact scores from a conventional LCIA methodology.

Results and discussion

Data allowed toxicity characterization of use stage exposure to only seven chemicals in HIPS out of fourty-four identified. Data required were the initial concentration of chemicals in food packaging, chemical mass transfer from packaging into food, and relevant toxicity information. Toxicity characterization demonstrated that the combined CTUh for HIPS material acquisition, manufacturing, and disposal stages exceeded the toxicity scores related to consumer exposure to previously estimated concentrations of the seven characterizable chemicals in HIPS, by about two orders of magnitude. The CTUh associated with consumer exposure became relevant when migration was above 0.1% of the European regulatory levels. Results emphasize missing data for chemical concentrations in food contact materials and a need to expand the current USEtox method for effect factor derivation (e.g., to consider endocrine disruption, mixture toxicity, background exposure, and thresholds when relevant).

Conclusions

An LCIA method was developed to include consumer exposure to chemicals in food packaging. Further study is required to assess realistic scenarios to inform decisions and policies, such as circular economy, which can lead to trade-offs between environmental impacts and potentially toxic chemicals in packaging. To apply the developed method, data regarding occurrence, concentration, and toxicity of chemicals in food packaging are needed. Revisiting the derivation of effect factors in future work could improve the interpretation of human toxicity impact scores.

  相似文献   

10.
The public understands and supports the ethical use of human subjects in medical research, recognizing the unique role for this type of study in the development of new drugs and therapeutic strategies for treatment of disease. The use of data from human subjects can also be of value in understanding the circumstances under which individuals exposed to chemicals in the food supply, in the workplace, or in the environment might experience toxicity, i.e., in support of risk assessment. However, questions have been raised as to whether this latter type of research is ethical, or can be performed in an ethical manner. Under what circumstances is it acceptable to intentionally expose human subjects to potentially toxic agents? This is an extremely important issue for the risk assessment community to address, because it affects in a fundamental way the types of information that will be available to conduct human health risk assessments. Four papers in this issue offer viewpoints on the value of human data, the circumstances under which human subjects might be exposed to toxic chemicals for research purposes, the ethical problems associated with this research, and the role of human vs. animal data in the development of toxicity values for human health risk assessment  相似文献   

11.
BACKGROUND: Toxicology studies utilizing animals and in vitro cellular or tissue preparations have been used to study the toxic effects and mechanism of action of drugs and chemicals and to determine the effective and safe dose of drugs in humans and the risk of toxicity from chemical exposures. Testing in animals could be improved if animal dosing using the mg/kg basis was abandoned and drugs and chemicals were administered to compare the effects of pharmacokinetically and toxicokinetically equivalent serum levels in the animal model and human. Because alert physicians or epidemiology studies, not animal studies, have discovered most human teratogens and toxicities in children, animal studies play a minor role in discovering teratogens and agents that are deleterious to infants and children. In vitro studies play even a less important role, although they are helpful in describing the cellular or tissue effects of the drugs or chemicals and their mechanism of action. One cannot determine the magnitude of human risks from in vitro studies when they are the only source of toxicology data. METHODS: Toxicology studies on adult animals is carried out by pharmaceutical companies, chemical companies, the Food and Drug Administration (FDA), many laboratories at the National Institutes of Health, and scientific investigators in laboratories throughout the world. Although there is a vast amount of animal toxicology studies carried out on pregnant animals and adult animals, there is a paucity of animal studies utilizing newborn, infant, and juvenile animals. This deficiency is compounded by the fact that there are very few toxicology studies carried out in children. That is one reason why pregnant women and children are referred to as "therapeutic orphans." RESULTS: When animal studies are carried out with newborn and developing animals, the results demonstrate that generalizations are less applicable and less predictable than the toxicology studies in pregnant animals. Although many studies show that infants and developing animals may have difficulty in metabolizing drugs and are more vulnerable to the toxic effects of environmental chemicals, there are exceptions that indicate that infants and developing animals may be less vulnerable and more resilient to some drugs and chemicals. In other words, the generalization indicating that developing animals are always more sensitive to environmental toxicants is not valid. For animal toxicology studies to be useful, animal studies have to utilize modern concepts of pharmacokinetics and toxicokinetics, as well as "mechanism of action" (MOA) studies to determine whether animal data can be utilized for determining human risk. One example is the inability to determine carcinogenic risks in humans for some drugs and chemicals that produce tumors in rodents, When the oncogenesis is the result of peroxisome proliferation, a reaction that is of diminished importance in humans. CONCLUSIONS: Scientists can utilize animal studies to study the toxicokinetic and toxicodynamic aspects of drugs and environmental toxicants. But they have to be carried out with the most modern techniques and interpreted with the highest level of scholarship and objectivity. Threshold exposures, no-adverse-effect level (NOAEL) exposures, and toxic effects can be determined in animals, but have to be interpreted with caution when applying them to the human. Adult problems in growth, endocrine dysfunction, neurobehavioral abnormalities, and oncogenesis may be related to exposures to drugs, chemicals, and physical agents during development and may be fruitful areas for investigation. Maximum permissible exposures have to be based on data, not on generalizations that are applied to all drugs and chemicals. Epidemiology studies are still the best methodology for determining the human risk and the effects of environmental toxicants. Carrying out these focused studies in developing humans will be difficult. Animal studies may be our only alternative for answering many questions with regard to specific postnatal developmental vulnerabilities.  相似文献   

12.
As a basis of the suggested test-system, the following conditions are observed: 1) the economy of fulfilment in a short time; 2) the analysis of gene and chromosome mutations in germ and somatic cells; 3) the evaluation of mutagenic effects of not only substance, but also of the products of its metabolism; 4) including in the system only the tests which give the minimal variability between separate experiments; 5) the evaluation of dose-effect relationship. The practical scheme of testing is divided into two parts: a screening and a complete one. The screening programme consists of two tests: a) a test on microorganisms with a metabolic activation in vitro; b) a cytogenetic analysis of bone marrow of mammals. The complete programme of testing includes 4 tests: a) a test on microorganisms with a metabolic activation in vitro and in vivo; b) a test of dominant lethal mutations on mammals; c) a cytogenetic analysis of bone marrow of mammals; d) a cytogenetic analysis in the culture of human lymphocytes. There are good reasons for the principles of selection of substance for testing according to the screening and complete programme: population occurence, economic (or medical) significance, information about relative chemicals showing mutagenic, carcinogenic and teratogenic effect. In the group of chemicals which are to be tested according to the screening programme, such ones can be included: industrial chemicals, phosphoorganic insecticides, drugs which are taken by a limited group of patients. The group of chemicals which are to be tested according to the complete programme consists of the following ones: pesticides, food additices, widespread drugs, the chemicals of the group 1, if during one of the tests of the screening programme a genetic effect is detected. At the genetic risk estimation it is advisable to keep to the following rule: a positive effect, identified in any object of the system must in the direct meaning extrapolate on men. The quantitative evaluation of the mutagenic danger of a chemical can be determined by the increase of the spontaneous level of mutations in the test-object on the basis of an average dose and exposition of the given chemical in the human population. Those chemicals are subject to the quantitative evaluation, which have shown a mutagenic activity during any of the test-objects; they are also widespread and because of their social or economic value can not be replaced or excluded from taking. From the point of view of genetics any substance with a mutagenic activity is dangerous and must be prohibited from using or replaced by any other non-mutagenic chemical, or limited by the contact of persons of non-reproductive age. As a temporary measure from a hygienic point of view, it is recommended to evaluate this chemical as especially mutagenic and prohibit or limit its using, when its average population dose produces 1/10 or more increase of the spontaneous level of mutations.  相似文献   

13.
The safety of chemicals, drugs, novel foods and genetically modified crops is often tested using repeat-dose sub-acute toxicity tests in rats or mice. It is important to avoid misinterpretations of the results as these tests are used to help determine safe exposure levels in humans. Treated and control groups are compared for a range of haematological, biochemical and other biomarkers which may indicate tissue damage or other adverse effects. However, the statistical analysis and presentation of such data poses problems due to the large number of statistical tests which are involved. Often, it is not clear whether a “statistically significant” effect is real or a false positive (type I error) due to sampling variation. The author''s conclusions appear to be reached somewhat subjectively by the pattern of statistical significances, discounting those which they judge to be type I errors and ignoring any biomarker where the p-value is greater than p = 0.05. However, by using standardised effect sizes (SESs) a range of graphical methods and an over-all assessment of the mean absolute response can be made. The approach is an extension, not a replacement of existing methods. It is intended to assist toxicologists and regulators in the interpretation of the results. Here, the SES analysis has been applied to data from nine published sub-acute toxicity tests in order to compare the findings with those of the author''s. Line plots, box plots and bar plots show the pattern of response. Dose-response relationships are easily seen. A “bootstrap” test compares the mean absolute differences across dose groups. In four out of seven papers where the no observed adverse effect level (NOAEL) was estimated by the authors, it was set too high according to the bootstrap test, suggesting that possible toxicity is under-estimated.  相似文献   

14.
目的:建立药物生殖毒性研究中孕鼠离体子宫平滑肌张力的测定方法。方法:SD大鼠于妊娠第6~15天(GD6-15)给予受试物,在妊娠第20天(GD20)取子宫平滑肌,分别给予不同浓度缩宫素和硫酸镁溶液刺激,选择试验的最佳条件。在此基础上,测定不同剂量组孕鼠离体子宫平滑肌张力的变化情况,采用RM-6240BD多道生理信号采集处理系统分析数据,统计结果。结果:选择0.007 U/m L缩宫素、0.008 mol/m L硫酸镁为最佳刺激浓度。随着受试物剂量的增加,加入缩宫素后,各组妊娠子宫平滑肌的频率和张力均呈逐渐上升的趋势,而加入硫酸镁后,各剂量组妊娠子宫平滑肌的活动幅度、频率和张力均呈现降低的趋势,但各剂量组与溶媒对照组相比均未见明显差异(P0.05)。结论:本研究建立了孕鼠离体子宫平滑肌张力的测定方法,该方法可以更好的反映受试物对孕鼠子宫肌的毒性作用,更加全面的评价受试物的生殖毒性。  相似文献   

15.
Toxicogenomic approach for assessing toxicant-related disease   总被引:6,自引:0,他引:6  
The problems of identifying environmental factors involved in the etiology of human disease and performing safety and risk assessments of drugs and chemicals have long been formidable issues. Three principal components for predicting potential human health risks are: (1) the diverse structure and properties of thousands of chemicals and other stressors in the environment; (2) the time and dose parameters that define the relationship between exposure and disease; and (3) the genetic diversity of organisms used as surrogates to determine adverse chemical effects. The global techniques evolving from successful genomics efforts are providing new exciting tools with which to address these intractable problems of environmental health and toxicology. In order to exploit the scientific opportunities, the National Institute of Environmental Health Sciences has created the National Center for Toxicogenomics (NCT). The primary mission of the NCT is to use gene expression technology, proteomics and metabolite profiling to create a reference knowledge base that will allow scientists to understand mechanisms of toxicity and to be able to predict the potential toxicity of new chemical entities and drugs. A principal scientific objective underpinning the use of microarray analysis of chemical exposures is to demonstrate the utility of signature profiling of the action of drugs or chemicals and to utilize microarray methodologies to determine biomarkers of exposure and potential adverse effects. The initial approach of the NCT is to utilize proof-of-principle experiments in an effort to "phenotypically anchor" the altered patterns of gene expression to conventional parameters of toxicity and to define dose and time relationships in which the expression of such signature genes may precede the development of overt toxicity. The microarray approach is used in conjunction with proteomic techniques to identify specific proteins that may serve as signature biomarkers. The longer-range goal of these efforts is to develop a reference relational database of chemical effects in biological systems (CEBS) that can be used to define common mechanisms of toxicity, chemical and drug actions, to define cellular pathways of response, injury and, ultimately, disease. In order to implement this strategy, the NCT has created a consortium of research organizations and private sector companies to actively collaborative in populating the database with high quality primary data. The evolution of discrete databases to a knowledge base of toxicogenomics will be accomplished through establishing relational interfaces with other sources of information on the structure and activity of chemicals such as that of the National Toxicology Program (NTP) and with databases annotating gene identity, sequence, and function.  相似文献   

16.
Drug-induced liver toxicity is a main reason for withdrawals of new drugs in late clinical phases and post-launch of the drugs. Thus, hepatotoxicity screening of drug candidates in pre-clinical stage is important for reducing drug attrition rates during the clinical development process. Here, we show commercially available hepatocytes that could be used for early toxicity evaluation of drug candidates. From our hepatic differentiation technology, we obtained highly pure (≥98%) hepatocytes from human embryonic stem cells (hESCs) having mature phenotypes and similar gene expression profiles with those of primary human tissues. Furthermore, we optimized 96-well culture condition of hESC-derived hepatocytes suitable for toxicity tests in vitro. To this end, we demonstrated the efficacy of our optimized hepatocyte model for predicting hepatotoxicity against the Chinese herbal medicines and showed that toxicity patterns from our hepatocyte model was similar to those of human primary cultured hepatocytes. We conclude that toxicity test using our hepatocyte model could be a good alternative cell source for pre-clinical study to predict potential hepatotoxicity in drug discovery industries.  相似文献   

17.
The genetic toxicity of human carcinogens and its implications   总被引:9,自引:0,他引:9  
23 chemicals and chemical combinations have been designated by the International Agency for Research on Cancer (IARC) as causally associated with cancer in humans. The literature was searched for reports of their activity in the Salmonella mutagenicity assay and for evidence of their ability to induce chromosome aberrations or micronuclei in the bone marrow of mice or rats. In addition, the chemical structures of these carcinogens were assessed for the presence of electrophilic substituents that might be associated with their mutagenicity and carcinogenicity. The purpose of this study was to determine which human carcinogens exhibit genetic toxicity in vitro and in vivo and to what extent they can be detected using these two widely employed short-term tests for genetic toxicity. The results of this study revealed 20 of the 23 carcinogens to be active in one or both short-term tests. Treosulphan, for which short-term test results are not available, is predicted to be active based on its structure. The remaining two agents, asbestos and conjugated estrogens, are not mutagenic to Salmonella; asbestos is not likely to induce cytogenetic effects in the bone marrow and the potential activity of conjugated estrogens in the bone marrow is difficult to anticipate. These findings show that genetic toxicity is characteristic of the majority of IARC Group 1 human carcinogens. If these chemicals are considered representative of human carcinogens, then two short-term tests may serve as an effective primary screen for chemicals that present a carcinogenic hazard to humans.  相似文献   

18.
The genetic toxicology of Gene-Tox non-carcinogens   总被引:1,自引:0,他引:1  
The Gene-Tox Program has identified 61 chemicals that have been tested in chronic rodent carcinogenesis bioassays and found to be inactive. The genetic toxicology data of these 61 non-carcinogens is reviewed and summarized. A large proportion of these chemicals have been tested to a limited extent in genetic toxicity bioassays: 32 in 2 tests or less. Of the remaining 29 chemicals, 28% have been tested in 9 or more tests which encompass a range of genetic endpoints: gene mutation, chromosomal effects, other genetic endpoints, and cell transformation. The genetic toxicity of 12 chemicals with sufficient data is discussed in detail: benzoin, caffeine caprolactam, ethanol, halothane, hycanthone methanesulfonate, malathion, maleic hydrazide, methotrexate, 1-naphthylamine, 4-nitro-o-phenylenediamine, and p-phenylenediamine. A new technique for the evaluation of multiple test data, the "genetic activity profile", has been applied to 6 of these chemicals, allowing the qualitative and quantitative information to be compared collectively. In the evaluation of the genotoxicity effects of these non-carcinogens, a number of discrepancies between the results from genetic toxicity bioassays and chronic rodent bioassays have been uncovered. These discrepancies are discussed in light of current knowledge on the strengths and weaknesses of both genetic toxicity bioassays and chronic rodent bioassays.  相似文献   

19.
In considering drug therapy for pregnant women, it must be borne in mind that almost all chemical compounds in use as therapeutic agents pass from the maternal to the fetal circulation through the placenta. These drugs can produce a wide range of harmful effects on the fetus and neonatal infant. The effects of some substances for which we have data reflecting a deleterious effect are listed.It is suggested that in the future more caution be exercised in using drugs during pregnancy and that in histories, both obstetrical and pediatric, any therapy given to the mother during gestation be recorded in detail.  相似文献   

20.
Toxicological risk assessment for chemicals is still mainly based on highly standardised protocols for animal experimentation and exposure assessment. However, developments in our knowledge of general physiology, in chemicobiological interactions and in (computer-supported) modelling, have resulted in a tremendous change in our understanding of the molecular mechanisms underlying the toxicity of chemicals. This permits the development of biologically based models, in which the biokinetics as well as the toxicodynamics of compounds can be described. In this paper, the possibilities are discussed of developing systems in which the systemic (acute and chronic) toxicities of chemicals can be quantified without the heavy reliance on animal experiments. By integrating data derived from different sources, predictions of toxicity can be made. Key elements in this integrated approach are the evaluation of chemical functionalities representing structural alerts for toxic actions, the construction of biokinetic models on the basis of non-animal data (for example, tissue-blood partition coefficients, in vitro biotransformation parameters), tests or batteries of tests for determining basal cytotoxicity, and more-specific tests for evaluating tissue or organ toxicity. It is concluded that this approach is a useful tool for various steps in toxicological hazard and risk assessment, especially for those forms of toxicity for which validated in vitro and other non-animal tests have already been developed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号