首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The basis for the prediction of toxicity from chemical structure is that the properties of a chemical are implicit in its molecular structure. Biological activity can be expressed as a function of partition and reactivity, that is, for a chemical to be able to express its toxicity, it must be transported from its site of administration to its site of action and then it must bind to or react with its receptor or target. This process may also involve metabolic transformation of the chemical. The application of these principles to the prediction of the toxicity of new or untested chemicals has been achieved in a number of different ways covering a wide range of complexity, from computer systems containing databases of hundreds of chemicals, to simple "reading across" between chemicals with similar chemical/toxicological functionality. The common feature of the approaches described in this article is that their starting point is a mechanistic hypothesis linking chemical structure and/or functionality with the toxicological endpoint of interest. The prediction of toxicity from chemical structure can make a valuable contribution to the reduction of animal usage in the screening out of potentially toxic chemicals at an early stage and in providing data for making positive classifications of toxicity. This revised version was published online in July 2006 with corrections to the Cover Date.  相似文献   

2.
Carcinogenicity is one of the toxicological endpoints causing the highest concern. Also, the standard bioassays in rodents used to assess the carcinogenic potential of chemicals and drugs are extremely long, costly and require the sacrifice of large numbers of animals. For these reasons, we have attempted development of a global quantitative structure-activity relationship (QSAR) model using a data set of 1464 compounds (the Galvez data set available from http://www.uv.es/-galvez/tablevi.pdf), including many marketed drugs for their carcinogenesis potential. Though experimental toxicity testing using animal models is unavoidable for new drug candidates at an advanced stage of drug development, yet the developed global QSAR model can in silico predict the carcinogenicity of new drug compounds to provide a tool for initial screening of new drug candidate molecules with reduced number of animal testing, money and time. Considering large number of data points with diverse structural features used for model development (n(training) = 732) and model validation (n(test) = 732), the model developed in this study has an encouraging statistical quality (leave-one-out Q2 = 0.731, R2pred = 0.716). Our developed model suggests that higher lipophilicity values and conjugated ring systems, thioketo and nitro groups contribute positively towards drug carcinogenicity. On the contrary, tertiary and secondary nitrogens, phenolic, enolic and carboxylic OH fragments and presence of three-membered rings reduce the carcinogenicity. Branching, size and shape are found to be crucial factors for drug-induced carcinogenicity. One may consider all these points to reduce carcinogenic potential of the molecules.  相似文献   

3.
DNA microarrays and toxicogenomics: applications for ecotoxicology?   总被引:5,自引:0,他引:5  
  相似文献   

4.

Background  

Bioactivity profiling using high-throughput in vitro assays can reduce the cost and time required for toxicological screening of environmental chemicals and can also reduce the need for animal testing. Several public efforts are aimed at discovering patterns or classifiers in high-dimensional bioactivity space that predict tissue, organ or whole animal toxicological endpoints. Supervised machine learning is a powerful approach to discover combinatorial relationships in complex in vitro/in vivo datasets. We present a novel model to simulate complex chemical-toxicology data sets and use this model to evaluate the relative performance of different machine learning (ML) methods.  相似文献   

5.
Data Mining of Toxic Chemicals: Structure Patterns and QSAR   总被引:1,自引:0,他引:1  
We take a two-step strategy to explore noncongeneric toxic chemicals from the database RTECS: the screening of structure patterns and the generation of a detailed relationship between structure and activity. An efficient similarity comparison is proposed to screen chemical patterns for further QSAR analysis. Then CoMFA study is carried out on one structure pattern as an example of the implementation, and the result shows that QSAR studies of structure patterns can provide an estimate of the activity as well as a detailed relationship between activity and structure. From the performance of overall procedure, such a stepwise scheme is demonstrated to be feasible and effective to mine a database of toxic chemicals.  相似文献   

6.
Seely JC 《Lab animal》2008,37(5):206-209
Although exposure to drugs or toxicants can affect children and adults very differently, many compounds lack specific safety information for children. Studies in juvenile animals can help researchers assess pediatric patients' potential response to certain chemicals. Juvenile studies are highly sensitive to animal age, sex and species and must be planned with care to prevent misinterpretation of experimental data. The author reviews considerations for the design of these studies, focusing on toxicological and pathological aspects.  相似文献   

7.

Purpose

Today’s chemical society use and emit an enormous number of different, potentially ecotoxic, chemicals to the environment. The vast majority of substances do not have characterisation factors describing their ecotoxicity potential. A first stage, high throughput, screening tool is needed for prioritisation of which substances need further measures.

Methods

USEtox characterisation factors were calculated in this work based on data generated by quantitative structure-activity relationship (QSAR) models to expand substance coverage where characterisation factors were missing. Existing QSAR models for physico-chemical data and ecotoxicity were used, and to further fill data gaps, an algae QSAR model was developed. The existing USEtox characterisation factors were used as reference to evaluate the impact from the use of QSARs to generate input data to USEtox, with focus on ecotoxicity data. An inventory of chemicals that make up the Swedish societal stock of plastic additives, and their associated predicted emissions, was used as a case study to rank chemicals according to their ecotoxicity potential.

Results and discussion

For the 210 chemicals in the inventory, only 41 had characterisation factors in the USEtox database. With the use of QSAR generated substance data, an additional 89 characterisation factors could be calculated, substantially improving substance coverage in the ranking. The choice of QSAR model was shown to be important for the reliability of the results, but also with the best correlated model results, the discrepancies between characterisation factors based on estimated data and experimental data were very large.

Conclusions

The use of QSAR estimated data as basis for calculation of characterisation factors, and the further use of those factors for ranking based on ecotoxicity potential, was assessed as a feasible way to gather substance data for large datasets. However, further research and development of the guidance on how to make use of estimated data is needed to achieve improvement of the accuracy of the results.
  相似文献   

8.
BACKGROUND: Toxicology studies utilizing animals and in vitro cellular or tissue preparations have been used to study the toxic effects and mechanism of action of drugs and chemicals and to determine the effective and safe dose of drugs in humans and the risk of toxicity from chemical exposures. Testing in animals could be improved if animal dosing using the mg/kg basis was abandoned and drugs and chemicals were administered to compare the effects of pharmacokinetically and toxicokinetically equivalent serum levels in the animal model and human. Because alert physicians or epidemiology studies, not animal studies, have discovered most human teratogens and toxicities in children, animal studies play a minor role in discovering teratogens and agents that are deleterious to infants and children. In vitro studies play even a less important role, although they are helpful in describing the cellular or tissue effects of the drugs or chemicals and their mechanism of action. One cannot determine the magnitude of human risks from in vitro studies when they are the only source of toxicology data. METHODS: Toxicology studies on adult animals is carried out by pharmaceutical companies, chemical companies, the Food and Drug Administration (FDA), many laboratories at the National Institutes of Health, and scientific investigators in laboratories throughout the world. Although there is a vast amount of animal toxicology studies carried out on pregnant animals and adult animals, there is a paucity of animal studies utilizing newborn, infant, and juvenile animals. This deficiency is compounded by the fact that there are very few toxicology studies carried out in children. That is one reason why pregnant women and children are referred to as "therapeutic orphans." RESULTS: When animal studies are carried out with newborn and developing animals, the results demonstrate that generalizations are less applicable and less predictable than the toxicology studies in pregnant animals. Although many studies show that infants and developing animals may have difficulty in metabolizing drugs and are more vulnerable to the toxic effects of environmental chemicals, there are exceptions that indicate that infants and developing animals may be less vulnerable and more resilient to some drugs and chemicals. In other words, the generalization indicating that developing animals are always more sensitive to environmental toxicants is not valid. For animal toxicology studies to be useful, animal studies have to utilize modern concepts of pharmacokinetics and toxicokinetics, as well as "mechanism of action" (MOA) studies to determine whether animal data can be utilized for determining human risk. One example is the inability to determine carcinogenic risks in humans for some drugs and chemicals that produce tumors in rodents, When the oncogenesis is the result of peroxisome proliferation, a reaction that is of diminished importance in humans. CONCLUSIONS: Scientists can utilize animal studies to study the toxicokinetic and toxicodynamic aspects of drugs and environmental toxicants. But they have to be carried out with the most modern techniques and interpreted with the highest level of scholarship and objectivity. Threshold exposures, no-adverse-effect level (NOAEL) exposures, and toxic effects can be determined in animals, but have to be interpreted with caution when applying them to the human. Adult problems in growth, endocrine dysfunction, neurobehavioral abnormalities, and oncogenesis may be related to exposures to drugs, chemicals, and physical agents during development and may be fruitful areas for investigation. Maximum permissible exposures have to be based on data, not on generalizations that are applied to all drugs and chemicals. Epidemiology studies are still the best methodology for determining the human risk and the effects of environmental toxicants. Carrying out these focused studies in developing humans will be difficult. Animal studies may be our only alternative for answering many questions with regard to specific postnatal developmental vulnerabilities.  相似文献   

9.
Whole animal testing is an essential part in evaluating the toxicological and pharmacological profiles of chemicals and pharmaceuticals, but these experiments are expensive and cumbersome. A cell culture analog (CCA) system, when used in conjunction with a physiologically based pharmacokinetic (PBPK) model, provides an in vitro supplement to animal studies and the possibility of a human surrogate for predicting human response in clinical trials. A PBPK model mathematically simulates animal metabolism by modeling the absorption, distribution, metabolism, and elimination kinetics of a chemical in interconnected tissue compartments. A CCA uses mammalian cells cultured in interconnected chambers to physically represent the corresponding PBPK. These compartments are connected by recirculating tissue culture medium that acts as a blood surrogate. The purpose of this article is to describe the design and basic operation of the microscale manifestation of such a system. Microscale CCAs offer the potential for inexpensive, relatively high throughput evaluation of chemicals while minimizing demand for reagents and cells. Using microfabrication technology, a three-chamber ("lung"-"liver"-"other") microscale cell culture analog (microCCA) device was fabricated on a 1 in. (2.54 cm) square silicon chip. With a design flow rate of 1.76 microL/min, this microCCA device achieves approximate physiological liquid-to-cell ratio and hydrodynamic shear stress while replicating the liquid residence time parameters in the PBPK model. A dissolved oxygen sensor based on collision quenching of a fluorescent ruthenium complex by oxygen molecules was integrated into the system, demonstrating the potential to integrate real-time sensors into such devices.  相似文献   

10.
Organic chemistry has been, and for the foreseeable future will remain, vitally important for crop protection. Control of fungal pathogens, insect pests and weeds is crucial to enhanced food provision. As world population continues to grow, it is timely to assess the current situation, anticipate future challenges and consider how new chemistry may help meet those challenges. In future, agriculture will increasingly be expected to provide not only food and feed, but also crops for conversion into renewable fuels and chemical feedstocks. This will further increase the demand for higher crop yields per unit area, requiring chemicals used in crop production to be even more sophisticated. In order to contribute to programmes of integrated crop management, there is a requirement for chemicals to display high specificity, demonstrate benign environmental and toxicological profiles, and be biodegradable. It will also be necessary to improve production of those chemicals, because waste generated by the production process mitigates the overall benefit. Three aspects are considered in this review: advances in the discovery process for new molecules for sustainable crop protection, including tests for environmental and toxicological properties as well as biological activity; advances in synthetic chemistry that may offer efficient and environmentally benign manufacturing processes for modern crop protection chemicals; and issues related to energy use and production through agriculture.  相似文献   

11.
The conventional method for assessing the safety of products, ranging from pharmaceuticals to agrochemicals, biocides and industrial and household chemicals - including cosmetics - involves determining their toxicological properties by using experimental animals. The aim is to identify any possible adverse effects in humans by using these animal models. Providing safe products is undoubtedly of the utmost importance but, over the last decade or so, this aim has come into conflict with strong public opinion, especially in Europe, against animal testing. Industry, academia and the regulators have worked in partnership to find other ways of evaluating the safety of products, by non-animal testing, or at least by reducing the numbers of animals required and the severity of the tests in which they are used. There is a long way to go before products can be evaluated without any animal studies, and it may be that this laudable aim is an impossible dream. Nevertheless, considerable progress has been made by using a combination of in vitro tests and the prediction of properties based on chemical structure. The aim of this review is to describe these important and worthwhile developments in various areas of toxicological testing, with a focus on the European regulatory framework for general industrial and household chemicals.  相似文献   

12.
Recent developments in the prediction of toxicity from chemical structure have been reviewed. Attention has been drawn to some of the problems that can be encountered in the area of predictive toxicology, including the need for a multi-disciplinary approach and the need to address mechanisms of action. Progress has been hampered by the sparseness of good quality toxicological data. Perhaps too much effort has been devoted to exploring new statistical methods rather than to the creation of data sets for hitherto uninvestigated toxicological endpoints and/or classes of chemicals.  相似文献   

13.
14.
Scientists face growing pressure to move away from using traditional animal toxicity tests to determine whether manufactured chemicals are safe. Numerous ethical, scientific, business, and legislative incentives will help to drive this shift. However, a number of hurdles must be overcome in the coming years before non-animal methods are adopted into widespread practice, particularly from regulatory, scientific, and global perspectives. Several initiatives are nevertheless underway that promise to increase the confidence in newer alternative methods, which will support the move towards a future in which less data from animal tests is required in the assessment of chemical safety.  相似文献   

15.
Naphthalene (1) and para-dichlorobenzene (PDCB, 2), which are widely used as moth repellents and air fresheners, cause cancer in rodents and are potential human carcinogens. However, their mechanisms of action remain unclear. Here we describe a novel method for delivering and screening hydrophobic chemicals in C. elegans and apply this technique to investigate the ways in which naphthalene and PDCB may promote tumorigenesis in mammals. We show that naphthalene and PDCB inhibit apoptosis in C. elegans, a result that suggests a cellular mechanism by which these chemicals may promote the survival and proliferation of latent tumor cells. In addition, we find that a naphthalene metabolite directly inactivates caspases by oxidizing the active site cysteine residue; this suggests a molecular mechanism by which these chemicals suppress apoptosis. Naphthalene and PDCB are the first small-molecule apoptosis inhibitors identified in C. elegans. The power of C. elegans molecular genetics, in combination with the possibility of carrying out large-scale chemical screens in this organism, makes C. elegans an attractive and economic animal model for both toxicological studies and drug screens.  相似文献   

16.
A workshop convened to define research needs in toxicology identified several deficiencies in data and methods currently applied in risk assessment. The workshop panel noted that improving the link between chemical exposure and toxicological response requires a better understanding of the biological basis for inter-and intra-human variability and susceptibility. This understanding will not be complete unless all life stages are taken into consideration. Because animal studies serve as a foundation for toxicological assessment, proper accounting for cross-species extrapolation is essential. To achieve this, adjustments for dose-rate effects must be improved, which will aid in extrapolating toxicological responses to low doses and from short-term exposures. Success depends on greater use of validated biologically based dose-response models that include pharmacokinetic and pharmacodynamic data. Research in these areas will help define uncertainty factors and reduce reliance on underlying default assumptions. Throughout the workshop the panel recognized that biomedical science and toxicology in particular is on the verge of a revolution because of advances in genomics and proteomics. Data from these high-output technologies are anticipated to greatly improve risk assessment by enabling scientists to better define and model the elements of the relationship between exposure to biological hazards and health risks in populations with differing susceptibilities.  相似文献   

17.
Empirical methods for building predictive models of the relationships between molecular structure and useful properties are becoming increasingly important. This has arisen because drug discovery and development have become more complex. A large amount of biological target information is becoming available through molecular biology. Automation of chemical synthesis and pharmacological screening has also provided a vast amount of experimental data. Tools for designing libraries and extracting information from molecular databases and high-throughput screening experiments robustly and quickly enable leads to be discovered more effectively. As drug leads progress down the development pipeline, the ability to predict physicochemical, pharmacokinetic and toxicological properties of these leads is becoming increasingly important in reducing the number of expensive, late development failures. Quantitative structure-activity relationship (QSAR) methods have much to offer in these areas. However, QSAR analysis has many traps for unwary practitioners. This review introduces the concepts behind QSAR, points out problems that may be encountered, suggests ways of avoiding the pitfalls and introduces several exciting, new QSAR methods discovered during the last decade.  相似文献   

18.
  • 1.1. Comparative physiology may help to improve the toxicologists' ability to assess and predict toxicological risks of chemicals.
  • 2.2. Three main lines of approach have been distinguished, A: comparative research concerning the toxicokinetics of chemicals in different species; B: research concerning ecophysiological characteristics and C: studies aimed at the identification of biological markers that can be used to signal toxic effects in both experimental and free living populations of organisms.
  • 3.3. Some remarks are made on limiting conditions to be fulfilled in order to make comparative physiology valuable from a toxicological point of view.
  相似文献   

19.
The approaches to quantitatively assessing the health risks of chemical exposure have not changed appreciably in the past 50 to 80 years, the focus remaining on high-dose studies that measure adverse outcomes in homogeneous animal populations. This expensive, low-throughput approach relies on conservative extrapolations to relate animal studies to much lower-dose human exposures and is of questionable relevance to predicting risks to humans at their typical low exposures. It makes little use of a mechanistic understanding of the mode of action by which chemicals perturb biological processes in human cells and tissues. An alternative vision, proposed by the U.S. National Research Council (NRC) report Toxicity Testing in the 21(st) Century: A Vision and a Strategy, called for moving away from traditional high-dose animal studies to an approach based on perturbation of cellular responses using well-designed in vitro assays. Central to this vision are (a) "toxicity pathways" (the innate cellular pathways that may be perturbed by chemicals) and (b) the determination of chemical concentration ranges where those perturbations are likely to be excessive, thereby leading to adverse health effects if present for a prolonged duration in an intact organism. In this paper we briefly review the original NRC report and responses to that report over the past 3 years, and discuss how the change in testing might be achieved in the U.S. and in the European Union (EU). EU initiatives in developing alternatives to animal testing of cosmetic ingredients have run very much in parallel with the NRC report. Moving from current practice to the NRC vision would require using prototype toxicity pathways to develop case studies showing the new vision in action. In this vein, we also discuss how the proposed strategy for toxicity testing might be applied to the toxicity pathways associated with DNA damage and repair.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号