首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Recent developments in the prediction of toxicity from chemical structure have been reviewed. Attention has been drawn to some of the problems that can be encountered in the area of predictive toxicology, including the need for a multi-disciplinary approach and the need to address mechanisms of action. Progress has been hampered by the sparseness of good quality toxicological data. Perhaps too much effort has been devoted to exploring new statistical methods rather than to the creation of data sets for hitherto uninvestigated toxicological endpoints and/or classes of chemicals.  相似文献   

2.
There is a great deal of current interest in the use of commercial, automated programs for the prediction of mutagenicity and carcinogenicity based on chemical structure. However, the goal of accurate and reliable toxicity prediction for any chemical, based solely on structural information remains elusive. The toxicity prediction challenge is global in its objective, but limited in its solution, to within local domains of chemicals acting according to similar mechanisms of action in the biological system; to predict, we must be able to generalize based on chemical structure, but the biology fundamentally limits our ability to do so. Available commercial systems for mutagenicity and/or carcinogenicity prediction differ in their specifics, yet most fall in two major categories: (1) automated approaches that rely on the use of statistics for extracting correlations between structure and activity; and (2) knowledge-based expert systems that rely on a set of programmed rules distilled from available knowledge and human expert judgement. These two categories of approaches differ in the ways that they represent, process, and generalize chemical-biological activity information. An application of four commercial systems (TOPKAT, CASE/MULTI-CASE, DEREK, and OncoLogic) to mutagenicity and carcinogenicity prediction for a particular class of chemicals—the haloacetic acids (HAs)—is presented to highlight these differences. Some discussion is devoted to the issue of gauging the relative performance of commercial prediction systems, as well as to the role of prospective prediction exercises in this effort. And finally, an alternative approach that stops short of delivering a prediction to a user, involving structure-searching and data base exploration, is briefly considered.  相似文献   

3.
Literature data were collected on the floristic distribution and toxicity of phytochemicals to herbivores and on herbivore specialization in order to test phytochemical coevolution theory. The theory makes four predictions that can be tested with this information. Herbivores can adapt to novel, more toxic chemicals by becoming specialists, or they can become generalists but at the cost of lower feeding success on any particular host. Thus, the first two predictions are as follows: herbivores should do better on chemicals that are present in their normal host, and this pattern should be stronger for specialists than for generalists. The "escape and radiation" aspect of the theory holds that if a plant taxon with a novel defense chemical diversifies, the chemical will become widespread. Eventually, herbivores will adapt to and disarm it. So the third prediction is that more widespread chemicals are less toxic than more narrowly distributed ones. Because generalists should not do as well as specialists on chemicals disarmed by the latter, the fourth prediction is that the third prediction should be more true for generalists than specialists and should depend on presence/absence of the chemical in the normal host. Multiple regressions of toxicity (herbivore mortality and final weight) on three predictor variables (chemical presence/absence in the normal host, specialism, and chemical floristic distribution) and relevant interactions were used to test these predictions. Chemical presence/absence in the normal host, the interaction between this variable and specialism, and chemical floristic distribution had significant effects on both measures of toxicity, supporting the first three predictions of the model. Support for the fourth prediction (a three-way interaction among all predictor variables) was evident for final weight but not mortality, perhaps because growth is more responsive to toxicity differences than survival. In short, the phytochemistry literature provides broad support for the phytochemical coevolution model.  相似文献   

4.
A considerable amount of chemical data is available for surficial sediments in Port Jackson. Some of the highest concentrations of heavy metals, organochlorines and polycyclic aromatic hydrocarbons of any capital port occur in sediments mantling shallow tributaries and embayments close to central Sydney. However, these data have limited ability to predict adverse effects on living resources and in the absence of toxicological data, sediment quality guidelines (SQG) have been used to assess the possible adverse biological effects of sedimentary contaminants in this estuary. Several SQGs are currently available for both fresh and marine environments, but the scheme used in the current study is based on empirical analysis of matching chemical and biological data compiled by the National Oceanographic and Atmospheric Administration (NOAA) in the U.S. The NOAA SQG can be used to assess individual chemicals, or to estimate the probability of acute sediment toxicity by calculating `mean quotients' for a large range of contaminants. Although many individual chemicals in sediment exceed SQG over extensive parts of Port Jackson, `mean quotient' results suggest that only a small proportion (1%) of the harbour may be highly toxic (74% probability of toxicity). However, sediment in a considerably larger proportion of the port (almost 25%) is estimated to have a 49% probability of being toxic using the `mean quotient' approach. These results are indicative at best and contemporaneous chemical/biological/ecotoxicological studies are needed to verify the applicability of SQG developed in the U.S. for use outside North America and site specific studies of this type are still required to determine the toxicity of sediments in Port Jackson.  相似文献   

5.
The ability to assess the potential genotoxicity, carcinogenicity, or other toxicity of pharmaceutical or industrial chemicals based on chemical structure information is a highly coveted and shared goal of varied academic, commercial, and government regulatory groups. These diverse interests often employ different approaches and have different criteria and use for toxicity assessments, but they share a need for unrestricted access to existing public toxicity data linked with chemical structure information. Currently, there exists no central repository of toxicity information, commercial or public, that adequately meets the data requirements for flexible analogue searching, Structure-Activity Relationship (SAR) model development, or building of chemical relational databases (CRD). The distributed structure-searchable toxicity (DSSTox) public database network is being proposed as a community-supported, web-based effort to address these shared needs of the SAR and toxicology communities. The DSSTox project has the following major elements: (1) to adopt and encourage the use of a common standard file format (structure data file (SDF)) for public toxicity databases that includes chemical structure, text and property information, and that can easily be imported into available CRD applications; (2) to implement a distributed source approach, managed by a DSSTox Central Website, that will enable decentralized, free public access to structure-toxicity data files, and that will effectively link knowledgeable toxicity data sources with potential users of these data from other disciplines (such as chemistry, modeling, and computer science); and (3) to engage public/commercial/academic/industry groups in contributing to and expanding this community-wide, public data sharing and distribution effort. The DSSTox project's overall aims are to effect the closer association of chemical structure information with existing toxicity data, and to promote and facilitate structure-based exploration of these data within a common chemistry-based framework that spans toxicological disciplines.  相似文献   

6.
7.
Rosenkranz HS 《Mutation research》2003,529(1-2):117-127
The health risk manager and policy analyst must frequently make recommendations based upon incomplete toxicity data. This is a situation which is encountered in the evaluation of human carcinogenic risks as animal cancer bioassay results are often not available. In this study, in order to assess the relevance of other possible indicators of carcinogenic risks, we used the "chemical diversity approach" to estimate the magnitude of the human carcinogenic risk based upon Salmonella mutagenicity and systemic toxicity data of the "universe of chemicals" to which humans have the potential to be exposed. Analyses of the properties of 10,000 agents representative of the "universe of chemicals" suggest that chemicals that have genotoxic potentials as well as exhibiting greater systemic toxicity are more likely to be carcinogens than non-genotoxicants or agents that exhibit lesser toxicity. Since "genotoxic" carcinogenicity is a hallmark of recognized human carcinogens, these findings are relevant to human cancer risk assessment.  相似文献   

8.
The conventional method for assessing the safety of products, ranging from pharmaceuticals to agrochemicals, biocides and industrial and household chemicals - including cosmetics - involves determining their toxicological properties by using experimental animals. The aim is to identify any possible adverse effects in humans by using these animal models. Providing safe products is undoubtedly of the utmost importance but, over the last decade or so, this aim has come into conflict with strong public opinion, especially in Europe, against animal testing. Industry, academia and the regulators have worked in partnership to find other ways of evaluating the safety of products, by non-animal testing, or at least by reducing the numbers of animals required and the severity of the tests in which they are used. There is a long way to go before products can be evaluated without any animal studies, and it may be that this laudable aim is an impossible dream. Nevertheless, considerable progress has been made by using a combination of in vitro tests and the prediction of properties based on chemical structure. The aim of this review is to describe these important and worthwhile developments in various areas of toxicological testing, with a focus on the European regulatory framework for general industrial and household chemicals.  相似文献   

9.
DNA microarrays and toxicogenomics: applications for ecotoxicology?   总被引:5,自引:0,他引:5  
  相似文献   

10.
Ionic liquids (ILs), a class of materials with unique physicochemical properties, have been used extensively in the fields of chemical engineering, biotechnology, material sciences, pharmaceutics, and many others. Because ILs are very polar by nature, they can migrate into the environment with the possibility of inclusion in the food chain and bioaccumulation in living organisms. However, the chemical natures of ILs are not quintessentially biocompatible. Therefore, the practical uses of ILs must be preceded by suitable toxicological assessments. Among different methods, the use of microorganisms to evaluate IL toxicity provides many advantages including short generation time, rapid growth, and environmental and industrial relevance. This article reviews the recent research progress on the toxicological properties of ILs toward microorganisms and highlights the computational prediction of various toxicity models.  相似文献   

11.
The notification procedure of the European Union (EU) for new chemicals requires the application of protocols on physicochemical and toxicological tests for the evaluation of physicochemical properties and probable toxic effects of each notified substance. A computerised database was developed from data sets and toxicological test protocols relating to substance properties responsible for skin and eye irritation/corrosion. To develop specific structure-activity relationship (SAR) models and to find rules for a decision support system (DSS) to predict local irritation/corrosion, physical property data, chemical structure data and toxicological data for approximately 1300 chemicals, each having a purity of 95% or more, were evaluated. The evaluation demonstrated that the lipid solubility and aqueous solubility of a chemical are relevant to, or - in some cases - responsible for, the observed local effects of a substance on the skins and eyes of rabbits. The octanol/water partition coefficient and the measured value of the surface tension of a saturated aqueous solution of the substance give additional information that permits the definition of detailed SAR algorithms that use measured solubility values. Data on melting points and vapour pressure can be used to assess the intensity and duration of local contact with a chemical. Considerations relating to the reactivity of a pure chemical can be based on molecular weight and the nature of the heteroatoms present. With respect to local lesions produced following contact with the skin and eyes of rabbits, the data evaluation revealed that no general "local irritation/corrosion potential" of a chemical can be defined. A variety of mechanisms are responsible for the formation of local lesions on the skin or in the eyes: serious lesions are produced by mechanisms different from those that cause moderate irritation in these organs. In order to develop a DSS that uses the information extracted from the database, chemical main groups were categorised on the basis of their empirical formulae, and rules were defined of the type IF (physicochemical property) A, THEN not (toxic) effect B, based on correlations between specific local effects and measured physicochemical values. Other rules of the type IF substructure A, THEN effect B were developed based on correlations between specific local effects and the submitted structural formulae. Reactive chemical substructures relevant to the formation of local lesions and rules for the prediction of the absence of any skin irritation potential were identified. Proposals are made relating to the development of alternatives to eye irritation testing with rabbits.  相似文献   

12.
BASF has developed a Metabolomics database (MetaMap(?) Tox) containing approximately 500 data rich chemicals, agrochemicals and drugs. This metabolome-database has been built based upon 28-day studies in rats (adapted to OECD 407 guideline) with blood sampling and metabolic profiling after 7, 14 and 28 days of test substance treatment. Numerous metabolome patterns have been established for different toxicological targets (liver, kidney, thyroid, testes, blood, nervous system and endocrine system) which are specific for different toxicological modes of action. With these patterns early detection of toxicological effects and the underlying mechanism can now be obtained from routine studies. Early recognition of toxicological mode of action will help to develop new compounds with a more favourable toxicological profile and will also help to reduce the number of animal studies necessary to do so. Thus this technology contributes to animal welfare by means of reduction through refinement (2R), but also has potential as a replacement method by analyzing samples from in vitro studies. With respect to the REACH legislation for which a large number of animal studies will need to be performed, one of the most promising methods to reduce the number of animal experiments is grouping of chemicals and read-across to those which are data rich. So far mostly chemical similarity or QSAR models are driving the selection process of chemical grouping. However, "omics" technologies such as metabolomics may help to optimize the chemical grouping process by providing biologically based criteria for toxicological equivalence. "From QSAR to QBAR" (quantitative biological activity relationship).  相似文献   

13.
All substances are toxic when the dose is large enough. In order to regulate the use of chemicals, we need to measure the level at which toxic effects are found. Epidemiological evidence suggests that present levels of chemical use do not lead to widespread harmful contamination of the human environment. For chemicals, most of the problems of toxicity are found in the workplace, while the population at large gets most of its toxic effects from voluntary exposure to substances such as tobacco smoke and ethanol. The prevention and control of toxic effects depends on a series of steps. This begins with measurement of toxicity in model systems, such as laboratory animals, and the estimation of the likely exposure of workers or consumers. Reliable extrapolation of information gathered from animals to the diverse and biochemically differing human population depends on understanding mechanisms of toxic effects. The toxic effect and mechanisms of action of substances such as carbon tetrachloride or paracetamol have been extensively investigated, and our ability to predict toxicity or develop antidotes to poisoning has had some success, but epidemiology is still an essential part of assessment of toxic effects of new chemicals. The example of phenobarbitone shows how animal experiments may well lead to conclusions which do not apply to man. After measurement of toxicity and assessment of likely hazards in use comes the final evaluation of the use of a chemical. This depends not only on its toxicity, but also on its usefulness. The direct effects on health may be small in comparison with the indirect advantageous effects which a useful substance such as vinyl chloride may bring. The assessment of risks and benefits of new chemicals can be partly removed from a political style of discourse, but the evaluation of the relative weight to be attached to these risks and benefits is inescapably political. The scientific contribution must be to allow the debate to take place in the light of maximum clarity of information about the consequences of use of chemicals.  相似文献   

14.
Many evaluations estimating safe levels of hydrophobic organic chemicals in sediments do not account for confounding factors such as physical habitat quality or covariance among chemicals. Controlled experiments demonstrating cause and effect can be conducted with spiked sediment toxicity tests, but application of this methodology has been limited in part by concerns about chemical bioavailability and challenges in achieving target concentrations. Relevant literature was reviewed to assess the utility of standardizing sediment equilibration times; hydrophobicity, complex sediment characteristics, and temperature were identified as potentially equally important factors. Disequilibrium appears likely following limited equilibration time but should yield conservative toxicity test results relative to aged field sediments. Nominal and measured concentrations in over 20 published studies were compared to assess spiked chemical recovery (i.e., measured concentration/nominal concentration). Recovery varied substantially among studies and was not readily predictable based on spiking or extraction method, chemical properties, or measured sediment characteristics, although unmeasured differences between sediments appeared to be important. Factors affecting specific studies included chemical adsorption to glassware, biodegradation, and volatilization. Pre- and post-toxicity test analyses are recommended to confirm exposure concentrations. Studies with 2,3,7,8-tetrachloro-dibenzo-p-dioxin (2,3,7,8-TCDD) and hexachlorobenzene (HCB) exemplify the utility of verifying results of field studies using spiked sediment tests. Sediments spiked with these chemicals at concentrations greatly exceeding those in associated field studies caused no adverse effects in test organisms, demonstrating that other chemicals co-occurring in test sediment samples caused toxicity initially attributed to 2,3,7,8-TCDD and HCB in the field studies. Another key application of spiked sediment tests has been the investigation of TOC as the primary factor affecting bioavailability of hydrophobic organic chemicals. A review of LC50s for nine chemicals reported in 12 studies shows that comparable LC50s derived in different sediments generally agree within a factor of five when concentrations are normalized to a constant TOC. Additionally, use of spiked sediment toxicity testing to investigate toxicological interactions among chemicals provides a promising approach to improving the ability to predict sediment toxicity in the field.  相似文献   

15.
Toxicity testing: creating a revolution based on new technologies   总被引:3,自引:0,他引:3  
Biotechnology is evolving at a tremendous rate. Although drug discovery is now heavily focused on high throughput and miniaturized screening, the application of these advances to the toxicological assessment of chemicals and chemical products has been slow. Nevertheless, the impending surge in demands for the regulatory toxicity testing of chemicals provides the impetus for the incorporation of novel methodologies into hazard identification and risk assessment. Here, we review the current and likely future value of these new technologies in relation to toxicological evaluation and the protection of human health.  相似文献   

16.
Kolman A 《Tsitologiia》2010,52(10):888-90, inside back cover
Dr. Bj?rn Ekwall (1940-2000) was a prominent Swedish scientist--cell toxicologist, who made an outstanding contribution in the field of in vitro toxicology. In the early 80-ties Ekwall formulated so called basal cytotoxicity concept, which served as a basis for modern orientation in the field of cell toxicology: the use of tests on cells in culture for prediction of acute systemic toxicity in humans, instead of the use of tests on experimental animals. To be able to verify his theories, Ekwall organized and led the international toxicological project called MEIC: Multicentre Evaluation of In Vitro Cytotoxicity Programme (1989-1999). In this project, 50 selected chemicals were tested in 100 laboratories worldwide with more than 60 different in vitro tests (laboratories have chosen tests themselves). MEIC project was unique not only because its large scale, but, in particular, because, for the first time, the human peak blood concentrations after acute poisoning with chemicals were used as references, aiming to check predictability of the in vitro assays. The results of the MEIC project have clearly demonstrated a possibility to use in vitro tests for prediction of toxicity of chemicals in humans.  相似文献   

17.
A method to determine toxicity using a bacterium as the indicator organism previously developed (Botsford 1998) perceives most divalent cations as being toxic. Mercury is perceived as the most toxic, followed by cadmium, zinc and copper. It was found that adding 2.5 m EDTA to the reaction would relieve the toxicity of the 15 divalent cations tested. This effect does not appear to be simple chelation. One micromolar EDTA eliminated the toxicity of 1.6 m calcium or 0.006 m mercury. Thirty-six chemicals were tested for their toxicity in the presence and absence of 2.5 m EDTA and 25 ppm calcium. Twenty-one were less toxic and two of these, p-aminobenzoic acid and tetrachloroethylene would no longer appear to be toxic according to the assay when these additions were present. Six chemicals had the same toxicity with and without the additions. Nine chemicals were more toxic when the EDTA and calcium were present. This experiment was repeated with six chemicals and ten times the EDTA concentration and ten times the calcium concentration. The toxicity with 10× was compared with the toxicity with 1× the additions. The toxicity of 4 of the six chemicals changed with the higher concentration of EDTA and calcium when the absorbancy values observed in samples with the lower levels were compared with samples with the higher levels. Obviously before EDTA can be added to mitigate the toxicity of divalent cations, it must be determined how much EDTA is required to eliminate the toxicity by the ions present in the sample. Alternatively, if the nature of the contaminating organic chemical is known, it can be determined what the effect of EDTA and the divalent cation present is on the apparent toxicity of the compound.  相似文献   

18.
Toxicological risk assessment for chemicals is still mainly based on highly standardised protocols for animal experimentation and exposure assessment. However, developments in our knowledge of general physiology, in chemicobiological interactions and in (computer-supported) modelling, have resulted in a tremendous change in our understanding of the molecular mechanisms underlying the toxicity of chemicals. This permits the development of biologically based models, in which the biokinetics as well as the toxicodynamics of compounds can be described. In this paper, the possibilities are discussed of developing systems in which the systemic (acute and chronic) toxicities of chemicals can be quantified without the heavy reliance on animal experiments. By integrating data derived from different sources, predictions of toxicity can be made. Key elements in this integrated approach are the evaluation of chemical functionalities representing structural alerts for toxic actions, the construction of biokinetic models on the basis of non-animal data (for example, tissue-blood partition coefficients, in vitro biotransformation parameters), tests or batteries of tests for determining basal cytotoxicity, and more-specific tests for evaluating tissue or organ toxicity. It is concluded that this approach is a useful tool for various steps in toxicological hazard and risk assessment, especially for those forms of toxicity for which validated in vitro and other non-animal tests have already been developed.  相似文献   

19.
Almost 10 years ago, microarray technology was established as a new powerful tool for large-scale analysis of gene expression. Soon thereafter the new technology was discovered by toxicologists for the purpose of deciphering the molecular events underlying toxicity, and the term "Toxicogenomics" appeared in scientific literature. Ever since, the toxicology community was fascinated by the multiplicity of sophisticated possibilities toxicogenomics seems to offer: genome-wide analysis of toxicant-induced expression profiles may provide a means for prediction of toxicity prior to classical toxicological endpoints such as histopathology or clinical chemistry. Some researchers even speculated of the classical methods being superfluous before long. It was assumed that by using toxicogenomics it would be possible to classify compounds early in drug development and consequently save animals, time, and money in pre-clinical toxicity studies. Moreover, it seemed within reach to unravel the molecular mechanisms underlying toxicity. The feasibility of bridging data derived from in vitro and in vivo systems, identifying new biomarkers, and comparing toxicological responses "across-species" was also excessively praised. After several years of intensive application of microarray technology in the field of toxicology, not only by the pharmaceutical industry, it is now time to survey its achievements and to question how many of these wishes and promises have really come true.  相似文献   

20.
Management of pests and diseases remains a key issue for agricultural profitability and environmental health. Moves towards sustainability require a reduction in chemical toxicity loadings and conservation of natural enemies to maintain pest control. There is a lot of information from laboratory tests regarding the effects of chemicals on beneficial predators and parasitoids but very few translations of these effects into field impacts particularly under commercial conditions. To address this issue we calculated a chemical toxicity score for 19 commercial vineyards based on IOBC toxicity ratings and application number, and compared this to extensive field collections to determine if natural enemy populations can be related to predicted toxicity loadings. Invertebrates were sampled four times during the growing season using canopy sticky traps and ground level pitfall traps. Ordination analyses using non-metric multidimensional scaling indicated community structure in vineyards correlated to site chemical use, while principal components analyses identified the taxa involved. One ordination axis from canopy data and two axes from ground level data were correlated to overall IOBC ratings for the vineyards. Principal components analyses indicated that spiders, lacewings, carabids and parasitoids were all affected by chemical use. IOBC rating based on laboratory studies therefore correlated with chemical effects on field populations of natural enemies in commercial vineyards where complexes of pesticides were applied. The use of chemicals with low toxicity to beneficials as predicted by IOBC ratings will contribute to preservation and maintenance of natural enemies in vineyard ecosystems.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号