首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In its White Paper, "Strategy for a Future Chemicals Policy," published in 2001, the European Commission (EC) proposed the REACH (Registration, Evaluation and Authorisation of CHemicals) system to deal with both existing and new chemical substances. This system is based on a top-down approach to toxicity testing, in which the degree of toxicity information required is dictated primarily by production volume (tonnage). If testing is to be based on traditional methods, very large numbers of laboratory animals could be needed in response to the REACH system, causing ethical, scientific and logistical problems that would be incompatible with the time-schedule envisaged for testing. The EC has emphasised the need to minimise animal use, but has failed to produce a comprehensive strategy for doing so. The present document provides an overall scheme for predictive toxicity testing, whereby the non-animal methods identified and discussed in a recent and comprehensive ECVAM document, could be used in a tiered approach to provide a rapid and scientifically justified basis for the risk assessment of chemicals for their toxic effects in humans. The scheme starts with a preliminary risk assessment process (involving available information on hazard and exposure), followed by testing, based on physicochemical properties and (Q)SAR approaches. (Q)SAR analyses are used in conjunction with expert system and biokinetic modelling, and information on metabolism and identification of the principal metabolites in humans. The resulting information is then combined with production levels and patterns of use to assess potential human exposure. The nature and extent of any further testing should be based strictly on the need to fill essential information gaps in order to generate adequate risk assessments, and should rely on non-animal methods, as far as possible. The scheme also includes a feedback loop, so that new information is used to improve the predictivity of computational expert systems. Several recommendations are made, the most important of which is that the European Union (EU) should actively promote the improvement and validation of (Q)SAR models and expert systems, and computer-based methods for biokinetic modelling, since these offer the most realistic and most economical solution to the need to test large numbers of chemicals.  相似文献   

2.
In its White Paper, Strategy for a Future Chemicals Policy, published in 2001, the European Commission (EC) proposed the REACH (Registration, Evaluation and Authorisation of CHemicals) system to deal with both existing and new chemical substances. This system is based on a top-down approach to toxicity testing, in which the degree of toxicity information required is dictated primarily by production volume (tonnage). If testing is to be based on traditional methods, very large numbers of laboratory animals could be needed in response to the REACH system, causing ethical, scientific and logistical problems that would be incompatible with the time-schedule envisaged for testing. The EC has emphasised the need to minimise animal use, but has failed to produce a comprehensive strategy for doing so. The present document provides an overall scheme for predictive toxicity testing, whereby the non-animal methods identified and discussed in a recent and comprehensive ECVAM document, could be used in a tiered approach to provide a rapid and scientifically justified basis for the risk assessment of chemicals for their toxic effects in humans. The scheme starts with a preliminary risk assessment process (involving available information on hazard and exposure), followed by testing, based on physicochemical properties and (Q)SAR approaches. (Q)SAR analyses are used in conjunction with expert system and biokinetic modelling, and information on metabolism and identification of the principal metabolites in humans. The resulting information is then combined with production levels and patterns of use to assess potential human exposure. The nature and extent of any further testing should be based strictly on the need to fill essential information gaps in order to generate adequate risk assessments, and should rely on non-animal methods, as far as possible. The scheme also includes a feedback loop, so that new information is used to improve the predictivity of computational expert systems. Several recommendations are made, the most important of which is that the European Union (EU) should actively promote the improvement and validation of (Q)SAR models and expert systems, and computer-based methods for biokinetic modelling, since these offer the most realistic and most economical solution to the need to test large numbers of chemicals.  相似文献   

3.
This paper presents some results of a joint research project conducted by FRAME and Liverpool John Moores University, and sponsored by Defra, on the status of alternatives to animal testing with regard to the European Union REACH (Registration, Evaluation and Authorisation of Chemicals) system for the safety testing and risk assessment of chemicals. The project covered all the main toxicity endpoints associated with REACH. This paper focuses on the use of alternative (non-animal) methods (both in vitro and in silico) for repeat dose (sub-acute, sub-chronic and chronic) toxicity testing. It reviews the limited number of in silico and in vitro tests available for this endpoint, and outlines new technologies which could be used in the future, e.g. the use of biomarkers and the 'omics' technologies. An integrated testing strategy is proposed, which makes use of as much non-animal data as possible, before any essential in vivo studies are performed. Although none of the non-animal tests are currently undergoing validation, their results could help to reduce the number of animals required for testing for repeat dose toxicity.  相似文献   

4.
5.
Toxicological risk assessment for chemicals is still mainly based on highly standardised protocols for animal experimentation and exposure assessment. However, developments in our knowledge of general physiology, in chemicobiological interactions and in (computer-supported) modelling, have resulted in a tremendous change in our understanding of the molecular mechanisms underlying the toxicity of chemicals. This permits the development of biologically based models, in which the biokinetics as well as the toxicodynamics of compounds can be described. In this paper, the possibilities are discussed of developing systems in which the systemic (acute and chronic) toxicities of chemicals can be quantified without the heavy reliance on animal experiments. By integrating data derived from different sources, predictions of toxicity can be made. Key elements in this integrated approach are the evaluation of chemical functionalities representing structural alerts for toxic actions, the construction of biokinetic models on the basis of non-animal data (for example, tissue-blood partition coefficients, in vitro biotransformation parameters), tests or batteries of tests for determining basal cytotoxicity, and more-specific tests for evaluating tissue or organ toxicity. It is concluded that this approach is a useful tool for various steps in toxicological hazard and risk assessment, especially for those forms of toxicity for which validated in vitro and other non-animal tests have already been developed.  相似文献   

6.
Four predictions are made on the future of space age technologies in human and cultural ecology: first, remote sensing systems will generate a need for more fieldwork, not less; second, the services and skills of anthropologists will become essential to the interpretation of satellite data, especially as these relate to areas characterized by non-Western cultural practices; third, training in remote sensing and the use of geographic information systems will become a regular offering for anthropology students; and fourth, since these new systems and methods can be applied retrospectively to the re-analysis of earlier ethnographic works, space age technologies will be with us for some time to come.  相似文献   

7.
The European Centre for the Validation of Alternative Methods (ECVAM) proposes to make the validation process more flexible, while maintaining its high standards. The various aspects of validation are broken down into independent modules, and the information necessary to complete each module is defined. The data required to assess test validity in an independent peer review, not the process, are thus emphasised. Once the information to satisfy all the modules is complete, the test can enter the peer-review process. In this way, the between-laboratory variability and predictive capacity of a test can be assessed independently. Thinking in terms of validity principles will broaden the applicability of the validation process to a variety of tests and procedures, including the generation of new tests, new technologies (for example, genomics, proteomics), computer-based models (for example, quantitative structure-activity relationship models), and expert systems. This proposal also aims to take into account existing information, defining this as retrospective validation, in contrast to a prospective validation study, which has been the predominant approach to date. This will permit the assessment of test validity by completing the missing information via the relevant validation procedure: prospective validation, retrospective validation, catch-up validation, or a combination of these procedures.  相似文献   

8.
In April 2009, the International Life Sciences Institute (ILSI) Health and Environmental Sciences Institute's (HESI) Developmental and Reproductive Toxicology Technical Committee held a two-day workshop entitled "Developmental Toxicology-New Directions." The third session of the workshop focused on ways to refine animal studies to improve relevance and predictivity for human risk. The session included five presentations on: (1) considerations for refining developmental toxicology testing and data interpretation; (2) comparative embryology and considerations in study design and interpretation; (3) pharmacokinetic considerations in study design; (4) utility of genetically modified models for understanding mode-of-action; and (5) special considerations in reproductive testing for biologics. The presentations were followed by discussion by the presenters and attendees. Much of the discussion focused on aspects of refining current animal testing strategies, including use of toxicokinetic data, dose selection, tiered/triggered testing strategies, species selection, and use of alternative animal models. Another major area of discussion was use of non-animal-based testing paradigms, including how to define a "signal" or adverse effect, translating in vitro exposures to whole animal and human exposures, validation strategies, the need to bridge the existing gap between classical toxicology testing and risk assessment, and development of new technologies. Although there was general agreement among participants that the current testing strategy is effective, there was also consensus that traditional methods are resource-intensive and improved effectiveness of developmental toxicity testing to assess risks to human health is possible. This article provides a summary of the session's presentations and discussion and describes some key areas that warrant further consideration.  相似文献   

9.
Modern epidemiology suggests a potential interactive association between diet, lifestyle, genetics and the risk of many chronic diseases. As such, many epidemiologic studies attempt to consider assessment of dietary intake alongside genetic measures and other variables of interest. However, given the multi-factorial complexities of dietary exposures, all dietary intake assessment methods are associated with measurement errors which affect dietary estimates and may obscure disease risk associations. For this reason, dietary biomarkers measured in biological specimens are being increasingly used as additional or substitute estimates of dietary intake and nutrient status. Genetic variation may influence dietary intake and nutrient metabolism and may affect the utility of a dietary biomarker to properly reflect dietary exposures. Although there are many functional dietary biomarkers that, if utilized appropriately, can be very informative, a better understanding of the interactions between diet and genes as potentially determining factors in the validity, application and interpretation of dietary biomarkers is necessary. It is the aim of this review to highlight how some important biomarkers are being applied in nutrition epidemiology and to address some associated questions and limitations. This review also emphasizes the need to identify new dietary biomarkers and highlights the emerging field of nutritional metabonomics as an analytical method to assess metabolic profiles as measures of dietary exposures and indicators of dietary patterns, dietary changes or effectiveness of dietary interventions. The review will also touch upon new statistical methodologies for the combination of dietary questionnaire and biomarker data for disease risk assessment. It is clear that dietary biomarkers require much further research in order to be better applied and interpreted. Future priorities should be to integrate high quality dietary intake information, measurements of dietary biomarkers, metabolic profiles of specific dietary patterns, genetics and novel statistical methodology in order to provide important new insights into gene-diet-lifestyle-disease risk associations.  相似文献   

10.
11.
12.
Toxicogenomics (TGx) can be defined as the application of "omics" techniques to toxicology and risk assessment. By identifying molecular changes associated with toxicity, TGx data might assist hazard identification and investigate causes. Early technical challenges were evaluated and addressed by consortia (e.g. ISLI/HESI and the Microarray Quality Control consortium), which demonstrated that TGx gave reliable and reproducible information. The MAQC also produced "best practice on signature generation" after conducting an extensive evaluation of different methods on common datasets. Two findings of note were the need for methods that control batch variability, and that the predictive ability of a signature changes in concert with the variability of the endpoint. The key challenge remaining is data interpretation, because TGx can identify molecular changes that are causal, associated with or incidental to toxicity. Application of Bradford Hill's tests for causation, which are used to build mode of action (MOA) arguments, can produce reasonable hypotheses linking altered pathways to phenotypic changes. However, challenges in interpretation still remain: are all pathway changes equal, which are most important and plausibly linked to toxicity? Therefore the expert judgement of the toxicologist is still needed. There are theoretical reasons why consistent alterations across a metabolic pathway are important, but similar changes in signalling pathways may not alter information flow. At the molecular level thresholds may be due to the inherent properties of the regulatory network, for example switch-like behaviours from some network motifs (e.g. positive feedback) in the perturbed pathway leading to the toxicity. The application of systems biology methods to TGx data can generate hypotheses that explain why a threshold response exists. However, are we adequately trained to make these judgments? There is a need for collaborative efforts between regulators, industry and academia to properly define how these technologies can be applied using appropriate case-studies.  相似文献   

13.
Nonhuman primate models of Parkinson's disease   总被引:3,自引:0,他引:3  
Nonhuman primate (NHP) models of Parkinson's disease (PD) play an essential role in the understanding of PD pathophysiology and the assessment of PD therapies. NHP research enabled the identification of environmental risk factors for the development of PD. Electrophysiological studies in NHP models of PD identified the neural circuit responsible for PD motor symptoms, and this knowledge led to the development of subthalamic surgical ablation and deep brain stimulation. Similar to human PD patients, parkinsonian monkeys are responsive to dopamine replacement therapies and present complications associated with their long-term use, a similarity that facilitated the assessment of new symptomatic treatments, such as dopaminergic agonists. New generations of compounds and novel therapies that use directed intracerebral delivery of drugs, cells, and viral vectors benefit from preclinical evaluation in NHP models of PD. There are several NHP models of PD, each with characteristics that make it suitable for the study of different aspects of the disease or potential new therapies. Investigators who use the models and peer scientists who evaluate their use need information about the strengths and limitations of the different PD models and their methods of evaluation. This article provides a critical review of available PD monkey models, their utilization, and how they compare to emerging views of PD as a multietiologic, multisystemic disease. The various models are particularly useful for representing different aspects of PD at selected time points. This conceptualization provides clues for the development of new NHP models and facilitates the clinical translation of findings. As ever, successful application of any model depends on matching the model to the scientific question to be answered. Adequate experimental designs, with multiple outcome measures of clinical relevance and an appropriate number of animals, are essential to minimize the limitations of models and increase their predictive clinical validity.  相似文献   

14.
The unprecedented advances in molecular biology during the last two decades have resulted in a dramatic increase in knowledge about gene structure and function, an immense database of genetic sequence information, and an impressive set of efficient new technologies for monitoring genetic sequences, genetic variation, and global functional gene expression. These advances have led to a new sub-discipline of toxicology: "toxicogenomics". We define toxicogenomics as "the study of the relationship between the structure and activity of the genome (the cellular complement of genes) and the adverse biological effects of exogenous agents". This broad definition encompasses most of the variations in the current usage of this term, and in its broadest sense includes studies of the cellular products controlled by the genome (messenger RNAs, proteins, metabolites, etc.). The new "global" methods of measuring families of cellular molecules, such as RNA, proteins, and intermediary metabolites have been termed "-omic" technologies, based on their ability to characterize all, or most, members of a family of molecules in a single analysis. With these new tools, we can now obtain complete assessments of the functional activity of biochemical pathways, and of the structural genetic (sequence) differences among individuals and species, that were previously unattainable. These powerful new methods of high-throughput and multi-endpoint analysis include gene expression arrays that will soon permit the simultaneous measurement of the expression of all human genes on a single "chip". Likewise, there are powerful new methods for protein analysis (proteomics: the study of the complement of proteins in the cell) and for analysis of cellular small molecules (metabonomics: the study of the cellular metabolites formed and degraded under genetic control). This will likely be extended in the near future to other important classes of biomolecules such as lipids, carbohydrates, etc. These assays provide a general capability for global assessment of many classes of cellular molecules, providing new approaches to assessing functional cellular alterations. These new methods have already facilitated significant advances in our understanding of the molecular responses to cell and tissue damage, and of perturbations in functional cellular systems.As a result of this rapidly changing scientific environment, regulatory and industrial toxicology practice is poised to undergo dramatic change during the next decade. These advances present exciting opportunities for improved methods of identifying and evaluating potential human and environmental toxicants, and of monitoring the effects of exposures to these toxicants. These advances also present distinct challenges. For example, the significance of specific changes and the performance characteristics of new methods must be fully understood to avoid misinterpretation of data that could lead to inappropriate conclusions about the toxicity of a chemical or a mechanism of action. We discuss the likely impact of these advances on the fields of general and genetic toxicology, and risk assessment. We anticipate that these new technologies will (1) lead to new families of biomarkers that permit characterization and efficient monitoring of cellular perturbations, (2) provide an increased understanding of the influence of genetic variation on toxicological outcomes, and (3) allow definition of environmental causes of genetic alterations and their relationship to human disease. The broad application of these new approaches will likely erase the current distinctions among the fields of toxicology, pathology, genetic toxicology, and molecular genetics. Instead, a new integrated approach will likely emerge that involves a comprehensive understanding of genetic control of cellular functions, and of cellular responses to alterations in normal molecular structure and function.  相似文献   

15.
Problem formulation, risk analysis, and risk characterization are, respectively, the design, estimation, and interpretation stages of risk assessment. Models traditionally have been used to estimate exposure and effects; now opportunities are growing to use them to design and interpret risk assessments as well. This could raise the level of rigor, reproducibility, and transparency in the risk assessment process, and improve the way information and expertise gets integrated to advise risk managers. The importance of good design and interpretation to the success of risk assessment and risk management, and the role of modeling in that success, is becoming increasingly apparent, but to date models are used only to a fraction of their potential. We provide two examples of the use of models to design and interpret risk assessments. The first looks at the use of models to better characterize risks by modeling uncertainties and exposure from offsite sources, and the second to forecast future risks of a new technology. Following the examples, we discuss some important obstacles to translating new modeling opportunities into practice. These include practical limits on the abilities of organizations to assimilate new tools and methods, and conceptual limits in the way people think about models.  相似文献   

16.
DNA barcodes, like traditional sources of taxonomic information, are potentially powerful heuristics in the identification of described species but require mindful analytical interpretation. The role of DNA barcoding in generating hypotheses of new taxa in need of formal taxonomic treatment is discussed, and it is emphasized that the recursive process of character evaluation is both necessary and best served by understanding the empirical mechanics of the discovery process. These undertakings carry enormous ramifications not only for the translation of DNA sequence data into taxonomic information but also for our comprehension of the magnitude of species diversity and its disappearance. This paper examines the potential strengths and pitfalls of integrating DNA sequence data, specifically in the form of DNA barcodes as they are currently generated and analyzed, with taxonomic practice.  相似文献   

17.
The European Workshop for Rheumatology Research met this year in Leiden, The Netherlands. The Workshop provided a platform to feast on new technologies and how they have taken research programmes forward. While there will be the inevitable delay during which mechanisms are devised for analysing the huge amount of information generated by these technologies, there is a lot already to look forward to. Highlights included genomic, reverse genomic and proteomic approaches to understanding disease pathogenesis and to identifying new therapeutic targets. Opportunities for exploring whether pharmacogenomics has a place in the clinic are now a reality, and phage display technology has been applied to in vivo arthritis models to identify human synovial microvascular 'post codes'.  相似文献   

18.
The market for biotherapeutic monoclonal antibodies (mAbs) is large and is growing rapidly. However, attrition poses a significant challenge for the development of mAbs, and for biopharmaceuticals in general, with large associated costs in resource and animal use. Termination of candidate mAbs may occur due to poor translation from preclinical models to human safety. It is critical that the industry addresses this problem to maintain productivity. Though attrition poses a significant challenge for pharmaceuticals in general, there are specific challenges related to the development of antibody-based products. Due to species specificity, non-human primates (NHP) are frequently the only pharmacologically relevant species for nonclinical safety and toxicology testing for the majority of antibody-based products, and therefore, as more mAbs are developed, increased NHP use is anticipated. The integration of new and emerging in vitro and in silico technologies, e.g., cell- and tissue-based approaches, systems pharmacology and modeling, have the potential to improve the human safety prediction and the therapeutic mAb development process, while reducing and refining animal use simultaneously. In 2014, to engage in open discussion about the challenges and opportunities for the future of mAb development, a workshop was held with over 60 regulators and experts in drug development, mechanistic toxicology and emerging technologies to discuss this issue. The workshop used industry case-studies to discuss the value of the in vivo studies and identify opportunities for in vitro technologies in human safety assessment. From these and continuing discussions it is clear that there are opportunities to improve safety assessment in mAb development using non-animal technologies, potentially reducing future attrition, and there is a shared desire to reduce animal use through minimised study design and reduced numbers of studies.  相似文献   

19.
Goh WW  Lee YH  Chung M  Wong L 《Proteomics》2012,12(4-5):550-563
Proteomics provides important information--that may not be inferable from indirect sources such as RNA or DNA--on key players in biological systems or disease states. However, it suffers from coverage and consistency problems. The advent of network-based analysis methods can help in overcoming these problems but requires careful application and interpretation. This review considers briefly current trends in proteomics technologies and understanding the causes of critical issues that need to be addressed--i.e., incomplete data coverage and inter-sample inconsistency. On the coverage issue, we argue that holistic analysis based on biological networks provides a suitable background on which more robust models and interpretations can be built upon; and we introduce some recently developed approaches. On consistency, group-based approaches based on identified clusters, as well as on properly integrated pathway databases, are particularly useful. Despite that protein interactions and pathway networks are still largely incomplete, given proper quality checks, applications and reasonably sized data sets, they yield valuable insights that greatly complement data generated from quantitative proteomics.  相似文献   

20.
Evaluation of: Deighton RF, Kerr LE, Short DM et al. Network generation enhances interpretation of proteomics data from induced apoptosis. Proteomics DOI: 10.1002/pmic.200900112 (2010) (Epub ahead of print).

The huge ongoing improvements in proteomics technologies, including the development of high-throughput mass spectrometry, are resulting in ever increasing information on protein behavior during cellular processes. The exponential accumulation of proteomics data has the promise to advance biomedical sciences by shedding light on the most important events that regulate mammalian cells under normal and pathophysiological conditions. This may provide practical insights that will impact medical practice and therapy, and may permit the development of a new generation of personalized therapeutics. Proteomics, as a powerful tool, creates numerous opportunities as well as challenges. At the different stages, data interpretation requires proteomics analysis, various tools to help deal with large proteomics data banks and the extraction of more functional information. Network analysis tools facilitate proteomics data interpretation and predict protein functions, functional interactions and in silica identification of intracellular pathways. The work reported by Deighton and colleagues illustrates an example of improving proteomics data interpretation by network generation. The authors used ingenuity pathway analysis to generate a protein network predicting direct and indirect interaction between 13 proteins found to be affected by staurosporine treatment. Importantly, the authors highlight the caution required when interpreting the results from a small number of proteins analyzed using network analysis tools.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号