首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The aims of this paper are to debate and raise awareness about the use of systematic, interconnected approaches for biodiversity collection curation by exploring the multi-disciplinary relevance of quality management tools developed by clinical biobanks. An appraisal of their best practices indicated the need for improved sample and process chain annotation as a significant number of historical collections used in medical research were of inadequate quality. This stimulated the creation of a new discipline, biospecimen science to develop quality management tools for clinical biobanks, two of which, Biospecimen Reporting for Improved Study Quality (BRISQ) and the Standard PRE-analytical Code (SPREC) report critical information about samples and process chain variables. Unprecedented advances in molecular-genetic and in silico technologies applied across the tree of life require international conservation networks to generate and share knowledge. This is used in biodiversity and systematics research, and to address the accelerating loss of species, including the sustainable use of bioresources. This review investigates the application of BRISQ and SPREC for biodiversity research and conservation using natural history, museum and living culture collections as case studies. The distinction between preservation and conservation is discussed with regard to process and storage treatments and how they impact on the usability of biospecimens and cultures. We conclude: (i) more rigorous approaches are needed for the quality management of biospecimens, bioresources and their associated sample and processing data to assure their fitness-for-purpose; and (ii) biospecimen science tools developed by clinical biobanks can be adapted to future-proof the quality of biodiversity collections and the reliability of molecular data generated from their use.  相似文献   

2.
Abstract

High quality cancer research using human biospecimens must be based on biospecimens that have been obtained according to rigorous ethical and procedural standards. Methods for biospecimen collection, stabilization, processing and storage for research, however, are highly variable and the biospecimens available for research often are of unknown quality. The National Cancer Institute Biospecimen Research Network program was initiated in 2006 to conduct, sponsor, report and collaborate on research to better understand the effects of different biospecimen collection methods on downstream molecular analysis. An online Biospecimen Research Database and an annual symposium, Advancing Cancer Research through Biospecimen Science, have been developed and many research projects are underway to develop a knowledge base from which to develop evidence-based biospecimen standard operating procedures and methods for assessing biospecimen quality. These efforts will enable better cancer research and development efforts.  相似文献   

3.
Human biospecimen samples (HBS) and associated data stored in biobanks (also called “biotrusts,” “biorepositories,” or “biodistributors”) are very critical resources for translational research. As HBS quality is decisive to the reproducibility of research results, biobanks are also key assets for new developments in precision medicine. Biobanks are more than infrastructures providing HBS and associated data. Biobanks have pioneered in identifying and standardizing sources of preanalytical variations in HBS, thus paving the way for the current biospecimen science. To achieve this milestone, biobankers have successively assumed the role of “detective,” and then “architect,” to identify new detrimental impact of preanalytical variables on the tissue integrity. While standardized methods in omics are required to be practiced throughout research communities, the accepted best practices and standards on biospecimen handling are generally not known nor applied by researchers. Therefore, it is mandatory to raise the awareness within omics communities regarding not only the basic concepts of collecting, storing, and utilizing HBS today, but also to suggest insights on biobanking in the cancer omics context.  相似文献   

4.

Background

Cancer health disparities research depends on access to biospecimens from diverse racial/ethnic populations. This multimethodological study, using mixed methods for quantitative and qualitative analysis of survey results, assessed barriers, concerns, and practices for sharing biospecimens/data among researchers working with biospecimens from minority populations in a 5 state region of the United States (Arizona, Colorado, New Mexico, Oklahoma, and Texas). The ultimate goals of this research were to understand data sharing barriers among biomedical researchers; guide strategies to increase participation in biospecimen research; and strengthen collaborative opportunities among researchers.

Methods and Population

Email invitations to anonymous participants (n = 605 individuals identified by the NIH RePORT database), resulted in 112 responses. The survey assessed demographics, specimen collection data, and attitudes about virtual biorepositories. Respondents were primarily principal investigators at PhD granting institutions (91.1%) conducting basic (62.3%) research; most were non-Hispanic White (63.4%) and men (60.6%). The low response rate limited the statistical power of the analyses, further the number of respondents for each survey question was variable.

Results

Findings from this study identified barriers to biospecimen research, including lack of access to sufficient biospecimens, and limited availability of diverse tissue samples. Many of these barriers can be attributed to poor annotation of biospecimens, and researchers’ unwillingness to share existing collections. Addressing these barriers to accessing biospecimens is essential to combating cancer in general and cancer health disparities in particular. This study confirmed researchers’ willingness to participate in a virtual biorepository (n = 50 respondents agreed). However, researchers in this region listed clear specifications for establishing and using such a biorepository: specifications related to standardized procedures, funding, and protections of human subjects and intellectual property. The results help guide strategies to increase data sharing behaviors and to increase participation of researchers with multiethnic biospecimen collections in collaborative research endeavors

Conclusions

Data sharing by researchers is essential to leveraging knowledge and resources needed for the advancement of research on cancer health disparities. Although U.S. funding entities have guidelines for data and resource sharing, future efforts should address researcher preferences in order to promote collaboration to address cancer health disparities.  相似文献   

5.
For scientific, ethical and economic reasons, experiments involving animals should be appropriately designed, correctly analysed and transparently reported. This increases the scientific validity of the results, and maximises the knowledge gained from each experiment. A minimum amount of relevant information must be included in scientific publications to ensure that the methods and results of a study can be reviewed, analysed and repeated. Omitting essential information can raise scientific and ethical concerns. We report the findings of a systematic survey of reporting, experimental design and statistical analysis in published biomedical research using laboratory animals. Medline and EMBASE were searched for studies reporting research on live rats, mice and non-human primates carried out in UK and US publicly funded research establishments. Detailed information was collected from 271 publications, about the objective or hypothesis of the study, the number, sex, age and/or weight of animals used, and experimental and statistical methods. Only 59% of the studies stated the hypothesis or objective of the study and the number and characteristics of the animals used. Appropriate and efficient experimental design is a critical component of high-quality science. Most of the papers surveyed did not use randomisation (87%) or blinding (86%), to reduce bias in animal selection and outcome assessment. Only 70% of the publications that used statistical methods described their methods and presented the results with a measure of error or variability. This survey has identified a number of issues that need to be addressed in order to improve experimental design and reporting in publications describing research using animals. Scientific publication is a powerful and important source of information; the authors of scientific publications therefore have a responsibility to describe their methods and results comprehensively, accurately and transparently, and peer reviewers and journal editors share the responsibility to ensure that published studies fulfil these criteria.  相似文献   

6.
Analysis of formalin-fixed paraffin-embedded (FFPE) tissue by immunohistochemistry (IHC) is commonplace in clinical and research laboratories. However, reports suggest that IHC results can be compromised by biospecimen preanalytical factors. The National Cancer Institute’s Biospecimen Preanalytical Variables Program conducted a systematic study to examine the potential effects of delay to fixation (DTF) and time in fixative (TIF) on IHC using 24 cancer biomarkers. Differences in IHC staining, relative to controls with a DTF of 1 hr, were observed in FFPE kidney tumor specimens after a DTF of ≥2 hr. Reductions in H-score and/or staining intensity were observed for c-MET, p53, PAX2, PAX8, pAKT, and survivin, whereas increases were observed for RCC1, EGFR, and CD10. Prolonged TIF of 72 hr resulted in significantly reduced H-scores of CD44 and c-Met in kidney tumor specimens, compared with controls with 12-hr TIF. An elevated probability of altered staining intensity due to DTF was observed for nine antigens, whereas for prolonged TIF an elevated probability was observed for one antigen. Results reported here and elsewhere across tumor types and antigens support limiting DTF to ≤1 hr when possible and fixing tissues in formalin for 12–24 hr to avoid confounding effects of these preanalytical factors on IHC.  相似文献   

7.
8.
9.
Background: Reproducibility is a defining feature of a scientific discovery. Reproducibility can be at different levels for different types of study. The purpose of the Human Cell Atlas (HCA) project is to build maps of molecular signatures of all human cell types and states to serve as references for future discoveries. Constructing such a complex reference atlas must involve the assembly and aggregation of data from multiple labs, probably generated with different technologies. It has much higher requirements on reproducibility than individual research projects. To add another layer of complexity, the bioinformatics procedures involved for single-cell data have high flexibility and diversity. There are many factors in the processing and analysis of single-cell RNA-seq data that can shape the final results in different ways. Methods: To study what levels of reproducibility can be reached in current practices, we conducted a detailed reproduction study for a well-documented recent publication on the atlas of human blood dendritic cells as an example to break down the bioinformatics steps and factors that are crucial for the reproducibility at different levels. Results: We found that the major scientific discovery can be well reproduced after some efforts, but there are also some differences in some details that may cause uncertainty in the future reference. This study provides a detailed case observation on the on-going discussions of the type of standards the HCA community should take when releasing data and publications to guarantee the reproducibility and reliability of the future atlas. Conclusion: Current practices of releasing data and publications may not be adequate to guarantee the reproducibility of HCA. We propose building more stringent guidelines and standards on the information that needs to be provided along with publications for projects that evolved in the HCA program.  相似文献   

10.
Quantitative metabolite profiling in biological samples has the potential to reflect physiological status and to identify disease associated disturbances in metabolic networks. However, this approach is hampered by a wide range of preanalytical variables. Hence, the aim of our study was to develop a standardized preanalytical protocol for metabolite profiling of amino acids and acylcarnitines in human blood. Amino acids and acylcarnitines were simultaneous analyzed after butylation of 3 μL dried blood or 10 μL whole blood, serum and anticoagulated plasma using electrospray tandem-mass spectrometry. The influence of exogenous and endogenous preanalytical variables was investigated in healthy volunteers. Different sampling materials and anticoagulants for blood taking were investigated. Concentrations of long-chain acylcarnitines were 5-fold higher in EDTA-whole blood or dried whole blood compared to serum and anticoagulated plasma. Significant differences in amino acid concentrations were found for capillary versus venous blood taking. Fasting for 8 h before specimen collection minimized the nutritional influence. Physical activity significantly alters amino acid and short chain acylcarnitine concentrations. As a result of our preanalytical investigation we developed a pre-treatment protocol based on EDTA whole blood dried on filter paper to reduce the preanalytical variability and facilitate reproducible quantitative metabolite profiling in clinical trials.  相似文献   

11.
The scientific journal Nature Methods have just retracted a publication that reported numerous unexpected mutations after a CRISPR-Cas9 experiment based on collecting whole genome sequencing information from one control and two experimental genome edited mice. In the intervening 10 months since publication the data presented have been strongly contested and criticized by the scientific and biotech communities, through publications, open science channels and social networks. The criticism focused on the animal used as control, which was derived from the same mouse strain as the experimental individuals but from an unrelated sub-colony, hence control and experimental mice were genetically divergent. The most plausible explanation for the vast majority of the reported unexpected mutations were the expected underlying genetic polymorphisms that normally accumulate in two different colonies of the same mouse strain which occur as a result of spontaneous mutations and genetic drift. Therefore, the reported mutations were most likely not related to CRISPR-Cas9 activity.  相似文献   

12.
High quality clinical biospecimens are vital for biomarker discovery, verification, and validation. Variations in blood processing and handling can affect protein abundances and assay reliability. Using an untargeted LC-MS approach, we systematically measured the impact of preanalytical variables on the plasma proteome. Time prior to processing was the only variable that affected the plasma protein levels. LC-MS quantification showed that preprocessing times <6 h had minimal effects on the immunodepleted plasma proteome, but by 4 days significant changes were apparent. Elevated levels of many proteins were observed, suggesting that in addition to proteolytic degradation during the preanalytical phase, changes in protein structure are also important considerations for protocols using antibody depletion. As to processing variables, a comparison of single- vs double-spun plasma showed minimal differences. After processing, the impact ?3 freeze–thaw cycles was negligible regardless of whether freshly collected samples were processed in short succession or the cycles occurred during 14–17 years of frozen storage (−80 °C). Thus, clinical workflows that necessitate modest delays in blood processing times or employ different centrifugation steps can yield valuable samples for biomarker discovery and verification studies.  相似文献   

13.
Governance, underlying general ICT (Information and Communication Technology) architecture, and workflow of the Central Research Infrastructure for molecular Pathology (CRIP) are discussed as a model enabling biobank networks to form operational “meta biobanks” whilst respecting the donors’ privacy, biobank autonomy and confidentiality, and the researchers’ needs for appropriate biospecimens and information, as well as confidentiality. Tailored to these needs, CRIP efficiently accelerates and facilitates research with human biospecimens and data.  相似文献   

14.
The standardized nomenclature of rodent strains, genes and mutations has long been the focus of careful attention. Its aim is to provide proper designation of laboratory animals used in research projects and to convey as much information on each strain as possible. Since the development of different techniques to mutate the genome of laboratory rodents on a large scale, the correct application of current nomenclature systems is of increased significance. It facilitates not only the accurate communication of scientific results but is indispensable in controlling the dramatically increased number of transgenic animal models in experimental units, archives and databases. It is regrettable that many publications, especially on transgenic rodents, use vague and inappropriate strain designation. This situation should definitely be improved, particularly considering the increasingly emphasized importance of genetic background on the phenotype of mutations. The aim of these guidelines is to raise awareness about specific features of production and of the current nomenclature system used for transgenic rodents.  相似文献   

15.
Serum protein profiling by MS is a promising method for early detection of disease. Important characteristics for serum protein profiling are preanalytical factors, analytical reproducibility and high throughput. Problems related to preanalytical factors can be overcome by using standardized and rigorous sample collection and sample handling protocols. The sensitivity of the MS analysis relies on the quality of the sample; consequently, the blood sample preparation step is crucial to obtain pure and concentrated samples and enrichment of the proteins and peptides of interest. This review focuses on the serum sample preparation step prior to protein profiling by MALDI MS analysis, with particular focus on various SPE methods. The application of SPE techniques with different chromatographic properties such as RP, ion exchange, or affinity binding to isolate specific subsets of molecules (subproteomes) is advantageous for increasing resolution and sensitivity in the subsequent MS analysis. In addition, several of the SPE sample preparation methods are simple and scalable and have proven easy to automate for higher reproducibility and throughput, which is important in a clinical proteomics setting.  相似文献   

16.
There is a wealth of knowledge in the field of in vitro diagnostics with regard to preanalytical variables and their impact on the determination of peptide and protein analytes in human serum and plasma. This information is applicable to clinical proteomics investigations, which utilize the same sample types. Studies have demonstrated that the majority of variations and errors in in vitro diagnostics seem to occur in the preanalytical phase prior to specimen analysis. Preanalytical processes include study design, compliance of the subjects investigated, compliance of the technical staff in adherence to protocols, choice of specimens utilized and sample collection and processing. These variables can have a dramatic impact on the determination of analytes and can affect result outcomes, reproducibility and the validity of investigations. By drawing analogies to in vitro diagnostics practices, specific variables that are likely to impact the results of proteomics studies can be identified. Recognition of such variables is the first step towards their understanding and, eventually, controlling their impact. In this article, we will review preanalytical variables, provide examples for their effects on the determination of distinct peptides and proteins and discuss potential implications for clinical proteomics investigations.  相似文献   

17.
Human biospecimen collection, processing and preservation are rapidly emerging subjects providing essential support to clinical as well as basic researchers. Unlike collection of other biospecimens (e.g. DNA and serum), biobanking of viable immune cells, such as peripheral blood mononuclear cells (PBMC) and/or isolated immune cell subsets is still in its infancy. While certain aspects of processing and freezing conditions have been studied in the past years, little is known about the effect of blood transportation on immune cell survival, phenotype and specific functions. However, especially for multicentric and cooperative projects it is vital to precisely know those effects. In this study we investigated the effect of blood shipping and pre-processing delay on immune cell phenotype and function both on cellular and subcellular levels. Peripheral blood was collected from healthy volunteers (n = 9): at a distal location (shipped overnight) and in the central laboratory (processed immediately). PBMC were processed in the central laboratory and analyzed post-cryopreservation. We analyzed yield, major immune subset distribution, proliferative capacity of T cells, cytokine pattern and T-cell receptor signal transduction. Results show that overnight transportation of blood samples does not globally compromise T- cell subsets as they largely retain their phenotype and proliferative capacity. However, NK and B cell frequencies, the production of certain PBMC-derived cytokines and IL-6 mediated cytokine signaling pathway are altered due to transportation. Various control experiments have been carried out to compare issues related to shipping versus pre-processing delay on site. Our results suggest the implementation of appropriate controls when using multicenter logistics for blood transportation aiming at subsequent isolation of viable immune cells, e.g. in multicenter clinical trials or studies analyzing immune cells/subsets. One important conclusion might be that despite changes due to overnight shipment, highly standardized central processing (and analysis) could be superior to multicentric de-central processing with more difficult standardization.  相似文献   

18.
19.
According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, we present a brand new scientific theory that offers an unique mechanism for brain information processing. We demonstrate that the neural coding produced by the activity of the brain is well described by our theory of energy coding. Due to the energy coding model’s ability to reveal mechanisms of brain information processing based upon known biophysical properties, we can not only reproduce various experimental results of neuro-electrophysiology, but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, we estimate that the theory has very important consequences for quantitative research of cognitive function.  相似文献   

20.
The magnitude and direction of reported physiological effects induced using transcranial magnetic stimulation (TMS) to modulate human motor cortical excitability have proven difficult to replicate routinely. We conducted an online survey on the prevalence and possible causes of these reproducibility issues. A total of 153 researchers were identified via their publications and invited to complete an anonymous internet-based survey that asked about their experience trying to reproduce published findings for various TMS protocols. The prevalence of questionable research practices known to contribute to low reproducibility was also determined. We received 47 completed surveys from researchers with an average of 16.4 published papers (95% CI 10.8–22.0) that used TMS to modulate motor cortical excitability. Respondents also had a mean of 4.0 (2.5–5.7) relevant completed studies that would never be published. Across a range of TMS protocols, 45–60% of respondents found similar results to those in the original publications; the other respondents were able to reproduce the original effects only sometimes or not at all. Only 20% of respondents used formal power calculations to determine study sample sizes. Others relied on previously published studies (25%), personal experience (24%) or flexible post-hoc criteria (41%). Approximately 44% of respondents knew researchers who engaged in questionable research practices (range 32–70%), yet only 18% admitted to engaging in them (range 6–38%). These practices included screening subjects to find those that respond in a desired way to a TMS protocol, selectively reporting results and rejecting data based on a gut feeling. In a sample of 56 published papers that were inspected, not a single questionable research practice was reported. Our survey revealed that approximately 50% of researchers are unable to reproduce published TMS effects. Researchers need to start increasing study sample size and eliminating—or at least reporting—questionable research practices in order to make the outcomes of TMS research reproducible.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号