首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Renewed efforts in tuberculosis (TB) research have led to important new insights into the biology and epidemiology of this devastating disease. Yet, in the face of the modern epidemics of HIV/AIDS, diabetes, and multidrug resistance—all of which contribute to susceptibility to TB—global control of the disease will remain a formidable challenge for years to come. New high-throughput genomics technologies are already contributing to studies of TB''s epidemiology, comparative genomics, evolution, and host–pathogen interaction. We argue here, however, that new multidisciplinary approaches—especially the integration of epidemiology with systems biology in what we call “systems epidemiology”—will be required to eliminate TB.  相似文献   

2.
Fortney K  Jurisica I 《Human genetics》2011,130(4):465-481
Over the past two decades, high-throughput (HTP) technologies such as microarrays and mass spectrometry have fundamentally changed clinical cancer research. They have revealed novel molecular markers of cancer subtypes, metastasis, and drug sensitivity and resistance. Some have been translated into the clinic as tools for early disease diagnosis, prognosis, and individualized treatment and response monitoring. Despite these successes, many challenges remain: HTP platforms are often noisy and suffer from false positives and false negatives; optimal analysis and successful validation require complex workflows; and great volumes of data are accumulating at a rapid pace. Here we discuss these challenges, and show how integrative computational biology can help diminish them by creating new software tools, analytical methods, and data standards.  相似文献   

3.
《Journal of Proteomics》2010,73(2):252-266
In recent years, affinity-based technologies have become important tools for serum profiling to uncover protein expression patterns linked to disease state or therapeutic effects. In this study, we describe a path towards the production of an antibody microarray to allow protein profiling of biotinylated human serum samples with reproducible sensitivity in the picomolar range. With the availability of growing numbers of affinity reagents, protein profiles are to be validated in efficient manners and we describe a cross-platform strategy based on data concordance with a suspension bead array to interrogate the identical set of antibodies with the same cohort of serum samples. Comparative analysis enabled to screen for high-performing antibodies, which were displaying consistent results across the two platforms and targeting known serum components. Moreover, data processing methods such as sample referencing and normalization were evaluated for their effects on inter-platform agreement. Our work suggests that mutual validation of protein expression profiles using alternative microarray platforms holds great potential in becoming an important and valuable component in affinity-based high-throughput proteomic screenings as it allows to narrow down the number of discovered targets prior to orthogonal, uniplexed validation approaches.  相似文献   

4.
Accurate and efficient genome-wide detection of copy number variants (CNVs) is essential for understanding human genomic variation, genome-wide CNV association type studies, cytogenetics research and diagnostics, and independent validation of CNVs identified from sequencing based technologies. Numerous, array-based platforms for CNV detection exist utilizing array Comparative Genome Hybridization (aCGH), Single Nucleotide Polymorphism (SNP) genotyping or both. We have quantitatively assessed the abilities of twelve leading genome-wide CNV detection platforms to accurately detect Gold Standard sets of CNVs in the genome of HapMap CEU sample NA12878, and found significant differences in performance. The technologies analyzed were the NimbleGen 4.2 M, 2.1 M and 3×720 K Whole Genome and CNV focused arrays, the Agilent 1×1 M CGH and High Resolution and 2×400 K CNV and SNP+CGH arrays, the Illumina Human Omni1Quad array and the Affymetrix SNP 6.0 array. The Gold Standards used were a 1000 Genomes Project sequencing-based set of 3997 validated CNVs and an ultra high-resolution aCGH-based set of 756 validated CNVs. We found that sensitivity, total number, size range and breakpoint resolution of CNV calls were highest for CNV focused arrays. Our results are important for cost effective CNV detection and validation for both basic and clinical applications.  相似文献   

5.
To apply exome-seq-derived variants in the clinical setting, there is an urgent need to identify the best variant caller(s) from a large collection of available options. We have used an Illumina exome-seq dataset as a benchmark, with two validation scenarios—family pedigree information and SNP array data for the same samples, permitting global high-throughput cross-validation, to evaluate the quality of SNP calls derived from several popular variant discovery tools from both the open-source and commercial communities using a set of designated quality metrics. To the best of our knowledge, this is the first large-scale performance comparison of exome-seq variant discovery tools using high-throughput validation with both Mendelian inheritance checking and SNP array data, which allows us to gain insights into the accuracy of SNP calling through such high-throughput validation in an unprecedented way, whereas the previously reported comparison studies have only assessed concordance of these tools without directly assessing the quality of the derived SNPs. More importantly, the main purpose of our study was to establish a reusable procedure that applies high-throughput validation to compare the quality of SNP discovery tools with a focus on exome-seq, which can be used to compare any forthcoming tool(s) of interest.  相似文献   

6.
Although the recent advances in stem cell engineering have gained a great deal of attention due to their high potential in clinical research, the applicability of stem cells for preclinical screening in the drug discovery process is still challenging due to difficulties in controlling the stem cell microenvironment and the limited availability of high-throughput systems. Recently, researchers have been actively developing and evaluating three-dimensional (3D) cell culture-based platforms using microfluidic technologies, such as organ-on-a-chip and organoid-on-a-chip platforms, and they have achieved promising breakthroughs in stem cell engineering. In this review, we start with a comprehensive discussion on the importance of microfluidic 3D cell culture techniques in stem cell research and their technical strategies in the field of drug discovery. In a subsequent section, we discuss microfluidic 3D cell culture techniques for high-throughput analysis for use in stem cell research. In addition, some potential and practical applications of organ-on-a-chip or organoid-on-a-chip platforms using stem cells as drug screening and disease models are highlighted.  相似文献   

7.
The importance of structural variants(SVs) for human phenotypes and diseases is now recognized.Although a variety of SV detection platforms and strategies that vary in sensitivity and specificity have been developed,few benchmarking procedures are available to confidently assess their performances in biological and clinical research.To facilitate the validation and application of these SV detection approaches,we established an Asian reference material by characterizing the genome of an Epstein-B...  相似文献   

8.

Background

In industry and academic research, there is an increasing demand for flexible automated microfermentation platforms with advanced sensing technology. However, up to now, conventional platforms cannot generate continuous data in high-throughput cultivations, in particular for monitoring biomass and fluorescent proteins. Furthermore, microfermentation platforms are needed that can easily combine cost-effective, disposable microbioreactors with downstream processing and analytical assays.

Results

To meet this demand, a novel automated microfermentation platform consisting of a BioLector and a liquid-handling robot (Robo-Lector) was sucessfully built and tested. The BioLector provides a cultivation system that is able to permanently monitor microbial growth and the fluorescence of reporter proteins under defined conditions in microtiter plates. Three examplary methods were programed on the Robo-Lector platform to study in detail high-throughput cultivation processes and especially recombinant protein expression. The host/vector system E. coli BL21(DE3) pRhotHi-2-EcFbFP, expressing the fluorescence protein EcFbFP, was hereby investigated. With the method 'induction profiling' it was possible to conduct 96 different induction experiments (varying inducer concentrations from 0 to 1.5 mM IPTG at 8 different induction times) simultaneously in an automated way. The method 'biomass-specific induction' allowed to automatically induce cultures with different growth kinetics in a microtiter plate at the same biomass concentration, which resulted in a relative standard deviation of the EcFbFP production of only ± 7%. The third method 'biomass-specific replication' enabled to generate equal initial biomass concentrations in main cultures from precultures with different growth kinetics. This was realized by automatically transferring an appropiate inoculum volume from the different preculture microtiter wells to respective wells of the main culture plate, where subsequently similar growth kinetics could be obtained.

Conclusion

The Robo-Lector generates extensive kinetic data in high-throughput cultivations, particularly for biomass and fluorescence protein formation. Based on the non-invasive on-line-monitoring signals, actions of the liquid-handling robot can easily be triggered. This interaction between the robot and the BioLector (Robo-Lector) combines high-content data generation with systematic high-throughput experimentation in an automated fashion, offering new possibilities to study biological production systems. The presented platform uses a standard liquid-handling workstation with widespread automation possibilities. Thus, high-throughput cultivations can now be combined with small-scale downstream processing techniques and analytical assays. Ultimately, this novel versatile platform can accelerate and intensify research and development in the field of systems biology as well as modelling and bioprocess optimization.  相似文献   

9.
The widespread use of DNA microarrays has led to the discovery of many genes whose expression profile may have significant clinical relevance. The translation of this data to the bedside requires that gene expression be validated as protein expression, and that annotated clinical samples be available for correlative and quantitative studies to assess clinical context and usefulness of putative biomarkers. We review two microarray platforms developed to facilitate the clinical validation of candidate biomarkers: tissue microarrays and reverse-phase protein microarrays. Tissue microarrays are arrays of core biopsies obtained from paraffin-embedded tissues, which can be assayed for histologically-specific protein expression by immunohistochemistry. Reverse-phase protein microarrays consist of arrays of cell lysates or, more recently, plasma or serum samples, which can be assayed for protein quantity and for the presence of post-translational modifications such as phosphorylation. Although these platforms are limited by the availability of validated antibodies, both enable the preservation of precious clinical samples as well as experimental standardization in a high-throughput manner proper to microarray technologies. While tissue microarrays are rapidly becoming a mainstay of translational research, reverse-phase protein microarrays require further technical refinements and validation prior to their widespread adoption by research laboratories.  相似文献   

10.
This paper presents a map of Africa''s rainforests for 2005. Derived from moderate resolution imaging spectroradiometer data at a spatial resolution of 250 m and with an overall accuracy of 84%, this map provides new levels of spatial and thematic detail. The map is accompanied by measurements of deforestation between 1990, 2000 and 2010 for West Africa, Central Africa and Madagascar derived from a systematic sample of Landsat images—imagery from equivalent platforms is used to fill gaps in the Landsat record. Net deforestation is estimated at 0.28% yr−1 for the period 1990–2000 and 0.14% yr−1 for the period 2000–2010. West Africa and Madagascar exhibit a much higher deforestation rate than the Congo Basin, for example, three times higher for West Africa and nine times higher for Madagascar. Analysis of variance over the Congo Basin is then used to show that expanding agriculture and increasing fuelwood demands are key drivers of deforestation in the region, whereas well-controlled timber exploitation programmes have little or no direct influence on forest-cover reduction at present. Rural and urban population concentrations and fluxes are also identified as strong underlying causes of deforestation in this study.  相似文献   

11.
The fractional concentration of exhaled nitric oxide (FeNO) is a biomarker of airway inflammation that is being increasingly considered in clinical, occupational, and epidemiological applications ranging from asthma management to the detection of air pollution health effects. FeNO depends strongly on exhalation flow rate. This dependency has allowed for the development of mathematical models whose parameters quantify airway and alveolar compartment contributions to FeNO. Numerous methods have been proposed to estimate these parameters using FeNO measured at multiple flow rates. These methods—which allow for non-invasive assessment of localized airway inflammation—have the potential to provide important insights on inflammatory mechanisms. However, different estimation methods produce different results and a serious barrier to progress in this field is the lack of a single recommended method. With the goal of resolving this methodological problem, we have developed a unifying framework in which to present a comprehensive set of existing and novel statistical methods for estimating parameters in the simple two-compartment model. We compared statistical properties of the estimators in simulation studies and investigated model fit and parameter estimate sensitivity across methods using data from 1507 schoolchildren from the Southern California Children''s Health Study, one of the largest multiple flow FeNO studies to date. We recommend a novel nonlinear least squares model with natural log transformation on both sides that produced estimators with good properties, satisfied model assumptions, and fit the Children''s Health Study data well.  相似文献   

12.
13.
To improve treatment and reduce the mortality from cancer, a key task is to detect the disease as early as possible. To achieve this, many new technologies have been developed for biomarker discovery and validation. This review provides an overview of omics technologies in biomarker discovery and cancer detection, and highlights recent applications and future trends in cancer diagnostics. Although the present omic methods are not ready for immediate clinical use as diagnostic tools, it can be envisaged that simple, fast, robust, portable and cost-effective clinical diagnosis systems could be available in near future, for home and bedside use.  相似文献   

14.
Genetic heterogeneity in a mixed sample of tumor and normal DNA can confound characterization of the tumor genome. Numerous computational methods have been proposed to detect aberrations in DNA samples from tumor and normal tissue mixtures. Most of these require tumor purities to be at least 10–15%. Here, we present a statistical model to capture information, contained in the individual''s germline haplotypes, about expected patterns in the B allele frequencies from SNP microarrays while fully modeling their magnitude, the first such model for SNP microarray data. Our model consists of a pair of hidden Markov models—one for the germline and one for the tumor genome—which, conditional on the observed array data and patterns of population haplotype variation, have a dependence structure induced by the relative imbalance of an individual''s inherited haplotypes. Together, these hidden Markov models offer a powerful approach for dealing with mixtures of DNA where the main component represents the germline, thus suggesting natural applications for the characterization of primary clones when stromal contamination is extremely high, and for identifying lesions in rare subclones of a tumor when tumor purity is sufficient to characterize the primary lesions. Our joint model for germline haplotypes and acquired DNA aberration is flexible, allowing a large number of chromosomal alterations, including balanced and imbalanced losses and gains, copy-neutral loss-of-heterozygosity (LOH) and tetraploidy. We found our model (which we term J-LOH) to be superior for localizing rare aberrations in a simulated 3% mixture sample. More generally, our model provides a framework for full integration of the germline and tumor genomes to deal more effectively with missing or uncertain features, and thus extract maximal information from difficult scenarios where existing methods fail.  相似文献   

15.

Fungal disease is an increasingly recognised global clinical challenge associated with high mortality. Early diagnosis of fungal infection remains problematic due to the poor sensitivity and specificity of current diagnostic modalities. Advances in sequencing technologies hold promise in addressing these shortcomings and for improved fungal detection and identification. To translate such emerging approaches into mainstream clinical care will require refinement of current sequencing and analytical platforms, ensuring standardisation and consistency through robust clinical benchmarking and its validation across a range of patient populations. In this state-of-the-art review, we discuss current diagnostic and therapeutic challenges associated with fungal disease and provide key examples where the application of sequencing technologies has potential diagnostic application in assessing the human ‘mycobiome’. We assess how ready access to fungal sequencing may be exploited in broadening our insight into host–fungal interaction, providing scope for clinical diagnostics and the translation of emerging mycobiome research into clinical practice.

  相似文献   

16.
By virtue of advances in next generation sequencing technologies, we have access to new genome sequences almost daily. The tempo of these advances is accelerating, promising greater depth and breadth. In light of these extraordinary advances, the need for fast, parallel methods to define gene function becomes ever more important. Collections of genome-wide deletion mutants in yeasts and E. coli have served as workhorses for functional characterization of gene function, but this approach is not scalable, current gene-deletion approaches require each of the thousands of genes that comprise a genome to be deleted and verified. Only after this work is complete can we pursue high-throughput phenotyping. Over the past decade, our laboratory has refined a portfolio of competitive, miniaturized, high-throughput genome-wide assays that can be performed in parallel. This parallelization is possible because of the inclusion of DNA ''tags'', or ''barcodes,'' into each mutant, with the barcode serving as a proxy for the mutation and one can measure the barcode abundance to assess mutant fitness. In this study, we seek to fill the gap between DNA sequence and barcoded mutant collections. To accomplish this we introduce a combined transposon disruption-barcoding approach that opens up parallel barcode assays to newly sequenced, but poorly characterized microbes. To illustrate this approach we present a new Candida albicans barcoded disruption collection and describe how both microarray-based and next generation sequencing-based platforms can be used to collect 10,000 - 1,000,000 gene-gene and drug-gene interactions in a single experiment.  相似文献   

17.
MicroRNA profiling represents an important first-step in deducting individual RNA-based regulatory function in a cell, tissue, or at a specific developmental stage. Currently there are several different platforms to choose from in order to make the initial miRNA profiles. In this study we investigate recently developed digital microRNA high-throughput technologies. Four different platforms were compared including next generation SOLiD ligation sequencing and Illumina HiSeq sequencing, hybridization-based NanoString nCounter, and miRCURY locked nucleic acid RT-qPCR. For all four technologies, full microRNA profiles were generated from human cell lines that represent noninvasive and invasive tumorigenic breast cancer. This study reports the correlation between platforms, as well as a more extensive analysis of the accuracy and sensitivity of data generated when using different platforms and important consideration when verifying results by the use of additional technologies. We found all the platforms to be highly capable for microRNA analysis. Furthermore, the two NGS platforms and RT-qPCR all have equally high sensitivity, and the fold change accuracy is independent of individual miRNA concentration for NGS and RT-qPCR. Based on these findings we propose new guidelines and considerations when performing microRNA profiling.  相似文献   

18.
《TARGETS》2003,2(4):147-153
The most effective targeted cancer therapies have arisen from research into genetically altered oncogenes, including BCR-ABL, HER2, RAS and EGFR. Recent advances in cancer genetics have identified many regions of the genome that undergo amplification (increase in copy number) but, in most cases, the key oncogenic targets driving the growth and survival of cancer cells remain unknown. In this review, we discuss high-throughput technologies for the discovery of putative oncogenes, and clinical and functional validation of these genes as targets for therapy. New technologies in translational genomics facilitate the identification, validation and prioritization of candidate molecular targets for anti-cancer therapy.  相似文献   

19.
Water     
Water remains a scarce and valuable resource. Improving technologies for water purification, use and recycling should be a high priority for all branches of science.One of our most crucial and finite resources is freshwater. How often do biologists spare a thought for this substance, other than to think about its purity for the sake of an experiment? How often do we consider that 30 litres of cooling water are used to make one litre of double-distilled water? Americans use approximately 100 gallons per person per day, whereas millions of the world''s poor subsist on less than 5 gallons per day. Within the next 15 years, it is estimated that more than 1.8 billion people will be living in regions with severe water scarcity, partly as a result of climate change. By 2030 it is estimated that the annual global demand for water will increase from 4,500 billion m3 to 6,900 billion m3—approximately 40% more than the amount of freshwater available (Water Resources Group, 2009). We are not only facing an increasing scarcity of water, but we also misuse the available water. Approximately 2.5 billion people use rivers to dispose of waste—not to mention what industry dumps into them—while freshwater dams generate problems of their own including population displacement, the spread of new and more diseases to people living in the vicinity of the river, as well as effects on ecology and farming downstream.Many factors influence the supply of and demand for water, and a one-fits-all solution for all regions is therefore not possible. There are essentially two strategies to ensure a sound supply of freshwater: we either use less water, or we make more of the water that we do use. The first is a typical accounting approach and is limited in scope, whereas the second calls for better science and engineering approaches.Although the surface of the Earth is mostly covered with water, more than 95% of it is salty or inaccessible. One clear solution to increase fresh water supply is desalination, which can be done by distillation or osmosis, through the use of carbon nanotubes, or by using another promising new technology: biomimetics. Water can be filtered through aquaporins—proteins that transport water molecules in and out of cells. Such biotechnologies could reach the market as early as 2013, although other exciting technologies are already available. Simple chemistry can be used, for example, in the ‘PUR'' water purifier that uses gravity to precipitate water-born contaminants and pathogens or the water purifier akin to a trash bag, which cleanses water through a nanofibre filter containing microbicides and carbon to remove pollutants and pathogens. Such simple and cheap technology is ideal for billions of the world''s poor who do not have access to clean drinking water.Of the available freshwater, agriculture uses the largest share—up to 70% in many regions—and technological and biotechnological solutions can also contribute to preserving water in this context. New farming processes that can retain water in the soil, recycle it or reduce its use include no-till farming, crop intensification, improved fertilizer usage, crop development, waste water re-use and pre- and post-harvest food processing, among many others. The different degrees of water quality can also be exploited for agriculture; ‘grey water''—which is unsafe for human consumption—could still be used in agriculture.In addition to improving management practices, there is no question that we need considerably more innovation in water technology to close the supply–demand gap. These developments should include better processes for purification and desalination, more efficient industrial use and re-use and improved agricultural usage. The problem is that the water sector is poorly funded in all respects, including research. New technologies could help to re-use water and reclaim resources from wastewater while generating biogas from the waste. There is also enormous potential for the use of water beyond its consumption in households, agriculture and industry. ‘Blue energy'', for instance, generates power from reverse electrodialysis by mixing saltwater and freshwater across an ion exchange membrane stack. This could potentially generate energy wherever rivers flow into the sea.With so many innovations already under way with so little funding, what other technologies can we come up with to reduce water usage and deal with medical, industrial and individual waste? The issue of waste is a serious and pressing problem: we find pharmaceutical chemicals in fish, which are in turn consumed by humans and other species in the food chain. We need to find ways to effectively transform waste into biodegradable products that can be used as fertilizers, as well as to recover valuable molecules such as rare metals. The downstream consequences of such technologies will be the regeneration of coastal estuaries, lower levels of contaminants in marine life and cleaner rivers. Ultimately, we need much more research into reducing water use, purification, bioremediation and recycling. I submit that this should be a priority research area for all the natural sciences and engineering.Companies are held accountable these days for socially responsible projects, sustainability and their carbon footprint—this includes water usage. Why should research institutions not be held responsible too? After all, we claim to be at the cutting edge of science and should set the trend. Research grants should have a ‘green component'' and a score should be given to applications according to water usage and ‘green work''.  相似文献   

20.
《Médecine Nucléaire》2020,44(3):158-163
The metabolome, which represents the complete set of molecules (metabolites) of a biological sample (cell, tissue, organ, organism), is the final downstream product of the metabolic cell process that involves the genome and exogenous sources. The metabolome is characterized by a large number of small molecules with a huge diversity of chemical structures and abundances. Exploring the metabolome requires complementary analytical platforms to reach its extensive coverage. The metabolome is continually evolving, reflecting the continuous flux of metabolic and signaling pathways. Metabolomic research aims to study the biochemical processes by detecting and quantifying metabolites to obtain a metabolic picture able to give a functional readout of the physiological state. Recent advances in mass spectrometry (one of the mostly used technologies for metabolomics studies) have given the opportunity to determine the spatial distribution of metabolites in tissues. In a two-part article, we describe the usual metabolomics technologies, workflows and strategies leading to the implementation of new clinical biomarkers. In this second part, we first develop the steps of a metabolomic analysis from sample collection to biomarker validation. Then with two examples, autism spectrum disorders and Alzheimer's disease, we illustrate the contributions of metabolomics to clinical practice. Finally, we discuss the complementarity of in vivo (positron emission tomography) and in vitro (metabolomics) molecular explorations for biomarker research.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号