首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Technical and experimental advances in microaspiration techniques, RNA amplification, quantitative real-time polymerase chain reaction (qPCR), and cDNA microarray analysis have led to an increase in the number of studies of single-cell gene expression. In particular, the central nervous system (CNS) is an ideal structure to apply single-cell gene expression paradigms. Unlike an organ that is composed of one principal cell type, the brain contains a constellation of neuronal and noneuronal populations of cells. A goal is to sample gene expression from similar cell types within a defined region without potential contamination by expression profiles of adjacent neuronal subpopulations and noneuronal cells. The unprecedented resolution afforded by single-cell RNA analysis in combination with cDNA microarrays and qPCR-based analyses allows for relative gene expression level comparisons across cell types under different experimental conditions and disease states. The ability to analyze single cells is an important distinction from global and regional assessments of mRNA expression and can be applied to optimally prepared tissues from animal models as well as postmortem human brain tissues. This focused review illustrates the potential power of single-cell gene expression studies within the CNS in relation to neurodegenerative and neuropsychiatric disorders such as Alzheimer's disease (AD) and schizophrenia, respectively.  相似文献   

2.
The importance of maintaining DNA methylation patterns and faithful transmission of these patterns during cell division to ensure appropriate gene expression has been known for many decades now. It has largely been assumed that the symmetrical nature of CpG motifs, the most common site for DNA methylation in mammals, together with the presence of maintenance methylases able to methylate newly synthesised DNA, ensures that there is concordance of methylation on both strands. However, although this assumption is compelling in theory, little experimental evidence exists that either supports or refutes this assumption. Here, we have undertaken a genome‐wide single‐nucleotide resolution analysis to determine the frequency with which hemimethylated CpG sites exist in sheep muscle tissue. Analysis of multiple independent samples provides strong evidence that stably maintained hemimethylation is a very rare occurrence, at least in this tissue. Given the rarity of stably maintained hemimethylation, next‐generation sequencing data from both DNA strands may be carefully combined to increase the accuracy with which DNA methylation can be measured at single‐nucleotide resolution.  相似文献   

3.
Understanding how flowers develop from undifferentiated stem cells has occupied developmental biologists for decades. Key to unraveling this process is a detailed knowledge of the global regulatory hierarchies that control developmental transitions, cell differentiation and organ growth. These hierarchies may be deduced from gene perturbation experiments, which determine the effects on gene expression after specific disruption of a regulatory gene. Here, we tested experimental strategies for gene perturbation experiments during Arabidopsis thaliana flower development. We used artificial miRNAs (amiRNAs) to disrupt the functions of key floral regulators, and expressed them under the control of various inducible promoter systems that are widely used in the plant research community. To be able to perform genome‐wide experiments with stage‐specific resolution using the various inducible promoter systems for gene perturbation experiments, we also generated a series of floral induction systems that allow collection of hundreds of synchronized floral buds from a single plant. Based on our results, we propose strategies for performing dynamic gene perturbation experiments in flowers, and outline how they may be combined with versions of the floral induction system to dissect the gene regulatory network underlying flower development.  相似文献   

4.
Advances in single particle electron cryomicroscopy have made possible to elucidate routinely the structure of biological specimens at subnanometer resolution. At this resolution, secondary structure elements are discernable by their signature. However, identification and interpretation of high resolution structural features are hindered by the contrast loss caused by experimental and computational factors. This contrast loss is traditionally modeled by a Gaussian decay of structure factors with a temperature factor, or B-factor. Standard restoration procedures usually sharpen the experimental maps either by applying a Gaussian function with an inverse ad hoc B-factor, or according to the amplitude decay of a reference structure. EM-BFACTOR is a program that has been designed to widely facilitate the use of the novel method for objective B-factor determination and contrast restoration introduced by Rosenthal and Henderson [Rosenthal, P.B., Henderson, R., 2003. Optimal determination of particle orientation, absolute hand, and contrast loss in single-particle electron cryomicroscopy. J. Mol. Biol. 333, 721-745]. The program has been developed to interact with the most common packages for single particle electron cryomicroscopy. This sharpening method has been further investigated via EM-BFACTOR, concluding that it helps to unravel the high resolution molecular features concealed in experimental density maps, thereby making them better suited for interpretation. Therefore, the method may facilitate the analysis of experimental data in high resolution single particle electron cryomicroscopy.  相似文献   

5.
Several factors, including spatial and temporal coherence of the electron microscope, specimen movement, recording medium, and scanner optics, contribute to the decay of the measured Fourier amplitude in electron image intensities. We approximate the combination of these factors as a single Gaussian envelope function, the width of which is described by a single experimental B-factor. We present an improved method for estimating this B-factor from individual micrographs by combining the use of X-ray solution scattering and numerical fitting to the average power spectrum of particle images. A statistical estimation from over 200 micrographs of herpes simplex virus type-1 capsids was used to estimate the spread in the experimental B-factor of the data set. The B-factor is experimentally shown to be dependent on the objective lens defocus setting of the microscope. The average B-factor, the X-ray scattering intensity of the specimen, and the number of particles required to determine the structure at a lower resolution can be used to estimate the minimum fold increase in the number of particles that would be required to extend a single particle reconstruction to a specified higher resolution. We conclude that microscope and imaging improvements to reduce the experimental B-factor will be critical for obtaining an atomic resolution structure.  相似文献   

6.
Understanding and characterising biochemical processes inside single cells requires experimental platforms that allow one to perturb and observe the dynamics of such processes as well as computational methods to build and parameterise models from the collected data. Recent progress with experimental platforms and optogenetics has made it possible to expose each cell in an experiment to an individualised input and automatically record cellular responses over days with fine time resolution. However, methods to infer parameters of stochastic kinetic models from single-cell longitudinal data have generally been developed under the assumption that experimental data is sparse and that responses of cells to at most a few different input perturbations can be observed. Here, we investigate and compare different approaches for calculating parameter likelihoods of single-cell longitudinal data based on approximations of the chemical master equation (CME) with a particular focus on coupling the linear noise approximation (LNA) or moment closure methods to a Kalman filter. We show that, as long as cells are measured sufficiently frequently, coupling the LNA to a Kalman filter allows one to accurately approximate likelihoods and to infer model parameters from data even in cases where the LNA provides poor approximations of the CME. Furthermore, the computational cost of filtering-based iterative likelihood evaluation scales advantageously in the number of measurement times and different input perturbations and is thus ideally suited for data obtained from modern experimental platforms. To demonstrate the practical usefulness of these results, we perform an experiment in which single cells, equipped with an optogenetic gene expression system, are exposed to various different light-input sequences and measured at several hundred time points and use parameter inference based on iterative likelihood evaluation to parameterise a stochastic model of the system.  相似文献   

7.
J Wang  HC Fan  B Behr  SR Quake 《Cell》2012,150(2):402-412
Meiotic recombination and de novo mutation are the two main contributions toward gamete genome diversity, and many questions remain about how an individual human's genome is edited by these two processes. Here, we describe a high-throughput method for single-cell whole-genome analysis that was used to measure the genomic diversity in one individual's gamete genomes. A microfluidic system was used for highly parallel sample processing and to minimize nonspecific amplification. High-density genotyping results from 91 single cells were used to create a personal recombination map, which was consistent with population-wide data at low resolution but revealed significant differences from pedigree data at higher resolution. We used the data to test for meiotic drive and found evidence for gene conversion. High-throughput sequencing on 31 single cells was used to measure the frequency of large-scale genome instability, and deeper sequencing of eight single cells revealed de novo mutation rates with distinct characteristics.  相似文献   

8.
Twenty years ago, the release of the first draft of the human genome sequence instigated a paradigm shift in genomics and molecular biology. Arguably, structural biology is entering an analogous era, with availability of an experimentally determined or predicted molecular model for almost every protein-coding gene from many genomes—producing a reference “structureome”. Structural predictions require experimental validation and not all proteins conform to a single structure, making any reference structureome necessarily incomplete. Despite these limitations, a reference structureome can be used to characterize cell state in more detail than by quantifying sequence or expression levels alone. Cryogenic electron microscopy (cryo-EM) is a method that can generate atomic resolution views of molecules and cells frozen in place. In this perspective I consider how emerging cryo-EM methods are contributing to the new field of structureomics.  相似文献   

9.
Three PCR-based methods are described that allow covalent drug-DNA adducts, and their repair, to be studied at various levels of resolution from gene regions to the individual nucleotide level in single copy genes. A quantitative PCR (QPCR) method measures the total damage on both DNA strands in a gene region, usually between 300 and 3000 base pairs in length. Strand-specific QPCR incorporates adaptations that allow damage to be measured in the same region as QPCR but in a strand-specific manner. Single-strand ligation PCR allows the detection of adduct formation at the level of single nucleotides, on individual strands, in a single copy gene in mammalian cells. If antibodies to the DNA adducts of interest are available, these can be used to capture and isolate adducted DNA for use in single-strand ligation PCR increasing the sensitivity of the assay.  相似文献   

10.
A new field of gene expression regulation research is emerging that has previously been overlooked. This new area is concerned with distinguishing the expression of a single gene from the averaged expression of many gene copies within the cell population. This paper reviews research focused on individual genes in inducible gene expression systems. The main experimental strategy is to measure the gene expression level of a single cell containing a single reporter gene molecule. In contrast to the commonly held belief, gene induction is found to be stochastic under certain conditions. The possible mechanisms and implications are discussed.  相似文献   

11.
12.
Two different methods of using paralogous genes for phylogenetic inference have been proposed: reconciled trees (or gene tree parsimony) and uninode coding. Gene tree parsimony suffers from 10 serious problems, including differential weighting of nucleotide and gap characters, undersampling which can be misinterpreted as synapomorphy, all of the characters not being allowed to interact, and conflict between gene trees being given equal weight, regardless of branch support. These problems are largely avoided by using uninode coding. The uninode coding method is elaborated to address multiple gene duplications within a single gene tree family and handle problems caused by lack of gene tree resolution. An example of vertebrate phylogeny inferred from nine genes is reanalyzed using uninode coding. We suggest that uninode coding be used instead of gene tree parsimony for phylogenetic inference from paralogous genes.  相似文献   

13.
For the analysis of protein-DNA interactions by coupled gel-shift/footprinting, DNA fragments need to be extracted from polyacrylamide gels and subsequently separated on high resolution gels. Due to impurities in the extracted DNA, single nucleotide resolution is frequently not achieved. We now describe an improved experimental strategy that employs transient coupling of DNA fragments to a solid support in order to extract DNA of high purity quantitatively, rapidly and reliably. As an example, we describe the application of our protocol to the 'in-gel footprinting' by copper phenanthroline. The method should also find application to the chemical interference assays.  相似文献   

14.
Linear molecular motors translocate along polymeric tracks using discrete steps. The step length is usually measured using constant-force single molecule experiments in which the polymer is tethered to a force-clamped microsphere. During the enzymatic cycle the motor shortens the tether contour length. Experimental conditions influence the achievable step length resolution, and ideally experiments should be conducted with high clamp-force using slow motors linked to small beads via stiff short tethers. We focus on the limitations that the polymer-track flexibility, the thermal motion of the microsphere, and the motor kinetics pose for step-length measurement in a typical optical tweezers experiment. An expression for the signal/noise ratio in a constant-force, worm-like chain tethered particle, single-molecule experiment is developed. The signal/noise ratio is related to the Fourier transform of the pairwise distance distribution, commonly used to determine step length from a time-series. Monte Carlo simulations verify the proposed theory for experimental parameter values typically encountered with molecular motors (polymerases and helicases) translocating along single- or double-stranded nucleic acids. The predictions are consistent with recent experimental results for double-stranded DNA tethers. Our results map favorable experimental conditions for observing single motor steps on various substrates but indicate that principal resolution limits are set by thermal fluctuations.  相似文献   

15.
16.
17.
Three-dimensional (3D) conformation of the chromatin is crucial to stringently regulate gene expression patterns and DNA replication in a cell-type specific manner. Hi-C is a key technique for measuring 3D chromatin interactions genome wide. Estimating and predicting the resolution of a library is an essential step in any Hi-C experimental design. Here, we present the mathematical concepts to estimate the resolution of a dataset and predict whether deeper sequencing would enhance the resolution. We have developed HiCRes, a docker pipeline, by applying these concepts to several Hi-C libraries.  相似文献   

18.
19.
In this experimental study the patterns in early marine biofouling communities and possible implications for surveillance and environmental management were explored using metabarcoding, viz. 18S ribosomal RNA gene barcoding in combination with high-throughput sequencing. The community structure of eukaryotic assemblages and the patterns of initial succession were assessed from settlement plates deployed in a busy port for one, five and 15 days. The metabarcoding results were verified with traditional morphological identification of taxa from selected experimental plates. Metabarcoding analysis identified > 400 taxa at a comparatively low taxonomic level and morphological analysis resulted in the detection of 25 taxa at varying levels of resolution. Despite the differences in resolution, data from both methods were consistent at high taxonomic levels and similar patterns in community shifts were observed. A high percentage of sequences belonging to genera known to contain non-indigenous species (NIS) were detected after exposure for only one day.  相似文献   

20.
In this work, the application of a multivariate curve resolution procedure based on alternating least squares optimization (MCR-ALS) for the analysis of data from DNA microarrays is proposed. For this purpose, simulated and publicly available experimental data sets have been analyzed. Application of MCR-ALS, a method that operates without the use of any training set, has enabled the resolution of the relevant information about different cancer lines classification using a set of few components; each of these defined by a sample and a pure gene expression profile. From resolved sample profiles, a classification of samples according to their origin is proposed. From the resolved pure gene expression profiles, a set of over- or underexpressed genes that could be related to the development of cancer diseases has been selected. Advantages of the MCR-ALS procedure in relation to other previously proposed procedures such as principal component analysis are discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号