首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Quality Assessment and Data Analysis for microRNA Expression Arrays   总被引:1,自引:0,他引:1       下载免费PDF全文
MicroRNAs are small (~22 nt) RNAs that regulate gene expression and play important roles in both normal and disease physiology. The use of microarrays for global characterization of microRNA expression is becoming increasingly popular and has the potential to be a widely used and valuable research tool. However, microarray profiling of microRNA expression raises a number of data analytic challenges that must be addressed in order to obtain reliable results. We introduce here a universal reference microRNA reagent set as well as a series of nonhuman spiked-in synthetic microRNA controls, and demonstrate their use for quality control and between-array normalization of microRNA expression data. We also introduce diagnostic plots designed to assess and compare various normalization methods. We anticipate that the reagents and analytic approach presented here will be useful for improving the reliability of microRNA microarray experiments.  相似文献   

2.
When studying animal perception, one normally has the chance of localizing perceptual events in time, that is via behavioural responses time-locked to the stimuli. With multistable stimuli, however, perceptual changes occur despite stationary stimulation. Here, the challenge is to infer these not directly observable perceptual states indirectly from the behavioural data. This estimation is complicated by the fact that an animal's performance is contaminated by errors. We propose a two-step approach to overcome this difficulty: First, one sets up a generative, stochastic model of the behavioural time series based on the relevant parameters, including the probability of errors. Second, one performs a model-based maximum-likelihood estimation on the data in order to extract the non-observable perceptual state transitions. We illustrate this methodology for data from experiments on perception of bistable apparent motion in pigeons. The observed behavioural time series is analysed and explained by a combination of a Markovian perceptual dynamics with a renewal process that governs the motor response. We propose a hidden Markov model in which non-observable states represent both the perceptual states and the states of the renewal process of the motor dynamics, while the observable states account for overt pecking performance. Showing that this constitutes an appropriate phenomenological model of the time series of observable pecking events, we use it subsequently to obtain an estimate of the internal (and thus covert) perceptual reversals. These may directly correspond to changes in the activity of mutually inhibitory populations of motion selective neurones tuned to orthogonal directions.  相似文献   

3.
A brief version of the Implicit Association Test (BIAT) has been introduced. The present research identified analytical best practices for overall psychometric performance of the BIAT. In 7 studies and multiple replications, we investigated analytic practices with several evaluation criteria: sensitivity to detecting known effects and group differences, internal consistency, relations with implicit measures of the same topic, relations with explicit measures of the same topic and other criterion variables, and resistance to an extraneous influence of average response time. The data transformation algorithms D outperformed other approaches. This replicates and extends the strong prior performance of D compared to conventional analytic techniques. We conclude with recommended analytic practices for standard use of the BIAT.  相似文献   

4.
Longitudinal data can always be represented by a time series with a deterministic trend and randomly correlated residuals, the latter of which do not usually form a stationary process. The class of linear spectral models is a basis for the exploratory analysis of these data. The theory and techniques of factor analysis provide a means by which one component of the residual series can be separated from an error series, and then partitioned into a sum of randomly scaled metameters that characterize the sample paths of the residuals. These metameters, together with linear modelling techniques, are then used to partition the nonrandom trend into a determined component, which is associated with the sample paths of the residuals, and an independent inherent component. Linear spectral models are assumption-free and represent both random and nonrandom trends with fewer terms than any other mixed-effects linear model. Data on body-weight growth of juvenile mice are used in this paper to illustrate the application of linear spectral models, through a relatively sophisticated exploratory analysis.  相似文献   

5.
Abstract: Assessing the dynamics of wild populations often involves an estimate of the finite rate of population increase (λ) or the instantaneous rate of increase (r). However, a pervasive problem in trend estimation is that many analytical techniques assume independent errors among the observations. To be valid, variance estimates around λ (or r) must account for serial correlation that exists in abundance data. Time series analysis provides a method for estimating population trends and associated variances when serial correlation of errors occurs. We offer an approach and present an example for estimating λ and its associated variance when observations are correlated over time. We present a simplified time series method and variance estimator to account for autocorrelation based on a moving average process. We illustrate the procedure using a spectacled eider (Somateria fischeri) data set of estimated annual abundances from aerial transect surveys conducted from 1957 to 1995. The analytic variance estimator provides a way to plan future studies to reduce uncertainty and bias in estimates of population growth rates. Demographic studies with policy implications or those involving species of conservation concern should especially consider the correlated nature of population trend data.  相似文献   

6.
Surface plasmon resonance (SPR)-biosensor techniques directly provide essential information for the study and characterization of small molecule-nucleic acid interactions, and the use of these methods is steadily increasing. The method is label-free and monitors the interactions in real time. Both dynamic and steady-state information can be obtained for a wide range of reaction rates and binding affinities. This article presents the basics of the SPR technique, provides suggestions for experimental design, and illustrates data processing and analysis of results. A specific example of the interaction of a well-known minor groove binding agent, netropsin, with DNA is evaluated by both kinetic and steady-state SPR methods. Three different experiments are used to illustrate different approaches and analysis methods. The three sets of results show the reproducibility of the binding constants and agreement from both steady-state and kinetic analyses. These experiments also show that reliable kinetic information can be obtained, even with difficult systems, if the experimental conditions are optimized to minimize mass transport effects. Limitations of the biosensor-SPR technique are also discussed to provide an awareness of the care needed to conduct a successful experiment.  相似文献   

7.
Comparative analysis of nonhuman animal communication systems and their complexity, particularly in comparison to human language, has been generally hampered by both a lack of sufficiently extensive data sets and appropriate analytic tools. Information theory measures provide an important quantitative tool for examining and comparing communication systems across species. In this paper we use the original application of information theory, that of statistical examination of a communication system's structure and organization. As an example of the utility of information theory to the analysis of animal communication systems, we applied a series of information theory statistics to a statistically categorized set of bottlenose dolphin Tursiops truncatus, whistle vocalizations. First, we use the first-order entropic relation in a Zipf-type diagram (Zipf 1949 Human Behavior and the Principle of Least Effort) to illustrate the application of temporal statistics as comparative indicators of repertoire complexity, and as possible predictive indicators of acquisition/learning in animal vocal repertoires. Second, we illustrate the need for more extensive temporal data sets when examining the higher entropic orders, indicative of higher levels of internal informational structure, of such vocalizations, which could begin to allow the statistical reconstruction of repertoire organization. Third, we propose using 'communication capacity' as a measure of the degree of temporal structure and complexity of statistical correlation, represented by the values of entropic order, as an objective tool for interspecies comparison of communication complexity. In doing so, we introduce a new comparative measure, the slope of Shannon entropies, and illustrate how it potentially can be used to compare the organizational complexity of vocal repertoires across a diversity of species. Finally, we illustrate the nature and predictive application of these higher-order entropies using a preliminary sample of dolphin whistle vocalizations. The purpose of this preliminary report is to re-examine the original application of information theory to the field of animal communication, illustrate its potential utility as a comparative tool for examining the internal informational structure of animal vocal repertoires and their development, and discuss its relationship to behavioural ecology and evolutionary theory. Copyright 1999 The Association for the Study of Animal Behaviour.  相似文献   

8.
The similarities and differences between the banding patterns obtained in human chromosomes with the Quinacrine fluorescence and the Acetic-Saline-Giemsa (ASG) techniques are described. The use of these techniques to identify each chromosome pair in the human karyotype is discussed, as also is the use of the methods to identify aberrant chromosomes and to map points of exchange in translocations and inversions. A number of examples are used to illustrate the resolution permitted by these new methods. Seven polymorphic regions on normal chromosomes are described, which include four identified by fluorescence on chromosomes 3,4, 13, and 22. The secondary constrictions on chromosomes 1, 9, and 16, which had previously been observed in conventionally stained preparations from favourable material, are particularly clear in all cells treated with the Giemsa techniques. The new methods make it possible to detect small differences in size between the heterochromatic blocks at these regions in homologous chromosomes. The benefit to human genetics of studying the familial segregation of both structurally rearranged and normal, but polymorphic chromosomes, where the chromosomes or parts of chromosomes can be unambiguously identified is stressed.  相似文献   

9.
A computer model of protein aggregation competing with productive folding is proposed. Our model adapts techniques from lattice Monte Carlo studies of protein folding to the problem of aggregation. However, rather than starting with a single string of residues, we allow independently folding strings to undergo collisions and consider their interactions in different orientations. We first present some background into the nature and significance of protein aggregation and the use of lattice Monte Carlo simulations in understanding other aspects of protein folding. The results of a series of simulation experiments involving simple versions of the model illustrate the importance of considering aggregation in simulations of protein folding and provide some preliminary understanding of the characteristics of the model. Finally, we discuss the value of the model in general and of our particular design decisions and experiments. We conclude that computer simulation techniques developed to study protein folding can provide insights into protein aggregation, and that a better understanding of aggregation may in turn provide new insights into and constraints on the more general protein folding problem.  相似文献   

10.
We combine two techniques in order to discuss the time-varying elastic properties of the left ventricular muscle. An analytic model for the shape and forces in the left ventricle is combined with the Fourier series representations of certain of the ventricular dimensions and pressure to derive expressions for the stress and strain in the left ventricle. The strain is thus a function of the elastic material properties, which are then expressed as functions of time by using Fourier series. The only data needed for a numerical study using these techniques are closed-chest determinations of the ventricular dimensions and the ventricular pressure.  相似文献   

11.
12.
A series of experiments examined possible strategies human subjectsmight use in tasks requiring rapid detection or recognitionof a taste quality. In a reaction time (RT) paradigm, subjectswere to decide whether each of a series of stimuli flowed overthe tongue contained a previously designated target taste. Severaltasks of varying difficulty were used. The simplest task requiredsubjects to decide whether the target taste or water was presented.The most difficult task required discrimination between twodifferent target tastes in a series of mixtures formed by orthogonallycombining the target taste with two different irrelevant tastes.The speed at which subjects could detect and/or recognize targettastes was related to the RT for the particular taste. However,it was also clear that other variables, including the specificstimuli in the mixtures and the cognitive demands placed onthe subjects, influenced performance. These results suggestthat differences in taste onset time, as indexed by RT, canserve as a cue which subjects use to aid identification of singletastes in a mixture. It is concluded that the ease with whichsubjects can identify single tastes in a mixture is relatedto, among other variables, the differences in taste onset timebetween the tastes.  相似文献   

13.
Measurements of metabolic rate in rats: a comparison of techniques   总被引:1,自引:0,他引:1  
Two different open-circuit techniques of measuring metabolic rate were examined in rats at rest and during exercise. With one technique ambient air was drawn through a tightly fitting mask that was secured to the rat's head, whereas with the other technique the rat was placed into and ambient air was drawn through a Plexiglas box. Two series of experiments were performed. In series I, two groups were studied that consisted of rats that had received myocardial infarctions produced by coronary arterial ligations and rats that had received sham operations. In this series of experiments O2 uptake (VO2) and CO2 production (VCO2) were measured at rest, during four levels of submaximal exercise, and during maximal treadmill exercise in the same group of rats by use of both techniques in random order. VO2, VCO2, and the calculated respiratory exchange ratio (R) were similar at rest, during the highest level of submaximal exercise (20% grade, 37 m/min), and during maximal exercise; however, VO2 and VCO2 were significantly lower with the metabolic box technique compared with the mask technique during the three lowest work loads (5% grade, 19 m/min; 10% grade, 24 m/min; and 15% grade, 31 m/min). These differences appeared to be associated with a change in gait produced when the mask was worn. In series II, the arterial blood gas and acid-base responses to both submaximal and maximal exercise were measured using both techniques in a group of instrumented rats that had a catheter placed into the right carotid artery.(ABSTRACT TRUNCATED AT 250 WORDS)  相似文献   

14.
Dataset partitioning and validation techniques are required in all artificial neural network based waste models. However, there is currently no consensual approach on the validation techniques. This study examines the effects of three time series nested forward validation techniques (rolling origin - RO, rolling window - RW, and growing window - GW) on total municipal waste disposal estimates using recurrent neural network (RNN) models, and benchmarks model performance with respect to multiple linear regression (MLR) models. Validation selection techniques appear important to waste disposal time series model construction and evaluation. Sample size is found as an important factor on model accuracy for both RNN and MLR models. Better performance in Trial RW4 is observed, probably due to a more consistent testing set in 2019. Overall, the MAPE of the waste disposal models ranging from 10.4% to 12.7%. Both GW and RO validation techniques appear appropriate for RNN waste models. However, MLR waste models are more sensitive to the dataset characteristics, and RO validation technique appears more suitable to MLR models. It is found that data characteristics are more important than training period duration. It is recommended data set normality and skewness be examined for waste disposal modeling.  相似文献   

15.
In the preceding paper, we have investigated the structural heterogeneous character of a series of amorphous samples prepared from various starchy substrates (native potato starch, amylopectin and amylose) following different techniques of preparation (casting, freeze drying and solvent exchange). Spectral decompositions of the C1 resonances of the (13)C CP-MAS (Cross Polarization and Magic Angle Spinning) spectra under (1)H decoupling have shown the existence of five main types of alpha(1-4) linkages. In this part, 2D solid state NMR WISE experiments and the (13)C/(1)H magnetization transfer in CP as a local probe for both structures and dynamics were used. The (13)C CP magnetization curves versus contact time of each C1 component in each recorded spectrum were fitted with an analytic function taking into account two (1)H reservoirs. Interpretation of the characteristic times derived from fitting yields some improvements on the knowledge of the heterogeneity of the samples and on the water molecules distribution.  相似文献   

16.
The automation of laboratory techniques has greatly increased the number of experiments that can be carried out in the chemical and biological sciences. Until recently, this automation has focused primarily on improving hardware. Here we argue that future advances will concentrate on intelligent software to integrate physical experimentation and results analysis with hypothesis formulation and experiment planning. To illustrate our thesis, we describe the 'Robot Scientist' - the first physically implemented example of such a closed loop system. In the Robot Scientist, experimentation is performed by a laboratory robot, hypotheses concerning the results are generated by machine learning and experiments are allocated and selected by a combination of techniques derived from artificial intelligence research. The performance of the Robot Scientist has been evaluated by a rediscovery task based on yeast functional genomics. The Robot Scientist is proof that the integration of programmable laboratory hardware and intelligent software can be used to develop increasingly automated laboratories.  相似文献   

17.
Many biomedical experiments require the qualitative and quantitative localization of trace elements with high sensitivity and good spatial resolution. The feasibility of measuring the chemical form of the elements, the time course of trace element metabolism, and conducting experiments in living biological systems are also important requirements for biological trace element research. Nuclear analytical techniques that employ ion or photon beams have grown in importance in the past decade and have led to several new experimental approaches. Some of the important features of these methods are reviewed here along with their role in trace element research. Examples of their use are given to illustrate potential for new research directions. It is emphasized that the effective application of these methods necessitates a closely integrated multidisciplinary scientific team.  相似文献   

18.
Most computational models for gender classification use global information (the full face image) giving equal weight to the whole face area irrespective of the importance of the internal features. Here, we use a global and feature based representation of face images that includes both global and featural information. We use dimensionality reduction techniques and a support vector machine classifier and show that this method performs better than either global or feature based representations alone. We also present results of human subjects performance on gender classification task and evaluate how the different dimensionality reduction techniques compare with human subjects performance. The results support the psychological plausibility of the global and feature based representation.  相似文献   

19.
MOTIVATION: Accurate time series for biological processes are difficult to estimate due to problems of synchronization, temporal sampling and rate heterogeneity. Methods are needed that can utilize multi-dimensional data, such as those resulting from DNA microarray experiments, in order to reconstruct time series from unordered or poorly ordered sets of observations. RESULTS: We present a set of algorithms for estimating temporal orderings from unordered sets of sample elements. The techniques we describe are based on modifications of a minimum-spanning tree calculated from a weighted, undirected graph. We demonstrate the efficacy of our approach by applying these techniques to an artificial data set as well as several gene expression data sets derived from DNA microarray experiments. In addition to estimating orderings, the techniques we describe also provide useful heuristics for assessing relevant properties of sample datasets such as noise and sampling intensity, and we show how a data structure called a PQ-tree can be used to represent uncertainty in a reconstructed ordering. AVAILABILITY: Academic implementations of the ordering algorithms are available as source code (in the programming language Python) on our web site, along with documentation on their use. The artificial 'jelly roll' data set upon which the algorithm was tested is also available from this web site. The publicly available gene expression data may be found at http://genome-www.stanford.edu/cellcycle/ and http://caulobacter.stanford.edu/CellCycle/.  相似文献   

20.
《Biophysical journal》2021,120(20):4472-4483
Single-molecule (SM) approaches have provided valuable mechanistic information on many biophysical systems. As technological advances lead to ever-larger data sets, tools for rapid analysis and identification of molecules exhibiting the behavior of interest are increasingly important. In many cases the underlying mechanism is unknown, making unsupervised techniques desirable. The divisive segmentation and clustering (DISC) algorithm is one such unsupervised method that idealizes noisy SM time series much faster than computationally intensive approaches without sacrificing accuracy. However, DISC relies on a user-selected objective criterion (OC) to guide its estimation of the ideal time series. Here, we explore how different OCs affect DISC’s performance for data typical of SM fluorescence imaging experiments. We find that OCs differing in their penalty for model complexity each optimize DISC’s performance for time series with different properties such as signal/noise and number of sample points. Using a machine learning approach, we generate a decision boundary that allows unsupervised selection of OCs based on the input time series to maximize performance for different types of data. This is particularly relevant for SM fluorescence data sets, which often have signal/noise near the derived decision boundary and include time series of nonuniform length because of stochastic bleaching. Our approach, AutoDISC, allows unsupervised per-molecule optimization of DISC, which will substantially assist in the rapid analysis of high-throughput SM data sets with noisy samples and nonuniform time windows.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号