首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
《Biophysical journal》2023,122(2):433-441
Potential energy landscapes are useful models in describing events such as protein folding and binding. While single-molecule fluorescence resonance energy transfer (smFRET) experiments encode information on continuous potentials for the system probed, including rarely visited barriers between putative potential minima, this information is rarely decoded from the data. This is because existing analysis methods often model smFRET output assuming, from the onset, that the system probed evolves in a discretized state space to be analyzed within a hidden Markov model (HMM) paradigm. By contrast, here, we infer continuous potentials from smFRET data without discretely approximating the state space. We do so by operating within a Bayesian nonparametric paradigm by placing priors on the family of all possible potential curves. As our inference accounts for a number of required experimental features raising computational cost (such as incorporating discrete photon shot noise), the framework leverages a structured-kernel-interpolation Gaussian process prior to help curtail computational cost. We show that our structured-kernel-interpolation priors for potential energy reconstruction from smFRET analysis accurately infers the potential energy landscape from a smFRET binding experiment. We then illustrate advantages of structured-kernel-interpolation priors for potential energy reconstruction from smFRET over standard HMM approaches by providing information, such as barrier heights and friction coefficients, that is otherwise inaccessible to HMMs.  相似文献   

2.
单分子荧光共振能量转移技术是通过检测单个分子内的荧光供体及受体间荧光能量转移的效率来研究分子构象的变化.要得到这些生物大分子的信息就需要对大量的单分子信号进行统计分析,人工分析这些信息,既费时费力又不具备客观性和可重复性,因此本文将小波变换及滚球算法应用到单分子荧光能量共振转移图像中对单分子信号进行统计分析.在保证准确检测到单分子信号的前提下,文章对滚球算法和小波变换算法处理图像后的线性进行了分析,结果表明,滚球算法和小波变换算法不但能够很好地去除单分子FRET图像的背景噪声,同时还能很好地保持单分子荧光信号的线性.最后本文还利用滚球算法处理单分子FRET图像及统计15 bp DNA的FRET效率的直方图,通过计算得到了15 bp DNA的FRET效率值.  相似文献   

3.
Wavelet thresholding with bayesian false discovery rate control   总被引:1,自引:0,他引:1  
The false discovery rate (FDR) procedure has become a popular method for handling multiplicity in high-dimensional data. The definition of FDR has a natural Bayesian interpretation; it is the expected proportion of null hypotheses mistakenly rejected given a measure of evidence for their truth. In this article, we propose controlling the positive FDR using a Bayesian approach where the rejection rule is based on the posterior probabilities of the null hypotheses. Correspondence between Bayesian and frequentist measures of evidence in hypothesis testing has been studied in several contexts. Here we extend the comparison to multiple testing with control of the FDR and illustrate the procedure with an application to wavelet thresholding. The problem consists of recovering signal from noisy measurements. This involves extracting wavelet coefficients that result from true signal and can be formulated as a multiple hypotheses-testing problem. We use simulated examples to compare the performance of our approach to the Benjamini and Hochberg (1995, Journal of the Royal Statistical Society, Series B57, 289-300) procedure. We also illustrate the method with nuclear magnetic resonance spectral data from human brain.  相似文献   

4.
《Médecine Nucléaire》2007,31(5):219-234
Scintigraphic images are strongly affected by Poisson noise. This article presents the results of a comparison between denoising methods for Poisson noise according to different criteria: the gain in signal-to-noise ratio, the preservation of resolution and contrast, and the visual quality. The wavelet techniques recently developed to denoise Poisson noise limited images are divided into two groups based on: (1) the Haar representation, (2) the transformation of Poisson noise into white Gaussian noise by the Haar–Fisz transform followed by a denoising. In this study, three variants of the first group and three variants of the second, including the adaptative Wiener filter, four types of wavelet thresholdings and the Bayesian method of Pizurica were compared to Metz and Hanning filters and to Shine, a systematic noise elimination process. All these methods, except Shine, are parametric. For each of them, ranges of optimal values for the parameters were highlighted as a function of the aforementioned criteria. The intersection of ranges for the wavelet methods without thresholding was empty, and these methods were therefore not further compared quantitatively. The thresholding techniques and Shine gave the best results in resolution and contrast. The largest improvement in signal-to-noise ratio was obtained by the filters. Ideally, these filters should be accurately defined for each image. This is difficult in the clinical context. Moreover, they generate oscillation artefacts. In addition, the wavelet techniques did not bring significant improvements, and are rather slow. Therefore, Shine, which is fast and works automatically, appears to be an interesting alternative.  相似文献   

5.
The identification of new diagnostic or prognostic biomarkers is one of the main aims of clinical cancer research. In recent years, there has been a growing interest in using mass spectrometry for the detection of such biomarkers. The MS signal resulting from MALDI‐TOF measurements is contaminated by different sources of technical variations that can be removed by a prior pre‐processing step. In particular, denoising makes it possible to remove the random noise contained in the signal. Wavelet methodology associated with thresholding is usually used for this purpose. In this study, we adapted two multivariate denoising methods that combine wavelets and PCA to MS data. The objective was to obtain better denoising of the data so as to extract the meaningful proteomic biological information from the raw spectra and reach meaningful clinical conclusions. The proposed methods were evaluated and compared with the classical soft thresholding denoising method using both real and simulated data sets. It was shown that taking into account common structures of the signals by adding a dimension reduction step on approximation coefficients through PCA provided more effective denoising when combined with soft thresholding on detail coefficients.  相似文献   

6.
Single-molecule fluorescence resonance energy transfer (smFRET) measurement is a powerful technique for investigating dynamics of biomolecules, for which various efforts have been made to overcome significant stochastic noise. Time stamp (TS) measurement has been employed experimentally to enrich information within the signals, while data analyses such as the hidden Markov model (HMM) have been successfully applied to recover the trajectories of molecular state transitions from time-binned photon counting signals or images. In this article, we introduce the HMM for TS-FRET signals, employing the variational Bayes (VB) inference to solve the model, and demonstrate the application of VB-HMM-TS-FRET to simulated TS-FRET data. The same analysis using VB-HMM is conducted for other models and the previously reported change point detection scheme. The performance is compared to other analysis methods or data types and we show that our VB-HMM-TS-FRET analysis can achieve the best performance and results in the highest time resolution. Finally, an smFRET experiment was conducted to observe spontaneous branch migration of Holliday-junction DNA. VB-HMM-TS-FRET was successfully applied to reconstruct the state transition trajectory with the number of states consistent with the nucleotide sequence. The results suggest that a single migration process frequently involves rearrangement of multiple basepairs.  相似文献   

7.
Tracking single molecules in living cells provides invaluable information on their environment and on the interactions that underlie their motion. New experimental techniques now permit the recording of large amounts of individual trajectories, enabling the implementation of advanced statistical tools for data analysis. In this primer, we present a Bayesian approach toward treating these data, and we discuss how it can be fruitfully employed to infer physical and biochemical parameters from single-molecule trajectories.  相似文献   

8.
基于小波变换的心电信号去噪算法   总被引:1,自引:0,他引:1  
目的:去除在心电信号采集过程中混入的肌电干扰、工频干扰、基线漂移等噪声信号,避免噪声对心电信号特征点的识别和提取造成误判和漏判。方法:首先利用coif4小波对心电信号按Mallat算法进行分解,然后采用软、硬阈值折衷与小波重构的算法进行去噪。结果:采用MIT/BIH Arrhythmia Database中的心电信号进行仿真、验证,有效去除了三种常见的噪声信号。结论:本方法实时性好,为临床分析与诊断奠定了基础。  相似文献   

9.
Single-molecule real time trajectories are embedded in high noise. To extract kinetic or dynamic information of the molecules from these trajectories often requires idealization of the data in steps and dwells. One major premise behind the existing single-molecule data analysis algorithms is the Gaussian ‘white’ noise, which displays no correlation in time and whose amplitude is independent on data sampling frequency. This so-called ‘white’ noise is widely assumed but its validity has not been critically evaluated. We show that correlated noise exists in single-molecule real time trajectories collected from optical tweezers. The assumption of white noise during analysis of these data can lead to serious over- or underestimation of the number of steps depending on the algorithms employed. We present a statistical method that quantitatively evaluates the structure of the underlying noise, takes the noise structure into account, and identifies steps and dwells in a single-molecule trajectory. Unlike existing data analysis algorithms, this method uses Generalized Least Squares (GLS) to detect steps and dwells. Under the GLS framework, the optimal number of steps is chosen using model selection criteria such as Bayesian Information Criterion (BIC). Comparison with existing step detection algorithms showed that this GLS method can detect step locations with highest accuracy in the presence of correlated noise. Because this method is automated, and directly works with high bandwidth data without pre-filtering or assumption of Gaussian noise, it may be broadly useful for analysis of single-molecule real time trajectories.  相似文献   

10.
Hidden Markov models have been used to restore recorded signals of single ion channels buried in background noise. Parameter estimation and signal restoration are usually carried out through likelihood maximization by using variants of the Baum-Welch forward-backward procedures. This paper presents an alternative approach for dealing with this inferential task. The inferences are made by using a combination of the framework provided by Bayesian statistics and numerical methods based on Markov chain Monte Carlo stochastic simulation. The reliability of this approach is tested by using synthetic signals of known characteristics. The expectations of the model parameters estimated here are close to those calculated using the Baum-Welch algorithm, but the present methods also yield estimates of their errors. Comparisons of the results of the Bayesian Markov Chain Monte Carlo approach with those obtained by filtering and thresholding demonstrate clearly the superiority of the new methods.  相似文献   

11.

Background

A key to increasing the power of multilocus association tests is to reduce the number of degrees of freedom by suppressing noise from data. One of the difficulties is to decide how much noise to suppress. An often overlooked problem is that commonly used association tests based on genotype data cannot utilize the genetic information contained in spatial ordering of SNPs (see proof in the Appendix), which may prevent them from achieving higher power.

Results

We develop a score test based on wavelet transform with empirical Bayesian thresholding. Extensive simulation studies are carried out under various LD structures as well as using HapMap data from many different chromosomes for both qualitative and quantitative traits. Simulation results show that the proposed test automatically adjusts the level of noise suppression according to LD structures, and it is able to consistently achieve higher or similar powers than many commonly used association tests including the principle component regression method (PCReg).

Conclusion

The wavelet-based score test automatically suppresses the right amount of noise and uses the information contained in spatial ordering of SNPs to achieve higher power.  相似文献   

12.
Calcium sparks and embers are localized intracellular events of calcium release in muscle cells studied frequently by confocal microscopy using line-scan imaging. The large quantity of images and large number of events require automatic detection procedures based on signal processing methods. In the past decades these methods were based on thresholding procedures. Although, recently, wavelet transforms were also introduced, they have not become widespread. We have implemented a set of algorithms based on one- and two-dimensional versions of the à trous wavelet transform. The algorithms were used to perform spike filtering, denoising and detection procedures. Due to the dependence of the algorithms on user adjustable parameters, their effect on the efficiency of the algorithm was studied in detail. We give methods to avoid false positive detections which are the consequence of the background noise in confocal images. In order to establish the efficiency and reliability of the algorithms, various tests were performed on artificial and experimental images. Spark parameters (amplitude, full width-at-half maximum) calculated using the traditional and the wavelet methods were compared. We found that the latter method is capable of identifying more events with better accuracy on experimental images. Furthermore, we extended the wavelet-based transform from calcium sparks to long-lasting small-amplitude events as calcium embers. The method not only solved their automatic detection but enabled the identification of events with small amplitude that otherwise escaped the eye, rendering the determination of their characteristic parameters more accurate.  相似文献   

13.
Y. Slim  K. Raoof 《IRBM》2010,31(4):209-220
The signal to noise ratio (SNR) of surface respiratory electromyography signal is very low. Indeed EMG signal is contaminated by different types of noise especially the cardiac artefact ECG. This article explores the problem of removing ECG artefact from respiratory EMG signal. The new method uses an adaptive structure with an electrocardyographic ECG reference signal carried out by wavelet decomposition. The proposed algorithm requires only one channel to both estimating the adaptive filter input reference noise and the respiratory EMG signal. This new technique demonstrates how two steps will be combined: the first step decomposes the signal with forward discrete wavelet transform into sub-bands to get the wavelet coefficients. Then, an improved soft thresholding function was applied. And the ECG input reference signal is reconstructed with the transformed coefficients whereas, the second uses an adaptive filter especially the LMS one to remove the ECG signal. After trying statistical as well as mathematical analysis, the complete investigation ensures that all details and steps make proof that our rigorous method is appropriate. Compared to the results obtained using previous techniques, the results achieved using the new algorithm show a significant improvement in the efficiency of the ECG rejection.  相似文献   

14.
A practical guide to single-molecule FRET   总被引:1,自引:0,他引:1  
Roy R  Hohng S  Ha T 《Nature methods》2008,5(6):507-516
Single-molecule fluorescence resonance energy transfer (smFRET) is one of the most general and adaptable single-molecule techniques. Despite the explosive growth in the application of smFRET to answer biological questions in the last decade, the technique has been practiced mostly by biophysicists. We provide a practical guide to using smFRET, focusing on the study of immobilized molecules that allow measurements of single-molecule reaction trajectories from 1 ms to many minutes. We discuss issues a biologist must consider to conduct successful smFRET experiments, including experimental design, sample preparation, single-molecule detection and data analysis. We also describe how a smFRET-capable instrument can be built at a reasonable cost with off-the-shelf components and operated reliably using well-established protocols and freely available software.  相似文献   

15.
For over a decade, experimental evolution has been combined with high-throughput sequencing techniques. In so-called Evolve-and-Resequence (E&R) experiments, populations are kept in the laboratory under controlled experimental conditions where their genomes are sampled and allele frequencies monitored. However, identifying signatures of adaptation in E&R datasets is far from trivial, and it is still necessary to develop more efficient and statistically sound methods for detecting selection in genome-wide data. Here, we present Bait-ER – a fully Bayesian approach based on the Moran model of allele evolution to estimate selection coefficients from E&R experiments. The model has overlapping generations, a feature that describes several experimental designs found in the literature. We tested our method under several different demographic and experimental conditions to assess its accuracy and precision, and it performs well in most scenarios. Nevertheless, some care must be taken when analysing trajectories where drift largely dominates and starting frequencies are low. We compare our method with other available software and report that ours has generally high accuracy even for trajectories whose complexity goes beyond a classical sweep model. Furthermore, our approach avoids the computational burden of simulating an empirical null distribution, outperforming available software in terms of computational time and facilitating its use on genome-wide data. We implemented and released our method in a new open-source software package that can be accessed at https://doi.org/10.5281/zenodo.7351736 .  相似文献   

16.
基于小波变换的混合二维ECG数据压缩方法   总被引:5,自引:0,他引:5  
提出了一种新的基于小波变换的混合二维心电(electrocardiogram,ECG)数据压缩方法。基于ECG数据的两种相关性,该方法首先将一维ECG信号转化为二维信号序列。然后对二维序列进行了小波变换,并利用改进的编码方法对变换后的系数进行了压缩编码:即先根据不同系数子带的各自特点和系数子带之间的相似性,改进了等级树集合分裂(setpartitioninghierarchicaltrees,SPIHT)算法和矢量量化(vectorquantization,VQ)算法;再利用改进后的SPIHT与VQ相混合的算法对小波变换后的系数进行了编码。利用所提算法与已有具有代表性的基于小波变换的压缩算法和其他二维ECG信号的压缩算法,对MIT/BIH数据库中的心律不齐数据进行了对比压缩实验。结果表明:所提算法适用于各种波形特征的ECG信号,并且在保证压缩质量的前提下,可以获得较大的压缩比。  相似文献   

17.
The heart sound is the characteristic signal of cardiovascular health status. The objective of this project is to explore the correlation between Wavelet Transform and noise performance of heart sound and the adaptability of classifying heart sound using bispectrum estimation. Since the wavelet has multi-scale and multi-resolution characteristics, in this paper, the heart sound signal with different frequency ranges is decomposed through wavelet and displayed on different scales of the resolving wavelet result. According to distribution features of frequency of heart sound signals, the interference components in heart sound signal can be eliminated by selecting reconstruction coefficients. Comparing de-noising effects of four wavelets which are haar, db6, sym8 and coif6, the db6 wavelet has achieved an optimal denoising effect to heart sound signals. The de-noising result of contrasting different layers in the db6 wavelet shows that decomposing with five layers in db6 provide the optimal performance. In practice, the db6 wavelet also shows commendable denoising effects when applying to 51 clinical heart signals. Furthermore, through the clinic analyses of 29 normal signals from healthy people and 22 abnormal heart signals from coronary heart disease patients, this method can fairly distinguish abnormal signals from normal signals by applying bispectrum estimation to denoised signals via ARMA coefficients model.  相似文献   

18.
In this paper, we introduce a model-based Bayesian denoising framework for phonocardiogram (PCG) signals. The denoising framework is founded on a new dynamical model for PCG, which is capable of generating realistic synthetic PCG signals. The introduced dynamical model is based on PCG morphology and is inspired by electrocardiogram (ECG) dynamical model proposed by McSharry et al. and can represent various morphologies of normal PCG signals. The extended Kalman smoother (EKS) is the Bayesian filter that is used in this study. In order to facilitate the adaptation of the denoising framework to each input PCG signal, the parameters are selected automatically from the input signal itself. This approach is evaluated on several PCGs recorded on healthy subjects, while artificial white Gaussian noise is added to each signal, and the SNR and morphology of the outputs of the proposed denoising approach are compared with the outputs of the wavelet denoising (WD) method. The results of the EKS demonstrate better performance than WD over a wide range of PCG SNRs. The new PCG dynamical model can also be employed to develop other model-based processing frameworks such as heart sound segmentation and compression.  相似文献   

19.
This paper discusses the suitability, in terms of noise reduction, of various methods which can be applied to an image type often used in radiation therapy: the portal image. Among these methods, the analysis focuses on those operating in the wavelet domain. Wavelet-based methods tested on natural images – such as the thresholding of the wavelet coefficients, the minimization of the Stein unbiased risk estimator on a linear expansion of thresholds (SURE-LET), and the Bayes least-squares method using as a prior a Gaussian scale mixture (BLS-GSM method) – are compared with other methods that operate on the image domain – an adaptive Wiener filter and a nonlocal mean filter (NLM). For the assessment of the performance, the peak signal-to-noise ratio (PSNR), the structural similarity index (SSIM), the Pearson correlation coefficient, and the Spearman rank correlation (ρ) coefficient are used. The performance of the wavelet filters and the NLM method are similar, but wavelet filters outperform the Wiener filter in terms of portal image denoising. It is shown how BLS-GSM and NLM filters produce the smoothest image, while keeping soft-tissue and bone contrast. As for the computational cost, filters using a decimated wavelet transform (decimated thresholding and SURE-LET) turn out to be the most efficient, with calculation times around 1 s.  相似文献   

20.
This paper presents a new ECG denoising approach based on noise reduction algorithms in empirical mode decomposition (EMD) and discrete wavelet transform (DWT) domains. Unlike the conventional EMD based ECG denoising approaches that neglect a number of initial intrinsic mode functions (IMFs) containing the QRS complex as well as noise, we propose to perform windowing in the EMD domain in order to reduce the noise from the initial IMFs instead of discarding them completely thus preserving the QRS complex and yielding a relatively cleaner ECG signal. The signal thus obtained is transformed in the DWT domain, where an adaptive soft thresholding based noise reduction algorithm is employed considering the advantageous properties of the DWT compared to that of the EMD in preserving the energy in the presence of noise and in reconstructing the original ECG signal with a better time resolution. Extensive simulations are carried out using the MIT-BIH arrythmia database and the performance of the proposed method is evaluated in terms of several standard metrics. The simulation results show that the proposed method is able to reduce noise from the noisy ECG signals more accurately and consistently in comparison to some of the stateof-the-art methods.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号