首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   14篇
  免费   0篇
  2020年   2篇
  2018年   1篇
  2015年   2篇
  2013年   2篇
  2010年   1篇
  2009年   2篇
  2005年   2篇
  2004年   2篇
排序方式: 共有14条查询结果,搜索用时 31 毫秒
1.
Rather than discarding motor unit potential trains (MUPTs) because they do not meet 100% validity criteria, we describe and evaluate a novel editing routine that preserves valid discharge times, based on decreasing shape variability (variance ratio, VR) within a MUPT. The error filtered estimation (EFE) algorithm is then applied to the remaining ‘high confidence’ discharge times to estimate inter-discharge interval (IDI) statistics. Decomposed surface EMG data from the flexor carpi radialis recorded from 20 participants during 60% MVC wrist flexion was used. There were two levels of denoising criteria (relaxed and strict) criteria for removing MUPs to decrease the VR and increase the signal-to-noise ratio (SNR) of a MUPT. In total, VR decreased 24.88% and SNR increased 6.0% (p’s < 0.05). The MUP template peak-to-peak (P-P) amplitude and P-P duration were dependent on the level of denoising (p’s < 0.05). The standard error of the estimate (SEE) of the mean IDI before and after editing using the relaxed criteria (3.2% versus 3.69%), was very similar (p > 0.05). The same was true for the SEE between denoising criteria, which increased only to 5.14% for the strict criteria (p > 0.05). Editing the MUPTs resulted in a significant decrease in MUP shape variability and in the measures extracted from the MUP templates, with trivial differences between the SEE of the mean IDI between the edited and unedited MUPTs.  相似文献   
2.
The identification of new diagnostic or prognostic biomarkers is one of the main aims of clinical cancer research. In recent years, there has been a growing interest in using mass spectrometry for the detection of such biomarkers. The MS signal resulting from MALDI‐TOF measurements is contaminated by different sources of technical variations that can be removed by a prior pre‐processing step. In particular, denoising makes it possible to remove the random noise contained in the signal. Wavelet methodology associated with thresholding is usually used for this purpose. In this study, we adapted two multivariate denoising methods that combine wavelets and PCA to MS data. The objective was to obtain better denoising of the data so as to extract the meaningful proteomic biological information from the raw spectra and reach meaningful clinical conclusions. The proposed methods were evaluated and compared with the classical soft thresholding denoising method using both real and simulated data sets. It was shown that taking into account common structures of the signals by adding a dimension reduction step on approximation coefficients through PCA provided more effective denoising when combined with soft thresholding on detail coefficients.  相似文献   
3.

Background

The popularity of new sequencing technologies has led to an explosion of possible applications, including new approaches in biodiversity studies. However each of these sequencing technologies suffers from sequencing errors originating from different factors. For 16S rRNA metagenomics studies, the 454 pyrosequencing technology is one of the most frequently used platforms, but sequencing errors still lead to important data analysis issues (e.g. in clustering in taxonomic units and biodiversity estimation). Moreover, retaining a higher portion of the sequencing data by preserving as much of the read length as possible while maintaining the error rate within an acceptable range, will have important consequences at the level of taxonomic precision.

Results

The new error correction algorithm proposed in this work - NoDe (Noise Detector) - is trained to identify those positions in 454 sequencing reads that are likely to have an error, and subsequently clusters those error-prone reads with correct reads resulting in error-free representative read. A benchmarking study with other denoising algorithms shows that NoDe can detect up to 75% more errors in a large scale mock community dataset, and this with a low computational cost compared to the second best algorithm considered in this study. The positive effect of NoDe in 16S rRNA studies was confirmed by the beneficial effect on the precision of the clustering of pyrosequencing reads in operational taxonomic units.

Conclusions

NoDe was shown to be a computational efficient denoising algorithm for pyrosequencing reads, producing the lowest error rates in an extensive benchmarking study with other denoising algorithms.

Electronic supplementary material

The online version of this article (doi:10.1186/s12859-015-0520-5) contains supplementary material, which is available to authorized users.  相似文献   
4.
One of the most commonly used methods for protein separation is 2‐DE. After 2‐DE gel scanning, images with a plethora of spot features emerge that are usually contaminated by inherent noise. The objective of the denoising process is to remove noise to the extent that the true spots are recovered correctly and accurately i.e. without introducing distortions leading to the detection of false‐spot features. In this paper we propose and justify the use of the contourlet transform as a tool for 2‐DE gel images denoising. We compare its effectiveness with state‐of‐the‐art methods such as wavelets‐based multiresolution image analysis and spatial filtering. We show that contourlets not only achieve better average S/N performance than wavelets and spatial filters, but also preserve better spot boundaries and faint spots and alter less the intensities of informative spot features, leading to more accurate spot volume estimation and more reliable spot detection, operations that are essential to differential expression proteomics for biomarkers discovery.  相似文献   
5.

Background

Reducing the effects of sequencing errors and PCR artifacts has emerged as an essential component in amplicon-based metagenomic studies. Denoising algorithms have been designed that can reduce error rates in mock community data, but they change the sequence data in a manner that can be inconsistent with the process of removing errors in studies of real communities. In addition, they are limited by the size of the dataset and the sequencing technology used.

Results

FlowClus uses a systematic approach to filter and denoise reads efficiently. When denoising real datasets, FlowClus provides feedback about the process that can be used as the basis to adjust the parameters of the algorithm to suit the particular dataset. When used to analyze a mock community dataset, FlowClus produced a lower error rate compared to other denoising algorithms, while retaining significantly more sequence information. Among its other attributes, FlowClus can analyze longer reads being generated from all stages of 454 sequencing technology, as well as from Ion Torrent. It has processed a large dataset of 2.2 million GS-FLX Titanium reads in twelve hours; using its more efficient (but less precise) trie analysis option, this time was further reduced, to seven minutes.

Conclusions

Many of the amplicon-based metagenomics datasets generated over the last several years have been processed through a denoising pipeline that likely caused deleterious effects on the raw data. By using FlowClus, one can avoid such negative outcomes while maintaining control over the filtering and denoising processes. Because of its efficiency, FlowClus can be used to re-analyze multiple large datasets together, thereby leading to more standardized conclusions. FlowClus is freely available on GitHub (jsh58/FlowClus); it is written in C and supported on Linux.

Electronic supplementary material

The online version of this article (doi:10.1186/s12859-015-0532-1) contains supplementary material, which is available to authorized users.  相似文献   
6.
In this paper, a new filtering method is presented to remove the Rician noise from magnetic resonance images (MRI) acquired using single coil MRI acquisition system. This filter is based on nonlocal neutrosophic set (NLNS) approach of Wiener filtering. A neutrosophic set (NS), a part of neutrosophy theory, studies the origin, nature, and scope of neutralities, as well as their interactions with different ideational spectra. Now, we apply the neutrosophic set into image domain and define some concepts and operators for image denoising. First, the nonlocal mean is applied to the noisy MRI. The resultant image is transformed into NS domain, described using three membership sets: true (T), indeterminacy (I) and false (F). The entropy of the neutrosophic set is defined and employed to measure the indeterminacy. The ω-Wiener filtering operation is used on T and F to decrease the set indeterminacy and to remove the noise. The experiments have been conducted on simulated MR images from Brainweb database and clinical MR images. The results show that the NLNS Wiener filter produces better denoising results in terms of qualitative and quantitative measures compared with other denoising methods, such as classical Wiener filter, the anisotropic diffusion filter, the total variation minimization and the nonlocal means filter. The visual and the diagnostic quality of the denoised image are well preserved.  相似文献   
7.
8.
Electrocardiography (ECG) signals are often contaminated by various kinds of noise or artifacts, for example, morphological changes due to motion artifact, non-stationary noise due to muscular contraction (EMG), etc. Some of these contaminations severely affect the usefulness of ECG signals, especially when computer aided algorithms are utilized. In this paper, a novel ECG enhancement algorithm is proposed based on sparse derivatives. By solving a convex ?1 optimization problem, artifacts are reduced by modeling the clean ECG signal as a sum of two signals whose second and third-order derivatives (differences) are sparse respectively. The algorithm is applied to a QRS detection system and validated using the MIT-BIH Arrhythmia database (109,452 anotations), resulting a sensitivity of Se = 99.87% and a positive prediction of +P = 99.88%.  相似文献   
9.
Positron emission tomography (PET) images have been incorporated into the radiotherapy process as a powerful tool to assist in the contouring of lesions, leading to the emergence of a broad spectrum of automatic segmentation schemes for PET images (PET-AS). However, not all proposed PET-AS algorithms take into consideration the previous steps of image preparation. PET image noise has been shown to be one of the most relevant affecting factors in segmentation tasks. This study demonstrates a nonlinear filtering method based on spatially adaptive wavelet shrinkage using three-dimensional context modelling that considers the correlation of each voxel with its neighbours. Using this noise reduction method, excellent edge conservation properties are obtained. To evaluate the influence in the segmentation schemes of this filter, it was compared with a set of Gaussian filters (the most conventional) and with two previously optimised edge-preserving filters. Five segmentation schemes were used (most commonly implemented in commercial software): fixed thresholding, adaptive thresholding, watershed, adaptive region growing and affinity propagation clustering. Segmentation results were evaluated using the Dice similarity coefficient and classification error. A simple metric was also included to improve the characterisation of the filters used for induced blurring evaluation, based on the measurement of the average edge width. The proposed noise reduction procedure improves the results of segmentation throughout the performed settings and was shown to be more stable in low-contrast and high-noise conditions. Thus, the capacity of the segmentation method is reinforced by the denoising plan used.  相似文献   
10.
Denoising array-based comparative genomic hybridization data using wavelets   总被引:8,自引:0,他引:8  
Array-based comparative genomic hybridization (array-CGH) provides a high-throughput, high-resolution method to measure relative changes in DNA copy number simultaneously at thousands of genomic loci. Typically, these measurements are reported and displayed linearly on chromosome maps, and gains and losses are detected as deviations from normal diploid cells. We propose that one may consider denoising the data to uncover the true copy number changes before drawing inferences on the patterns of aberrations in the samples. Nonparametric techniques are particularly suitable for data denoising as they do not impose a parametric model in finding structures in the data. In this paper, we employ wavelets to denoise the data as wavelets have sound theoretical properties and a fast computational algorithm, and are particularly well suited for handling the abrupt changes seen in array-CGH data. A simulation study shows that denoising data prior to testing can achieve greater power in detecting the aberrant spot than using the raw data without denoising. Finally, we illustrate the method on two array-CGH data sets.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号