首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The experimental process of collecting images from macromolecules in an electron microscope is such that it does not allow for prior specification of the angular distribution of the projection images. As a consequence, an uneven distribution of projection directions may occur. Concerns have been raised recently about the behavior of 3D reconstruction algorithms for the case of unevenly distributed projections. It has been illustrated on experimental data that in the case of a heavily uneven distribution of projection directions some algorithms tend to elongate the reconstructed volumes along the overloaded direction so much as to make a quantitative biological analysis impossible. In answer to these concerns we have developed a strategy for quantitative comparison and optimization of 3D reconstruction algorithms. We apply this strategy to quantitatively analyze algebraic reconstruction techniques (ART) with blobs, simultaneous iterative reconstruction techniques (SIRT) with voxels, and weighted backprojection (WBP). We show that the elongation artifacts that had been previously reported can be strongly reduced. With our specific choices for the free parameters of the three algorithms, WBP reconstructions tend to be inferior to those obtained with either SIRT or ART and the results obtained with ART are comparable to those with SIRT, but at a very small fraction of the computational cost of SIRT.  相似文献   

2.
The Filtered Back-Projection (FBP) algorithm and its modified versions are the most important techniques for CT (Computerized tomography) reconstruction, however, it may produce aliasing degradation in the reconstructed images due to projection discretization. The general iterative reconstruction (IR) algorithms suffer from their heavy calculation burden and other drawbacks. In this paper, an iterative FBP approach is proposed to reduce the aliasing degradation. In the approach, the image reconstructed by FBP algorithm is treated as the intermediate image and projected along the original projection directions to produce the reprojection data. The difference between the original and reprojection data is filtered by a special digital filter, and then is reconstructed by FBP to produce a correction term. The correction term is added to the intermediate image to update it. This procedure can be performed iteratively to improve the reconstruction performance gradually until certain stopping criterion is satisfied. Some simulations and tests on real data show the proposed approach is better than FBP algorithm or some IR algorithms in term of some general image criteria. The calculation burden is several times that of FBP, which is much less than that of general IR algorithms and acceptable in the most situations. Therefore, the proposed algorithm has the potential applications in practical CT systems.  相似文献   

3.
Projection and back-projection are the most computationally intensive parts in Computed Tomography (CT) reconstruction, and are essential to acceleration of CT reconstruction algorithms. Compared to back-projection, parallelization efficiency in projection is highly limited by racing condition and thread unsynchronization. In this paper, a strategy of Fixed Sampling Number Projection (FSNP) is proposed to ensure the operation synchronization in the ray-driven projection with Graphical Processing Unit (GPU). Texture fetching is also used utilized to further accelerate the interpolations in both projection and back-projection. We validate the performance of this FSNP approach using both simulated and real cone-beam CT data. Experimental results show that compare to the conventional approach, the proposed FSNP method together with texture fetching is 10~16 times faster than the conventional approach based on global memory, and thus leads to more efficient iterative algorithm in CT reconstruction.  相似文献   

4.
In practical applications of computed tomography imaging (CT), it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data.  相似文献   

5.
In single photon emission computed tomography (SPECT), accurate attenuation maps are needed to perform essential attenuation compensation for high quality radioactivity estimation. Formulating the SPECT activity and attenuation reconstruction tasks as coupled signal estimation and system parameter identification problems, where the activity distribution and the attenuation parameter are treated as random variables with known prior statistics, we present a nonlinear dual reconstruction scheme based on the unscented Kalman filtering (UKF) principles. In this effort, the dynamic changes of the organ radioactivity distribution are described through state space evolution equations, while the photon-counting SPECT projection data are measured through the observation equations. Activity distribution is then estimated with sub-optimal fixed attenuation parameters, followed by attenuation map reconstruction given these activity estimates. Such coupled estimation processes are iteratively repeated as necessary until convergence. The results obtained from Monte Carlo simulated data, physical phantom, and real SPECT scans demonstrate the improved performance of the proposed method both from visual inspection of the images and a quantitative evaluation, compared to the widely used EM-ML algorithms. The dual estimation framework has the potential to be useful for estimating the attenuation map from emission data only and thus benefit the radioactivity reconstruction.  相似文献   

6.
In this work we propose a reconstruction algorithm (ART with blobs) that has not been previously used in electron Tomography and we compare it with the standard method in the field (weighted back projection, WBP). We assume that only a limited set of very noisy images, collected around a single axis tilt, is available; which is a typical situation in Electron Tomography. In general, the reconstruction problem is underdetermined (due to the limited number of projections) and the data are inconsistent (due to the high level of noise). The evaluation of the results is performed in a rigorous way by a task-oriented approach which makes use of numerical observers. ART with blobs outperforms WBP for a number of key tasks. Results are presented both for simplified line integral data and for realistic simulations of macromolecular structures embedded in amorphous ice.  相似文献   

7.
The large amount of image data necessary for high-resolution 3D reconstruction of macromolecular assemblies leads to significant increases in the computational time. One of the most time consuming operations is 3D density map reconstruction, and software optimization can greatly reduce the time required for any given structural study. The majority of algorithms proposed for improving the computational effectiveness of a 3D reconstruction are based on a ray-by-ray projection of each image into the reconstructed volume. In this paper, we propose a novel fast implementation of the "filtered back-projection" algorithm based on a voxel-by-voxel principle. Our version of this implementation has been exhaustively tested using both model and real data. We compared 3D reconstructions obtained by the new approach with results obtained by the filtered Back-Projections algorithm and the Fourier-Bessel algorithm commonly used for reconstructing icosahedral viruses. These computational experiments demonstrate the robustness, reliability, and efficiency of this approach.  相似文献   

8.
Liu H  Wang S  Gao F  Tian Y  Chen W  Hu Z  Shi P 《PloS one》2012,7(3):e32224
In Positron Emission Tomography (PET), an optimal estimate of the radioactivity concentration is obtained from the measured emission data under certain criteria. So far, all the well-known statistical reconstruction algorithms require exactly known system probability matrix a priori, and the quality of such system model largely determines the quality of the reconstructed images. In this paper, we propose an algorithm for PET image reconstruction for the real world case where the PET system model is subject to uncertainties. The method counts PET reconstruction as a regularization problem and the image estimation is achieved by means of an uncertainty weighted least squares framework. The performance of our work is evaluated with the Shepp-Logan simulated and real phantom data, which demonstrates significant improvements in image quality over the least squares reconstruction efforts.  相似文献   

9.
Phylogenetic mixtures model the inhomogeneous molecular evolution commonly observed in data. The performance of phylogenetic reconstruction methods where the underlying data are generated by a mixture model has stimulated considerable recent debate. Much of the controversy stems from simulations of mixture model data on a given tree topology for which reconstruction algorithms output a tree of a different topology; these findings were held up to show the shortcomings of particular tree reconstruction methods. In so doing, the underlying assumption was that mixture model data on one topology can be distinguished from data evolved on an unmixed tree of another topology given enough data and the "correct" method. Here we show that this assumption can be false. For biologists, our results imply that, for example, the combined data from two genes whose phylogenetic trees differ only in terms of branch lengths can perfectly fit a tree of a different topology.  相似文献   

10.
A new algorithm for three-dimensional reconstruction from randomly oriented projections has been developed. The algorithm recovers the 3D Radon transform from the 2D Radon transforms (sinograms) of the projections. The structure in direct space is obtained by an inversion of the 3D Radon transform. The mathematical properties of the Radon transform are exploited to design a special filter that can be used to correct inconsistencies in a data set and to fill the gaps in the Radon transform that originate from missing projections. Several versions of the algorithm have been implemented, with and without a filter and with different interpolation methods for merging the sinograms into the 3D Radon transform. The algorithms have been tested on analytical phantoms and experimental data and have been compared with a weighted back projection algorithm (WBP). A quantitative analysis of phantoms reconstructed from noise-free and noise-corrupted projections shows that the new algorithms are more accurate than WBP when the number of projections is small. Experimental structures obtained by the new methods are strictly comparable to those obtained by WBP. Moreover, the algorithm is more than 10 times faster than WPB when applied to a data set of 1000-5000 projections. Copyright 1999 Academic Press.  相似文献   

11.
MOTIVATION: Cluster analysis of genome-wide expression data from DNA microarray hybridization studies has proved to be a useful tool for identifying biologically relevant groupings of genes and samples. In the present paper, we focus on several important issues related to clustering algorithms that have not yet been fully studied. RESULTS: We describe a simple and robust algorithm for the clustering of temporal gene expression profiles that is based on the simulated annealing procedure. In general, this algorithm guarantees to eventually find the globally optimal distribution of genes over clusters. We introduce an iterative scheme that serves to evaluate quantitatively the optimal number of clusters for each specific data set. The scheme is based on standard approaches used in regular statistical tests. The basic idea is to organize the search of the optimal number of clusters simultaneously with the optimization of the distribution of genes over clusters. The efficiency of the proposed algorithm has been evaluated by means of a reverse engineering experiment, that is, a situation in which the correct distribution of genes over clusters is known a priori. The employment of this statistically rigorous test has shown that our algorithm places greater than 90% genes into correct clusters. Finally, the algorithm has been tested on real gene expression data (expression changes during yeast cell cycle) for which the fundamental patterns of gene expression and the assignment of genes to clusters are well understood from numerous previous studies.  相似文献   

12.
MOTIVATION: Reconstructing evolutionary trees is an important problem in biology. A response to the computational intractability of most of the traditional criteria for inferring evolutionary trees has been a focus on new criteria, particularly quartet-based methods that seek to merge trees derived on subsets of four species from a given species-set into a tree for that entire set. Unfortunately, most of these methods are very sensitive to errors in the reconstruction of the trees for individual quartets of species. A recently developed technique called quartet cleaning can alleviate this difficulty in certain cases by using redundant information in the complete set of quartet topologies for a given species-set to correct such errors. RESULTS: In this paper, we describe two new local vertex quartet cleaning algorithms which have optimal time complexity and error-correction bound, respectively. These are the first known local vertex quartet cleaning algorithms that are optimal with respect to either of these attributes.  相似文献   

13.
Three-dimensional electron microscopy allows direct visualization of biological macromolecules close to their native state. The high impact of this technique in the structural biology field is highly correlated with the development of new image processing algorithms. In order to achieve subnanometer resolution, the size and number of images involved in a three-dimensional reconstruction increase and so do computer requirements. New chips integrating multiple processors are hitting the market at a reduced cost. This high-integration, low-cost trend has just begun and is expected to bring real supercomputers to our laboratory desktops in the coming years. This paper proposes a parallel implementation of a computation-intensive algorithm for three-dimensional reconstruction, ART, that takes advantage of the computational power in modern multicore platforms. ART is a sophisticated iterative reconstruction algorithm that has turned out to be well suited for the conditions found in three-dimensional electron microscopy. In view of the performance obtained in this work, these modern platforms are expected to play an important role to face the future challenges in three-dimensional electron microscopy.  相似文献   

14.
Several methods have been developed to estimate the parental contributions in the genetic pool of an admixed population. Some pair-comparisons have been performed on real data but, to date, no systematic comparison of a large number of methods has been attempted. In this study, we performed a simulated data-based comparison of six of the most cited methods in the literature of the last 20 years. Five of these methods use allele frequencies and differ in the statistical treatment of the data. The last one also considers the degree of molecular divergence by estimating the coalescence times. Comparisons are based on the frequency at which the method can be applied, the bias and the mean square error of the estimation, and the frequency at which the true value is within the confidence interval. Eventually, each method was applied to a real data set of variously introgressed honeybee populations. In optimal conditions (highly differentiated parental populations, recent hybridization event), all methods perform equally well. When conditions are not optimal, the methods perform differently, but no method is always better or worse than all others. Some guidelines are given for the choice of the method.  相似文献   

15.
PurposeNon-local means (NLM) based reconstruction method is a promising algorithm for few-view computed tomography (CT) reconstruction, but often suffers from over-smoothed image edges. To address this problem, an adaptive NLM reconstruction method based on rotational invariance (ART-RIANLM) is proposed.MethodsThe method consists of four steps: 1) Initializing parameters; 2) ART reconstruction using raw data; 3) Positivity constraint of the reconstructed image; 4) Image updating by RIANLM filtering. In RIANLM, two kinds of rotational invariance measures which are average gradient (AG) and region homogeneity (RH) are proposed to calculate the distance between two patches and a novel NLM filter is developed to avoid over-smoothed image. Moreover, the parameter h in RIANLM which controls the decay of the weights is adaptive to avoid over-smoothness, while it is constant in NLM during the whole reconstruction process. The proposed method is validated on two digital phantoms and real projection data.ResultsIn our experiments, the searching neighborhood size is set as 15 × 15 and the similarity window is set as 3 × 3. For the simulated case of Shepp-Logan phantom, ART-RIANLM produces higher SNR (36.23 dB > 24.00 dB) and lower MAE (0.0006 < 0.0024) reconstructed images than ART-NLM. The visual inspection demonstrated that the proposed method could suppress artifacts or noises more effectively and recover image edges better. The result of real data case is also consistent with the simulation result.ConclusionsA RIANLM based reconstruction method for few-view CT is presented. Compared to the traditional ART-NLM method, SNR and MAE from ART-RIANLM increases 51% and decreases 75%, respectively.  相似文献   

16.
MOTIVATION: Grouping genes having similar expression patterns is called gene clustering, which has been proved to be a useful tool for extracting underlying biological information of gene expression data. Many clustering procedures have shown success in microarray gene clustering; most of them belong to the family of heuristic clustering algorithms. Model-based algorithms are alternative clustering algorithms, which are based on the assumption that the whole set of microarray data is a finite mixture of a certain type of distributions with different parameters. Application of the model-based algorithms to unsupervised clustering has been reported. Here, for the first time, we demonstrated the use of the model-based algorithm in supervised clustering of microarray data. RESULTS: We applied the proposed methods to real gene expression data and simulated data. We showed that the supervised model-based algorithm is superior over the unsupervised method and the support vector machines (SVM) method. AVAILABILITY: The program written in the SAS language implementing methods I-III in this report is available upon request. The software of SVMs is available in the website http://svm.sdsc.edu/cgi-bin/nph-SVMsubmit.cgi  相似文献   

17.
Treatment of coronary lesions by percutaneous transluminal angioplasty is performed from 2D observations acquired according to an optimal view, i.e. a point of view showing the segment including the stenosis in its most extended and unobstructed dimension over a given perimeter around the lesion. In clinical routine, the research of this view generally involves a large number of acquisitions with repeated injections of contrast agent. In this paper, we present an automatic method to determine this optimal view. We consider that a dynamic sequence of 3D coronary tree is available either from pre-segmented CT data or by reconstruction of the 3D coronary tree from the projections acquired at each phase of the cardiac cycle. We proceed by volume projection into the detector plane in all possible gantry orientations. The optimal viewing angle is the one that minimizes the degree of segment foreshortening and overlap with adjacent segments in projection space.  相似文献   

18.
PurposeLimited-angle CT imaging is an effective technique to reduce radiation. However, existing image reconstruction methods can effectively reduce streak artifacts but fail to suppress those artifacts around edges due to incomplete projection data. Thus, a modified NLM (mNLM) based reconstruction method is proposed.MethodsSince the artifacts around edges mainly exist in local position, it is possible to restore the true pixels in artifacts using pixels located in artifacts-free regions. In each iteration, mNLM is performed on image reconstructed by ART followed by positivity constraint. To solve the problem caused by ART-mNLM that there is undesirable information that may appear in the image, ART-TV is then utilized in the following iterative process after ART-mNLM iterates for a number of iterations. The proposed algorithm is named as ART-mNLM/TV.ResultsSimulation experiments are performed to validate the feasibility of algorithm. When the scanning range is [0, 150°], our algorithm outperforms the ART-NLM and ART-TV with more than 40% and 29% improvement in terms of SNR and with more than 58% and 49% reduction in terms of MAE. Consistently, reconstructed images from real projection data also demonstrate the effectiveness of presented algorithm.ConclusionThis paper uses mNLM which benefits from redundancy of information across the whole image, to recover the true value of pixels in artifacts region by utilizing pixels from artifact-free regions, and artifacts around the edges can be mitigated effectively. Experiments show that the proposed ART-mNLM/TV is able to achieve better performances compared to traditional methods.  相似文献   

19.
Three-dimensional reconstruction from electron micrographs requires the selection of many single-particle projection images; more than 10 000 are generally required to obtain 5- to 10-A structural resolution. Consequently, various automatic detection algorithms have been developed and successfully applied to large symmetric protein complexes. This paper presents a new automated particle recognition and pickup procedure based on the three-layer neural network that has a large application range than other automated procedures. Its use for both faint and noisy electron micrographs is demonstrated. The method requires only 200 selected particles as learning data and is able to detect images of proteins as small as 200 kDa.  相似文献   

20.
Elucidating gene regulatory network (GRN) from large scale experimental data remains a central challenge in systems biology. Recently, numerous techniques, particularly consensus driven approaches combining different algorithms, have become a potentially promising strategy to infer accurate GRNs. Here, we develop a novel consensus inference algorithm, TopkNet that can integrate multiple algorithms to infer GRNs. Comprehensive performance benchmarking on a cloud computing framework demonstrated that (i) a simple strategy to combine many algorithms does not always lead to performance improvement compared to the cost of consensus and (ii) TopkNet integrating only high-performance algorithms provide significant performance improvement compared to the best individual algorithms and community prediction. These results suggest that a priori determination of high-performance algorithms is a key to reconstruct an unknown regulatory network. Similarity among gene-expression datasets can be useful to determine potential optimal algorithms for reconstruction of unknown regulatory networks, i.e., if expression-data associated with known regulatory network is similar to that with unknown regulatory network, optimal algorithms determined for the known regulatory network can be repurposed to infer the unknown regulatory network. Based on this observation, we developed a quantitative measure of similarity among gene-expression datasets and demonstrated that, if similarity between the two expression datasets is high, TopkNet integrating algorithms that are optimal for known dataset perform well on the unknown dataset. The consensus framework, TopkNet, together with the similarity measure proposed in this study provides a powerful strategy towards harnessing the wisdom of the crowds in reconstruction of unknown regulatory networks.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号