首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 105 毫秒
1.
We present EMAN (Electron Micrograph ANalysis), a software package for performing semiautomated single-particle reconstructions from transmission electron micrographs. The goal of this project is to provide software capable of performing single-particle reconstructions beyond 10 A as such high-resolution data become available. A complete single-particle reconstruction algorithm is implemented. Options are available to generate an initial model for particles with no symmetry, a single axis of rotational symmetry, or icosahedral symmetry. Model refinement is an iterative process, which utilizes classification by model-based projection matching. CTF (contrast transfer function) parameters are determined using a new paradigm in which data from multiple micrographs are fit simultaneously. Amplitude and phase CTF correction is then performed automatically as part of the refinement loop. A graphical user interface is provided, so even those with little image processing experience will be able to begin performing reconstructions. Advanced users can directly use the lower level shell commands and even expand the package utilizing EMAN's extensive image-processing library. The package was written from scratch in C++ and is provided free of charge on our Web site. We present an overview of the package as well as several conformance tests with simulated data.  相似文献   

2.
Reduced representation templates are used in a real-space pattern matching framework to facilitate automatic particle picking from electron micrographs. The procedure consists of five parts. First, reduced templates are constructed either from models or directly from the data. Second, a real-space pattern matching algorithm is applied using the reduced representations as templates. Third, peaks are selected from the resulting score map using peak-shape characteristics. Fourth, the surviving peaks are tested for distance constraints. Fifth, a correlation-based outlier screening is applied. Test applications to a data set of keyhole limpet hemocyanin particles indicate that the method is robust and reliable.  相似文献   

3.
Single-particle analysis is a 3-D structure determining method using electron microscopy (EM). In this method, a large number of projections is required to create 3-D reconstruction. In order to enable completely automatic pickup without a matching template or a training data set, we established a brand-new method in which the frames to pickup particles are randomly shifted and rotated over the electron micrograph and, using the total average image of the framed images as an index, each frame reaches a particle. In this process, shifts are selected to increase the contrast of the average. By iterated shifts and further selection of the shifts, the frames are induced to shift so as to surround particles. In this algorithm, hundreds of frames are initially distributed randomly over the electron micrograph in which multi-particle images are dispersed. Starting with these frames, one of them is selected and shifted randomly, and acceptance or non-acceptance of its new position is judged using the simulated annealing (SA) method in which the contrast score of the total average image is adopted as an index. After iteration of this process, the position of each frame converges so as to surround a particle and the framed images are picked up. This method is the first unsupervised fully automatic particle picking method which is applicable to EM of various kinds of proteins, especially to low-contrasted cryo-EM protein images.  相似文献   

4.
Recent progress in single-particle reconstruction methods and cryo-EM techniques has led to the determination of macromolecular structures with unprecedented resolution. The number of particles that goes into the reconstruction is a key determinant in achieving high resolution. Interactive manual picking of particles from an electron micrograph is a very time-consuming, tedious, and inefficient process. We have implemented a fast automatic particle picking procedure in the SPIDER environment. The procedure makes use of template matching schemes and employs a recently developed locally normalized correlation algorithm based on Fourier techniques. As a test, we have used this procedure to pick 70S Escherichia coli ribosomes from a cryo-electron micrograph. Different search strategies including use of a circular mask and asymmetric masks for different orientations of the particle have been explored, and their relative efficiencies are discussed. The results indicate that the procedure can be optimally used to pick ribosomes in a fully automatic way within the limit of selecting less than 10% false positives while missing about 15% of true positives.  相似文献   

5.
MOTIVATION: Despite the growing literature devoted to finding differentially expressed genes in assays probing different tissues types, little attention has been paid to the combinatorial nature of feature selection inherent to large, high-dimensional gene expression datasets. New flexible data analysis approaches capable of searching relevant subgroups of genes and experiments are needed to understand multivariate associations of gene expression patterns with observed phenotypes. RESULTS: We present in detail a deterministic algorithm to discover patterns of multivariate gene associations in gene expression data. The patterns discovered are differential with respect to a control dataset. The algorithm is exhaustive and efficient, reporting all existent patterns that fit a given input parameter set while avoiding enumeration of the entire pattern space. The value of the pattern discovery approach is demonstrated by finding a set of genes that differentiate between two types of lymphoma. Moreover, these genes are found to behave consistently in an independent dataset produced in a different laboratory using different arrays, thus validating the genes selected using our algorithm. We show that the genes deemed significant in terms of their multivariate statistics will be missed using other methods. AVAILABILITY: Our set of pattern discovery algorithms including a user interface is distributed as a package called Genes@Work. This package is freely available to non-commercial users and can be downloaded from our website (http://www.research.ibm.com/FunGen).  相似文献   

6.
Single particle analysis (SPA) coupled with high-resolution electron cryo-microscopy is emerging as a powerful technique for the structure determination of membrane protein complexes and soluble macromolecular assemblies. Current estimates suggest that approximately 10(4)-10(5) particle projections are required to attain a 3A resolution 3D reconstruction (symmetry dependent). Selecting this number of molecular projections differing in size, shape and symmetry is a rate-limiting step for the automation of 3D image reconstruction. Here, we present Swarm(PS), a feature rich GUI based software package to manage large scale, semi-automated particle picking projects. The software provides cross-correlation and edge-detection algorithms. Algorithm-specific parameters are transparently and automatically determined through user interaction with the image, rather than by trial and error. Other features include multiple image handling (approximately 10(2)), local and global particle selection options, interactive image freezing, automatic particle centering, and full manual override to correct false positives and negatives. Swarm(PS) is user friendly, flexible, extensible, fast, and capable of exporting boxed out projection images, or particle coordinates, compatible with downstream image processing suites.  相似文献   

7.
A new learning-based approach is presented for particle detection in cryo-electron micrographs using the Adaboost learning algorithm. The approach builds directly on the successful detectors developed for the domain of face detection. It is a discriminative algorithm which learns important features of the particle's appearance using a set of training examples of the particles and a set of images that do not contain particles. The algorithm is fast (10 s on a 1.3 GHz Pentium M processor), is generic, and is not limited to any particular shape or size of the particle to be detected. The method has been evaluated on a publicly available dataset of 82 cryoEM images of keyhole lympet hemocyanin (KLH). From 998 automatically extracted particle images, the 3-D structure of KLH has been reconstructed at a resolution of 23.2 A which is the same resolution as obtained using particles manually selected by a trained user.  相似文献   

8.
9.
Predicting what items will be selected by a target user in the future is an important function for recommendation systems. Matrix factorization techniques have been shown to achieve good performance on temporal rating-type data, but little is known about temporal item selection data. In this paper, we developed a unified model that combines Multi-task Non-negative Matrix Factorization and Linear Dynamical Systems to capture the evolution of user preferences. Specifically, user and item features are projected into latent factor space by factoring co-occurrence matrices into a common basis item-factor matrix and multiple factor-user matrices. Moreover, we represented both within and between relationships of multiple factor-user matrices using a state transition matrix to capture the changes in user preferences over time. The experiments show that our proposed algorithm outperforms the other algorithms on two real datasets, which were extracted from Netflix movies and Last.fm music. Furthermore, our model provides a novel dynamic topic model for tracking the evolution of the behavior of a user over time.  相似文献   

10.
In order to make a high resolution model of macromolecular structures from cryo-electron microscope (cryo-EM) raw images one has to be precise at every processing step from particle picking to 3D image reconstruction. In this paper we propose a collection of novel methods for filtering cryo-EM images and for automatic picking of particles. These methods have been developed for two cases: (1) when particles can be identified and (2) when particle are not distinguishable. The advantages of these methods are demonstrated in standard purified protein samples and to generalize them we do not use any ad hoc presumption of the geometry of the particle projections. We have also suggested a filtering method to increase the signal-to-noise (S/N) ratio which has proved to be useful for other levels of reconstruction, i.e., finding orientations and 3D model reconstruction.  相似文献   

11.
The FindEM particle picking program was tested on the publicly available keyhole limpet hemocyanin (KLH) dataset, and the results were submitted for the "bakeoff" contest at the recent particle picking workshop (Zhu et al., 2003b). Two alternative ways of using the program are demonstrated and the results are compared. The first of these approximates exhaustive projection matching with a full set of expected views, which need to be known. This could correspond to the task of extending a known structure to higher resolution, for which many 1000's of additional images are required. The second procedure illustrates use of multivariate statistical analysis (MSA) to filter a preliminary set of candidate particles containing a high proportion of false particles. This set was generated using the FindEM program to search with one template that crudely represents the expected views. Classification of the resultant set of candidate particles then allows the desired classes to be selected while the rest can be ignored. This approach requires no prior information of the structure and is suitable for the initial investigation of an unknown structure--the class averages indicate the symmetry and oligomeric state of the particles. Potential improvements in speed and accuracy are discussed.  相似文献   

12.
SUMMARY: AnnBuilder is an R package for assembling genomic annotation data. The system currently provides parsers to process annotation data from LocusLink, Gene Ontology Consortium, and Human Gene Project and can be extended to new data sources via user defined parsers. AnnBuilder differs from other existing systems in that it provides users with unlimited ability to assemble data from user selected sources. The products of AnnBuilder are files in XML format that can be easily used by different systems. AVAILABILITY: (http://www.bioconductor.org). Open source.  相似文献   

13.
The automation of single particle selection and tomographic segmentation of asymmetric particles and objects is facilitated by continuing improvement of methods based on the detection of pixel discontinuity. Here, we present the new arbitrary z-crossings approach which can be employed to enhance the accuracy of edge detection algorithms that are based on the second derivative. This is demonstrated using the Laplacian of Gaussian (LoG) filter. In its normal implementation the LoG filter uses a z value of zero to define edge contours. In contrast, the arbitrary z-crossings approach allows the user to adjust z, which causes the subsequently generated contours to tend towards lighter or darker image objects, depending on the sign of z. This functionality has been coupled with an additional feature: the ability to use the major and minor axes of bounding contours to hone automated object selection. In combination, these features significantly enhance the accuracy of particle selection and the speed of tomographic segmentation. Both features have been incorporated into the software package Swarm(PS) in which parameters are automatically adjusted based on user defined target selection.  相似文献   

14.
MOTIVATION: Core sets are necessary to ensure that access to useful alleles or characteristics retained in genebanks is guaranteed. We have successfully developed a computational tool named 'PowerCore' that aims to support the development of core sets by reducing the redundancy of useful alleles and thus enhancing their richness. RESULTS: The program, using a new approach completely different from any other previous methodologies, selects entries of core sets by the advanced M (maximization) strategy implemented through a modified heuristic algorithm. The developed core set has been validated to retain all characteristics for qualitative traits and all classes for quantitative ones. PowerCore effectively selected the accessions with higher diversity representing the entire coverage of variables and gave a 100% reproducible list of entries whenever repeated. AVAILABILITY: PowerCore software uses the .NET Framework Version 1.1 environment which is freely available for the MS Windows platform. The files can be downloaded from http://genebank.rda.go.kr/powercore/. The distribution of the package includes executable programs, sample data and a user manual.  相似文献   

15.
We present a new particle tracking software algorithm designed to accurately track the motion of low-contrast particles against a background with large variations in light levels. The method is based on a polynomial fit of the intensity around each feature point, weighted by a Gaussian function of the distance from the centre, and is especially suitable for tracking endogeneous particles in the cell, imaged with bright field, phase contrast or fluorescence optical microscopy. Furthermore, the method can simultaneously track particles of all different sizes, and allows significant freedom in their shape. The algorithm is evaluated using the quantitative measures of accuracy and precision of previous authors, using simulated images at variable signal-to-noise ratios. To these we add new tests: the error due to a non-uniform background, and the error due to two particles approaching each other. Finally the tracking of particles in real cell images is demonstrated. The method is made freely available for non-commercial use as a software package with a graphical user-interface, which can be run within the Matlab programming environment.  相似文献   

16.
Molecular docking and virtual screening based on molecular docking have become an integral part of many modern structure-based drug discovery efforts. Hence, it becomes a useful endeavor to evaluate existing docking programs, which can assist in the choice of the most suitable docking algorithm for any particular study. The objective of the current study was to evaluate the ability of ArgusLab 4.0, a relatively new molecular modeling package in which molecular docking is implemented, to reproduce crystallographic binding orientations and to compare its accuracy with that of a well established commercial package, GOLD. The study also aimed to evaluate the effect of the nature of the binding site and ligand properties on docking accuracy. The three dimensional structures of a carefully chosen set of 75 pharmaceutically relevant protein-ligand complexes were used for the comparative study. The study revealed that the commercial package outperforms the freely available docking engine in almost all the parameters tested. However, the study also revealed that although lagging behind in accuracy, results from ArgusLab are biologically meaningful. This taken together with the fact that ArgusLab has an easy to use graphical user interface, means that it can be employed as an effective teaching tool to demonstrate molecular docking to beginners in this area.  相似文献   

17.
Reaction kinetics for complex, highly interconnected kinetic schemes are modeled using analytical solutions to a system of ordinary differential equations. The algorithm employs standard linear algebra methods that are implemented using MatLab functions in a Visual Basic interface. A graphical user interface for simple entry of reaction schemes facilitates comparison of a variety of reaction schemes. To ensure microscopic balance, graph theory algorithms are used to determine violations of thermodynamic cycle constraints. Analytical solutions based on linear differential equations result in fast comparisons of first order kinetic rates and amplitudes as a function of changing ligand concentrations. For analysis of higher order kinetics, we also implemented a solution using numerical integration. To determine rate constants from experimental data, fitting algorithms that adjust rate constants to fit the model to imported data were implemented using the Levenberg-Marquardt algorithm or using Broyden-Fletcher-Goldfarb-Shanno methods. We have included the ability to carry out global fitting of data sets obtained at varying ligand concentrations. These tools are combined in a single package, which we have dubbed VisKin, to guide and analyze kinetic experiments. The software is available online for use on PCs.  相似文献   

18.
Accurate and automatic particle detection from cryo-electron microscopy (cryo-EM images) is very important for high-resolution reconstruction of large macromolecular structures. In this paper, we present a method for particle picking based on shape feature detection. Two fundamental concepts of computational geometry, namely, the distance transform and the Voronoi diagram, are used for detection of critical features as well as for accurate location of particles from the images or micrographs. Unlike the conventional template-matching methods, our approach detects the particles based on their boundary features instead of intensities. The geometric features derived from the boundaries provide an efficient way for locating particles quickly and accurately, which avoids a brute-force searching for the best position/orientation. Our approach is fully automatic and has been successfully applied to detect particles with approximately circular or rectangular shapes (e.g., KLH particles). Particle detection can be enhanced by multiple sets of parameters used in edge detection and/or by anisotropic filtering. We also discuss the extension of this approach to other types of particles with certain geometric features.  相似文献   

19.
Cyprinodontiforms are a diverse and speciose order that includes topminnows, pupfishes, swordtails, mosquitofishes, guppies, and mollies. Sister group to the Beloniformes and Atheriniformes, Cyprinodontiformes contains approximately twice the number of species of these other two orders combined. Recent studies suggest that this group is well suited to capturing prey by “picking” small items from the water surface, water column, and the substrate. Because picking places unusual performance demands on the feeding apparatus, this mode of prey capture may rely upon novel morphological modifications not found in more widespread ram‐ or suction‐based feeding mechanisms. To assess this evolutionary hypothesis, we describe the trophic anatomy of 16 cyprinodontiform species, selected to broadly represent the order as well as capture intrageneric variation. The group appears to have undergone gradual morphological changes to become increasingly specialized for picking and scraping behaviors. We also identify a suite of functional characters related to the acquisition of a novel and previously undescribed mechanism of premaxillary protrusion and retraction, including: modification of the “premaxillomandibular” ligament (which connects each side of the premaxilla to the ipsilateral mandible, or lower jaw), a novel architecture of the ligaments and bony elements that unite the premaxillae, maxillae and palatine bones, and novel insertions of the adductor muscles onto the jaws. These morphological changes to both the upper and lower jaws suggest an evolutionary trend within this group toward increased reliance on picking individual prey from the water column/substrate or for scraping encrusting material from the substrate. We propose that the suite of morphological characters described here enable a functional innovation, “picking,” which leads to novel trophic habits. J. Morphol., 2009. © 2008 Wiley‐Liss, Inc.  相似文献   

20.
The user-based collaborative filtering (CF) algorithm is one of the most popular approaches for making recommendation. Despite its success, the traditional user-based CF algorithm suffers one serious problem that it only measures the influence between two users based on their symmetric similarities calculated by their consumption histories. It means that, for a pair of users, the influences on each other are the same, which however may not be true. Intuitively, an expert may have an impact on a novice user but a novice user may not affect an expert at all. Besides, each user may possess a global importance factor that affects his/her influence to the remaining users. To this end, in this paper, we propose an asymmetric user influence model to measure the directed influence between two users and adopt the PageRank algorithm to calculate the global importance value of each user. And then the directed influence values and the global importance values are integrated to deduce the final influence values between two users. Finally, we use the final influence values to improve the performance of the traditional user-based CF algorithm. Extensive experiments have been conducted, the results of which have confirmed that both the asymmetric user influence model and global importance value play key roles in improving recommendation accuracy, and hence the proposed method significantly outperforms the existing recommendation algorithms, in particular the user-based CF algorithm on the datasets of high rating density.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号