首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The coupling of protein energetics and sequence changes is a critical aspect of computational protein design, as well as for the understanding of protein evolution, human disease, and drug resistance. To study the molecular basis for this coupling, computational tools must be sufficiently accurate and computationally inexpensive enough to handle large amounts of sequence data. We have developed a computational approach based on the linear interaction energy (LIE) approximation to predict the changes in the free-energy of the native state induced by a single mutation. This approach was applied to a set of 822 mutations in 10 proteins which resulted in an average unsigned error of 0.82 kcal/mol and a correlation coefficient of 0.72 between the calculated and experimental ΔΔG values. The method is able to accurately identify destabilizing hot spot mutations; however, it has difficulty in distinguishing between stabilizing and destabilizing mutations because of the distribution of stability changes for the set of mutations used to parameterize the model. In addition, the model also performs quite well in initial tests on a small set of double mutations. On the basis of these promising results, we can begin to examine the relationship between protein stability and fitness, correlated mutations, and drug resistance.  相似文献   

2.
Identifying protein-coding regions in DNA sequences is an active issue in computational biology. In this study, we present a self adaptive spectral rotation (SASR) approach, which visualizes coding regions in DNA sequences, based on investigation of the Triplet Periodicity property, without any preceding training process. It is proposed to help with the rough coding regions prediction when there is no extra information for the training required by other outstanding methods. In this approach, at each position in the DNA sequence, a Fourier spectrum is calculated from the posterior subsequence. Following the spectrums, a random walk in complex plane is generated as the SASR's graphic output. Applications of the SASR on real DNA data show that patterns in the graphic output reveal locations of the coding regions and the frame shifts between them: arcs indicate coding regions, stable points indicate non-coding regions and corners' shapes reveal frame shifts. Tests on genomic data set from Saccharomyces Cerevisiae reveal that the graphic patterns for coding and non-coding regions differ to a great extent, so that the coding regions can be visually distinguished. Meanwhile, a time cost test shows that the SASR can be easily implemented with the computational complexity of O(N).  相似文献   

3.
4.
The large amount of image data necessary for high-resolution 3D reconstruction of macromolecular assemblies leads to significant increases in the computational time. One of the most time consuming operations is 3D density map reconstruction, and software optimization can greatly reduce the time required for any given structural study. The majority of algorithms proposed for improving the computational effectiveness of a 3D reconstruction are based on a ray-by-ray projection of each image into the reconstructed volume. In this paper, we propose a novel fast implementation of the "filtered back-projection" algorithm based on a voxel-by-voxel principle. Our version of this implementation has been exhaustively tested using both model and real data. We compared 3D reconstructions obtained by the new approach with results obtained by the filtered Back-Projections algorithm and the Fourier-Bessel algorithm commonly used for reconstructing icosahedral viruses. These computational experiments demonstrate the robustness, reliability, and efficiency of this approach.  相似文献   

5.
MOTIVATION: Since the whole genome sequences of many species have been determined, computational prediction of RNA secondary structures and computational identification of those non-coding RNA regions by comparative genomics become important. Therefore, more advanced alignment methods are required. Recently, an approach of structural alignment for RNA sequences has been introduced to solve these problems. Pair hidden Markov models on tree structures (PHMMTSs) proposed by Sakakibara are efficient automata-theoretic models for structural alignment of RNA secondary structures, although PHMMTSs are incapable of handling pseudoknots. On the other hand, tree adjoining grammars (TAGs), a subclass of context-sensitive grammars, are suitable for modeling pseudoknots. Our goal is to extend PHMMTSs by incorporating TAGs to be able to handle pseudoknots. RESULTS: We propose pair stochastic TAGs (PSTAGs) for aligning and predicting RNA secondary structures including a simple type of pseudoknot which can represent most known pseudoknot structures. First, we extend PHMMTSs defined on alignment of 'trees' to PSTAGs defined on alignment of 'TAG trees' which represent derivation processes of TAGs and are functionally equivalent to derived trees of TAGs. Then, we develop an efficient dynamic programming algorithm of PSTAGs for obtaining an optimal structural alignment including pseudoknots. We implement the PSTAG algorithm and demonstrate the properties of the algorithm by using it to align and predict several small pseudoknot structures. We believe that our implemented program based on PSTAGs is the first grammar-based and practically executable software for comparative analyses of RNA pseudoknot structures, and, further, non-coding RNAs.  相似文献   

6.
Profile hidden Markov models (HMMs) based on classical HMMs have been widely applied for protein sequence identification. The formulation of the forward and backward variables in profile HMMs is made under statistical independence assumption of the probability theory. We propose a fuzzy profile HMM to overcome the limitations of that assumption and to achieve an improved alignment for protein sequences belonging to a given family. The proposed model fuzzifies the forward and backward variables by incorporating Sugeno fuzzy measures and Choquet integrals, thus further extends the generalized HMM. Based on the fuzzified forward and backward variables, we propose a fuzzy Baum-Welch parameter estimation algorithm for profiles. The strong correlations and the sequence preference involved in the protein structures make this fuzzy architecture based model as a suitable candidate for building profiles of a given family, since the fuzzy set can handle uncertainties better than classical methods.  相似文献   

7.
8.
Multiple sequence alignment is one of the dominant problems in computational molecular biology. Numerous scoring functions and methods have been proposed, most of which result in NP-hard problems. In this paper we propose for the first time a general formulation for multiple alignment with arbitrary gap-costs based on an integer linear program (ILP). In addition we describe a branch-and-cut algorithm to effectively solve the ILP to optimality. We evaluate the performances of our approach in terms of running time and quality of the alignments using the BAliBase database of reference alignments. The results show that our implementation ranks amongst the best programs developed so far.  相似文献   

9.
It is frequently impossible to meet the assumptions underlying the statistical approach to classification of food products by a sensory panel. To find an alternative, we have investigated the applicability of the fuzzy set theory. Within a fuzzy set framework it is acceptable that a product belongs to several classes simultaneously and no assumptions regarding the distribution of sensory properties for a product class are made. Fuzzy classification models can be constructed from a set of training objects by linking the soft class labels to the sensory attributes applying an inference procedure based on fuzzy logic. A number of fuzzy inference procedures has been evaluated using a number of attribute sets. A satisfactory classification has been found using a very simple implication rule and a set of three attributes.  相似文献   

10.
The reconstruction of gene regulatory networks (GRNs) from high-throughput experimental data has been considered one of the most important issues in systems biology research. With the development of high-throughput technology and the complexity of biological problems, we need to reconstruct GRNs that contain thousands of genes. However, when many existing algorithms are used to handle these large-scale problems, they will encounter two important issues: low accuracy and high computational cost. To overcome these difficulties, the main goal of this study is to design an effective parallel algorithm to infer large-scale GRNs based on high-performance parallel computing environments. In this study, we proposed a novel asynchronous parallel framework to improve the accuracy and lower the time complexity of large-scale GRN inference by combining splitting technology and ordinary differential equation (ODE)-based optimization. The presented algorithm uses the sparsity and modularity of GRNs to split whole large-scale GRNs into many small-scale modular subnetworks. Through the ODE-based optimization of all subnetworks in parallel and their asynchronous communications, we can easily obtain the parameters of the whole network. To test the performance of the proposed approach, we used well-known benchmark datasets from Dialogue for Reverse Engineering Assessments and Methods challenge (DREAM), experimentally determined GRN of Escherichia coli and one published dataset that contains more than 10 thousand genes to compare the proposed approach with several popular algorithms on the same high-performance computing environments in terms of both accuracy and time complexity. The numerical results demonstrate that our parallel algorithm exhibits obvious superiority in inferring large-scale GRNs.  相似文献   

11.
The literature has been relatively silent about post-conflict processes. However, understanding the way humans deal with post-conflict situations is a challenge in our societies. With this in mind, we focus the present study on the rationality of cooperative decision making after an intergroup conflict, i.e., the extent to which groups take advantage of post-conflict situations to obtain benefits from collaborating with the other group involved in the conflict. Based on dual-process theories of thinking and affect heuristic, we propose that intergroup conflict hinders the rationality of cooperative decision making. We also hypothesize that this rationality improves when groups are involved in an in-group deliberative discussion. Results of a laboratory experiment support the idea that intergroup conflict –associated with indicators of the activation of negative feelings (negative affect state and heart rate)– has a negative effect on the aforementioned rationality over time and on both group and individual decision making. Although intergroup conflict leads to sub-optimal decision making, rationality improves when groups and individuals subjected to intergroup conflict make decisions after an in-group deliberative discussion. Additionally, the increased rationality of the group decision making after the deliberative discussion is transferred to subsequent individual decision making.  相似文献   

12.
Large sample theory of semiparametric models based on maximum likelihood estimation (MLE) with shape constraint on the nonparametric component is well studied. Relatively less attention has been paid to the computational aspect of semiparametric MLE. The computation of semiparametric MLE based on existing approaches such as the expectation‐maximization (EM) algorithm can be computationally prohibitive when the missing rate is high. In this paper, we propose a computational framework for semiparametric MLE based on an inexact block coordinate ascent (BCA) algorithm. We show theoretically that the proposed algorithm converges. This computational framework can be applied to a wide range of data with different structures, such as panel count data, interval‐censored data, and degradation data, among others. Simulation studies demonstrate favorable performance compared with existing algorithms in terms of accuracy and speed. Two data sets are used to illustrate the proposed computational method. We further implement the proposed computational method in R package BCA1SG , available at CRAN.  相似文献   

13.
Knowledge of the complete three-dimensional (3D) mechanical behavior of soft tissues is essential in understanding their pathophysiology and in developing novel therapies. Despite significant progress made in experimentation and modeling, a complete approach for the full characterization of soft tissue 3D behavior remains elusive. A major challenge is the complex architecture of soft tissues, such as myocardium, which endows them with strongly anisotropic and heterogeneous mechanical properties. Available experimental approaches for quantifying the 3D mechanical behavior of myocardium are limited to preselected planar biaxial and 3D cuboidal shear tests. These approaches fall short in pursuing a model-driven approach that operates over the full kinematic space. To address these limitations, we took the following approach. First, based on a kinematical analysis and using a given strain energy density function (SEDF), we obtained an optimal set of displacement paths based on the full 3D deformation gradient tensor. We then applied this optimal set to obtain novel experimental data from a 1-cm cube of post-infarcted left ventricular myocardium. Next, we developed an inverse finite element (FE) simulation of the experimental configuration embedded in a parameter optimization scheme for estimation of the SEDF parameters. Notable features of this approach include: (i) enhanced determinability and predictive capability of the estimated parameters following an optimal design of experiments, (ii) accurate simulation of the experimental setup and transmural variation of local fiber directions in the FE environment, and (iii) application of all displacement paths to a single specimen to minimize testing time so that tissue viability could be maintained. Our results indicated that, in contrast to the common approach of conducting preselected tests and choosing an SEDF a posteriori, the optimal design of experiments, integrated with a chosen SEDF and full 3D kinematics, leads to a more robust characterization of the mechanical behavior of myocardium and higher predictive capabilities of the SEDF. The methodology proposed and demonstrated herein will ultimately provide a means to reliably predict tissue-level behaviors, thus facilitating organ-level simulations for efficient diagnosis and evaluation of potential treatments. While applied to myocardium, such developments are also applicable to characterization of other types of soft tissues.  相似文献   

14.
In this paper, we propose a context-aware reminding system to assist dementia elders for their daily activities. We particularly focus on the issue of conflict detecting and handling in the real-world reminding applications. We adopt a planning approach to describe the constraints of reminders, and transform the constraints into temporal constraint graph for reminder conflict detecting. In real-time reminder scheduling, we define three disruptive activities, and present a context-aware reminding strategy to handle the conflicts between pre-planned and disruptive activities. Finally, the system prototype is introduced.  相似文献   

15.
Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these necessary observations, available estimators typically assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble of realizations is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that is suitable for the increased computational demand of the ensemble method''s practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method for transfer entropy estimation. We test the performance and robustness of our implementation on data from numerical simulations of stochastic processes. We also demonstrate the applicability of the ensemble method to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscience data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and artificial systems.  相似文献   

16.
Wei LY  Huang CL  Chen CH 《BMC genetics》2005,6(Z1):S133
Rough set theory and decision trees are data mining methods used for dealing with vagueness and uncertainty. They have been utilized to unearth hidden patterns in complicated datasets collected for industrial processes. The Genetic Analysis Workshop 14 simulated data were generated using a system that implemented multiple correlations among four consequential layers of genetic data (disease-related loci, endophenotypes, phenotypes, and one disease trait). When information of one layer was blocked and uncertainty was created in the correlations among these layers, the correlation between the first and last layers (susceptibility genes and the disease trait in this case), was not easily directly detected. In this study, we proposed a two-stage process that applied rough set theory and decision trees to identify genes susceptible to the disease trait. During the first stage, based on phenotypes of subjects and their parents, decision trees were built to predict trait values. Phenotypes retained in the decision trees were then advanced to the second stage, where rough set theory was applied to discover the minimal subsets of genes associated with the disease trait. For comparison, decision trees were also constructed to map susceptible genes during the second stage. Our results showed that the decision trees of the first stage had accuracy rates of about 99% in predicting the disease trait. The decision trees and rough set theory failed to identify the true disease-related loci.  相似文献   

17.
Methanogenic fermentation involves a natural ecosystem that can be used for waste water treatment. This anaerobic process can have two locally stable steady‐states and an unstable one making the process hard to handle. The aim of this work is to propose analytical criteria to detect hazardous working modes, namely situations where the system evolves towards the acidification of the plant. We first introduce a commonly used simplified model and recall its main properties. To assess the evolution of the system we study the phase plane and split it into nineteen zones according to some qualitative traits. Then a methodology is introduced to monitor in real‐time the trajectory of the system across these zones and determine its position in the plane. It leads to a dynamical risk index based on the analysis of the transitions from one zone to another, and generates a classification of the zones according to their dangerousness. Finally the proposed strategy is applied to a virtual process based on model ADM1. It is worth noting that the proposed approach do not rely on the value of the parameters and is thus very robust. © 2009 American Institute of Chemical Engineers Biotechnol. Prog., 2009  相似文献   

18.
Most mathematical models of the growth and remodeling of load-bearing soft tissues are based on one of two major approaches: a kinematic theory that specifies an evolution equation for the stress-free configuration of the tissue as a whole or a constrained mixture theory that specifies rates of mass production and removal of individual constituents within stressed configurations. The former is popular because of its conceptual simplicity, but relies largely on heuristic definitions of growth; the latter is based on biologically motivated micromechanical models, but suffers from higher computational costs due to the need to track all past configurations. In this paper, we present a temporally homogenized constrained mixture model that combines advantages of both classical approaches, namely a biologically motivated micromechanical foundation, a simple computational implementation, and low computational cost. As illustrative examples, we show that this approach describes well both cell-mediated remodeling of tissue equivalents in vitro and the growth and remodeling of aneurysms in vivo. We also show that this homogenized constrained mixture model suggests an intimate relationship between models of growth and remodeling and viscoelasticity. That is, important aspects of tissue adaptation can be understood in terms of a simple mechanical analog model, a Maxwell fluid (i.e., spring and dashpot in series) in parallel with a “motor element” that represents cell-mediated mechanoregulation of extracellular matrix. This analogy allows a simple implementation of homogenized constrained mixture models within commercially available simulation codes by exploiting available models of viscoelasticity.  相似文献   

19.
In this work we present a web-based tool for estimating multiple alignment quality using Bayesian hypothesis testing. The proposed method is very simple, easily implemented and not time consuming with a linear complexity. We evaluated method against a series of different alignments (a set of random and biologically derived alignments) and compared the results with tools based on classical statistical methods (such as sFFT and csFFT). Taking correlation coefficient as an objective criterion of the true quality, we found that Bayesian hypothesis testing performed better on average than the classical methods we tested. This approach may be used independently or as a component of any tool in computational biology which is based on the statistical estimation of alignment quality. AVAILABILITY: http://www.fmi.ch/groups/functional.genomics/tool.htm. SUPPLEMENTARY INFORMATION: Supplementary data are available from http://www.fmi.ch/groups/functional.genomics/tool-Supp.htm.  相似文献   

20.
Biological soft tissues experience damage and failure as a result of injury, disease, or simply age; examples include torn ligaments and arterial dissections. Given the complexity of tissue geometry and material behavior, computational models are often essential for studying both damage and failure. Yet, because of the need to account for discontinuous phenomena such as crazing, tearing, and rupturing, continuum methods are limited. Therefore, we model soft tissue damage and failure using a particle/continuum approach. Specifically, we combine continuum damage theory with Smoothed Particle Hydrodynamics (SPH). Because SPH is a meshless particle method, and particle connectivity is determined solely through a neighbor list, discontinuities can be readily modeled by modifying this list. We show, for the first time, that an anisotropic hyperelastic constitutive model commonly employed for modeling soft tissue can be conveniently implemented within a SPH framework and that SPH results show excellent agreement with analytical solutions for uniaxial and biaxial extension as well as finite element solutions for clamped uniaxial extension in 2D and 3D. We further develop a simple algorithm that automatically detects damaged particles and disconnects the spatial domain along rupture lines in 2D and rupture surfaces in 3D. We demonstrate the utility of this approach by simulating damage and failure under clamped uniaxial extension and in a peeling experiment of virtual soft tissue samples. In conclusion, SPH in combination with continuum damage theory may provide an accurate and efficient framework for modeling damage and failure in soft tissues.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号