共查询到20条相似文献,搜索用时 0 毫秒
1.
2.
Joshua Colvin Michael I Monine Ryan N Gutenkunst William S Hlavacek Daniel D Von Hoff Richard G Posner 《BMC bioinformatics》2010,11(1):404
Background
The system-level dynamics of many molecular interactions, particularly protein-protein interactions, can be conveniently represented using reaction rules, which can be specified using model-specification languages, such as the BioNetGen language (BNGL). A set of rules implicitly defines a (bio)chemical reaction network. The reaction network implied by a set of rules is often very large, and as a result, generation of the network implied by rules tends to be computationally expensive. Moreover, the cost of many commonly used methods for simulating network dynamics is a function of network size. Together these factors have limited application of the rule-based modeling approach. Recently, several methods for simulating rule-based models have been developed that avoid the expensive step of network generation. The cost of these "network-free" simulation methods is independent of the number of reactions implied by rules. Software implementing such methods is now needed for the simulation and analysis of rule-based models of biochemical systems. 相似文献3.
Sensitivity analysis quantifies the dependence of system behavior on the parameters that affect the process dynamics. Classical sensitivity analysis, however, does not directly apply to discrete stochastic dynamical systems, which have recently gained popularity because of its relevance in the simulation of biological processes. In this work, sensitivity analysis for discrete stochastic processes is developed based on density function (distribution) sensitivity, using an analog of the classical sensitivity and the Fisher Information Matrix. There exist many circumstances, such as in systems with multistability, in which the stochastic effects become nontrivial and classical sensitivity analysis on the deterministic representation of a system cannot adequately capture the true system behavior. The proposed analysis is applied to a bistable chemical system--the Schl?gl model, and to a synthetic genetic toggle-switch model. Comparisons between the stochastic and deterministic analyses show the significance of explicit consideration of the probabilistic nature in the sensitivity analysis for this class of processes. 相似文献
4.
Background
The importance of stochasticity in cellular processes having low number of molecules has resulted in the development of stochastic models such as chemical master equation. As in other modelling frameworks, the accompanying rate constants are important for the end-applications like analyzing system properties (e.g. robustness) or predicting the effects of genetic perturbations. Prior knowledge of kinetic constants is usually limited and the model identification routine typically includes parameter estimation from experimental data. Although the subject of parameter estimation is well-established for deterministic models, it is not yet routine for the chemical master equation. In addition, recent advances in measurement technology have made the quantification of genetic substrates possible to single molecular levels. Thus, the purpose of this work is to develop practical and effective methods for estimating kinetic model parameters in the chemical master equation and other stochastic models from single cell and cell population experimental data. 相似文献5.
We investigate how stochastic reaction processes are affected by external perturbations. We describe an extension of the deterministic metabolic control analysis (MCA) to the stochastic regime. We introduce stochastic sensitivities for mean and covariance values of reactant concentrations and reaction fluxes and show that there exist MCA-like summation theorems among these sensitivities. The summation theorems for flux variances is shown to depend on the size of the measurement time window (?) within which reaction events are counted for measuring a single flux. It is found that the degree of the ?-dependency can become significant for processes involving multi-time-scale dynamics and is estimated by introducing a new measure of time-scale separation. This ?-dependency is shown to be closely related to the power-law scaling observed in flux fluctuations in various complex networks. 相似文献
6.
ABSTRACT: BACKGROUND: A prerequisite for the mechanistic simulation of a biochemical system is detailed knowledge of its kinetic parameters. Despite recent experimental advances, the estimation of unknown parameter values from observed data is still a bottleneck for obtaining accurate simulation results. Many methods exist for parameter estimation in deterministic biochemical systems; methods for discrete stochastic systems are less well developed. Given the probabilistic nature of stochastic biochemical models, a natural approach is to choose parameter values that maximize the probability of the observed data with respect to the unknown parameters, a.k.a. the maximum likelihood parameter estimates (MLEs). MLE computation for all but the simplest models requires the simulation of many system trajectories that are consistent with experimental data. For models with unknown parameters, this presents a computational challenge, as the generation of consistent trajectories can be an extremely rare occurrence. RESULTS: We have developed Monte Carlo Expectation-Maximization with Modified Cross-Entropy Method (MCEM2): an accelerated method for calculating MLEs that combines advances in rare event simulation with a computationally efficient version of the Monte Carlo expectation-maximization (MCEM) algorithm. Our method requires no prior knowledge regarding parameter values, and it automatically provides a multivariate parameter uncertainty estimate. We applied the method to five stochastic systems of increasing complexity, progressing from an analytically tractable pure-birth model to a computationally demanding model of yeast-polarization. Our results demonstrate that MCEM2 substantially accelerates MLE computation on all tested models when compared to a stand-alone version of MCEM. Additionally, we show how our method identifies parameter values for certain classes of models more accurately than two recently proposed computationally efficient methods. CONCLUSIONS: This work provides a novel, accelerated version of a likelihood-based parameter estimation method that can be readily applied to stochastic biochemical systems. In addition, our results suggest opportunities for added efficiency improvements that will further enhance our ability to mechanistically simulate biological processes. 相似文献
7.
8.
The weighted stochastic simulation algorithm (wSSA) recently developed by Kuwahara and Mura and the refined wSSA proposed by Gillespie et al. based on the importance sampling technique open the door for efficient estimation of the probability of rare events in biochemical reaction systems. In this paper, we first apply the importance sampling technique to the next reaction method (NRM) of the stochastic simulation algorithm and develop a weighted NRM (wNRM). We then develop a systematic method for selecting the values of importance sampling parameters, which can be applied to both the wSSA and the wNRM. Numerical results demonstrate that our parameter selection method can substantially improve the performance of the wSSA and the wNRM in terms of simulation efficiency and accuracy. 相似文献
9.
Mauch S Stalzer M 《IEEE/ACM transactions on computational biology and bioinformatics / IEEE, ACM》2011,8(1):27-35
One can generate trajectories to simulate a system of chemical reactions using either Gillespie's direct method or Gibson and Bruck's next reaction method. Because one usually needs many trajectories to understand the dynamics of a system, performance is important. In this paper, we present new formulations of these methods that improve the computational complexity of the algorithms. We present optimized implementations, available from http://cain.sourceforge.net/, that offer better performance than previous work. There is no single method that is best for all problems. Simple formulations often work best for systems with a small number of reactions, while some sophisticated methods offer the best performance for large problems and scale well asymptotically. We investigate the performance of each formulation on simple biological systems using a wide range of problem sizes. We also consider the numerical accuracy of the direct and the next reaction method. We have found that special precautions must be taken in order to ensure that randomness is not discarded during the course of a simulation. 相似文献
10.
11.
Goutsias J 《Biophysical journal》2007,92(7):2350-2365
We study fundamental relationships between classical and stochastic chemical kinetics for general biochemical systems with elementary reactions. Analytical and numerical investigations show that intrinsic fluctuations may qualitatively and quantitatively affect both transient and stationary system behavior. Thus, we provide a theoretical understanding of the role that intrinsic fluctuations may play in inducing biochemical function. The mean concentration dynamics are governed by differential equations that are similar to the ones of classical chemical kinetics, expressed in terms of the stoichiometry matrix and time-dependent fluxes. However, each flux is decomposed into a macroscopic term, which accounts for the effect of mean reactant concentrations on the rate of product synthesis, and a mesoscopic term, which accounts for the effect of statistical correlations among interacting reactions. We demonstrate that the ability of a model to account for phenomena induced by intrinsic fluctuations may be seriously compromised if we do not include the mesoscopic fluxes. Unfortunately, computation of fluxes and mean concentration dynamics requires intensive Monte Carlo simulation. To circumvent the computational expense, we employ a moment closure scheme, which leads to differential equations that can be solved by standard numerical techniques to obtain more accurate approximations of fluxes and mean concentration dynamics than the ones obtained with the classical approach. 相似文献
12.
13.
A P system represents a distributed and parallel bio-inspired computing model in which basic data structures are multi-sets or strings. Numerical P systems have been recently introduced and they use numerical variables and local programs (or evolution rules), usually in a deterministic way. They may find interesting applications in areas such as computational biology, process control or robotics. The first simulator of numerical P systems (SNUPS) has been designed, implemented and made available to the scientific community by the authors of this paper. SNUPS allows a wide range of applications, from modeling and simulation of ordinary differential equations, to the use of membrane systems as computational blocks of cognitive architectures, and as controllers for autonomous mobile robots. This paper describes the functioning of a numerical P system and presents an overview of SNUPS capabilities together with an illustrative example. Availability: SNUPS is freely available to researchers as a standalone application and may be downloaded from a dedicated website, http://snups.ics.pub.ro/, which includes an user manual and sample membrane structures. 相似文献
14.
15.
16.
Werner Sandmann 《Mathematical biosciences》2009,221(1):43-2141
Stochastic simulation of biological systems proceeds by repeatedly generating sample paths or trajectories of the underlying stochastic process, from which many relevant and important system properties can be obtained. While a great deal of research is targeted towards accelerated trajectory generation, issues concerned with the variability across trajectories are often neglected. Advanced methods for properly quantifying the statistical accuracy and determining a reasonable number of trajectories are hardly addressed formally in the context of biological system simulation, though mathematical statistics provides a large body of powerful theory. We invoke this theory and show how mathematically well-founded sequential estimation approaches serve for systematically generating enough but not too many trajectories for achieving a certain prescribed accuracy. The practical applicability is demonstrated and illustrated by numerical examples through simulation studies of an immigration-death process and a gene regulatory network. 相似文献
17.
A J van Soest A L Schwab M F Bobbert G J van Ingen Schenau 《Journal of biomechanics》1992,25(10):1219-1226
Direct dynamics computer simulation is gaining importance as a research tool in the biomechanical study of complex human movements. Therefore, the need for general-purpose software packages with which the equations of motion can be derived automatically and solved numerically is growing. In this paper such a method is described: SPACAR. The method is compared to well-known commercially available software packages. On the basis of the results obtained on a test problem simulated with both SPACAR and DADS, it is concluded that both methods are accurate; DADS is much faster. The user-friendliness of SPACAR is less than that of DADS. However, SPACAR has two major advantages. First is the basic deformability of all elements, which allows handling of all kinds of problems within a unified framework; second is the full availability of the source code, which allows the experienced user to broaden the scope of possibilities to any extent. 相似文献
18.
Stochastic P systems and the simulation of biochemical processes with dynamic compartments 总被引:3,自引:0,他引:3
We introduce a sequential rewriting strategy for P systems based on Gillespie's stochastic simulation algorithm, and show that the resulting formalism of stochastic P systems makes it possible to simulate biochemical processes in dynamically changing, nested compartments. Stochastic P systems have been implemented using the spatially explicit programming language MGS. Implementation examples include models of the Lotka-Volterra auto-catalytic system, and the life cycle of the Semliki Forest virus. 相似文献
19.
20.