首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
One can generate trajectories to simulate a system of chemical reactions using either Gillespie's direct method or Gibson and Bruck's next reaction method. Because one usually needs many trajectories to understand the dynamics of a system, performance is important. In this paper, we present new formulations of these methods that improve the computational complexity of the algorithms. We present optimized implementations, available from http://cain.sourceforge.net/, that offer better performance than previous work. There is no single method that is best for all problems. Simple formulations often work best for systems with a small number of reactions, while some sophisticated methods offer the best performance for large problems and scale well asymptotically. We investigate the performance of each formulation on simple biological systems using a wide range of problem sizes. We also consider the numerical accuracy of the direct and the next reaction method. We have found that special precautions must be taken in order to ensure that randomness is not discarded during the course of a simulation.  相似文献   

2.
Memory is a ubiquitous phenomenon in biological systems in which the present system state is not entirely determined by the current conditions but also depends on the time evolutionary path of the system. Specifically, many memorial phenomena are characterized by chemical memory reactions that may fire under particular system conditions. These conditional chemical reactions contradict to the extant stochastic approaches for modeling chemical kinetics and have increasingly posed significant challenges to mathematical modeling and computer simulation. To tackle the challenge, I proposed a novel theory consisting of the memory chemical master equations and memory stochastic simulation algorithm. A stochastic model for single-gene expression was proposed to illustrate the key function of memory reactions in inducing bursting dynamics of gene expression that has been observed in experiments recently. The importance of memory reactions has been further validated by the stochastic model of the p53-MDM2 core module. Simulations showed that memory reactions is a major mechanism for realizing both sustained oscillations of p53 protein numbers in single cells and damped oscillations over a population of cells. These successful applications of the memory modeling framework suggested that this innovative theory is an effective and powerful tool to study memory process and conditional chemical reactions in a wide range of complex biological systems.  相似文献   

3.
In this paper a nonholonomic mobile robot with completely unknown dynamics is discussed. A mathematical model has been considered and an efficient neural network is developed, which ensures guaranteed tracking performance leading to stability of the system. The neural network assumes a single layer structure, by taking advantage of the robot regressor dynamics that expresses the highly nonlinear robot dynamics in a linear form in terms of the known and unknown robot dynamic parameters. No assumptions relating to the boundedness is placed on the unmodeled disturbances. It is capable of generating real-time smooth and continuous velocity control signals that drive the mobile robot to follow the desired trajectories. The proposed approach resolves speed jump problem existing in some previous tracking controllers. Further, this neural network does not require offline training procedures. Lyapunov theory has been used to prove system stability. The practicality and effectiveness of the proposed tracking controller are demonstrated by simulation and comparison results.  相似文献   

4.
Networks of molecular interactions regulate key processes in living cells. Therefore, understanding their functionality is a high priority in advancing biological knowledge. Boolean networks are often used to describe cellular networks mathematically and are fitted to experimental datasets. The fitting often results in ambiguities since the interpretation of the measurements is not straightforward and since the data contain noise. In order to facilitate a more reliable mapping between datasets and Boolean networks, we develop an algorithm that infers network trajectories from a dataset distorted by noise. We analyze our algorithm theoretically and demonstrate its accuracy using simulation and microarray expression data.  相似文献   

5.
Human motion studies have focused primarily on modeling straight point-to-point reaching movements. However, many goal-directed reaching movements, such as movements directed towards oneself, are not straight but rather follow highly curved trajectories. These movements are particularly interesting to study since they are essential in our everyday life, appear early in development and are routinely used to assess movement deficits following brain lesions. We argue that curved and straight-line reaching movements are generated by a unique neural controller and that the observed curvature of the movement is the result of an active control strategy that follows the geometry of one’s body, for instance to avoid trajectories that would hit the body or yield postures close to the joint limits. We present a mathematical model that accounts for such an active control strategy and show that the model reproduces with high accuracy the kinematic features of human data during unconstrained reaching movements directed toward the head. The model consists of a nonlinear dynamical system with a single stable attractor at the target. Embodiment-related task constraints are expressed as a force field that acts on the dynamical system. Finally, we discuss the biological plausibility and neural correlates of the model’s parameters and suggest that embodiment should be considered as a main cause for movement trajectory curvature.  相似文献   

6.
Probabilistic automata are compared with deterministic ones in simulations of growing networks made of dividing interconnected cells. On examples of chains, wheels and tree-like structures made of large numbers of cells it is shown that the number of necessary states in the initial generating cell automaton is reduced drastically when the automaton is probabilistic rather than deterministic. Since the price being paid is a decrease in the accuracy of the generated network, conditions under which reasonable compromises can be achieved are studied. They depend on the degree of redundancy of the final network (defined from the complexity of a deterministic automaton capable of generating it with maximum accuracy), on the "entropy" of the generating probabilistic automaton, and on the effects of different inputs on its transition probabilities (as measured by its "'capacity" in the sense of Shannon's information theory). The results are used to discuss and make more precise the notion of biological specificity. It is suggested that the weak metaphor of a genetic program, classically used to account for the role of DNA in specific genetic determinations, is replaced by that of inputs to biochemical probabilistic automata.  相似文献   

7.
Silver nanoparticles are one of the most commercialized nanomaterials. They are widely applied as biocides for their strong antimicrobial activity, but also their conductive, optic and catalytic properties make them wanted in many applications. The chemical and physical processes which are used to synthesize silver nanoparticles generally have many disadvantages and are not eco‐friendly. In this review, we will discuss biological alternatives that have been developed using microorganisms or plants to produce biogenic silver. Until now, only their antimicrobial activity has been studied more into detail. In contrast, a wide range of practical applications as biocide, biosensor, and catalyst are still unexplored. The shape, size, and functionalization of the nanoparticles is defined by the biological system used to produce the nanoparticles, hence for every application a specific biological production process needs to be chosen. On the other hand, biogenic silver needs to compete with chemically produced nanosilver on the market. Large scale production generating inexpensive nanoparticles is needed. This can only be achieved when the biological production system is chosen in function of the yield. Hence, the true challenge for biogenic silver is finding the balance between scalability, price, and applicability. Biotechnol. Bioeng. 2012; 109: 2422–2436. © 2012 Wiley Periodicals, Inc.  相似文献   

8.
It has been suggested by Thom that the mathematical theory of structural stability gives insight into the stable reproduction of form in biological systems. In this paper a process for the generation of some simple forms is described and it is shown that the forms produced are insensitive to perturbations in the production process. The reason for this stability is found in an analysis of the process in terms of the theory of structural stability and in particular in terms of “catastrophes”. Based on the example of our form generating process, we give a general definition of form and define the pattern recognition problem (which is inverse to the generation problem) in the context of that definition.  相似文献   

9.
Wu R  Ma CX  Lin M  Wang Z  Casella G 《Biometrics》2004,60(3):729-738
The incorporation of developmental control mechanisms of growth has proven to be a powerful tool in mapping quantitative trait loci (QTL) underlying growth trajectories. A theoretical framework for implementing a QTL mapping strategy with growth laws has been established. This framework can be generalized to an arbitrary number of time points, where growth is measured, and becomes computationally more tractable, when the assumption of variance stationarity is made. In practice, however, this assumption is likely to be violated for age-specific growth traits due to a scale effect. In this article, we present a new statistical model for mapping growth QTL, which also addresses the problem of variance stationarity, by using a transform-both-sides (TBS) model advocated by Carroll and Ruppert (1984, Journal of the American Statistical Association 79, 321-328). The TBS-based model for mapping growth QTL cannot only maintain the original biological properties of a growth model, but also can increase the accuracy and precision of parameter estimation and the power to detect a QTL responsible for growth differentiation. Using the TBS-based model, we successfully map a QTL governing growth trajectories to a linkage group in an example of forest trees. The statistical and biological properties of the estimates of this growth QTL position and effect are investigated using Monte Carlo simulation studies. The implications of our model for understanding the genetic architecture of growth are discussed.  相似文献   

10.
On the basis of the concept of biological activity, the large-scale evolution by generating new genes from gene duplication is theoretically compared between the monoploid organism and the diploid organism. The comparison is carried out not only for the process of generating one new gene but also for the process of generating two or more kinds of new genes from successive gene duplication. This comparison reveals the following difference in evolutionary pattern between the monoploids and diploids. The monoploid organism is more suitable to generate one or two new genes step by step but its successive gene duplication is obliged to generate smaller sizes of genes by the severer lowering of biological activity or self-reproducing rate. This is consistent with the evolutionary pattern of prokaryotes having steadily developed chemical syntheses, O2-releasing photosynthesis and O2-respiration in the respective lineages. On the other hand, the diploid organism with the plural number of homologous chromosome pairs has a chance to get together many kinds of new genes by the hybridization of variants having experienced different origins of gene duplication. Although this strategy of hybridization avoids the severe lowering of biological activity, it takes the longer time to establish the homozygotes of the more kinds of new genes. During this long period, furthermore different types of variants are accumulated in the population, and their successive hybridization sometimes yields various styles of new organisms. This evolutionary pattern explains the explosive divergence of body plans that has occasionally occurred in the diploid organisms, because the cell differentiation is a representative character exhibited by many kinds of genes and its evolution to the higher hierarchy constructs body plans.  相似文献   

11.
Reactions involving proton transfer, especially those taking place in excited states (ESPT), received considerable attention. These reactions underlie many biological and biochemical processes that are vital to life. DNA mutations excited state funnelling and transport phenomena are just examples. Among many molecules undergoing this type of photoreactions, phenols show a unique property, the so-called photoacidity. In the present communication, solvent (methanol and ammonia) assisted excited state proton transfer and photo acidity of 2-hydroxypyridine (2HP) are investigated at the DFT M06-2X / def2pvv level of theory. Excited states of 2HP and its complexes with methanol and ammonia were examined at two levels. First, sampling a Wigner distribution of 300 points, the photoabsorption spectra were simulated within the nuclear ensemble approximation. Second, simulation of the dynamics of the excited states with the surface hopping method. Several separate spectral windows with three hundred trajectories each, were considered. Lifetime of excited states and photochemical channels observed were discussed.  相似文献   

12.
This paper describes an iterative learning control scheme for fed-batch operation where repetitive trajectory tracking tasks are required. The proposed learning strategy is model-independent, and it takes advantage of the repetitive feature of system operations with a certain degree of intelligence and requires only small size of dynamic database for the learning process. The convergence of the learning process is proven. An example of simultaneously tracking two predefined trajectories by iterative learning control with two control inputs is given to illustrate the methodology. Satisfactory performance of the learning system can be observed from the simulation results.  相似文献   

13.
Abseher R  Nilges M 《Proteins》2000,39(1):82-88
Collective motions in biological macromolecules have been shown to be important for function. The most important collective motions occur on slow time scales, which poses a sampling problem in dynamic simulation of biomolecules. We present a novel method for efficient conformational sampling. The method combines the simulation of an ensemble of concurrent trajectories with restraints acting on the ensemble of structures as a whole. Two properties of the ensemble may be restrained: (i) the variance of the ensemble and (ii) the average position of the ensemble. Both properties are defined in a subspace of collective coordinate space spanned by an arbitrary number of modes. We show that weak restraints on the ensemble variance suffice for an increase in sampling efficiency along soft modes by two orders of magnitudes. The resulting trajectories exhibit virtually the same structural quality as trajectories generated by restraint-free-molecular dynamics simulation, as judged by standard structure validation tools. The method is used to probe the resistance of a structure against conformational changes along collective modes and clearly distinguishes soft from stiff modes. Further applications are discussed. Proteins 2000;39:82-88.  相似文献   

14.
ABSTRACT: BACKGROUND: A prerequisite for the mechanistic simulation of a biochemical system is detailed knowledge of its kinetic parameters. Despite recent experimental advances, the estimation of unknown parameter values from observed data is still a bottleneck for obtaining accurate simulation results. Many methods exist for parameter estimation in deterministic biochemical systems; methods for discrete stochastic systems are less well developed. Given the probabilistic nature of stochastic biochemical models, a natural approach is to choose parameter values that maximize the probability of the observed data with respect to the unknown parameters, a.k.a. the maximum likelihood parameter estimates (MLEs). MLE computation for all but the simplest models requires the simulation of many system trajectories that are consistent with experimental data. For models with unknown parameters, this presents a computational challenge, as the generation of consistent trajectories can be an extremely rare occurrence. RESULTS: We have developed Monte Carlo Expectation-Maximization with Modified Cross-Entropy Method (MCEM2): an accelerated method for calculating MLEs that combines advances in rare event simulation with a computationally efficient version of the Monte Carlo expectation-maximization (MCEM) algorithm. Our method requires no prior knowledge regarding parameter values, and it automatically provides a multivariate parameter uncertainty estimate. We applied the method to five stochastic systems of increasing complexity, progressing from an analytically tractable pure-birth model to a computationally demanding model of yeast-polarization. Our results demonstrate that MCEM2 substantially accelerates MLE computation on all tested models when compared to a stand-alone version of MCEM. Additionally, we show how our method identifies parameter values for certain classes of models more accurately than two recently proposed computationally efficient methods. CONCLUSIONS: This work provides a novel, accelerated version of a likelihood-based parameter estimation method that can be readily applied to stochastic biochemical systems. In addition, our results suggest opportunities for added efficiency improvements that will further enhance our ability to mechanistically simulate biological processes.  相似文献   

15.
Integrating concepts of maintenance and of origins is essential to explaining biological diversity. The unified theory of evolution attempts to find a common theme linking production rules inherent in biological systems, explaining the origin of biological order as a manifestation of the flow of energy and the flow of information on various spatial and temporal scales, with the recognition that natural selection is an evolutionarily relevant process. Biological systems persist in space and time by transfor ming energy from one state to another in a manner that generates structures which allows the system to continue to persist. Two classes of energetic transformations allow this; heat-generating transformations, resulting in a net loss of energy from the system, and conservative transformations, changing unusable energy into states that can be stored and used subsequently. All conservative transformations in biological systems are coupled with heat-generating transformations; hence, inherent biological production, or genealogical proesses, is positively entropic. There is a self-organizing phenomenology common to genealogical phenomena, which imparts an arrow of time to biological systems. Natural selection, which by itself is time-reversible, contributes to the organization of the self-organized genealogical trajectories. The interplay of genealogical (diversity-promoting) and selective (diversity-limiting) processes produces biological order to which the primary contribution is genealogical history. Dynamic changes occuring on times scales shorter than speciation rates are microevolutionary; those occuring on time scales longer than speciation rates are macroevolutionary. Macroevolutionary processes are neither redicible to, nor autonomous from, microevolutionary processes.Authorship alphabetical  相似文献   

16.
17.
In this paper we give an overview of some very recent work, as well as presenting a new approach, on the stochastic simulation of multi-scaled systems involving chemical reactions. In many biological systems (such as genetic regulation and cellular dynamics) there is a mix between small numbers of key regulatory proteins, and medium and large numbers of molecules. In addition, it is important to be able to follow the trajectories of individual molecules by taking proper account of the randomness inherent in such a system. We describe different types of simulation techniques (including the stochastic simulation algorithm, Poisson Runge–Kutta methods and the balanced Euler method) for treating simulations in the three different reaction regimes: slow, medium and fast. We then review some recent techniques on the treatment of coupled slow and fast reactions for stochastic chemical kinetics and present a new approach which couples the three regimes mentioned above. We then apply this approach to a biologically inspired problem involving the expression and activity of LacZ and LacY proteins in E. coli, and conclude with a discussion on the significance of this work.  相似文献   

18.
19.
Upstream bioprocess characterization and optimization are time and resource‐intensive tasks. Regularly in the biopharmaceutical industry, statistical design of experiments (DoE) in combination with response surface models (RSMs) are used, neglecting the process trajectories and dynamics. Generating process understanding with time‐resolved, dynamic process models allows to understand the impact of temporal deviations, production dynamics, and provides a better understanding of the process variations that stem from the biological subsystem. The authors propose to use DoE studies in combination with hybrid modeling for process characterization. This approach is showcased on Escherichia coli fed‐batch cultivations at the 20L scale, evaluating the impact of three critical process parameters. The performance of a hybrid model is compared to a pure data‐driven model and the widely adopted RSM of the process endpoints. Further, the performance of the time‐resolved models to simultaneously predict biomass and titer is evaluated. The superior behavior of the hybrid model compared to the pure black‐box approaches for process characterization is presented. The evaluation considers important criteria, such as the prediction accuracy of the biomass and titer endpoints as well as the time‐resolved trajectories. This showcases the high potential of hybrid models for soft‐sensing and model predictive control.  相似文献   

20.
Modeling biophysical processes in general requires knowledge about underlying biological parameters. The quality of simulation results is strongly influenced by the accuracy of these parameters, hence the identification of parameter values that the model includes is a major part of simulating biophysical processes. In many cases, secondary data can be gathered by experimental setups, which are exploitable by mathematical inverse modeling techniques. Here we describe a method for parameter identification of diffusion properties of calcium in the nuclei of rat hippocampal neurons. The method is based on a Gauss-Newton method for solving a least-squares minimization problem and was formulated in such a way that it is ideally implementable in the simulation platform uG. Making use of independently published space- and time-dependent calcium imaging data, generated from laser-assisted calcium uncaging experiments, here we could identify the diffusion properties of nuclear calcium and were able to validate a previously published model that describes nuclear calcium dynamics as a diffusion process.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号