首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
A constant dilemma in theoretical ecology is knowing whether model predictions corrspond to real phenomena or whether they are artifacts of the modelling framework. The frequent absence of detailed ecological data against which models can be tested gives this issue particular importance. We address this question in the specific case of invasion in a predator-prey system with oscillatory population kinetics, in which both species exhibit local random movement. Given only these two basic qualitative features, we consider whether we can deduce any properties of the behaviour following invasion. To do this we study four different types of mathematical model, which have no formal relationship, but which all reflect our two qualitative ingredients. The models are: reaction-diffusion equations, coupled map lattices, deterministic cellular automata, and integrodifference equations. We present results of numerical simulations of the invasion of prey by predators for each model, and show that although there are certain differences, the main qualitative features of the behaviour behind invasion are the same for all the models. Specifically, there are either irregular spatiotemporal oscillations behind the invasion, or regular spatiotemporal oscillations with the form of a periodic travelling ''wake'', depending on parameter values. The observation of this behaviour in all types of model strongly suggests that it is a direct consequence of our basic qualitative assumptions, and as such is an ecological reality which will always occur behind invasion in actual oscillatory predator-prey systems.  相似文献   

2.
Quantitative models of biochemical networks (signal transduction cascades, metabolic pathways, gene regulatory circuits) are a central component of modern systems biology. Building and managing these complex models is a major challenge that can benefit from the application of formal methods adopted from theoretical computing science. Here we provide a general introduction to the field of formal modelling, which emphasizes the intuitive biochemical basis of the modelling process, but is also accessible for an audience with a background in computing science and/or model engineering. We show how signal transduction cascades can be modelled in a modular fashion, using both a qualitative approach--qualitative Petri nets, and quantitative approaches--continuous Petri nets and ordinary differential equations (ODEs). We review the major elementary building blocks of a cellular signalling model, discuss which critical design decisions have to be made during model building, and present a number of novel computational tools that can help to explore alternative modular models in an easy and intuitive manner. These tools, which are based on Petri net theory, offer convenient ways of composing hierarchical ODE models, and permit a qualitative analysis of their behaviour. We illustrate the central concepts using signal transduction as our main example. The ultimate aim is to introduce a general approach that provides the foundations for a structured formal engineering of large-scale models of biochemical networks.  相似文献   

3.
Quantitative computational models play an increasingly important role in modern biology. Such models typically involve many free parameters, and assigning their values is often a substantial obstacle to model development. Directly measuring in vivo biochemical parameters is difficult, and collectively fitting them to other experimental data often yields large parameter uncertainties. Nevertheless, in earlier work we showed in a growth-factor-signaling model that collective fitting could yield well-constrained predictions, even when it left individual parameters very poorly constrained. We also showed that the model had a “sloppy” spectrum of parameter sensitivities, with eigenvalues roughly evenly distributed over many decades. Here we use a collection of models from the literature to test whether such sloppy spectra are common in systems biology. Strikingly, we find that every model we examine has a sloppy spectrum of sensitivities. We also test several consequences of this sloppiness for building predictive models. In particular, sloppiness suggests that collective fits to even large amounts of ideal time-series data will often leave many parameters poorly constrained. Tests over our model collection are consistent with this suggestion. This difficulty with collective fits may seem to argue for direct parameter measurements, but sloppiness also implies that such measurements must be formidably precise and complete to usefully constrain many model predictions. We confirm this implication in our growth-factor-signaling model. Our results suggest that sloppy sensitivity spectra are universal in systems biology models. The prevalence of sloppiness highlights the power of collective fits and suggests that modelers should focus on predictions rather than on parameters.  相似文献   

4.
The Scale of Successional Models and Restoration Objectives   总被引:2,自引:0,他引:2  
Successional models are used to predict how restoration projects will achieve their goals. These models have been developed on different spatial and temporal scales and consequently emphasize different types of dynamics. This paper focuses on the restoration goal of self-sustainability, but only in the context of a long-term goal. Because of the temporal scale of this goal, we must consider the impact of processes arising outside of the restoration site as of greater importance than restoration itself. Because ecological systems are open, restoration sites will be subjected to many external influential processes. Depending on the landscape context, the impact of these processes may not be noticeable, or, at the other extreme, they may prevent the achievement of restoration objectives. A second issue is to emphasize the nature of processes in the long term, that they are a complex of characteristics such as magnitude, frequency, and extent. Ecological systems are only adapted to a range of values in each of these characteristics. Restoration often combines goals that are of different scales. Models appropriate to these goals need consideration.  相似文献   

5.
This paper proposes the use of hidden Markov time series models for the analysis of the behaviour sequences of one or more animals under observation. These models have advantages over the Markov chain models commonly used for behaviour sequences, as they can allow for time-trend or expansion to several subjects without sacrificing parsimony. Furthermore, they provide an alternative to higher-order Markov chain models if a first-order Markov chain is unsatisfactory as a model. To illustrate the use of such models, we fit multivariate and univariate hidden Markov models allowing for time-trend to data from an experiment investigating the effects of feeding on the locomotory behaviour of locusts (Locusta migratoria).  相似文献   

6.
Mathematical methods of biochemical pathway analysis are rapidly maturing to a point where it is possible to provide objective rationale for the natural design of metabolic systems and where it is becoming feasible to manipulate these systems based on model predictions, for instance, with the goal of optimizing the yield of a desired microbial product. So far, theory-based metabolic optimization techniques have mostly been applied to steady-state conditions or the minimization of transition time, using either linear stoichiometric models or fully kinetic models within biochemical systems theory (BST). This article addresses the related problem of controllability, where the task is to steer a non-linear biochemical system, within a given time period, from an initial state to some target state, which may or may not be a steady state. For this purpose, BST models in S-system form are transformed into affine non-linear control systems, which are subjected to an exact feedback linearization that permits controllability through independent variables. The method is exemplified with a small glycolytic-glycogenolytic pathway that had been analyzed previously by several other authors in different contexts.  相似文献   

7.
8.
Traditionally, a scientific model is thought to provide a good scientific explanation to the extent that it satisfies certain scientific goals that are thought to be constitutive of explanation (e.g. generating understanding, identifying mechanisms, making predictions, identifying high-level patterns, allowing us to control and manipulate phenomena). Problems arise when we realize that individual scientific models cannot simultaneously satisfy all the scientific goals typically associated with explanation. A given model’s ability to satisfy some goals must always come at the expense of satisfying others. This has resulted in philosophical disputes regarding which of these goals are in fact necessary for explanation, and as such which types of models can and cannot provide explanations (e.g. dynamical models, optimality models, topological models, etc.). Explanatory monists argue that one goal will be explanatory in all contexts, while explanatory pluralists argue that the goal will vary based on pragmatic considerations. In this paper, I argue that such debates are misguided, and that both monists and pluralists are incorrect. Instead of any goal being given explanatory priority over others in a given context, the different goals are all deeply dependent on one another for their explanatory power. Any model that sacrifices some explanatory goals to attain others will always necessarily undermine its own explanatory power in the process. And so when forced to choose between individual scientific models, there can be no explanatory victors. Given that no model can satisfy all the goals typically associated with explanation, no one model in isolation can provide a good scientific explanation. Instead we must appeal to collections of models. Collections of models provide an explanation when they satisfy the web of interconnected goals that justify the explanatory power of one another.  相似文献   

9.
Computational models are increasingly used to investigate and predict the complex dynamics of biological and biochemical systems. Nevertheless, governing equations of a biochemical system may not be (fully) known, which would necessitate learning the system dynamics directly from, often limited and noisy, observed data. On the other hand, when expensive models are available, systematic and efficient quantification of the effects of model uncertainties on quantities of interest can be an arduous task. This paper leverages the notion of flow-map (de)compositions to present a framework that can address both of these challenges via learning data-driven models useful for capturing the dynamical behavior of biochemical systems. Data-driven flow-map models seek to directly learn the integration operators of the governing differential equations in a black-box manner, irrespective of structure of the underlying equations. As such, they can serve as a flexible approach for deriving fast-to-evaluate surrogates for expensive computational models of system dynamics, or, alternatively, for reconstructing the long-term system dynamics via experimental observations. We present a data-efficient approach to data-driven flow-map modeling based on polynomial chaos Kriging. The approach is demonstrated for discovery of the dynamics of various benchmark systems and a coculture bioreactor subject to external forcing, as well as for uncertainty quantification of a microbial electrosynthesis reactor. Such data-driven models and analyses of dynamical systems can be paramount in the design and optimization of bioprocesses and integrated biomanufacturing systems.  相似文献   

10.
11.
Modelling is most clearly understood as a adjunct in the process of deriving predictions from hypotheses. By representing a hypothesised mechanism in a model we hope by manipulating the model to understand the hypotheses' consequences. Eight dimensions on which models of biological behaviour can vary are described: the degree of realism with which they apply to biology; the level of biology they represent; the generality or range of systems the model is supposed to cover; the abstraction or amount of biological detail represented; the accuracy of representation of the mechanisms; the medium in which the model is built; the match of the model behaviour to biological behaviour; and the utility of the model in providing biological understanding and/or technical insight. It is hoped this framework will help to clarify debates over different approaches to modelling, particularly by pointing out how the above dimensions are relatively independent and should not be conflated.  相似文献   

12.
Scientific understanding in physics or physiology is based on models or theories devised to describe what is known, within the limits imposed by observation error. Carefully integrated models can be used for prediction, and the inferences assessed via further experiments designed to test the adequacy of the theory summarizing the state of knowledge. This is the systems approach, the basis of theoretical physiology; the models, like those of theoretical physics, should be firmly based on fundamental reproducible observations of a physical or chemical nature, held together with the principles of mathematics, logic, and the conservation of mass and energy. Modern computing power is such that comprehensive models can now be constructed and tested. For this approach data sets should include as many simultaneously obtained items of information of differing sorts as possible to reduce the degrees of freedom in fitting models to data. By taking advantage of large memories and rapid computation, modular construction techniques permit the formulation of multimodels covering more than a single hierarchical level, and thereby allow the investigator to understand the effects of controllers at the molecular level on overall cell or organism behavior. How does this influence the research and teaching practices of physiology? Because the computer also allows a new type of collaboration involving the networking of ideas, data bases, analytical techniques, and experiment designing, investigators in geographically distributed individual laboratories can plan, work, and analyze in concert. The prediction from this socioscientific model is therefore that networked computer-based modeling will serve to coalesce the ideas and observations of enlarging groups of investigators.  相似文献   

13.
14.
Ricard J 《Comptes rendus biologies》2010,333(11-12):761-768
The set of these two theoretical papers offers an alternative to the hypothesis of a primordial RNA-world. The basic idea of these papers is to consider that the first prebiotic systems could have been networks of catalysed reactions encapsulated by a membrane. In order to test this hypothesis it was attempted to list the main obligatory features of living systems and see whether encapsulated biochemical networks could possibly display these features. The traits of living systems are the following: the ability they have to reproduce; the fact they possess an identity; the fact that biological events should be considered in the context of a history; the fact that living systems are able to evolve by selection of alterations of their structure and self-organization. The aim of these two papers is precisely to show that encapsulated biochemical networks can possess these properties and can be considered good candidates for the first prebiotic systems. In the present paper it is shown that if the proteinoids are not very specific catalysts and if some of the reactions of the network are autocatalytic whereas others are not, the resulting system does not reach a steady-state and tends to duplicate. In the same line, these biochemical networks possess an identity, viz. an information, defined from the probability of occurrence of these nodes. Moreover interaction of two ligands can increase, or decrease, this information. In the first case, the system is defined as emergent, in the second case it is considered integrated. Another property of living systems is that their behaviour is defined in the context of a time-arrow. For instance, they are able to sense whether the intensity of a signal is reached after an increase, or a decrease. This property can be mimicked by a simple physico-chemical system made up of the diffusion of a ligand followed by its chemical transformation catalysed by a proteinoid displaying inhibition by excess substrate. Under these conditions the system reacts differently depending on whether the same ligand concentration is reached after an increase or a decrease.  相似文献   

15.
Kim D 《Bio Systems》2007,87(2-3):322-331
Elasmobranchs can detect a little amount of electric fields and they have characteristic approach strategies to find an electric dipole source generated by prey or conspecifics. They appear to align the body at a constant angle with the current flow line of the electric field while swimming towards prey. However, it has not been studied how they process the perception of electric fields for the approach behaviour or what kind of neural mechanism is used. We use a model of electrosensory perception with electrodynamics and explore a possible approach mechanism based on the sensory landscape distributed on electroreceptors. This paper presents that elasmobranchs can estimate the direction of the electric field by swaying their head, which will be a basis information for their particular approach behaviour. A velocity profile of voltage gradients and intensity difference among the ampullary clusters can be another cues to detect a prey source.  相似文献   

16.
Understanding the properties of a system as emerging from the interaction of well described parts is the most important goal of Systems Biology. Although in the practice of Lactic Acid Bacteria (LAB) physiology we most often think of the parts as the proteins and metabolites, a wider interpretation of what a part is can be useful. For example, different strains or species can be the parts of a community, or we could study only the chemical reactions as the parts of metabolism (and forgetting about the enzymes that catalyze them), as is done in flux balance analysis. As long as we have some understanding of the properties of these parts, we can investigate whether their interaction leads to novel or unanticipated behaviour of the system that they constitute. There has been a tendency in the Systems Biology community to think that the collection and integration of data should continue ad infinitum, or that we will otherwise not be able to understand the systems that we study in their details. However, it may sometimes be useful to take a step back and consider whether the knowledge that we already have may not explain the system behaviour that we find so intriguing. Reasoning about systems can be difficult, and may require the application of mathematical techniques. The reward is sometimes the realization of unexpected conclusions, or in the worst case, that we still do not know enough details of the parts, or of the interactions between them. We will discuss a number of cases, with a focus on LAB-related work, where a typical systems approach has brought new knowledge or perspective, often counterintuitive, and clashing with conclusions from simpler approaches. Also novel types of testable hypotheses may be generated by the systems approach, which we will illustrate. Finally we will give an outlook on the fields of research where the systems approach may point the way for the near future.  相似文献   

17.
Nonlinear dynamical biomolecular systems can evidently be considered as prototypes of information processing devices at molecular level capable to solve problems of high computational complexity. Keeping in mind this goal the dynamics of biochemical system based on enzymatic oxidation of uric acid was considered. The system was studied in the version of distributed biomolecular structure having predetermined geometry of enzyme distribution on a porous planar medium. Being in the regime of stepwise dissipative structure formation this system demonstrated complicated modes of behaviour.  相似文献   

18.
An important goal of systems biology is to develop quantitative models that explain how specific molecular features give rise to systems-level properties. Metabolic and regulatory pathways that contain multifunctional proteins are especially interesting to study from this perspective because they have frequently been observed to exhibit robustness: the ability for a system to perform its proper function even as levels of its components change. In this study, we use extensive biochemical data and algebraic modeling to develop and analyze a model that shows how robust behavior arises in the isocitrate dehydrogenase (IDH) regulatory system of Escherichia coli, which was shown in 1985 to experimentally exhibit robustness. E. coli IDH is regulated by reversible phosphorylation catalyzed by the bifunctional isocitrate dehydrogenase kinase/phosphatase (IDHKP), and the level of IDH activity determines whether carbon flux is directed through the glyoxylate bypass (for growth on two-carbon substrates) or the full tricarboxylic acid cycle. Our model, which incorporates recent structural data on IDHKP, identifies several specific biochemical features of the system (including homodimerization of IDH and bifunctionality of IDHKP) that provide a potential explanation for robustness. Using algebraic techniques, we derive an invariant that summarizes the steady-state relationship between the phospho-forms of IDH. We use the invariant in combination with kinetic data on IDHKP to calculate IDH activity at a range of total IDH levels and find that our model predicts robustness. Our work unifies much of the known biochemistry of the IDH regulatory system into a single quantitative framework and highlights the importance of constructing biochemically realistic models in systems biology.  相似文献   

19.
20.
This paper addresses product yield optimization in microorganisms grown in continuous culture. Traditional optimization strategies of random mutagenesis and selection will eventually have limited efficacy, thus requiring more focused strategies. The best candidates for such strategies appear to be mathematical models that capture the essence of metabolic systems and permit optimization with computational methods. In the past, models used for this purpose have been stoichiometric, kinetic in the form of S-systems, or ad hoc. This work presents a deterministic approach based on generalized mass action (GMA) systems. These systems are interesting in that they allow direct merging of stoichiometric and S-system models. Two illustrations are considered. In the first case, the fermentation pathway in Saccharomyces cerevisiae is optimized for ethanol production under steady-state conditions. The model of this pathway is relatively small, with five states and eight rate constants. The second example addresses the maximization of citric acid in the mold Aspergillus niger. For the optimization of this larger pathway system with 30 states and 60 reactions, a Mixed Integer Nonlinear Programming (MINLP) is proposed. It is shown that efficient MINLP algorithms, based on convexification, branch-and-reduce methods, and binary variable selection, are essential for solving these difficult optimization problems.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号