首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Summary Models of optimal carbon allocation schedules have influenced the way plant ecologists think about life history evolution, particularly for annual plants. The present study asks (1) how, within the framework of these models, are their predictions affected by within-season variation in mortality and carbon assimilation rates?; and (2) what are the consequences of these prediction changes for empirical tests of the models? A companion paper examines the basic assumptions of the models themselves. I conducted a series of numerical experiments with a simple carbon allocation model. Results suggest that both qualitative and quantitative predictions can sometimes be sensitive to parameter values for net assimilation rate and mortality: for some parameter values, both the time and size at onset of reproduction, as well as the number of reproductive intervals, vary considerably as a result of small variations in these parameters. For other parameter values, small variations in the parameters result in only small changes in predicted phenotype, but these have very large fitness consequences. Satisfactory empirical tests are thus likely to require much accuracy in parameter estimates. The effort required for parameter estimation imposes a practical constraint on empirical tests, making large multipopulation comparisons impractical. It may be most practical to compare the predicted and observed fitness consequences of variation in the timing of onset of reproduction.  相似文献   

2.
Several recent publications describe remarkably promising effects of transplanting olfactory ensheathing cells as a potential future method to repair human spinal cord injuries. But why were cells from the nose transplanted into the spinal cord? What are olfactory ensheathing cells, and how might they produce these beneficial effects? And more generally, what do we mean by spinal cord injury? To what extent can we compare repair in an animal to repair in a human?  相似文献   

3.
Ogawa K  Miyake Y 《Bio Systems》2011,103(3):400-409
Many conventional models have used the positional information hypothesis to explain each elementary process of morphogenesis during the development of multicellular organisms. Their models assume that the steady concentration patterns of morphogens formed in an extracellular environment have an important property of positional information, so-called “robustness”. However, recent experiments reported that a steady morphogen pattern, the concentration gradient of the Bicoid protein, during early Drosophila embryonic development is not robust for embryo-to-embryo variability. These reports encourage a reconsideration of a long-standing problem in systematic cell differentiation: what is the entity of positional information for cells? And, what is the origin of the robust boundary of gene expression? To address these problems at a cellular level, in this article we pay attention to the re-generative phenomena that show another important property of positional information, “size invariance”. In view of regenerative phenomena, we propose a new mathematical model to describe the generation mechanism of a spatial pattern of positional values. In this model, the positional values are defined as the values into which differentiable cells transform a spatial pattern providing positional information. The model is mathematically described as an associative algebra composed of various terms, each of which is the multiplication of some fundamental operators under the assumption that the operators are derived from the remarkable properties of cell differentiation on an amputation surface in regenerative phenomena. We apply this model to the concentration pattern of the Bicoid protein during the anterior-posterior axis formation in Drosophila, and consider the conditions needed to establish the robust boundary of the expression of the hunchback gene.  相似文献   

4.
Computational simulation models can provide a way of understanding and predicting insect population dynamics and evolution of resistance, but the usefulness of such models depends on generating or estimating the values of key parameters. In this paper, we describe four numerical algorithms generating or estimating key parameters for simulating four different processes within such models. First, we describe a novel method to generate an offspring genotype table for one- or two-locus genetic models for simulating evolution of resistance, and how this method can be extended to create offspring genotype tables for models with more than two loci. Second, we describe how we use a generalized inverse matrix to find a least-squares solution to an over-determined linear system for estimation of parameters in probit models of kill rates. This algorithm can also be used for the estimation of parameters of Freundlich adsorption isotherms. Third, we describe a simple algorithm to randomly select initial frequencies of genotypes either without any special constraints or with some pre-selected frequencies. Also we give a simple method to calculate the “stable” Hardy–Weinberg equilibrium proportions that would result from these initial frequencies. Fourth we describe how the problem of estimating the intrinsic rate of natural increase of a population can be converted to a root-finding problem and how the bisection algorithm can then be used to find the rate. We implemented all these algorithms using MATLAB and Python code; the key statements in both codes consist of only a few commands and are given in the appendices. The results of numerical experiments are also provided to demonstrate that our algorithms are valid and efficient.  相似文献   

5.
Habitat fragmentation and connectivity loss pose significant threats to biodiversity at both local and landscape levels. Strategies to increase ecological connectivity and preserve strong connectivity are important for dealing with the potential threat of habitat degradation. Various metrics have been used to measure (i.e., quantify) landscape composition and configuration in landscape ecology. However, their relationship with ecological connectivity must be understood to interpret landscape patterns comprehensively. In the present study, correlations between ecological connectivity and land complexity are examined based on information-theory metrics. Two primary questions are explored: (1) to what extent are landscape mosaic measures of entropy correlated with ecological connectivity, with landscape gradient-based measures, and with each other? (2) are landscape gradient-based entropy measures correlated with ecological connectivity more than discrete entropy measures? Results show that all information theoretic metrics are statistically significant (p < 0.05) for modelling ecological connectivity. Among categorically-based indices, the relationship between ECI and joint entropy was the most significant, while a generalized additive model indicated that Boltzmann entropy could predict the ecological connectivity index, explaining ∼60% of the variance. Therefore, configurational entropy can be used for improving ecological connectivity models.  相似文献   

6.
Intention, Goal, Scope, Background It has gained growing acceptance in recent years that there are values in LCA, and several authors have discussed how value orientations can influence LCA models and results. The aim of this article is to continue this discussion and to focus on value choices in LCA. Objectives To find a way of describing value orientations in relation to choices in LCA. Methods This objective has been pursued in this paper by investigating the relationship between values and traditional science, exploring the concept of values, investigating the relationship between values and choice, and suggesting a way to describe the value base for specific choices in LCA. Results and Discussion Research on how to improve the environmental performance of products resembles peace research in that it aims to achieve a certain value-laden situation in society. The epistemological basis for peace research also seems to apply to LCA research. The term value has many meanings. There are several classification methods for values and I claim that one is more suitable for choices in LCA than the others. The correlation between values and choice is not straightforward, and values can only partially explain choices. Conclusions Describing the value base for choices in LCA increases the consistency and transparency of the value choices and offers a means of justifying them. Recommendations and Outlook It is recommended that the value base is described in terms of 1) what is included in the concern for the environment 2) how tradeoffs are made and 3) how uncertainty is handled.  相似文献   

7.
8.
The Hodgkin-Huxley (HH) model is the basis for numerous neural models. There are two negative feedback processes in the HH model that regulate rhythmic spiking. The first is an outward current with an activation variable n that has an opposite influence to the excitatory inward current and therefore provides subtractive negative feedback. The other is the inactivation of an inward current with an inactivation variable h that reduces the amount of positive feedback and therefore provides divisive feedback. Rhythmic spiking can be obtained with either negative feedback process, so we ask what is gained by having two feedback processes. We also ask how the different negative feedback processes contribute to spiking. We show that having two negative feedback processes makes the HH model more robust to changes in applied currents and conductance densities than models that possess only one negative feedback variable. We also show that the contributions made by the subtractive and divisive feedback variables are not static, but depend on time scales and conductance values. In particular, they contribute differently to the dynamics in Type I versus Type II neurons.  相似文献   

9.
10.
For the binding of peptides to wild-type HIV-1 and BIV TAR RNA and to mutants with bulges of various sizes, changes in the DeltaDelta G values of binding were determined from experimental K d values. The corresponding entropies of these bulges are estimated by enumerating all possible RNA bulge conformations on a lattice and then applying the Boltzmann relationship. Independent calculations of entropies from fluctuations are also carried out using the Gaussian network model (GNM) recently introduced for analyzing folded structures. Strong correlations are seen between the changes in free energy determined for binding and the two different unbound entropy calculations. The fact that the calculated entropy increase with larger bulge size is correlated with the enhanced experimental binding free energy is unusual. This system exhibits a dependence on the entropy of the unbound form that is opposite to usual binding models. Instead of a large initial entropy being unfavorable since it would be reduced upon binding, here the larger entropies actually favor binding. Several interpretations are possible: (i) the higher conformational freedom implies a higher competence for binding with a minimal strain, by suitable selection amongst the set of already accessible conformations; (ii) larger bulge entropies enhance the probability of the specific favorable conformation of the bound state; (iii) the increased freedom of the larger bulges contri-butes more to the bound state than to the unbound state; (iv) indirectly the large entropy of the bound state might have an unfavorable effect on the solvent structure. Nonetheless, this unusual effect is interesting.  相似文献   

11.
Mathematical models in ecology and epidemiology often consider populations “at equilibrium”, where in-flows, such as births, equal out-flows, such as death. For stochastic models, what is meant by equilibrium is less clear – should the population size be fixed or growing and shrinking with equal probability? Two different mechanisms to implement a stochastic steady state are considered. Under these mechanisms, both a predator-prey model and an epidemic model have vastly different outcomes, including the median population values for both predators and prey and the median levels of infection within a hospital (P < 0.001 for all comparisons). These results suggest that the question of how a stochastic steady state is modeled, and what it implies for the dynamics of the system, should be carefully considered.  相似文献   

12.
The work began in 1972 when three young assistant professors used a slime mold to see if electromagnetic fields would affect it. The fields did, though the effects were small and hard to tease out of the noise. The cell cycle was lengthened and there were changes in respiration. So, the next question was “how and why?” Further changes were seen using these and then other bacterial and eukaryotic cells in respiration, in ATP, in the protein replication chain, and so forth. Changes occurred even in cell extracts that lacked an intact plasma membrane. Nerve cells showed changes in leakage of neurotransmitters and in neurite outgrowth from excised ganglia. Based on some experiments with nerve cells, I also did some computer calculations, modeling the internal electric and magnetic fields and current densities in simplified representations of bone fractures and also of spinal cords in vertebrae. More recently, I have collaborated on some theoretical models of what fields might be doing at the cellular and molecular level, particularly with reference to the radical model. With each piece of research, my collaborators and I generally found a small piece of information about fields and biological systems; and each answer raised another set of questions, which is the way of science. Though bioelectromagnetic scientists have learned much and can say much at greater depth about what happens when an organism is exposed to a field, the fundamental question still remains: What exactly is going on here? © 2021 Bioelectromagnetics Society.  相似文献   

13.
Hydra and its fellow cnidarians - sea anemones, corals and jellyfish - are simple, mostly sessile animals that depend on bioactive chemicals for survival. In this review, we briefly describe what is known about the chemical armament of Hydra, and detail future research directions where Hydra can help illuminate major questions in chemical ecology, pharmacology, developmental biology and evolution. Focusing on two groups of putative toxins from Hydra - phospholipase A2s and proteins containing ShK and zinc metalloprotease domains, we ask: how do different venom components act together during prey paralysis? How is a venom arsenal created and how does it evolve? How is the chemical arsenal delivered to its target? To what extent does a chemical and biotic coupling exist between an organism and its environment? We propose a model whereby in Hydra and other cnidarians, bioactive compounds are secreted both as localized point sources (nematocyte discharges) and across extensive body surfaces, likely combining to create complex "chemical landscapes". We speculate that these cnidarian-derived chemical landscapes may affect the surrounding community on scales from microns to, in the case of coral reefs, hundreds of kilometers.  相似文献   

14.
How can animals learn the prey densities available in an environment that changes unpredictably from day to day, and how much effort should they devote to doing so, rather than exploiting what they already know? Using a two-armed bandit situation, we simulated several processes that might explain the trade-off between exploring and exploiting. They included an optimising model, dynamic backward sampling; a dynamic version of the matching law; the Rescorla-Wagner model; a neural network model; and ?-greedy and rule of thumb models derived from the study of reinforcement learning in artificial intelligence. Under conditions like those used in published studies of birds’ performance under two-armed bandit conditions, all models usually identified the more profitable source of reward, and did so more quickly when the reward probability differential was greater. Only the dynamic programming model switched from exploring to exploiting more quickly when available time in the situation was less. With sessions of equal length presented in blocks, a session-length effect was induced in some of the models by allowing motivational, but not memory, carry-over from one session to the next. The rule of thumb model was the most successful overall, though the neural network model also performed better than the remaining models.  相似文献   

15.
Identifying the climatic drivers of an ecological system is a key step in assessing its vulnerability to climate change. The climatic dimensions to which a species or system is most sensitive – such as means or extremes – can guide methodological decisions for projections of ecological impacts and vulnerabilities. However, scientific workflows for combining climate projections with ecological models have received little explicit attention. We review Global Climate Model (GCM) performance along different dimensions of change and compare frameworks for integrating GCM output into ecological models. In systems sensitive to climatological means, it is straightforward to base ecological impact assessments on mean projected changes from several GCMs. Ecological systems sensitive to climatic extremes may benefit from what we term the ‘model space’ approach: a comparison of ecological projections based on simulated climate from historical and future time periods. This approach leverages the experimental framework used in climate modeling, in which historical climate simulations serve as controls for future projections. Moreover, it can capture projected changes in the intensity and frequency of climatic extremes, rather than assuming that future means will determine future extremes. Given the recent emphasis on the ecological impacts of climatic extremes, the strategies we describe will be applicable across species and systems. We also highlight practical considerations for the selection of climate models and data products, emphasizing that the spatial resolution of the climate change signal is generally coarser than the grid cell size of downscaled climate model output. Our review illustrates how an understanding of how climate model outputs are derived and downscaled can improve the selection and application of climatic data used in ecological modeling.  相似文献   

16.
The current proliferation of proposals for health care reform makes it difficult to sort out the differences among plans and the likely outcome of different approaches to reform. The current health care system has two basic features. The first, enrollment and eligibility functions, includes how people get into the system and gain coverage for health care services. We describe 4 models, ranging from an individual, voluntary approach to a universal, tax-based model. The second, the provision of health care, includes how physician services are organized, how they are paid for, what mechanisms are in place for quality assurance, and the degree of organization and oversight of the health care system. We describe 7 models of the organization component, including the current fee-for-service system with no national health budget, managed care, salaried providers under a budget, and managed competition with and without a national health budget. These 2 components provide the building blocks for health care plans, presented as a matrix. We also evaluate several reform proposals by how they combine these 2 elements.  相似文献   

17.
18.
High-throughput computational methods in X-ray protein crystallography are indispensable to meet the goals of structural genomics. In particular, automated interpretation of electron density maps, especially those at mediocre resolution, can significantly speed up the protein structure determination process. TEXTAL(TM) is a software application that uses pattern recognition, case-based reasoning and nearest neighbor learning to produce reasonably refined molecular models, even with average quality data. In this work, we discuss a key issue to enable fast and accurate interpretation of typically noisy electron density data: what features should be used to characterize the density patterns, and how relevant are they? We discuss the challenges of constructing features in this domain, and describe SLIDER, an algorithm to determine the weights of these features. SLIDER searches a space of weights using ranking of matching patterns (relative to mismatching ones) as its evaluation function. Exhaustive search being intractable, SLIDER adopts a greedy approach that judiciously restricts the search space only to weight values that cause the ranking of good matches to change. We show that SLIDER contributes significantly in finding the similarity between density patterns, and discuss the sensitivity of feature relevance to the underlying similarity metric.  相似文献   

19.
‘Deterministic’ models in population dynamics often are really approximations to stochastic models, justified by an appeal to ‘the law of large numbers’. It is proposed to call such models ‘pseudodeterministic’. Four questions are discussed in this article: (1) What errors may be made by equating deterministically predicted values to expectations? (2) When, and in what sense, may numbers be assumed to be large? (3) How large are the variances, coefficients of variations, etc., as assigned to the variables in the stochastic versions of the models? (4) What role may pseudodeterministic models play in empirical research, where problems of statistical reliability arise? As an example, a modified Nicholson-Bailey model of the interaction between insect parasitoids and their hosts is discussed; the modification consists of assigning a random (density-independent) mortality to the parasitoid population. A stochastic version of this model is discussed. The expectation of the final host density is compared with the value computed from the deterministic model. The latter value is systematically lower than the former. The magnitude of the difference depends on parameter values. The variability to be expected with the stochastic model is characterized by the coefficient of variation of the final host density; its dependence on parameter values and initial conditions is discussed. It is concluded that it is worthwhile in practical applications to estimate parasitoid mortality, and that the coefficient of variation in real situations may be far from negligible.  相似文献   

20.
Numerous formulations with the same mathematical properties can be relevant to model a biological process. Different formulations can predict different model dynamics like equilibrium vs. oscillations even if they are quantitatively close (structural sensitivity). The question we address in this paper is: does the choice of a formulation affect predictions on the number of stable states? We focus on a predator–prey model with predator competition that exhibits multiple stable states. A bifurcation analysis is realized with respect to prey carrying capacity and species body mass ratio within range of values found in food web models. Bifurcation diagrams built for two type-II functional responses are different in two ways. First, the kind of stable state (equilibrium vs. oscillations) is different for 26.0–49.4% of the parameter values, depending on the parameter space investigated. Using generalized modelling, we highlight the role of functional response slope in this difference. Secondly, the number of stable states is higher with Ivlev's functional response for 0.1–14.3% of the parameter values. These two changes interact to create different model predictions if a parameter value or a state variable is altered. In these two examples of disturbance, Holling's disc equation predicts a higher system resilience. Indeed, Ivlev's functional response predicts that disturbance may trap the system into an alternative stable state that can be escaped from only by a larger alteration (hysteresis phenomena). Two questions arise from this work: (i) how much complex ecological models can be affected by this sensitivity to model formulation? and (ii) how to deal with these uncertainties in model predictions?  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号