首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Reconstructing the dynamics of populations is complicated by the different types of stochasticity experienced by populations, in particular if some forms of stochasticity introduce bias in parameter estimation in addition to error. Identification of systematic biases is critical when determining whether the intrinsic dynamics of populations are stable or unstable and whether or not populations exhibit an Allee effect, i.e., a minimum size below which deterministic extinction should follow. Using a simulation model that allows for Allee effects and a range of intrinsic dynamics, we investigated how three types of stochasticity—demographic, environmental, and random catastrophes— affect our ability to reconstruct the intrinsic dynamics of populations. Demographic stochasticity aside, which is only problematic in small populations, we find that environmental stochasticity—positive and negative environmental fluctuations—caused increased error in parameter estimation, but bias was rarely problematic, except at the highest levels of noise. Random catastrophes, events causing large-scale mortality and likely to be more common than usually recognized, caused immediate bias in parameter estimates, in particular when Allee effects were large. In the latter case, population stability was predicted when endogenous dynamics were actually unstable and the minimum viable population size was overestimated in populations with small or non-existent Allee effects. Catastrophes also generally increased extinction risk, in particular when endogenous Allee effects were large. We propose a method for identifying data points likely resulting from catastrophic events when such events have not been recorded. Using social spider colonies (Anelosimus spp.) as models for populations, we show that after known or suspected catastrophes are accounted for, reconstructed growth parameters are consistent with intrinsic dynamical instability and substantial Allee effects. Our results are applicable to metapopulation or time series data and are relevant for predicting extinction in conservation applications or the management of invasive species.  相似文献   

2.
The fractional concentration of exhaled nitric oxide (FeNO) is a biomarker of airway inflammation that is being increasingly considered in clinical, occupational, and epidemiological applications ranging from asthma management to the detection of air pollution health effects. FeNO depends strongly on exhalation flow rate. This dependency has allowed for the development of mathematical models whose parameters quantify airway and alveolar compartment contributions to FeNO. Numerous methods have been proposed to estimate these parameters using FeNO measured at multiple flow rates. These methods—which allow for non-invasive assessment of localized airway inflammation—have the potential to provide important insights on inflammatory mechanisms. However, different estimation methods produce different results and a serious barrier to progress in this field is the lack of a single recommended method. With the goal of resolving this methodological problem, we have developed a unifying framework in which to present a comprehensive set of existing and novel statistical methods for estimating parameters in the simple two-compartment model. We compared statistical properties of the estimators in simulation studies and investigated model fit and parameter estimate sensitivity across methods using data from 1507 schoolchildren from the Southern California Children''s Health Study, one of the largest multiple flow FeNO studies to date. We recommend a novel nonlinear least squares model with natural log transformation on both sides that produced estimators with good properties, satisfied model assumptions, and fit the Children''s Health Study data well.  相似文献   

3.
From population genetics theory, elevating the mutation rate of a large population should progressively reduce average fitness. If the fitness decline is large enough, the population will go extinct in a process known as lethal mutagenesis. Lethal mutagenesis has been endorsed in the virology literature as a promising approach to viral treatment, and several in vitro studies have forced viral extinction with high doses of mutagenic drugs. Yet only one empirical study has tested the genetic models underlying lethal mutagenesis, and the theory failed on even a qualitative level. Here we provide a new level of analysis of lethal mutagenesis by developing and evaluating models specifically tailored to empirical systems that may be used to test the theory. We first quantify a bias in the estimation of a critical parameter and consider whether that bias underlies the previously observed lack of concordance between theory and experiment. We then consider a seemingly ideal protocol that avoids this bias—mutagenesis of virions—but find that it is hampered by other problems. Finally, results that reveal difficulties in the mere interpretation of mutations assayed from double-strand genomes are derived. Our analyses expose unanticipated complexities in testing the theory. Nevertheless, the previous failure of the theory to predict experimental outcomes appears to reside in evolutionary mechanisms neglected by the theory (e.g., beneficial mutations) rather than from a mismatch between the empirical setup and model assumptions. This interpretation raises the specter that naive attempts at lethal mutagenesis may augment adaptation rather than retard it.  相似文献   

4.
A variety of filtering methods enable the recursive estimation of system state variables and inference of model parameters. These methods have found application in a range of disciplines and settings, including engineering design and forecasting, and, over the last two decades, have been applied to infectious disease epidemiology. For any system of interest, the ideal filter depends on the nonlinearity and complexity of the model to which it is applied, the quality and abundance of observations being entrained, and the ultimate application (e.g. forecast, parameter estimation, etc.). Here, we compare the performance of six state-of-the-art filter methods when used to model and forecast influenza activity. Three particle filters—a basic particle filter (PF) with resampling and regularization, maximum likelihood estimation via iterated filtering (MIF), and particle Markov chain Monte Carlo (pMCMC)—and three ensemble filters—the ensemble Kalman filter (EnKF), the ensemble adjustment Kalman filter (EAKF), and the rank histogram filter (RHF)—were used in conjunction with a humidity-forced susceptible-infectious-recovered-susceptible (SIRS) model and weekly estimates of influenza incidence. The modeling frameworks, first validated with synthetic influenza epidemic data, were then applied to fit and retrospectively forecast the historical incidence time series of seven influenza epidemics during 2003–2012, for 115 cities in the United States. Results suggest that when using the SIRS model the ensemble filters and the basic PF are more capable of faithfully recreating historical influenza incidence time series, while the MIF and pMCMC do not perform as well for multimodal outbreaks. For forecast of the week with the highest influenza activity, the accuracies of the six model-filter frameworks are comparable; the three particle filters perform slightly better predicting peaks 1–5 weeks in the future; the ensemble filters are more accurate predicting peaks in the past.  相似文献   

5.
Executions in Texas from 1994–2005 do not deter homicides, contrary to the results of Land et al. (2009). We find that using different models—based on pre-tests for unit roots that correct for earlier model misspecifications—one cannot reject the null hypothesis that executions do not lead to a change in homicides in Texas over this period. Using additional control variables, we show that variables such as the number of prisoners in Texas may drive the main drop in homicides over this period. Such conclusions however are highly sensitive to model specification decisions, calling into question the assumptions about fixed parameters and constant structural relationships. This means that using dynamic regressions to account for policy changes that may affect homicides need to be done with significant care and attention.  相似文献   

6.
Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted training exist, a unified and systematic CLDA framework that generalizes across different setups is lacking. Here we develop a novel closed-loop BMI training architecture that allows for processing, control, and adaptation using spike events, enables robust control and extends to various tasks. Moreover, we develop a unified control-theoretic CLDA framework within which intention estimation, assisted training, and adaptation are performed. The architecture incorporates an infinite-horizon optimal feedback-control (OFC) model of the brain’s behavior in closed-loop BMI control, and a point process model of spikes. The OFC model infers the user’s motor intention during CLDA—a process termed intention estimation. OFC is also used to design an autonomous and dynamic assisted training technique. The point process model allows for neural processing, control and decoder adaptation with every spike event and at a faster time-scale than current decoders; it also enables dynamic spike-event-based parameter adaptation unlike current CLDA methods that use batch-based adaptation on much slower adaptation time-scales. We conducted closed-loop experiments in a non-human primate over tens of days to dissociate the effects of these novel CLDA components. The OFC intention estimation improved BMI performance compared with current intention estimation techniques. OFC assisted training allowed the subject to consistently achieve proficient control. Spike-event-based adaptation resulted in faster and more consistent performance convergence compared with batch-based methods, and was robust to parameter initialization. Finally, the architecture extended control to tasks beyond those used for CLDA training. These results have significant implications towards the development of clinically-viable neuroprosthetics.  相似文献   

7.
We present a rigorous statistical model that infers the structure of P. falciparum mixtures—including the number of strains present, their proportion within the samples, and the amount of unexplained mixture—using whole genome sequence (WGS) data. Applied to simulation data, artificial laboratory mixtures, and field samples, the model provides reasonable inference with as few as 10 reads or 50 SNPs and works efficiently even with much larger data sets. Source code and example data for the model are provided in an open source fashion. We discuss the possible uses of this model as a window into within-host selection for clinical and epidemiological studies.  相似文献   

8.
Point 1: The ecological models of Alfred J. Lotka and Vito Volterra have had an enormous impact on ecology over the past century. Some of the earliest—and clearest—experimental tests of these models were famously conducted by Georgy Gause in the 1930s. Although well known, the data from these experiments are not widely available and are often difficult to analyze using standard statistical and computational tools.Point 2: Here, we introduce the gauseR package, a collection of tools for fitting Lotka‐Volterra models to time series data of one or more species. The package includes several methods for parameter estimation and optimization, and includes 42 datasets from Gause''s species interaction experiments and related work. Additionally, we include with this paper a short blog post discussing the historical importance of these data and models, and an R vignette with a walk‐through introducing the package methods. The package is available for download at github.com/adamtclark/gauseR.Point 3: To demonstrate the package, we apply it to several classic experimental studies from Gause, as well as two other well‐known datasets on multi‐trophic dynamics on Isle Royale, and in spatially structured mite populations. In almost all cases, models fit observations closely and fitted parameter values make ecological sense.Point 4: Taken together, we hope that the methods, data, and analyses that we present here provide a simple and user‐friendly way to interact with complex ecological data. We are optimistic that these methods will be especially useful to students and educators who are studying ecological dynamics, as well as researchers who would like a fast tool for basic analyses.  相似文献   

9.
The importance of the simultaneous consideration of tissue and substrate concentration in the estimation of ion uptake is discussed. An elaboration of the model of ion uptake, originally proposed by E. Epstein, is developed. The modified model takes both tissue and substrate ion concentration into account and uses true constants in the estimation of uptake. A test case—K+ uptake by a barley cultivar—has been presented to show the working of the model. The relevance of the modified model is also pointed out.  相似文献   

10.
Marine bacterial diversity is immense and believed to be driven in part by trade-offs in metabolic strategies. Here we consider heterotrophs that rely on organic carbon as an energy source and present a molecular-level model of cell metabolism that explains the dichotomy between copiotrophs—which dominate in carbon-rich environments—and oligotrophs—which dominate in carbon-poor environments—as the consequence of trade-offs between nutrient transport systems. While prototypical copiotrophs, like Vibrios, possess numerous phosphotransferase systems (PTS), prototypical oligotrophs, such as SAR11, lack PTS and rely on ATP-binding cassette (ABC) transporters, which use binding proteins. We develop models of both transport systems and use them in proteome allocation problems to predict the optimal nutrient uptake and metabolic strategy as a function of carbon availability. We derive a Michaelis–Menten approximation of ABC transport, analytically demonstrating how the half-saturation concentration is a function of binding protein abundance. We predict that oligotrophs can attain nanomolar half-saturation concentrations using binding proteins with only micromolar dissociation constants and while closely matching transport and metabolic capacities. However, our model predicts that this requires large periplasms and that the slow diffusion of the binding proteins limits uptake. Thus, binding proteins are critical for oligotrophic survival yet severely constrain growth rates. We propose that this trade-off fundamentally shaped the divergent evolution of oligotrophs and copiotrophs.  相似文献   

11.
12.
Computational modeling has been applied for data analysis in psychology, neuroscience, and psychiatry. One of its important uses is to infer the latent variables underlying behavior by which researchers can evaluate corresponding neural, physiological, or behavioral measures. This feature is especially crucial for computational psychiatry, in which altered computational processes underlying mental disorders are of interest. For instance, several studies employing model-based fMRI—a method for identifying brain regions correlated with latent variables—have shown that patients with mental disorders (e.g., depression) exhibit diminished neural responses to reward prediction errors (RPEs), which are the differences between experienced and predicted rewards. Such model-based analysis has the drawback that the parameter estimates and inference of latent variables are not necessarily correct—rather, they usually contain some errors. A previous study theoretically and empirically showed that the error in model-fitting does not necessarily cause a serious error in model-based fMRI. However, the study did not deal with certain situations relevant to psychiatry, such as group comparisons between patients and healthy controls. We developed a theoretical framework to explore such situations. We demonstrate that the parameter-misspecification can critically affect the results of group comparison. We demonstrate that even if the RPE response in patients is completely intact, a spurious difference to healthy controls is observable. Such a situation occurs when the ground-truth learning rate differs between groups but a common learning rate is used, as per previous studies. Furthermore, even if the parameters are appropriately fitted to individual participants, spurious group differences in RPE responses are observable when the model lacks a component that differs between groups. These results highlight the importance of appropriate model-fitting and the need for caution when interpreting the results of model-based fMRI.  相似文献   

13.
The way in which single neurons transform input into output spike trains has fundamental consequences for network coding. Theories and modeling studies based on standard Integrate-and-Fire models implicitly assume that, in response to increasingly strong inputs, neurons modify their coding strategy by progressively reducing their selective sensitivity to rapid input fluctuations. Combining mathematical modeling with in vitro experiments, we demonstrate that, in L5 pyramidal neurons, the firing threshold dynamics adaptively adjust the effective timescale of somatic integration in order to preserve sensitivity to rapid signals over a broad range of input statistics. For that, a new Generalized Integrate-and-Fire model featuring nonlinear firing threshold dynamics and conductance-based adaptation is introduced that outperforms state-of-the-art neuron models in predicting the spiking activity of neurons responding to a variety of in vivo-like fluctuating currents. Our model allows for efficient parameter extraction and can be analytically mapped to a Generalized Linear Model in which both the input filter—describing somatic integration—and the spike-history filter—accounting for spike-frequency adaptation—dynamically adapt to the input statistics, as experimentally observed. Overall, our results provide new insights on the computational role of different biophysical processes known to underlie adaptive coding in single neurons and support previous theoretical findings indicating that the nonlinear dynamics of the firing threshold due to Na+-channel inactivation regulate the sensitivity to rapid input fluctuations.  相似文献   

14.
Parametric methods for identifying laterally transferred genes exploit the directional mutational biases unique to each genome. Yet the development of new, more robust methods—as well as the evaluation and proper implementation of existing methods—relies on an arbitrary assessment of performance using real genomes, where the evolutionary histories of genes are not known. We have used the framework of a generalized hidden Markov model to create artificial genomes modeled after genuine genomes. To model a genome, “core” genes—those displaying patterns of mutational biases shared among large numbers of genes—are identified by a novel gene clustering approach based on the Akaike information criterion. Gene models derived from multiple “core” gene clusters are used to generate an artificial genome that models the properties of a genuine genome. Chimeric artificial genomes—representing those having experienced lateral gene transfer—were created by combining genes from multiple artificial genomes, and the performance of the parametric methods for identifying “atypical” genes was assessed directly. We found that a hidden Markov model that included multiple gene models, each trained on sets of genes representing the range of genotypic variability within a genome, could produce artificial genomes that mimicked the properties of genuine genomes. Moreover, different methods for detecting foreign genes performed differently—i.e., they had different sets of strengths and weaknesses—when identifying atypical genes within chimeric artificial genomes.  相似文献   

15.
Mechanical characteristics of single biological cells are used to identify and possibly leverage interesting differences among cells or cell populations. Fluidity—hysteresivity normalized to the extremes of an elastic solid or a viscous liquid—can be extracted from, and compared among, multiple rheological measurements of cells: creep compliance versus time, complex modulus versus frequency, and phase lag versus frequency. With multiple strategies available for acquisition of this nondimensional property, fluidity may serve as a useful and robust parameter for distinguishing cell populations, and for understanding the physical origins of deformability in soft matter. Here, for three disparate eukaryotic cell types deformed in the suspended state via optical stretching, we examine the dependence of fluidity on chemical and environmental influences at a timescale of ∼1 s. We find that fluidity estimates are consistent in the time and frequency domains under a structural damping (power-law or fractional-derivative) model, but not under an equivalent-complexity, lumped-component (spring-dashpot) model; the latter predicts spurious time constants. Although fluidity is suppressed by chemical cross-linking, we find that ATP depletion in the cell does not measurably alter the parameter, and we thus conclude that active ATP-driven events are not a crucial enabler of fluidity during linear viscoelastic deformation of a suspended cell. Finally, by using the capacity of optical stretching to produce near-instantaneous increases in cell temperature, we establish that fluidity increases with temperature—now measured in a fully suspended, sortable cell without the complicating factor of cell-substratum adhesion.  相似文献   

16.
We study the genetic basis of adaptation in a moving optimum model, in which the optimal value for a quantitative trait increases over time at a constant rate. We first analyze a one-locus two-allele model with recurrent mutation, for which we derive accurate analytical approximations for (i) the time at which a previously deleterious allele becomes beneficial, (ii) the waiting time for a successful new mutation, and (iii) the time the mutant allele needs to reach fixation. On the basis of these results, we show that the shortest total time to fixation is for alleles with intermediate phenotypic effect. We derive an approximation for this “optimal” effect, and we show that it depends in a simple way on a composite parameter, which integrates the ecological parameters and the genetic architecture of the trait. In a second step, we use stochastic computer simulations of a multilocus model to study the order in which mutant alleles with different effects go to fixation. In agreement with the one-locus results, alleles with intermediate effect tend to become fixed earlier than those with either small or large effects. However, the effect size of the fastest mutations differs from the one predicted in the one-locus model. We show how these differences can be explained by two specific effects of multilocus genetics. Finally, we discuss our results in the light of three relevant timescales acting in the system—the environmental, mutation, and fixation timescales—which define three parameter regimes leading to qualitative differences in the adaptive substitution pattern.  相似文献   

17.
Mislocalization and aggregation of the huntingtin protein are related to Huntington’s disease. Its first exon—more specifically the first 17 amino acids (Htt17)—is crucial for the physiological and pathological functions of huntingtin. It regulates huntingtin’s activity through posttranslational modifications and serves as an anchor to membrane-containing organelles of the cell. Recently, structure and orientation of the Htt17 membrane anchor were determined using a combined solution and solid-state NMR approach. This prompted us to refine this model by investigating the dynamics and thermodynamics of this membrane anchor on a POPC bilayer using all-atom, explicit solvent molecular dynamics and Hamiltonian replica exchange. Our simulations are combined with various experimental measurements to generate a high-resolution atomistic model for the huntingtin Htt17 membrane anchor on a POPC bilayer. More precisely, we observe that the single α-helix structure is more stable in the phospholipid membrane than the NMR model obtained in the presence of dodecylphosphocholine detergent micelles. The resulting Htt17 monomer has its hydrophobic plane oriented parallel to the bilayer surface. Our results further unveil the key residues interacting with the membrane in terms of hydrogen bonds, salt-bridges, and nonpolar contributions. We also observe that Htt17 equilibrates at a well-defined insertion depth and that it perturbs the physical properties—order parameter, thickness, and area per lipid—of the bilayer in a manner that could favor its dimerization. Overall, our observations reinforce and refine the NMR measurements on the Htt17 membrane anchor segment of huntingtin that is of fundamental importance to its biological functions.  相似文献   

18.
ObjectiveTumor cachexia is an important prognostic parameter in epithelial ovarian cancer (EOC). Tumor cachexia is characterized by metabolic and inflammatory disturbances. These conditions might be reflected by body composition measurements (BCMs) ascertained by pre-operative computed tomography (CT). Thus, we aimed to identify the prognostically most relevant BCMs assessed by pre-operative CT in EOC patients.MethodsWe evaluated muscle BCMs and well established markers of nutritional and inflammatory status, as well as clinical-pathological parameters in 140 consecutive patients with EOC. Furthermore, a multiplexed inflammatory marker panel of 25 cytokines was used to determine the relationship of BCMs with inflammatory markers and patient’s outcome. All relevant parameters were evaluated in uni- and multivariate survival analysis.ResultsMuscle attenuation (MA)—a well established BCM parameter—is an independent prognostic factor for survival in multivariate analysis (HR 2.25; p = 0.028). Low MA—reflecting a state of cachexia—is also associated with residual tumor after cytoreductive surgery (p = 0.046) and with an unfavorable performance status (p = 0.015). Moreover, MA is associated with Eotaxin and IL-10 out of the 25 cytokine multiplex marker panel in multivariate linear regression analysis (p = 0.021 and p = 0.047, respectively).ConclusionMA—ascertained by routine pre-operative CT—is an independent prognostic parameter in EOC patients. Low MA is associated with the inflammatory, as well as the nutritional component of cachexia. Therefore, the clinical value of pre-operative CT could be enhanced by the assessment of MA.  相似文献   

19.
Native biodiversity is threatened by invasive species in many terrestrial and marine systems, and conservation managers have demonstrated successes by responding with eradication or control programs. Although invasive species are often the direct cause of threat to native species, ecosystems can react in unexpected ways to their removal or reduction. Here, we use theoretical models to predict boom‐bust dynamics, where the removal of predatory or competitive pressure from a native herbivore results in oscillatory population dynamics (boom‐bust), which can endanger the native species’ population in the short term. We simulate control activities, applied to multiple theoretical three‐species Lotka‐Volterra ecosystem models consisting of vegetation, a native herbivore, and an invasive predator. Based on these communities, we then develop a predictive tool that—based on relative parameter values—predicts whether control efforts directed at the invasive predator will lead to herbivore release followed by a crash. Further, by investigating the different functional responses, we show that model structure, as well as model parameters, are important determinants of conservation outcomes. Finally, control strategies that can mitigate these negative consequences are identified. Managers working in similar data‐poor ecosystems can use the predictive tool to assess the probability that their system will exhibit boom‐bust dynamics, without knowing exact community parameter values.  相似文献   

20.
Batesian mimicry evolves when individuals of a palatable species gain the selective advantage of reduced predation because they resemble a toxic species that predators avoid. Here, we evaluated whether—and in which direction—Batesian mimicry has evolved in a natural population of mimics following extirpation of their model. We specifically asked whether the precision of coral snake mimicry has evolved among kingsnakes from a region where coral snakes recently (1960) went locally extinct. We found that these kingsnakes have evolved more precise mimicry; by contrast, no such change occurred in a sympatric non-mimetic species or in conspecifics from a region where coral snakes remain abundant. Presumably, more precise mimicry has continued to evolve after model extirpation, because relatively few predator generations have passed, and the fitness costs incurred by predators that mistook a deadly coral snake for a kingsnake were historically much greater than those incurred by predators that mistook a kingsnake for a coral snake. Indeed, these results are consistent with prior theoretical and empirical studies, which revealed that only the most precise mimics are favoured as their model becomes increasingly rare. Thus, highly noxious models can generate an ‘evolutionary momentum’ that drives the further evolution of more precise mimicry—even after models go extinct.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号