首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In this paper we present a multiscale, individual-based simulation environment that integrates CompuCell3D for lattice-based modelling on the cellular level and Bionetsolver for intracellular modelling. CompuCell3D or CC3D provides an implementation of the lattice-based Cellular Potts Model or CPM (also known as the Glazier-Graner-Hogeweg or GGH model) and a Monte Carlo method based on the metropolis algorithm for system evolution. The integration of CC3D for cellular systems with Bionetsolver for subcellular systems enables us to develop a multiscale mathematical model and to study the evolution of cell behaviour due to the dynamics inside of the cells, capturing aspects of cell behaviour and interaction that is not possible using continuum approaches. We then apply this multiscale modelling technique to a model of cancer growth and invasion, based on a previously published model of Ramis-Conde et al. (2008) where individual cell behaviour is driven by a molecular network describing the dynamics of E-cadherin and β-catenin. In this model, which we refer to as the centre-based model, an alternative individual-based modelling technique was used, namely, a lattice-free approach. In many respects, the GGH or CPM methodology and the approach of the centre-based model have the same overall goal, that is to mimic behaviours and interactions of biological cells. Although the mathematical foundations and computational implementations of the two approaches are very different, the results of the presented simulations are compatible with each other, suggesting that by using individual-based approaches we can formulate a natural way of describing complex multi-cell, multiscale models. The ability to easily reproduce results of one modelling approach using an alternative approach is also essential from a model cross-validation standpoint and also helps to identify any modelling artefacts specific to a given computational approach.  相似文献   

2.
Existing compartmental mathematical modelling methods for epidemics, such as SEIR models, cannot accurately represent effects of contact tracing. This makes them inappropriate for evaluating testing and contact tracing strategies to contain an outbreak. An alternative used in practice is the application of agent- or individual-based models (ABM). However ABMs are complex, less well-understood and much more computationally expensive. This paper presents a new method for accurately including the effects of Testing, contact-Tracing and Isolation (TTI) strategies in standard compartmental models. We derive our method using a careful probabilistic argument to show how contact tracing at the individual level is reflected in aggregate on the population level. We show that the resultant SEIR-TTI model accurately approximates the behaviour of a mechanistic agent-based model at far less computational cost. The computational efficiency is such that it can be easily and cheaply used for exploratory modelling to quantify the required levels of testing and tracing, alone and with other interventions, to assist adaptive planning for managing disease outbreaks.  相似文献   

3.
4.
Data from long-term monitoring sites are vital for biogeochemical process understanding, and for model development. Implicitly or explicitly, information provided by both monitoring and modelling must be extrapolated in order to have wider scientific and policy utility. In many cases, large-scale modelling utilises little of the data available from long-term monitoring, instead relying on simplified models and limited, often highly uncertain, data for parameterisation. Here, we propose a new approach whereby outputs from model applications to long-term monitoring sites are upscaled to the wider landscape using a simple statistical method. For the 22 lakes and streams of the UK Acid Waters Monitoring Network (AWMN), standardised concentrations (Z scores) for Acid Neutralising Capacity (ANC), dissolved organic carbon, nitrate and sulphate show high temporal coherence among sites. This coherence permits annual mean solute concentrations at a new site to be predicted by back-transforming Z scores derived from observations or model applications at other sites. The approach requires limited observational data for the new site, such as annual mean estimates from two synoptic surveys. Several illustrative applications of the method suggest that it is effective at predicting long-term ANC change in upland surface waters, and may have wider application. Because it is possible to parameterise and constrain more sophisticated models with data from intensively monitored sites, the extrapolation of model outputs to policy relevant scales using this approach could provide a more robust, and less computationally demanding, alternative to the application of simple generalised models using extrapolated input data.  相似文献   

5.
Ye W  Lin X  Taylor JM 《Biometrics》2008,64(4):1238-1246
SUMMARY: In this article we investigate regression calibration methods to jointly model longitudinal and survival data using a semiparametric longitudinal model and a proportional hazards model. In the longitudinal model, a biomarker is assumed to follow a semiparametric mixed model where covariate effects are modeled parametrically and subject-specific time profiles are modeled nonparametrially using a population smoothing spline and subject-specific random stochastic processes. The Cox model is assumed for survival data by including both the current measure and the rate of change of the underlying longitudinal trajectories as covariates, as motivated by a prostate cancer study application. We develop a two-stage semiparametric regression calibration (RC) method. Two variations of the RC method are considered, risk set regression calibration and a computationally simpler ordinary regression calibration. Simulation results show that the two-stage RC approach performs well in practice and effectively corrects the bias from the naive method. We apply the proposed methods to the analysis of a dataset for evaluating the effects of the longitudinal biomarker PSA on the recurrence of prostate cancer.  相似文献   

6.
We have developed a method of calculating the solvation energy of a surface based on an implicit solvent model. This new model called COSMIC, is an extension of the established COSMO solvation approach and allows the technique to be applied to systems of any periodicity from finite molecules, through polymers and surfaces, to cavities of water within a bulk unit cell. As well as extending the scope of the COSMO technique, it also improves the numerical stability through removal of a number of discontinuities in the potential energy surface. The COSMIC model has been applied to barium sulfate, where it was found to produce similar surface energies and configurations to the much more computationally expensive explicit molecular dynamics simulations. The calculated solvated morphology of barium sulfate was found to differ significantly to that calculated in vacuum with a reduced number of faces present.  相似文献   

7.
8.
9.
The mechanical properties of well-ordered porous materials are related to their geometrical parameters at the mesoscale. Finite element (FE) analysis is a powerful tool to design well-ordered porous materials by analysing the mechanical behaviour. However, FE models are often computationally expensive. This article aims to develop a cost-effective FE model to simulate well-ordered porous metallic materials for orthopaedic applications. Solid and beam FE modelling approaches are compared, using finite size and infinite media models considering cubic unit cell geometry. The model is then applied to compare two unit cell geometries: cubic and diamond. Models having finite size provide similar results than the infinite media model approach for large sample sizes. In addition, these finite size models also capture the influence of the boundary conditions on the mechanical response for small sample sizes. The beam FE modelling approach showed little computational cost and similar results to the solid FE modelling approach. Diamond unit cell geometry appeared to be more suitable for orthopaedic applications than the cubic unit cell geometry.  相似文献   

10.
Providing generic and cost effective modelling approaches to reconstruct and forecast freshwater temperature using predictors as air temperature and water discharge is a prerequisite to understanding ecological processes underlying the impact of water temperature and of global warming on continental aquatic ecosystems. Using air temperature as a simple linear predictor of water temperature can lead to significant bias in forecasts as it does not disentangle seasonality and long term trends in the signal. Here, we develop an alternative approach based on hierarchical Bayesian statistical time series modelling of water temperature, air temperature and water discharge using seasonal sinusoidal periodic signals and time varying means and amplitudes. Fitting and forecasting performances of this approach are compared with that of simple linear regression between water and air temperatures using i) an emotive simulated example, ii) application to three French coastal streams with contrasting bio-geographical conditions and sizes. The time series modelling approach better fit data and does not exhibit forecasting bias in long term trends contrary to the linear regression. This new model also allows for more accurate forecasts of water temperature than linear regression together with a fair assessment of the uncertainty around forecasting. Warming of water temperature forecast by our hierarchical Bayesian model was slower and more uncertain than that expected with the classical regression approach. These new forecasts are in a form that is readily usable in further ecological analyses and will allow weighting of outcomes from different scenarios to manage climate change impacts on freshwater wildlife.  相似文献   

11.
Fluorescence lifetime imaging (FLIM) is widely applied to obtain quantitative information from fluorescence signals, particularly using Förster Resonant Energy Transfer (FRET) measurements to map, for example, protein-protein interactions. Extracting FRET efficiencies or population fractions typically entails fitting data to complex fluorescence decay models but such experiments are frequently photon constrained, particularly for live cell or in vivo imaging, and this leads to unacceptable errors when analysing data on a pixel-wise basis. Lifetimes and population fractions may, however, be more robustly extracted using global analysis to simultaneously fit the fluorescence decay data of all pixels in an image or dataset to a multi-exponential model under the assumption that the lifetime components are invariant across the image (dataset). This approach is often considered to be prohibitively slow and/or computationally expensive but we present here a computationally efficient global analysis algorithm for the analysis of time-correlated single photon counting (TCSPC) or time-gated FLIM data based on variable projection. It makes efficient use of both computer processor and memory resources, requiring less than a minute to analyse time series and multiwell plate datasets with hundreds of FLIM images on standard personal computers. This lifetime analysis takes account of repetitive excitation, including fluorescence photons excited by earlier pulses contributing to the fit, and is able to accommodate time-varying backgrounds and instrument response functions. We demonstrate that this global approach allows us to readily fit time-resolved fluorescence data to complex models including a four-exponential model of a FRET system, for which the FRET efficiencies of the two species of a bi-exponential donor are linked, and polarisation-resolved lifetime data, where a fluorescence intensity and bi-exponential anisotropy decay model is applied to the analysis of live cell homo-FRET data. A software package implementing this algorithm, FLIMfit, is available under an open source licence through the Open Microscopy Environment.  相似文献   

12.
Recent advances in statistical software have led to the rapid diffusion of new methods for modelling longitudinal data. Multilevel (also known as hierarchical or random effects) models for binary outcomes have generally been based on a logistic-normal specification, by analogy with earlier work for normally distributed data. The appropriate application and interpretation of these models remains somewhat unclear, especially when compared with the computationally more straightforward semiparametric or 'marginal' modelling (GEE) approaches. In this paper we pose two interrelated questions. First, what limits should be placed on the interpretation of the coefficients and inferences derived from random-effect models involving binary outcomes? Second, what diagnostic checks are appropriate for evaluating whether such random-effect models provide adequate fits to the data? We address these questions by means of an extended case study using data on adolescent smoking from a large cohort study. Bayesian estimation methods are used to fit a discrete-mixture alternative to the standard logistic-normal model, and posterior predictive checking is used to assess model fit. Surprising parallels in the parameter estimates from the logistic-normal and mixture models are described and used to question the interpretability of the so-called 'subject-specific' regression coefficients from the standard multilevel approach. Posterior predictive checks suggest a serious lack of fit of both multilevel models. The results do not provide final answers to the two questions posed, but we expect that lessons learned from the case study will provide general guidance for further investigation of these important issues.  相似文献   

13.
This paper presents a novel approach to constitutive modeling of viscoelastic soft tissues. This formulation combines an anisotropic strain energy function, accounting for preferred material directions, to define the elastic stress–strain relationship, and a discrete time black-box dynamic model, borrowed from the theory of system identification, to describe the time-dependent behavior. This discrete time formulation is straightforwardly oriented to the development of a recursive time integration scheme that calculates the current stress state by using strain and stress values stored at a limited number of previous time instants. The viscoelastic model and the numerical procedure are assessed by implementing two numerical examples, the simulation of a uniaxial tensile test and the inflation of a thin tube. Both simulations are performed using parameter values based on previous experiments on preserved bovine pericardium. Parameters are then adjusted to investigate the sensitivity of the model. The hypotheses the model relies upon are discussed and the main limitations are stated.  相似文献   

14.
15.
Efficient measurement error correction with spatially misaligned data   总被引:1,自引:0,他引:1  
Association studies in environmental statistics often involve exposure and outcome data that are misaligned in space. A common strategy is to employ a spatial model such as universal kriging to predict exposures at locations with outcome data and then estimate a regression parameter of interest using the predicted exposures. This results in measurement error because the predicted exposures do not correspond exactly to the true values. We characterize the measurement error by decomposing it into Berkson-like and classical-like components. One correction approach is the parametric bootstrap, which is effective but computationally intensive since it requires solving a nonlinear optimization problem for the exposure model parameters in each bootstrap sample. We propose a less computationally intensive alternative termed the "parameter bootstrap" that only requires solving one nonlinear optimization problem, and we also compare bootstrap methods to other recently proposed methods. We illustrate our methodology in simulations and with publicly available data from the Environmental Protection Agency.  相似文献   

16.
This study presents a biomechanical model of orthodontic tooth movement. Although such models have already been presented in the literature, most of them incorporate computationally expensive finite elements (FE) methods to determine the strain distribution in the periodontal ligament (PDL). In contrast, the biomechanical model presented in this work avoids the use of FE methods. The elastic deformation of the PDL is modelled using an analytical approach, which does not require setting up a 3D model of the tooth. The duration of the lag phase is estimated using the calculated hydrostatic stresses, and bone remodelling is predicted by modelling the alveolar bone as a viscous material. To evaluate the model, some typically used motion patterns were simulated and a sensitivity analysis was carried out on the parameters. Results show that despite some shortcomings, the model is able to describe commonly used motion patterns in orthodontic tooth movement, in both single- and multi-rooted teeth.  相似文献   

17.
Understanding the mechanics of adaptive evolution requires not only knowing the quantitative genetic bases of the traits of interest but also obtaining accurate measures of the strengths and modes of selection acting on these traits. Most recent empirical studies of multivariate selection have employed multiple linear regression to obtain estimates of the strength of selection. We reconsider the motivation for this approach, paying special attention to the effects of nonnormal traits and fitness measures. We apply an alternative statistical method, logistic regression, to estimate the strength of selection on multiple phenotypic traits. First, we argue that the logistic regression model is more suitable than linear regression for analyzing data from selection studies with dichotomous fitness outcomes. Subsequently, we show that estimates of selection obtained from the logistic regression analyses can be transformed easily to values that directly plug into equations describing adaptive microevolutionary change. Finally, we apply this methodology to two published datasets to demonstrate its utility. Because most statistical packages now provide options to conduct logistic regression analyses, we suggest that this approach should be widely adopted as an analytical tool for empirical studies of multivariate selection.  相似文献   

18.
Chemical synaptic transmission involves the release of a neurotransmitter that diffuses in the extracellular space and interacts with specific receptors located on the postsynaptic membrane. Computer simulation approaches provide fundamental tools for exploring various aspects of the synaptic transmission under different conditions. In particular, Monte Carlo methods can track the stochastic movements of neurotransmitter molecules and their interactions with other discrete molecules, the receptors. However, these methods are computationally expensive, even when used with simplified models, preventing their use in large-scale and multi-scale simulations of complex neuronal systems that may involve large numbers of synaptic connections. We have developed a machine-learning based method that can accurately predict relevant aspects of the behavior of synapses, such as the percentage of open synaptic receptors as a function of time since the release of the neurotransmitter, with considerably lower computational cost compared with the conventional Monte Carlo alternative. The method is designed to learn patterns and general principles from a corpus of previously generated Monte Carlo simulations of synapses covering a wide range of structural and functional characteristics. These patterns are later used as a predictive model of the behavior of synapses under different conditions without the need for additional computationally expensive Monte Carlo simulations. This is performed in five stages: data sampling, fold creation, machine learning, validation and curve fitting. The resulting procedure is accurate, automatic, and it is general enough to predict synapse behavior under experimental conditions that are different to the ones it has been trained on. Since our method efficiently reproduces the results that can be obtained with Monte Carlo simulations at a considerably lower computational cost, it is suitable for the simulation of high numbers of synapses and it is therefore an excellent tool for multi-scale simulations.  相似文献   

19.
Yin G  Cai J 《Biometrics》2005,61(1):151-161
As an alternative to the mean regression model, the quantile regression model has been studied extensively with independent failure time data. However, due to natural or artificial clustering, it is common to encounter multivariate failure time data in biomedical research where the intracluster correlation needs to be accounted for appropriately. For right-censored correlated survival data, we investigate the quantile regression model and adapt an estimating equation approach for parameter estimation under the working independence assumption, as well as a weighted version for enhancing the efficiency. We show that the parameter estimates are consistent and asymptotically follow normal distributions. The variance estimation using asymptotic approximation involves nonparametric functional density estimation. We employ the bootstrap and perturbation resampling methods for the estimation of the variance-covariance matrix. We examine the proposed method for finite sample sizes through simulation studies, and illustrate it with data from a clinical trial on otitis media.  相似文献   

20.
Next-generation sequencing of pooled samples (Pool-seq) is a popular method to assess genome-wide diversity patterns in natural and experimental populations. However, Pool-seq is associated with specific sources of noise, such as unequal individual contributions. Consequently, using Pool-seq for the reconstruction of evolutionary history has remained underexplored. Here we describe a novel Approximate Bayesian Computation (ABC) method to infer demographic history, explicitly modelling Pool-seq sources of error. By jointly modelling Pool-seq data, demographic history and the effects of selection due to barrier loci, we obtain estimates of demographic history parameters accounting for technical errors associated with Pool-seq. Our ABC approach is computationally efficient as it relies on simulating subsets of loci (rather than the whole-genome) and on using relative summary statistics and relative model parameters. Our simulation study results indicate Pool-seq data allows distinction between general scenarios of ecotype formation (single versus parallel origin) and to infer relevant demographic parameters (e.g. effective sizes and split times). We exemplify the application of our method to Pool-seq data from the rocky-shore gastropod Littorina saxatilis, sampled on a narrow geographical scale at two Swedish locations where two ecotypes (Wave and Crab) are found. Our model choice and parameter estimates show that ecotypes formed before colonization of the two locations (i.e. single origin) and are maintained despite gene flow. These results indicate that demographic modelling and inference can be successful based on pool-sequencing using ABC, contributing to the development of suitable null models that allow for a better understanding of the genetic basis of divergent adaptation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号