首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 53 毫秒
1.
Hybrid simulation of cellular behavior   总被引:4,自引:0,他引:4  
MOTIVATION: To be valuable to biological or biomedical research, in silico methods must be scaled to complex pathways and large numbers of interacting molecular species. The correct method for performing such simulations, discrete event simulation by Monte Carlo generation, is computationally costly for large complex systems. Approximation of molecular behavior by continuous models fails to capture stochastic behavior that is essential to many biological phenomena. RESULTS: We present a novel approach to building hybrid simulations in which some processes are simulated discretely, while other processes are handled in a continuous simulation by differential equations. This approach preserves the stochastic behavior of cellular pathways, yet enables scaling to large populations of molecules. We present an algorithm for synchronizing data in a hybrid simulation and discuss the trade-offs in such simulation. We have implemented the hybrid simulation algorithm and have validated it by simulating the statistical behavior of the well-known lambda phage switch. Hybrid simulation provides a new method for exploring the sources and nature of stochastic behavior in cells.  相似文献   

2.
Cheon S  Liang F 《Bio Systems》2011,105(3):243-249
Recently, the stochastic approximation Monte Carlo algorithm has been proposed by Liang et al. (2007) as a general-purpose stochastic optimization and simulation algorithm. An annealing version of this algorithm was developed for real small protein folding problems. The numerical results indicate that it outperforms simulated annealing and conventional Monte Carlo algorithms as a stochastic optimization algorithm. We also propose one method for the use of secondary structures in protein folding. The predicted protein structures are rather close to the true structures.  相似文献   

3.
A formal partially dynamical approach to ergodic sampling, hybrid Monte Carlo, has been adapted for the first time from its proven application in quantum chromodynamics to realistic molecular systems. A series of simulations of pancreatic trypsin inhibitor were run using temperature-rescaled molecular dynamics and hybrid Monte Carlo. It was found that simulations run using hybrid Monte Carlo equilibrated an order of magnitude faster than those run using temperature-rescaled molecular dynamics. Certain aspects of improved performance obtained using hybrid Monte Carlo are probably due to the increased efficiency with which this algorithm explores phase space. To discuss this we introduce the notion of “trajectory stiffness”. © 1993 John Wiley & Sons, Inc.  相似文献   

4.
In the purification of monoclonal antibodies, ion-exchange chromatography is typically used among the polishing steps to reduce the amount of product-related impurities such as aggregates and fragments, whilst simultaneously reducing HCP, residual Protein A and potential toxins and viruses. When the product-related impurities are difficult to separate from the products, the optimization of these chromatographic steps can be complex and laborious. In this paper, we optimize the polishing chromatography of a monoclonal antibody from a challenging ternary feed mixture by introducing a hybrid approach of the simplex method and a form of local optimization. To maximize the productivity of this preparative bind-and-elute cation-exchange chromatography, wide ranges of the three critical operational parameters—column loading, the initial salt concentration, and gradient slope—had to be considered. The hybrid optimization approach is shown to be extremely effective in dealing with this complex separation that was subject to multiple constraints based on yield, purity, and product breakthrough. Furthermore, it enabled the generation of a large knowledge space that was subsequently used to study the sensitivity of the objective function. Increased design space understanding was gained through the application of Monte Carlo simulations. Hence, this work proposes a powerful hybrid optimization method, applied to an industrially relevant process development challenge. The properties of this approach and the results and insights gained, make it perfectly suited for the rapid development of biotechnological unit operations during early-stage bioprocess development.  相似文献   

5.
Introduction

The Monte Carlo technique is widely used and recommended for including uncertainties LCA. Typically, 1000 or 10,000 runs are done, but a clear argument for that number is not available, and with the growing size of LCA databases, an excessively high number of runs may be a time-consuming thing. We therefore investigate if a large number of runs are useful, or if it might be unnecessary or even harmful.

Probability theory

We review the standard theory or probability distributions for describing stochastic variables, including the combination of different stochastic variables into a calculation. We also review the standard theory of inferential statistics for estimating a probability distribution, given a sample of values. For estimating the distribution of a function of probability distributions, two major techniques are available, analytical, applying probability theory and numerical, using Monte Carlo simulation. Because the analytical technique is often unavailable, the obvious way-out is Monte Carlo. However, we demonstrate and illustrate that it leads to overly precise conclusions on the values of estimated parameters, and to incorrect hypothesis tests.

Numerical illustration

We demonstrate the effect for two simple cases: one system in a stand-alone analysis and a comparative analysis of two alternative systems. Both cases illustrate that statistical hypotheses that should not be rejected in fact are rejected in a highly convincing way, thus pointing out a fundamental flaw.

Discussion and conclusions

Apart form the obvious recommendation to use larger samples for estimating input distributions, we suggest to restrict the number of Monte Carlo runs to a number not greater than the sample sizes used for the input parameters. As a final note, when the input parameters are not estimated using samples, but through a procedure, such as the popular pedigree approach, the Monte Carlo approach should not be used at all.

  相似文献   

6.
MOTIVATION: The stochastic kinetics of a well-mixed chemical system, governed by the chemical Master equation, can be simulated using the exact methods of Gillespie. However, these methods do not scale well as systems become more complex and larger models are built to include reactions with widely varying rates, since the computational burden of simulation increases with the number of reaction events. Continuous models may provide an approximate solution and are computationally less costly, but they fail to capture the stochastic behavior of small populations of macromolecules. RESULTS: In this article we present a hybrid simulation algorithm that dynamically partitions the system into subsets of continuous and discrete reactions, approximates the continuous reactions deterministically as a system of ordinary differential equations (ODE) and uses a Monte Carlo method for generating discrete reaction events according to a time-dependent propensity. Our approach to partitioning is improved such that we dynamically partition the system of reactions, based on a threshold relative to the distribution of propensities in the discrete subset. We have implemented the hybrid algorithm in an extensible framework, utilizing two rigorous ODE solvers to approximate the continuous reactions, and use an example model to illustrate the accuracy and potential speedup of the algorithm when compared with exact stochastic simulation. AVAILABILITY: Software and benchmark models used for this publication can be made available upon request from the authors.  相似文献   

7.
An optimization framework based on the use of hybrid models is presented for preparative chromatographic processes. The first step in the hybrid model strategy involves the experimental determination of the parameters of the physical model, which consists of the full general rate model coupled with the kinetic form of the steric mass action isotherm. These parameters are then used to carry out a set of simulations with the physical model to obtain data on the functional relationship between various objective functions and decision variables. The resulting data is then used to estimate the parameters for neural-network-based empirical models. These empirical models are developed in order to enable the exploration of a wide variety of different design scenarios without any additional computational requirements. The resulting empirical models are then used with a sequential quadratic programming optimization algorithm to maximize the objective function, production rate times yield (in the presence of solubility and purity constraints), for binary and tertiary model protein systems. The use of hybrid empirical models to represent complex preparative chromatographic systems significantly reduces the computational time required for simulation and optimization. In addition, it allows both multivariable optimization and rapid exploration of different scenarios for optimal design.  相似文献   

8.
MOTIVATION: Monte Carlo methods are the most effective means of exploring the energy landscapes of protein folding. The rugged topography of folding energy landscapes causes sampling inefficiencies however, particularly at low, physiological temperatures. RESULTS: A hybrid Monte Carlo method, termed density guided importance sampling (DGIS), is presented that overcomes these sampling inefficiencies. The method is shown to be highly accurate and efficient in determining Boltzmann weighted structural metrics of a discrete off-lattice protein model. In comparison to the Metropolis Monte Carlo method, and the hybrid Monte Carlo methods, jump-walking, smart-walking and replica-exchange, the DGIS method is shown to be more efficient, requiring no parameter optimization. The method guides the simulation towards under-sampled regions of the energy spectrum and recognizes when equilibrium has been reached, avoiding arbitrary and excessively long simulation times. AVAILABILITY: Fortran code available from authors upon request. CONTACT: m.j.parker@leeds.ac.uk.  相似文献   

9.

Purpose

When product systems are optimized to minimize environmental impacts, uncertainty in the process data may impact optimal decisions. The purpose of this article is to propose a mathematical method for life cycle assessment (LCA) optimization that protects decisions against uncertainty at the life cycle inventory (LCI) stage.

Methods

A robust optimization approach is proposed for decision making under uncertainty in the LCI stage. The proposed approach incorporates data uncertainty into an optimization problem in which the matrix-based LCI model appears as a constraint. The level of protection against data uncertainty in the technology and intervention matrices can be controlled to reflect varying degrees of conservatism.

Results and discussion

A simple numerical example on an electricity generation product system is used to illustrate the main features of this methodology. A comparison is made between a robust optimization approach, and decision making using a Monte Carlo analysis. Challenges to implement the robust optimization approach on common uncertainty distributions found in LCA and on large product systems are discussed. Supporting source code is available for download at https://github.com/renwang/Robust_Optimization_LCI_Uncertainty.

Conclusions

A robust optimization approach for matrix-based LCI is proposed. The approach incorporates data uncertainties into an optimization framework for LCI and provides a mechanism to control the level of protection against uncertainty. The tool computes optimal decisions that protects against worst-case realizations of data uncertainty. The robust optimal solution is conservative and is able to avoid the negative consequences of uncertainty in decision making.  相似文献   

10.
Summary When cells are subjected to ionizing radiation the specific energy rate (microscopic analog of dose-rate) varies from cell to cell. Within one cell, this rate fluctuates during the course of time; a crossing of a sensitive cellular site by a high energy charged particle produces many ionizations almost simultaneously, but during the interval between events no ionizations occur. In any cell-survival model one can incorporate the effect of such fluctuations without changing the basic biological assumptions. Using stochastic differential equations and Monte Carlo methods to take into account stochastic effects we calculated the dose-survival relationships in a number of current cell survival models. Some of the models assume quadratic misrepair; others assume saturable repair enzyme systems. It was found that a significant effect of random fluctuations is to decrease the theoretically predicted amount of dose-rate sparing. In the limit of low dose-rates neglecting the stochastic nature of specific energy rates often leads to qualitatively misleading results by overestimating the surviving fraction drastically. In the opposite limit of acute irradiation, analyzing the fluctuations in rates merely amounts to analyzing fluctuations in total specific energyvia the usual microdosimetric specific energy distribution function, and neglecting fluctuations usually underestimates the surviving fraction. The Monte Carlo methods interpolate systematically between the low dose-rate and high dose-rate limits. As in other approaches, the slope of the survival curve at low dose-rates is virtually independent of dose and equals the initial slope of the survival curve for acute radiation.  相似文献   

11.

Background  

The fundamental role that intrinsic stochasticity plays in cellular functions has been shown via numerous computational and experimental studies. In the face of such evidence, it is important that intracellular networks are simulated with stochastic algorithms that can capture molecular fluctuations. However, separation of time scales and disparity in species population, two common features of intracellular networks, make stochastic simulation of such networks computationally prohibitive. While recent work has addressed each of these challenges separately, a generic algorithm that can simultaneously tackle disparity in time scales and population scales in stochastic systems is currently lacking. In this paper, we propose the hybrid, multiscale Monte Carlo (HyMSMC) method that fills in this void.  相似文献   

12.
We have developed a computational method of protein design to detect amino acid sequences that are adaptable to given main-chain coordinates of a protein. In this method, the selection of amino acid types employs a Metropolis Monte Carlo method with a scoring function in conjunction with the approximation of free energies computed from 3D structures. To compute the scoring function, a side-chain prediction using another Metropolis Monte Carlo method was performed to select structurally suitable side-chain conformations from a side-chain library. In total, two layers of Monte Carlo procedures were performed, first to select amino acid types (1st layer Monte Carlo) and then to predict side-chain conformations (2nd layers Monte Carlo). We applied this method to sequence design for the entire sequence on the SH3 domain, Protein G, and BPTI. The predicted sequences were similar to those of the wild-type proteins. We compared the results of the predictions with and without the 2nd layer Monte Carlo method. The results revealed that the two-layer Monte Carlo method produced better sequence similarity to the wild-type proteins than the one-layer method. Finally, we applied this method to neuraminidase of influenza virus. The results were consistent with the sequences identified from the isolated viruses.  相似文献   

13.
《Biophysical journal》2022,121(12):2398-2410
Quorum sensing is a bacterial cell-cell communication process that regulates gene expression. The search and binding of the autoinducer molecule (AHL)-bound LuxR-type proteins to specific sites on DNA in quorum-sensing cells in Gram-negative bacteria is a complex process and has been theoretically investigated based on a discrete-state stochastic approach. It is shown that several factors such as the rate of formation of the AHL-bound LuxR protein within the cells and its dissociation to freely diffusing AHL, the diffusion of the latter in and out of the cells, positive feedback loops, and the cell population density play important roles in the protein target search and can control the gene regulation processes. Physical-chemical arguments to explain these observations are presented. Our calculations of the dynamic properties are also supplemented by Monte Carlo computer simulations. Our theoretical model provides physical insights into the complex mechanisms of protein target search in quorum-sensing cells.  相似文献   

14.
ABACUS [Grishaev et al. (2005) Proteins 61:36-43] is a novel protocol for automated protein structure determination via NMR. ABACUS starts from molecular fragments defined by unassigned J-coupled spin-systems and involves a Monte Carlo stochastic search in assignment space, probabilistic sequence selection, and assembly of fragments into structures that are used to guide the stochastic search. Here, we report further development of the two main algorithms that increase the flexibility and robustness of the method. Performance of the BACUS [Grishaev and Llinás (2004) J Biomol NMR 28:1-101] algorithm was significantly improved through use of sequential connectivities available from through-bond correlated 3D-NMR experiments, and a new set of likelihood probabilities derived from a database of 56 ultra high resolution X-ray structures. A Multicanonical Monte Carlo procedure, Fragment Monte Carlo (FMC), was developed for sequence-specific assignment of spin-systems. It relies on an enhanced assignment sampling and provides the uncertainty of assignments in a quantitative manner. The efficiency of the protocol was validated on data from four proteins of between 68-116 residues, yielding 100% accuracy in sequence specific assignment of backbone and side chain resonances.  相似文献   

15.
Consistently predicting biopolymer structure at atomic resolution from sequence alone remains a difficult problem, even for small sub-segments of large proteins. Such loop prediction challenges, which arise frequently in comparative modeling and protein design, can become intractable as loop lengths exceed 10 residues and if surrounding side-chain conformations are erased. Current approaches, such as the protein local optimization protocol or kinematic inversion closure (KIC) Monte Carlo, involve stages that coarse-grain proteins, simplifying modeling but precluding a systematic search of all-atom configurations. This article introduces an alternative modeling strategy based on a ‘stepwise ansatz’, recently developed for RNA modeling, which posits that any realistic all-atom molecular conformation can be built up by residue-by-residue stepwise enumeration. When harnessed to a dynamic-programming-like recursion in the Rosetta framework, the resulting stepwise assembly (SWA) protocol enables enumerative sampling of a 12 residue loop at a significant but achievable cost of thousands of CPU-hours. In a previously established benchmark, SWA recovers crystallographic conformations with sub-Angstrom accuracy for 19 of 20 loops, compared to 14 of 20 by KIC modeling with a comparable expenditure of computational power. Furthermore, SWA gives high accuracy results on an additional set of 15 loops highlighted in the biological literature for their irregularity or unusual length. Successes include cis-Pro touch turns, loops that pass through tunnels of other side-chains, and loops of lengths up to 24 residues. Remaining problem cases are traced to inaccuracies in the Rosetta all-atom energy function. In five additional blind tests, SWA achieves sub-Angstrom accuracy models, including the first such success in a protein/RNA binding interface, the YbxF/kink-turn interaction in the fourth ‘RNA-puzzle’ competition. These results establish all-atom enumeration as an unusually systematic approach to ab initio protein structure modeling that can leverage high performance computing and physically realistic energy functions to more consistently achieve atomic accuracy.  相似文献   

16.
Chemical synaptic transmission involves the release of a neurotransmitter that diffuses in the extracellular space and interacts with specific receptors located on the postsynaptic membrane. Computer simulation approaches provide fundamental tools for exploring various aspects of the synaptic transmission under different conditions. In particular, Monte Carlo methods can track the stochastic movements of neurotransmitter molecules and their interactions with other discrete molecules, the receptors. However, these methods are computationally expensive, even when used with simplified models, preventing their use in large-scale and multi-scale simulations of complex neuronal systems that may involve large numbers of synaptic connections. We have developed a machine-learning based method that can accurately predict relevant aspects of the behavior of synapses, such as the percentage of open synaptic receptors as a function of time since the release of the neurotransmitter, with considerably lower computational cost compared with the conventional Monte Carlo alternative. The method is designed to learn patterns and general principles from a corpus of previously generated Monte Carlo simulations of synapses covering a wide range of structural and functional characteristics. These patterns are later used as a predictive model of the behavior of synapses under different conditions without the need for additional computationally expensive Monte Carlo simulations. This is performed in five stages: data sampling, fold creation, machine learning, validation and curve fitting. The resulting procedure is accurate, automatic, and it is general enough to predict synapse behavior under experimental conditions that are different to the ones it has been trained on. Since our method efficiently reproduces the results that can be obtained with Monte Carlo simulations at a considerably lower computational cost, it is suitable for the simulation of high numbers of synapses and it is therefore an excellent tool for multi-scale simulations.  相似文献   

17.
Genetic alterations such as point mutations, chromosomal rearrangements, modification of DNA methylation and chromosome aberrations accumulate during the lifetime of an organism. They can be caused by intrinsic errors in the DNA replication and repair as well as by external factors such as exposure to mutagenic substances or radiation. The main purpose of the present work is to begin an exploration of the stochastic nature of non-equilibrium DNA alteration caused by events such as tautomeric shifts. This is done by modeling the genetic DNA code chain as a sequence of DNA-bit values ('1' for normal bases and '-1' for abnormal bases). We observe the number of DNA-bit changes resulting from the random point mutation process which, in the model, is being induced by a stochastic Brownian mutagen (BM) as it diffuses through the DNA-bit systems. Using both an analytical and Monte Carlo (MC) simulation techniques, we observe the local and global number of DNA-bit changes. It is found that in 1D, the local DNA-bit density behaves like 1/t, the global total number of the switched (abnormal) DNA-bit increases as t. The probability distribution P(b, 0, t) of b(0, t) is log-normal. It is also found that when the number of mutagens is increased, the number of the total abnormal DNA-bits does not grow linearly with the number of mutagens. All analytic results are in good agreement with the simulation results.  相似文献   

18.
IntroductionInterventional procedures are associated with potentially high radiation doses to the skin. The 2013/59/EURATOM Directive establishes that the equipment used for interventional radiology must have a device or a feature informing the practitioner of relevant parameters for assessing patient dose at the end of the procedure. Monte Carlo codes of radiation transport are considered to be one of the most reliable tools available to assess doses. However, they are usually too time consuming for use in clinical practice. This work presents the validation of the fast Monte Carlo code MC-GPU for application in interventional radiology.MethodologiesMC-GPU calculations were compared against the well-validated Monte Carlo simulation code PENELOPE/penEasy by simulating the organ dose distribution in a voxelized anthropomorphic phantom. In a second phase, the code was compared against thermoluminescent measurements performed on slab phantoms, both in a calibration laboratory and at a hospital.ResultsThe results obtained from the two simulation codes show very good agreement, differences in the output were within 1%, whereas the calculation time on the MC-GPU was 2500 times shorter. Comparison with measurements is of the order of 10%, within the associated uncertainty.ConclusionsIt has been verified that MC-GPU provides good estimates of the dose when compared to PENELOPE program. It is also shown that it presents very good performance when assessing organ doses in very short times, less than one minute, in real clinical set-ups. Future steps would be to simulate complex procedures with several projections.  相似文献   

19.
Using informational statistics, an analysis-of-variance model is developed for separating the exclusive effects of two simultaneously present factors on the variability of the ongoing behaviour of a reference subject. In particular the following factors are considered: the preceding behaviour of the reference subject itself and the preceding behaviour of its partner. Effects due to the latter are usually regarded as representing communication. The model is compared with other information-statistical models for social interaction proposed in ethological research. Three properties are discussed: structural complexity, the rationale for identifying relevant effects and the efficiency in measuring them. With respect to measuring communication it is shown that several existing models confounded inter-individual and intra-individual effects in behaviour sequences. It is pointed out that different analytical frameworks (e.g. Markovian stochastic processes or analysis-of-variance) can use the same information-statistical formalism but give rise to different interpretations. Finally, the relation between the complexity of inter- and intra-individual effects in interaction sequences and the structure of information-statistical models is discussed. In Appendix I computational procedures are specified; in Appendix II a Monte Carlo procedure for testing observed variability measures is presented.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号