首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Abstract

Computer simulations using particles are an attractive method to extract microscopic information of flow phenomena [1]. The molecular dynamics (MD) method, in which Newton's equations of motions are integrated, gives the temporal development of the system. In the MD simulation of fluid flows, the computational region is limited to atomistic scales [2]. On the other hand, the direct simulation Monte Carlo (DSMC) method, in which collisions of particles are made on a probabilistic basis, has a potential of treating a realistic system with a macroscopic scale length retaining the atomistic details. The DSMC method provides an efficient way to integrate the Boltzmann equation from the rarefied gas to the near-continuum region. Bird clarified the validity of the DSMC method in the near-continuum flow region [3]. However, the DSMC method has not been applied to the continuum region and compared with the continuum hydrodynamics.  相似文献   

2.
Abstract

This paper continues our Monte Carlo simulation study of liquid hydrogen chloride [1]. The importance of non-additive interactions is carefully analyzed. Computed atom pair correlation functions are compared to neutron scattering experiments [2]. A difference algorithm (“Δ—algorithm”) is developed, which makes non-additive Monte Carlo simulations practicable. We also report an implementation of this algorithm on a transputer network, taking advantage of the inherent parallelism of the Δ — algorithm.  相似文献   

3.
PurposeThe main focus of the current paper is the clinical implementation of a Monte Carlo based platform for treatment plan validation for Tomotherapy and Cyberknife, without adding additional tasks to the dosimetry department.MethodsThe Monte Carlo platform consists of C++ classes for the actual functionality and a web based GUI that allows accessing the system using a web browser. Calculations are based on BEAMnrc/DOSXYZnrc and/or GATE and are performed automatically after exporting the dicom data from the treatment planning system. For Cyberknife treatments of moving targets, the log files saved during the treatment (position of robot, internal fiducials and external markers) can be used in combination with the 4D planning CT to reconstruct the actually delivered dose. The Monte Carlo platform is also used for calculation on MRI images, using pseudo-CT conversion.ResultsFor Tomotherapy treatments we obtain an excellent agreement (within 2%) for almost all cases. However, we have been able to detect a problem regarding the CT Hounsfield units definition of the Toshiba Large Bore CT when using a large reconstruction diameter. For Cyberknife treatments we obtain an excellent agreement with the Monte Carlo algorithm of the treatment planning system. For some extreme cases, when treating small lung lesions in low density lung tissue, small differences are obtained due to the different cut-off energy of the secondary electrons.ConclusionsA Monte Carlo based treatment plan validation tool has successfully been implemented in clinical routine and is used to systematically validate all Cyberknife and Tomotherapy plans.  相似文献   

4.
Abstract

Time saving procedures unifying Monte Carlo and self consistent field approaches for the calculation of equilibrium potentials and density distributions of mobile ions around a polyion in a polyelectrolyte system are considered. In the final version of the method the region around the polyion is divided into two zones—internal and external; all the ions of the internal zone are accounted for explicitly in a Monte Carlo procedure, in the external zone the self consistent field approximation is applied with an exchange of ions between regions. Simulations are carried out for cylindrical and spherical polyions in solutions with mono-and divalent ions and their mixtures. The results are compared with Poisson—Boltzmann approximation and experimental data on intrinsic viscosity.  相似文献   

5.
Abstract

Taboo-based Monte Carlo search which restricts the sampling of the region near an old configuration, is developed. In this procedure, Monte Carlo simulation and random search method are combined to improve the sampling efficiency. The feasibility of this method is tested on global optimization of a continuous model function, melting of the 256 Lennard-Jones particles at T? = 0.680 and ρ? = 0.850 and polypeptides (alanine dipeptide and Metenkephalin). From the comparison of results for the model function between our method and other methods, we find the increase of convergence rate and the high possibility of escaping from the local energy minima. The results of the Lennard-Jones solids and polypeptides show that the convergence property to reach the equilibrium state is better than that of others. It is also found that no significant bias in ensemble distribution is detected, though taboo-based Monte Carlo search does not sample the correct ensemble distribution owing to the restriction of the sampling of the region near an old configuration.  相似文献   

6.
AimIn this study, we investigated initial electron parameters of Siemens Artiste Linac with 6 MV photon beam using the Monte Carlo method.BackgroundIt is essential to define all the characteristics of initial electrons hitting the target, i.e. mean energy and full width of half maximum (FWHM) of the spatial distribution intensity, which is needed to run Monte Carlo simulations. The Monte Carlo is the most accurate method for simulation of radiotherapy treatments.Materials and methodsLinac head geometry was modeled using the BEAMnrc code. The phase space files were used as input file to DOSXYZnrc simulation to determine the dose distribution in a water phantom. We obtained percent depth dose curves and the lateral dose profile. All the results were obtained at 100 cm of SSD and for a 10 × 10 cm2 field.ResultsWe concluded that there existed a good conformity between Monte Carlo simulation and measurement data when we used electron mean energy of 6.3 MeV and 0.30 cm FWHM value as initial parameters. We observed that FWHM values had very little effect on PDD and we found that the electron mean energy and FWHM values affected the lateral dose profile. However, these effects are between tolerance values.ConclusionsThe initial parameters especially depend on components of a linac head. The phase space file which was obtained from Monte Carlo Simulation for a linac can be used as calculation of scattering, MLC leakage, to compare dose distribution on patients and in various studies.  相似文献   

7.
PurposeAt introduction in 2014, dose calculation for the first MLC on a robotic SRS/SBRT platform was limited to a correction-based Finite-Size Pencil Beam (FSPB) algorithm. We report on the dosimetric accuracy of a novel Monte Carlo (MC) dose calculation algorithm for this MLC, included in the Precision™ treatment planning system.MethodsA phantom was built of one slab (5.0 cm) of lung-equivalent material (0.09…0.29 g/cc) enclosed by 3.5 cm (above) and 5 cm (below) slabs of solid water (1.045 g/cc). This was irradiated using rectangular (15.4 × 15.4 mm2 to 53.8 × 53.7 mm2) and two irregular MLC-fields. Radiochromic film (EBT3) was positioned perpendicular to the slabs and parallel to the beam. Calculated dose distributions were compared to film measurements using line scans and 2D gamma analysis.ResultsMeasured and MC calculated percent depth dose curves showed a characteristic dose drop within the low-density region, which was not correctly reproduced by FSPB. Superior average gamma pass rates (2%/1 mm) were found for MC (91.2 ± 1.5%) compared to FSPB (55.4 ± 2.7%). However, MC calculations exhibited localized anomalies at mass density transitions around 0.15 g/cc, which were traced to a simplification in electron transport. Absence of these anomalies was confirmed in a modified build of the MC engine, which increased gamma pass rates to 96.6 ± 1.2%.ConclusionsThe novel MC algorithm greatly improves dosimetric accuracy in heterogeneous tissue, potentially expanding the clinical use of robotic radiosurgery with MLC. In-depth, independent validation is paramount to identify and reduce the residual uncertainties in any software solution.  相似文献   

8.
We employ Monte Carlo simulations to investigate the interaction between an adsorbing linear flexible cationic polyelectrolyte and a ternary mixed fluid membrane containing neutral (phosphatidylcholine, PC), monovalent (phosphatidylserine, PS), and multivalent (phosphatidylinositol, PIP2) anionic lipids. We systematically explore the influences of polyelectrolyte chain length, polyelectrolyte charge density, polyelectrolyte total charge amount, and salt solution ionic strength on the static and dynamic properties of different anionic lipid species. Our results show that the multivalent PIP2 lipids dominate the polyelectrolyte–membrane interaction and competitively inhibit polyelectrolyte–PS binding. When the total charge amount of the polyelectrolyte is less than that of the local oppositely charged PIP2 lipids, the polyelectrolyte can drag the bound multivalent lipids to diffuse on the membrane, but cannot interact with the PS lipids. Under this condition, the diffusion behaviors of the polyelectrolyte closely follow the prediction of the Rouse model, and the polyelectrolyte chain properties determine the adsorption amount, concentration gradients, and hierarchical mobility of the bound PIP2 lipids. However, when the total charge amount of the polyelectrolyte is larger than that of the local PIP2 lipids, the polyelectrolyte further binds the PS lipids around the polyelectrolyte–PIP2 complex to achieve local electrical neutrality. In this condition, parts of the polyelectrolyte desorb from the membrane and show faster mobility, and the bound PS presents much faster mobility than the segregated PIP2. This work provides an explanation for heterogeneity formation in different anionic lipids induced by polyelectrolyte adsorption.  相似文献   

9.
AimThe purpose of this study is to calculate radiation dose around a brachytherapy source in a water phantom for different seed locations or rotation the sources by the matrix summation method.BackgroundMonte Carlo based codes like MCNP are widely used for performing radiation transport calculations and dose evaluation in brachytherapy. But for complicated situations, like using more than one source, moving or rotating the source, the routine Monte Carlo method for dose calculation needs a long time running.Materials and methodsThe MCNPX code has been used to calculate radiation dose around a 192Ir brachytherapy source and saved in a 3D matrix. Then, we used this matrix to evaluate the absorbed dose in any point due to some sources or a source which shifted or rotated in some places by the matrix summation method.ResultsThree dimensional (3D) dose results and isodose curves were presented for 192Ir source in a water cube phantom shifted for 10 steps and rotated for 45 and 90° based on the matrix summation method. Also, we applied this method for some arrays of sources.ConclusionThe matrix summation method can be used for 3D dose calculations for any brachytherapy source which has moved or rotated. This simple method is very fast compared to routine Monte Carlo based methods. In addition, it can be applied for dose optimization study.  相似文献   

10.
Abstract

We present a novel method to simulate phase equilibria in atomic and molecular systems. The method is a Molecular Dynamics version of the Gibbs-Ensemble Monte Carlo technique, which has been developed some years ago for the direct simulation of phase equilibria in fluid systems. The idea is to have two separate simulation boxes, which can exchange particles (or molecules) in a thermodynamically consistent fashion. Here we pres the derivation of the generalized equations of motion and discuss the relation of the resulting trajectory averages to the relevant ensemble. We test this Gibbs-Ensemble Molecular Dynamics algorithm by applying it to an atomic and a molecular system, i.e. to the liquid-gas coexistence in a Lennard-Jones fluid and in n-hexane. In both cases our results are in good accord with previous mean field and Gibbs-Ensemble Monte Carlo results as well as with the experimental data in the case of hexane. We also show that our Gibbs-Ensemble Molecular Dynamics algorithm like other Molecular Dynamics techniques can be used to study the dynamics of the system. Self-diffusion coefficients calculated with this method are in agreement with the result of conventional constant temperature Molecular Dynamics.  相似文献   

11.
傅煜  雷渊才  曾伟生 《生态学报》2015,35(23):7738-7747
采用系统抽样体系江西省固定样地杉木连续观测数据和生物量数据,通过Monte Carlo法反复模拟由单木生物量模型推算区域尺度地上生物量的过程,估计了江西省杉木地上总生物量。基于不同水平建模样本量n及不同决定系数R~2的设计,分别研究了单木生物量模型参数变异性及模型残差变异性对区域尺度生物量估计不确定性的影响。研究结果表明:2009年江西省杉木地上生物量估计值为(19.84±1.27)t/hm~2,不确定性占生物量估计值约6.41%。生物量估计值和不确定性值达到平稳状态所需的运算时间随建模样本量及决定系数R~2的增大而减小;相对于模型参数变异性,残差变异性对不确定性的影响更小。  相似文献   

12.
We have studied the use of a new Monte Carlo (MC) chain generation algorithm, introduced by T. Garel and H. Orland[(1990) Journal of Physics A, Vol. 23, pp. L621–L626], for examining the thermodynamics of protein folding transitions and for generating candidate Cαbackbone structures as starting points for a de now protein structure paradigm. This algorithm, termed the guided replication Monte Carlo method, allows a rational approach to the introduction of known “native” folded characteristics as constraints in the chain generation process. We have shown this algorithm to be computationally very efficient in generating large ensembles of candidate Cαchains on the face centered cubic lattice, and illustrate its use by calculating a number of thermodynamic quantities related to protein folding characteristics. In particular, we have used this static MC algorithm to compute such temperature-dependent quantities as the ensemble mean energy, ensemble mean free energy, the heat capacity, and the mean-square radius of gyration. We also demonstrate the use of several simple “guide fields” for introducing protein-specific constraints into the ensemble generation process. Several extensions to our current model are suggested, and applications of the method to other folding related problems are discussed. © 1995 John Wiley & Sons, Inc.  相似文献   

13.
How rare are magic squares? So far, the exact number of magic squares of order n is only known for n ≤ 5. For larger squares, we need statistical approaches for estimating the number. For this purpose, we formulated the problem as a combinatorial optimization problem and applied the Multicanonical Monte Carlo method (MMC), which has been developed in the field of computational statistical physics. Among all the possible arrangements of the numbers 1; 2, …, n 2 in an n × n square, the probability of finding a magic square decreases faster than the exponential of n. We estimated the number of magic squares for n ≤ 30. The number of magic squares for n = 30 was estimated to be 6.56(29) × 102056 and the corresponding probability is as small as 10−212. Thus the MMC is effective for counting very rare configurations.  相似文献   

14.
A Markov chain Monte Carlo (MCMC) algorithm to sample an exchangeable covariance matrix, such as the one of the error terms (R0) in a multiple trait animal model with missing records under normal-inverted Wishart priors is presented. The algorithm (FCG) is based on a conjugate form of the inverted Wishart density that avoids sampling the missing error terms. Normal prior densities are assumed for the ''fixed'' effects and breeding values, whereas the covariance matrices are assumed to follow inverted Wishart distributions. The inverted Wishart prior for the environmental covariance matrix is a product density of all patterns of missing data. The resulting MCMC scheme eliminates the correlation between the sampled missing residuals and the sampled R0, which in turn has the effect of decreasing the total amount of samples needed to reach convergence. The use of the FCG algorithm in a multiple trait data set with an extreme pattern of missing records produced a dramatic reduction in the size of the autocorrelations among samples for all lags from 1 to 50, and this increased the effective sample size from 2.5 to 7 times and reduced the number of samples needed to attain convergence, when compared with the ''data augmentation'' algorithm.  相似文献   

15.
Abstract

Polyampholyte copolymers containing both positive and negative monomers regularly dispersed along the chain were studied. The Monte Carlo method was used to simulate chains with charged monomers interacting by screened Coulomb potential. The neutral polyampholyte chains collapse due to the attractive electrostatic interactions. The nonneutral chains are in extended conformations due to the repulsive polyelectrolyte effects that dominate the attractive polyampholyte interactions. The results are in good agreement with experiment.  相似文献   

16.
Introduction

The Monte Carlo technique is widely used and recommended for including uncertainties LCA. Typically, 1000 or 10,000 runs are done, but a clear argument for that number is not available, and with the growing size of LCA databases, an excessively high number of runs may be a time-consuming thing. We therefore investigate if a large number of runs are useful, or if it might be unnecessary or even harmful.

Probability theory

We review the standard theory or probability distributions for describing stochastic variables, including the combination of different stochastic variables into a calculation. We also review the standard theory of inferential statistics for estimating a probability distribution, given a sample of values. For estimating the distribution of a function of probability distributions, two major techniques are available, analytical, applying probability theory and numerical, using Monte Carlo simulation. Because the analytical technique is often unavailable, the obvious way-out is Monte Carlo. However, we demonstrate and illustrate that it leads to overly precise conclusions on the values of estimated parameters, and to incorrect hypothesis tests.

Numerical illustration

We demonstrate the effect for two simple cases: one system in a stand-alone analysis and a comparative analysis of two alternative systems. Both cases illustrate that statistical hypotheses that should not be rejected in fact are rejected in a highly convincing way, thus pointing out a fundamental flaw.

Discussion and conclusions

Apart form the obvious recommendation to use larger samples for estimating input distributions, we suggest to restrict the number of Monte Carlo runs to a number not greater than the sample sizes used for the input parameters. As a final note, when the input parameters are not estimated using samples, but through a procedure, such as the popular pedigree approach, the Monte Carlo approach should not be used at all.

  相似文献   

17.
Abstract

Monte Carlo simulations of water in the NVT ensemble using three models (SPC, TIP4P and TIPS2) are reported. The internal energy, dielectric constant, and the site-site radial distribution functions of liquid water (temperature 300 K and mass density 1 gm cc?1) were calculated and compared with experiment. It was found that of the three intermolecular potential models, SPC gives the best dielectric constant. Since SPC also yields acceptable results for the energy and structure, it is judged to be the best among the three models studied.  相似文献   

18.
IntroductionThe increased radioresistance of hypoxic cells compared to well-oxygenated cells is quantified by the oxygen enhancement ratio (OER). In this study we created a FLUKA Monte Carlo based tool for inclusion of both OER and relative biological effectiveness (RBE) in biologically weighted dose (ROWD) calculations in proton therapy and applied this to explore the impact of hypoxia.MethodsThe RBE-weighted dose was adapted for hypoxia by making RBE model parameters dependent on the OER, in addition to the linear energy transfer (LET). The OER depends on the partial oxygen pressure (pO2) and LET. To demonstrate model performance, calculations were done with spread-out Bragg peaks (SOBP) in water phantoms with pO2 ranging from strongly hypoxic to normoxic (0.01–30 mmHg) and with a head and neck cancer proton plan optimized with an RBE of 1.1 and pO2 estimated voxel-by-voxel using [18F]-EF5 PET. An RBE of 1.1 and the Rørvik RBE model were used for the ROWD calculations.ResultsThe SOBP in water had decreasing ROWD with decreasing pO2. In the plans accounting for oxygenation, the median target doses were approximately a factor 1.1 lower than the corresponding plans which did not consider the OER. Hypoxia adapted target ROWDs were considerably more heterogeneous than the RBE1.1-weighted doses.ConclusionWe realized a Monte Carlo based tool for calculating the ROWD. Read-in of patient pO2 and estimation of ROWD with flexibility in choice of RBE model was achieved, giving a tool that may be useful in future clinical applications of hypoxia-guided particle therapy.  相似文献   

19.
Abstract

A novel Monte Carlo simulation named as Dual Ensemble Monte Carlo (DEMC) method is developed for the investigation of the membrane separation process. In this method the spatial combination of Grand Canonical MC and Canonical MC techniques is employed. The DEMC method can be used to calculate the separation factor at a specific chemical potential gradient. At first, a check on the accuracy of the DEMC method is made by generating gas density gradient between two reservoir regions. Thereafter, we applied this method to CO2/N2 gas separation by inorganic membranes and calculated the separation factor dependence on the size of micropore in membranes.  相似文献   

20.
PurposeA reliable model to simulate nuclear interactions is fundamental for Ion-therapy. We already showed how BLOB (“Boltzmann-Langevin One Body”), a model developed to simulate heavy ion interactions up to few hundreds of MeV/u, could simulate also 12C reactions in the same energy domain. However, its computation time is too long for any medical application. For this reason we present the possibility of emulating it with a Deep Learning algorithm.MethodsThe BLOB final state is a Probability Density Function (PDF) of finding a nucleon in a position of the phase space. We discretised this PDF and trained a Variational Auto-Encoder (VAE) to reproduce such a discrete PDF. As a proof of concept, we developed and trained a VAE to emulate BLOB in simulating the interactions of 12C with 12C at 62 MeV/u. To have more control on the generation, we forced the VAE latent space to be organised with respect to the impact parameter (b) training a classifier of b jointly with the VAE.ResultsThe distributions obtained from the VAE are similar to the input ones and the computation time needed to use the VAE as a generator is negligible.ConclusionsWe show that it is possible to use a Deep Learning approach to emulate a model developed to simulate nuclear reactions in the energy range of interest for Ion-therapy. We foresee the implementation of the generation part in C++ and to interface it with the most used Monte Carlo toolkit: Geant4.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号