首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Twelve terrestrial and marine studies were conducted at various sites in Malaysia, Brazil, and the United States between April 1999 and February 2004. These data were analyzed using five density estimate techniques for stationary (non-motile) organisms including Stratified Random Sampling, Point-Center Quarter, Third Nearest Object, Weinberg, and Strong. The Strong method gave the most accurate density estimates of stationary animals and plants. Stratified Random Sampling ranked second best and the Third Nearest Object the third best. Belt or strip transects may be preferable but can be restrictive in some situations because of logistics and associated time constraints. Straight line measurements on reefs were 3–27% more accurate than reef slack line and reef contour measurements. Most study areas measured with the standardized Morisita index of dispersion were moderately aggregated. Results from the Third Nearest Object and Point-Center Quarter techniques indicate that the addition of more data to establish a density correction factor does not necessarily give more accurate estimates of density.  相似文献   

2.
Abstract

The principle purpose of this paper is to demonstrate the use of the Inverse Monte Carlo technique for calculating pair interaction energies in monoatomic liquids from a given equilibrium property. This method is based on the mathematical relation between transition probability and pair potential given by the fundamental equation of the “importance sampling” Monte Carlo method. In order to have well defined conditions for the test of the Inverse Monte Carlo method a Metropolis Monte Carlo simulation of a Lennard Jones liquid is carried out to give the equilibrium pair correlation function determined by the assumed potential. Because an equilibrium configuration is prerequisite for an Inverse Monte Carlo simulation a model system is generated reproducing the pair correlation function, which has been calculated by the Metropolis Monte Carlo simulation and therefore representing the system in thermal equilibrium. This configuration is used to simulate virtual atom displacements. The resulting changes in atom distribution for each single simulation step are inserted in a set of non-linear equations defining the transition probability for the virtual change of configuration. The solution of the set of equations for pair interaction energies yields the Lennard Jones potential by which the equilibrium configuration has been determined.  相似文献   

3.
Aiming at the reliability evaluation method of the complex network, and network reliability is an important index in measuring the reliability of large-sized network. The Monte Carlo method is studied, and the general principle of MC simulation and the reliability evaluation approach based on MC are introduced. Sampling is very important in the Monte Carlo simulation, and random variable is studied, and several kinds of discrete distributions are introduced. A novel reliability evaluation method based on Monte Carlo method is proposed. To evaluate network reliability efficiently, the proposed method generates time-pointer of the arc failure events and constructs the event-table of the complex network, and updates the network states, and sampling is selected by geometric distribution. Precision and unbiased of the reliability evaluating are discussed. Furthermore, a series of numerical experiments are implemented to compare the efficiency of the CMC and the other traditional methods under the same experimental condition.  相似文献   

4.
Three numerical techniques for generating thermally accessible configurations of globular proteins are considered; these techniques are the molecular dynamics method, the Metropolis Monte Carlo method, and a modified Monte Carlo method which takes account of the forces acting on the protein atoms. The molecular dynamics method is shown to be more efficient than either of the Monte Carlo methods. Because it may be necessary to use Monte Carlo methods in certain important types of sampling problems, the behavior of these methods is examined in some detail. It is found that an acceptance ratio close to 1/6 yields optimum efficiency for the Metropolis method, in contrast to what is often assumed. This result, together with the overall inefficiency of the Monte Carlo methods, appears to arise from the anisotropic forces acting on the protein atoms due to their covalent bonding. Possible ways of improving the Monte Carlo methods are suggested.  相似文献   

5.
Monte Carlo studies of the unperturbed amylosic chain conformation have been carried out in the approximation of separable chain configuration energies. Sample chains of arbitrary chain length have been generated so as to be distributed consistent with refined estimates of the configuration energy and thus suitable for evaluation of averages of the desired configuration-dependent properties. Perspective drawings of representative chains from the Monte Carlo sample have been made for comparison with standard idealizations of amylosic chain conformation. He molecular model employed generates a randomly coiling chain possessing perceptible regions of left-handed pseudohelical backbone trajectory. Distribution functions for the end-to-end distance of short amylosic chains disclose some propensity for the chain to suffer self-intersections at sort range in the chain sequence, which may vitiate the usual amylosic chain models based on the assumed independence of sets of glycosidic linkage torsion angles. The amylosic persistence vector and persistence length have been calculated as a function of chain length for the chain model employed.  相似文献   

6.
PurposeThe main focus of the current paper is the clinical implementation of a Monte Carlo based platform for treatment plan validation for Tomotherapy and Cyberknife, without adding additional tasks to the dosimetry department.MethodsThe Monte Carlo platform consists of C++ classes for the actual functionality and a web based GUI that allows accessing the system using a web browser. Calculations are based on BEAMnrc/DOSXYZnrc and/or GATE and are performed automatically after exporting the dicom data from the treatment planning system. For Cyberknife treatments of moving targets, the log files saved during the treatment (position of robot, internal fiducials and external markers) can be used in combination with the 4D planning CT to reconstruct the actually delivered dose. The Monte Carlo platform is also used for calculation on MRI images, using pseudo-CT conversion.ResultsFor Tomotherapy treatments we obtain an excellent agreement (within 2%) for almost all cases. However, we have been able to detect a problem regarding the CT Hounsfield units definition of the Toshiba Large Bore CT when using a large reconstruction diameter. For Cyberknife treatments we obtain an excellent agreement with the Monte Carlo algorithm of the treatment planning system. For some extreme cases, when treating small lung lesions in low density lung tissue, small differences are obtained due to the different cut-off energy of the secondary electrons.ConclusionsA Monte Carlo based treatment plan validation tool has successfully been implemented in clinical routine and is used to systematically validate all Cyberknife and Tomotherapy plans.  相似文献   

7.
A formal partially dynamical approach to ergodic sampling, hybrid Monte Carlo, has been adapted for the first time from its proven application in quantum chromodynamics to realistic molecular systems. A series of simulations of pancreatic trypsin inhibitor were run using temperature-rescaled molecular dynamics and hybrid Monte Carlo. It was found that simulations run using hybrid Monte Carlo equilibrated an order of magnitude faster than those run using temperature-rescaled molecular dynamics. Certain aspects of improved performance obtained using hybrid Monte Carlo are probably due to the increased efficiency with which this algorithm explores phase space. To discuss this we introduce the notion of “trajectory stiffness”. © 1993 John Wiley & Sons, Inc.  相似文献   

8.
Abstract

A bulk Lennard-Jones fluid was simulated using the grand canonical Monte Carlo method. Three different sampling methods were used in the transition matrix, namely the Metropolis, Barker and a third novel method. While it can be shown that the Metropolis method will give the most accurate ensemble averages in the limit of an infinitely long run, the new method termed “Modified Barker Sampling” (MBS), is shown to be superior for the runs of practical length for the particular system studied.  相似文献   

9.
J Nedelman 《Biometrics》1983,39(4):1009-1020
Sampling models are investigated for counts of mosquitoes from a malaria field survey conducted by the World Health Organization in Nigeria. The data can be described by a negative binomial model for two-way classified counted data, where the cell means are constrained to satisfy row-by-column independence and the parameter k is constant across rows. An algorithm, based on iterative proportional fitting, is devised for finding maximum likelihood estimates. Sampling properties of the estimates and likelihood-ratio statistics for the small sample sizes of the data are investigated by Monte Carlo experiments. The WHO reported an observation that the relative efficiencies of four trapping methods vary over time. Out of eight villages in the survey area, this observation is found to be true in only the one village that is near a swamp.  相似文献   

10.
11.
Four computational methods for estimating mean fecundity are compared by Monte Carlo simulation. One of the four methods is the simple expedient of estimating fecundity at sample mean length, a method known to be downwardly biassed. The Monte Carlo study shows that the other three methods reduce bias and provide worthwhile efficiency gains. For small samples, the most efficient of the four methods is a 'bias adjustment', proposed here, that uses easily calculated sample statistics. For large samples, a numerical integration method has the highest efficiency. The fourth method, a 'direct summation' procedure which can be done easily in many statistical or spreadsheet programs, performs well for all sample sizes.  相似文献   

12.
It has been 10 years since the publication of the relative risk model (RRM) for regional scale ecological risk assessment. The approach has since been used successfully for a variety of freshwater, marine, and terrestrial environments in North America, South America, and Australia. During this period the types of stressors have been expanded to include more than contaminants. Invasive species, habitat loss, stream alteration and blockage, temperature, change in land use, and climate have been incorporated into the assessments. Major developments in the RRM have included the extensive use of geographical information systems, uncertainty analysis using Monte Carlo techniques, and its application to retrospective assessments to determine causation. The future uses of the RRM include assessments for forestry and conservation management, an increasing use in invasive species evaluation, and in sustainability. Developments in risk communication, the use of Bayesian approaches, and in uncertainty analyses are on the horizon.  相似文献   

13.
A Monte Carlo model has been developed to support the design of a 180° geometry x-ray fluorescence system for the measurement of cadmium concentration in deep body organs such as the kidney. 133Xe was investigated as the excitation photon source. A total number of 15×106 simulated incident photons were used. Monte Carlo simulations were performed using the EGS4 Monte Carlo code system. The results showed that for distances between the skin and the kidney surface of 30–60 mm, respectively, cadmium concentrations of 15–60 μg/g kidney tissue could easily be detected. The mean skin and kidney doses during such measurements were estimated to be between 8 and 0.9 mGy, respectively. Received: 1 June 1999 / Accepted: 10 February 2000  相似文献   

14.
Chemical synaptic transmission involves the release of a neurotransmitter that diffuses in the extracellular space and interacts with specific receptors located on the postsynaptic membrane. Computer simulation approaches provide fundamental tools for exploring various aspects of the synaptic transmission under different conditions. In particular, Monte Carlo methods can track the stochastic movements of neurotransmitter molecules and their interactions with other discrete molecules, the receptors. However, these methods are computationally expensive, even when used with simplified models, preventing their use in large-scale and multi-scale simulations of complex neuronal systems that may involve large numbers of synaptic connections. We have developed a machine-learning based method that can accurately predict relevant aspects of the behavior of synapses, such as the percentage of open synaptic receptors as a function of time since the release of the neurotransmitter, with considerably lower computational cost compared with the conventional Monte Carlo alternative. The method is designed to learn patterns and general principles from a corpus of previously generated Monte Carlo simulations of synapses covering a wide range of structural and functional characteristics. These patterns are later used as a predictive model of the behavior of synapses under different conditions without the need for additional computationally expensive Monte Carlo simulations. This is performed in five stages: data sampling, fold creation, machine learning, validation and curve fitting. The resulting procedure is accurate, automatic, and it is general enough to predict synapse behavior under experimental conditions that are different to the ones it has been trained on. Since our method efficiently reproduces the results that can be obtained with Monte Carlo simulations at a considerably lower computational cost, it is suitable for the simulation of high numbers of synapses and it is therefore an excellent tool for multi-scale simulations.  相似文献   

15.
The space in the unit cell of a metmyoglobin crystal not occupied by myoglobin atoms was filled with water using Monte Carlo calculations. Independent calculations with different amounts of water have been performed. Structure factors were calculated using the water coordinates thus obtained and the known coordinates of the myoglobin atoms. A comparison with experimental structure factors showed that both the low and the high resolution regime could be well reproduced with 814 Monte Carlo water molecules per unit cell with a B-value of 50 Å2. The Monte Carlo water molecules yield a smaller standard R-value (0.166) than using a homogeneous electron density for the simulation of the crystal water (R = 0.212). A reciprocal space refinement of the water and the protein coordinates has been performed. Monte Carlo calculations can be used to obtain information for crystallographically invisible parts of the unit cell and yield better coordinates for the visible part in the refinement. Correspondence to: F. Parak  相似文献   

16.
Lu Z  Hui YV  Lee AH 《Biometrics》2003,59(4):1016-1026
Minimum Hellinger distance estimation (MHDE) has been shown to discount anomalous data points in a smooth manner with first-order efficiency for a correctly specified model. An estimation approach is proposed for finite mixtures of Poisson regression models based on MHDE. Evidence from Monte Carlo simulations suggests that MHDE is a viable alternative to the maximum likelihood estimator when the mixture components are not well separated or the model parameters are near zero. Biometrical applications also illustrate the practical usefulness of the MHDE method.  相似文献   

17.
Book Reviews     
C. D. Kemp  A. W. Kemp 《Biometrics》1999,55(2):666-670
Books reviewed in this article:
GENTLE, J. E. Numerical Linear Algebra for Applications in Statistics.
GENTLE, J. E. Random Number Generation and Monte Carlo Methods.
LOTKA, A. J. Analytical Theory of Biological Populations  相似文献   

18.
In our recent series of papers, we have used the structuresof statistical significance from Monte Carlo simulations toimprove the predictions of secondary structure of RNA and toanalyze the possible role of locally significant structuresin the ljfe cycle of human immunodeficiency virus. Because ofintensive computational requirements for Monte Carlo simulation,it becomes impractical even using a supercomputer to assessthe significance of a structure with a window size > 200along an RNA sequence of 1000 bases or more. In this paper,we have developed a new procedure that drastically reduces thetime needed to assess the significance of structures. In fact,the efficiency of this new method allows us to assess structureson the VAX as well as the CRAY Received on May 11, 1989; accepted on August 22, 1989  相似文献   

19.
The MC dynamics of an off-lattice all-atom protein backbone model with rigid amide planes are studied. The only degrees of freedom are the dihedral angle pairs of the C-atoms. Conformational changes are generated by Monte Carlo (MC) moves. The MC moves considered are single rotations (simple moves, SM's) giving rise to global conformational changes or, alternatively, cooperative rotations in a window of amide planes (window moves, WM's) generating local conformational changes in the window. Outside the window the protein conformation is kept invariant by constraints. These constraints produce a bias in the distribution of dihedral angles. The WM's are corrected for this bias by suitable Jacobians. The energy function used is derived from the CHARMM force field. In a first application to polyalanine it is demonstrated that WM's sample the conformational space more efficiently than SM's.Abbreviations CPU Central Processing Unit - MC Monte Carlo - MCD Monte Carlo Dynamics - MD Molecular Dynamics - RMS Root-Mean-Square - RMSD Root-Mean-Square-Deviation - SM Simple Move - WM Window Move  相似文献   

20.
Sampling from a finite population on multiple occasions introduces dependencies between the successive samples when overlap is designed. Such sampling designs lead to efficient statistical estimates, while they allow estimating changes over time for the targeted outcomes. This makes them very popular in real‐world statistical practice. Sampling with partial replacement can also be very efficient in biological and environmental studies where estimation of toxicants and its trends over time is the main interest. Sampling with partial replacement is designed here on two occasions in order to estimate the median concentration of chemical constituents quantified by means of liquid chromatography coupled with tandem mass spectrometry. Such data represent relative peak areas resulting from the chromatographic analysis. They are therefore positive‐valued and skewed data, and are commonly fitted very well by the log‐normal model. A log‐normal model is assumed here for chemical constituents quantified in mainstream cigarette smoke in a real case study. Combining design‐based and model‐based approaches for statistical inference, we seek for the median estimation of chemical constituents by sampling with partial replacement on two time occasions. We also discuss the limitations of extending the proposed approach to other skewed population models. The latter is investigated by means of a Monte Carlo simulation study.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号