共查询到20条相似文献,搜索用时 0 毫秒
1.
Hiroshi Wako 《Journal of Protein Chemistry》1989,8(6):733-747
Monte Carlo simulations of a small protein, carmbin, were carried out with and without hydration energy. The methodology presented here is characterized, as compared with the other similar simulations of proteins in solution, by two points: (1) protein conformations are treated in fixed geometry so that dihedral angles are independent variables rather than cartesian coordinates of atoms; and (2) instead of treating water molecules explicitly in the calculation, hydration energy is incorporated in the conformational energy function in the form of g
i
A
i, whereA
i is the accessible surface area of an atomic groupi in a given conformation, andg
i is the free energy of hydration per unit surface area of the atomic group (i.e., hydration-shell model). Reality of this model was tested by carrying out Monte Carlo simulations for the two kinds of starting conformations, native and unfolded ones, and in the two kinds of systems,in vacuo and solution. In the simulations starting from the native conformation, the differences between the mean propertiesin vacuo and solution simulations are not very large, but their fluctuations around the mean conformation during the simulation are relatively smaller in solution thanin vacuo. On the other hand, in the simulations starting from the unfolded conformation, the molecule fluctuates much more largely in solution thanin vacuo, and the effects of taking into account the hydration energy are pronounced very much. The results suggest that the method presented in this paper is useful for the simulations of proteins in solution. 相似文献
2.
Barlette Vania Elisabeth Garbujo Fábio Luiz Laurenti Freitas Luiz Carlos Gomide 《Molecular Engineering》1997,7(3-4):439-455
A five site potential model combining Lennard–Jones plus Coulomb potential functions has been developed for chloroform molecule. The partial charges needed for Coulombic interactions were derived using the chelpg procedure implemented in the gaussian 92 program. These calculations were performed at the MP2 level with MC-311G* basis set for Cl and 6-311G** for C and H atoms. The parameters for the Lennard–Jones potentials were optimized to reproduce experimental values for the density and enthalpy of vaporization of the pure liquid at 298 K and 1 atm. The statistical mechanics calculations were performed with the Monte Carlo method in the isothermic and isobaric (NpT) ensemble. Besides the values obtained for density, ρ, and molar enthalpy of vaporization at constant pressure, Δ HV, for liquid chloroform, results for molar volume, Vm, molar heat capacity, Cp, isobaric thermal expansivity, αp, and isothermal compressibility, κT, for this pure liquid are also in very good agreement with experimental observations. Size effects on the values of thermodynamic properties were investigated. The potential model was also tested by computing the free energy for solvating one chloroform molecule into its own liquid at 298 K using a statistical perturbation approach. The result obtained compares well with the experimental value. Site–site pair correlation functions were calculated and are in good accordance with theoretical results available in the literature. Dipole–dipole correlation functions for the present five site model were also calculated at different carbon–carbon distances. These correlations were compared to those obtained using the four site model reported in the literature. An investigation of the solvent dependence of the relative free energy for cis/trans conversion of a hypothetical solute in TIP4P water and chloroform was accomplished. The results show strong interaction of water and chloroform molecules with the gauche conformer. The value obtained for the free energy barrier for cis/trans rotation in TIP4P water is higher than that for chloroform. This result is in agreement with the continuous theory for solvation as the conformer with higher dipole moment is more favoured by the solvent with higher dieletric constant. The results also show an increase in entropy as the solute goes from the cis to the trans geometry and this result is more appreciable in the aqueous solution. This revised version was published online in June 2006 with corrections to the Cover Date. 相似文献
3.
In contrast to the various “potential impact” indices that have been proposed, we show that indices for real damage can be
derived, based on the impact pathway methodology which involves the calculation of increased pollutant concentration in all
affected regions due to an incremental emission (e.g. μg/m3 of particles, using models of atmospheric dispersion and chemistry), followed by the calculation of physical impacts (e.g.
number of cases of asthma due to these particles, using a concentration-response function). The numbers are summed over all
receptors of concern (population, crops, buildings,…). We show that in a uniform world (linear dose-response function, uniform
receptor density and uniform atmospheric removal rate) the conservation of matter implies a very simple formula for the total
damage. The generalization to secondary pollutants is straightforward. By detailed numerical evaluations, using real data
for atmospheric dispersion and geographic receptor distribution, we have demonstrated that this simple formula is an excellent
representation of typical damages. Results are shown for the principal air pollutants emitted by smoke stacks of industrial
installations or by road transport.
A preliminary version was presented as a key note lecture at the SETAC Meeting in Bordeaux, April 14-18, 1998 相似文献
4.
One barrier to interpreting the observational evidence concerning the adverse health effects of air pollution for public policy purposes is the measurement error inherent in estimates of exposure based on ambient pollutant monitors. Exposure assessment studies have shown that data from monitors at central sites may not adequately represent personal exposure. Thus, the exposure error resulting from using centrally measured data as a surrogate for personal exposure can potentially lead to a bias in estimates of the health effects of air pollution. This paper develops a multi-stage Poisson regression model for evaluating the effects of exposure measurement error on estimates of effects of particulate air pollution on mortality in time-series studies. To implement the model, we have used five validation data sets on personal exposure to PM10. Our goal is to combine data on the associations between ambient concentrations of particulate matter and mortality for a specific location, with the validation data on the association between ambient and personal concentrations of particulate matter at the locations where data have been collected. We use these data in a model to estimate the relative risk of mortality associated with estimated personal-exposure concentrations and make a comparison with the risk of mortality estimated with measurements of ambient concentration alone. We apply this method to data comprising daily mortality counts, ambient concentrations of PM10measured at a central site, and temperature for Baltimore, Maryland from 1987 to 1994. We have selected our home city of Baltimore to illustrate the method; the measurement error correction model is general and can be applied to other appropriate locations.Our approach uses a combination of: (1) a generalized additive model with log link and Poisson error for the mortality-personal-exposure association; (2) a multi-stage linear model to estimate the variability across the five validation data sets in the personal-ambient-exposure association; (3) data augmentation methods to address the uncertainty resulting from the missing personal exposure time series in Baltimore. In the Poisson regression model, we account for smooth seasonal and annual trends in mortality using smoothing splines. Taking into account the heterogeneity across locations in the personal-ambient-exposure relationship, we quantify the degree to which the exposure measurement error biases the results toward the null hypothesis of no effect, and estimate the loss of precision in the estimated health effects due to indirectly estimating personal exposures from ambient measurements. 相似文献
5.
Assessment of potential health risk of heavy metals in soils from a rapidly developing region of China 总被引:1,自引:0,他引:1
Soil heavy metal contamination is a major environmental concern, and health risk associated with heavy metals is not fully explored. A combination of spatial analysis and Monte Carlo simulation was successfully used to identify the possible sources and health risk of cadmium (Cd), arsenic (As), mercury (Hg), lead (Pb), chromium (Cr), and copper (Cu) in soils collected from a rapidly developing region of China. It was found that mean concentrations of Cd (0.17 mg/kg ), As (8.74 mg/kg ), Hg (0.15 mg/kg ), Pb (27.28 mg/kg ), and Cu (33.32 mg/kg ) were greater than the soil background values. Accumulation and spatial variability of heavy metals were significantly affected by anthropogenic activities and soil properties. The risk assessment indicated that non-carcinogenic risk was not significant. However, 95% of the total cumulative carcinogenic risk of children was greater than 1E-05, implying high potential carcinogenic risk with As and Pb representing the major contributors. Ingestion of heavy metals in the soils was the main exposure pathway compared with the inhalation and the dermal exposure. Concentration of heavy metals in the soils, particulate emission factor, and dermal exposure ratio were the major parameters affecting health risk. This study highlights the importance of assessment of soil direct exposure health risk in studying heavy metal exposures. 相似文献
6.
This article evaluates the implications of uncertainty in the life cycle (LC) energy efficiency and greenhouse gas (GHG) emissions of rapeseed oil (RO) as an energy carrier displacing fossil diesel (FD). Uncertainties addressed include parameter uncertainty as well as scenario uncertainty concerning how RO coproduct credits are accounted for (uncertainty due to modeling choices). We have carried out an extensive data collection to build an LC inventory accounting for parameter uncertainty. Different approaches for carbon stock changes associated with converting set‐aside land to rapeseed cultivation have been considered, which result in different values: from ?0.25 t C/ha.yr (carbon uptake by the soil in tonnes per hectare year) to 0.60 t C/ha.yr (carbon emission). Energy renewability efficiency and GHG emissions of RO are presented, which show the influence of parameter versus scenario uncertainty. Primary energy savings and avoided GHG emissions when RO displaces FD have also been calculated: Avoided GHG emissions show considerably higher uncertainty than energy savings, mainly due to land use (nitrous oxide emissions from soil) and land use conversion (carbon stock changes). Results demonstrate the relevance of applying uncertainty approaches; emphasize the need to reduce uncertainty in the environmental life cycle modeling, particularly GHG emissions calculation; and show the importance of integrating uncertainty into the interpretation of results. 相似文献
7.
Hao Jiang 《Molecular simulation》2015,41(9):727-734
Monte Carlo simulation is conducted to obtain the structure, excess internal energy and Helmholtz energy for systems containing charged and neutral hard spheres of comparable concentrations. The results are compared with the thermodynamic properties predicted by solving Ornstein–Zernike equation with hypernetted chain (HNC) and mean spherical approximation (MSA) closures. The HNC approximation is found to well represent the simulation results for both structure and excess energy, while the excess energy of MSA deviates from the simulation results in the intermediate- and high-density range. A simple modification of MSA, referred to as KMSA, is proposed to accurately predict the excess internal and Helmholtz energy in the studied density range. KMSA is proved to capture the effects of neutral component, size and charge asymmetry, system temperature and dielectric constant of the background solvent, on the excess energy of electrolyte systems. 相似文献
8.
9.
Abstract To assess the health risks caused by soil heavy metal in China’s mining areas, concentration data for eight heavy metals in 77 mines were collected from previous literature. Monte Carlo simulation was used to analyze the corresponding carcinogenic risks and noncarcinogenic risks, and sensitivity analysis was carried out for each parameter. The results showed that among the different types of mining areas, multi-metal mines have the highest risk of carcinogenesis, followed by tungsten and antimony mines. Their carcinogenic risk values are all greater than 10?4, which is unacceptable. Pb is a heavy metal with highest noncarcinogenic risk. The log-transformed value is 3.2, which is much larger than the threshold of 0; Pb is followed by As and Hg. Therefore, Pb, As, and Hg are the heavy metals that should be controlled preferentially in polluted mining areas. Sensitivity analysis showed that the soil ingestion rate, exposure frequency, and pollutant concentration level are the factors that have the greatest impacts on health risks. More attention should be paid to these factors when addressing heavy metal pollution in mining areas. In addition, for the surveyed mines, children had a greater health risk than adults, so children should be given extra attention. 相似文献
10.
Abstract The principle purpose of this paper is to demonstrate the use of the Inverse Monte Carlo technique for calculating pair interaction energies in monoatomic liquids from a given equilibrium property. This method is based on the mathematical relation between transition probability and pair potential given by the fundamental equation of the “importance sampling” Monte Carlo method. In order to have well defined conditions for the test of the Inverse Monte Carlo method a Metropolis Monte Carlo simulation of a Lennard Jones liquid is carried out to give the equilibrium pair correlation function determined by the assumed potential. Because an equilibrium configuration is prerequisite for an Inverse Monte Carlo simulation a model system is generated reproducing the pair correlation function, which has been calculated by the Metropolis Monte Carlo simulation and therefore representing the system in thermal equilibrium. This configuration is used to simulate virtual atom displacements. The resulting changes in atom distribution for each single simulation step are inserted in a set of non-linear equations defining the transition probability for the virtual change of configuration. The solution of the set of equations for pair interaction energies yields the Lennard Jones potential by which the equilibrium configuration has been determined. 相似文献
11.
Natalia Nikolova Kiril Tenekedjiev Krasimir Kolev 《Central European Journal of Biology》2008,3(4):345-350
Progress curve analysis is a convenient tool for the characterization of enzyme action: a single reaction mixture provides
multiple experimental measured points for continuously varying amounts of substrates and products with exactly the same enzyme
and modulator concentrations. The determination of kinetic parameters from the progress curves, however, requires complex
mathematical evaluation of the time-course data. Some freely available programs (e.g. FITSIM, DYNAFIT) are widely applied
to fit kinetic parameters to user-defined enzymatic mechanisms, but users often overlook the stringent requirements of the
analytic procedures for appropriate design of the input experiments. Flaws in the experimental setup result in unreliable
parameters with consequent misinterpretation of the biological phenomenon under study. The present commentary suggests some
helpful mathematical tools to improve the analytic procedure in order to diagnose major errors in concept and design of kinetic
experiments. 相似文献
12.
Xie Y Yu H Yang H Shi Q Zhang X 《Biochemical and biophysical research communications》2006,349(1):15-19
The translocation of a confined polymer chain through a nano-channel has been simulated by using two-dimensional bond fluctuation model (BFM) with Monte Carlo dynamics. It is found that the trapping time for the polymer chain to overcome the free energy barrier during the translocation, tautrap, depends exponentially on the chain length N and the channel length M, respectively. The results suggest that the barrier height of free energy depends linearly on N and M, which is different from that predicted for the Gaussian chain. 相似文献
13.
Jeffrey A. Walker 《Evolution; international journal of organic evolution》2014,68(7):2128-2136
Multiple regression of observational data is frequently used to infer causal effects. Partial regression coefficients are biased estimates of causal effects if unmeasured confounders are not in the regression model. The sensitivity of partial regression coefficients to omitted confounders is investigated with a Monte‐Carlo simulation. A subset of causal traits is “measured” and their effects are estimated using ordinary least squares regression and compared to their expected values. Three major results are: (1) the error due to confounding is much larger than that due to sampling, especially with large samples, (2) confounding error shrinks trivially with sample size, and (3) small true effects are frequently estimated as large effects. Consequently, confidence intervals from regression are poor guides to the true intervals, especially with large sample sizes. The addition of a confounder to the model improves estimates only 55% of the time. Results are improved with complete knowledge of the rank order of causal effects but even with this omniscience, measured intervals are poor proxies for true intervals if there are many unmeasured confounders. The results suggest that only under very limited conditions can we have much confidence in the magnitude of partial regression coefficients as estimates of causal effects. 相似文献
14.
Population modeling for a squirrel monkey colony breeding in a captive laboratory environment was approached with the use of two different mathematical modeling techniques. Deterministic modeling was used initially on a spreadsheet to estimate future census figures for animals in various age/sex classes. Historical data were taken as input parameters for the model, combined with harvesting policies to calculate future population figures in the colony. This was followed by a more sophisticated stochastic model that is capable of accommodating random variations in biological phenomena, as well as smoothing out measurement errors. Point estimates (means) for input parameters used in the deterministic model are replaced by probability distributions fitted into historical data from colony records. With the use of Crystal Ball (Decisioneering, Inc., Denver, CO) software, user-selected distributions are embedded in appropriate cells in the spreadsheet model. A Monte Carlo simulation scheme running within the spreadsheet draws (on each cycle) random values for input parameters from the distribution embedded in each relevant cell, and thus generates output values for forecast variables. After several thousand runs, a distribution is formed at the output end representing estimates for population figures (forecast variables) in the form of probability distributions. Such distributions provide the decision-maker with a mathematical habitat for statistical analysis in a stochastic setting. In addition to providing standard statistical measures (e.g., mean, variance, and range) that describe the location and shape of the distribution, this approach offers the potential for investigating crucial issues such as conditions surrounding the plausibility of extinction. 相似文献
15.
Wojciech Zieliski 《Biometrical journal. Biometrische Zeitschrift》1992,34(3):291-296
Seven procedures of multiple comparisons: Tukey, Scheffé, Bonferroni, Studentized Maximum Modulus, Duncan, Newman-Keuls and F are compared with respect to the probability of the correct decision. Monte Carlo simulation shows that there is no the best procedure. AMS 1985 Subject Classification: 62 J 15. 相似文献
16.
Steven L. Fischer Robin H. Hampton 《Computer methods in biomechanics and biomedical engineering》2014,17(3):199-203
The use of principal component analysis (PCA) as a multivariate statistical approach to reduce complex biomechanical data-sets is growing. With its increased application in biomechanics, there has been a concurrent divergence in the use of criteria to determine how much the data is reduced (i.e. how many principal factors are retained). This short communication presents power equations to support the use of a parallel analysis (PA) criterion as a quantitative and transparent method for determining how many factors to retain when conducting a PCA. Monte Carlo simulation was used to carry out PCA on random data-sets of varying dimension. This process mimicked the PA procedure that would be required to determine principal component (PC) retention for any independent study in which the data-set dimensions fell within the range tested here. A surface was plotted for each of the first eight PCs, expressing the expected outcome of a PA as a function of the dimensions of a data-set. A power relationship was used to fit the surface, facilitating the prediction of the expected outcome of a PA as a function of the dimensions of a data-set. Coefficients used to fit the surface and facilitate prediction are reported. These equations enable the PA to be freely adopted as a criterion to inform PC retention. A transparent and quantifiable criterion to determine how many PCs to retain will enhance the ability to compare and contrast between studies. 相似文献
17.
G. Ian Town 《Biomarkers》2001,6(1):15-18
Air quality in Christchurch has been debated widely over the last 30 years and at present there is a Draft Plan from the Canterbury Regional Council which has the main aim of improving air quality in the region. It has been shown in an inventory of emissions, that the main source of particulate pollution in the city is the use of solid fuel domestic heating appliances such as open fires and wood burners. Pollution from road traffic is considered a significant contributor to other contaminants but is less that 10% for particulate. There is local evidence that during the winter months, when atmospheric inversion conditions occur, levels of PM10 (particulate matter less than 10 µm in diameter) exceed local guidelines (50 mcg m-3 -24 hr average) approximately 30 times each year. Research performed in Christchurch suggests that these levels of air pollution account for both premature mortality and an increase in symptoms and medication requirements in susceptible sub-groups such as those with chronic obstructive airways disease. Ongoing research is planned in Christchurch and a collaborative approach between public health physicians, biostatisticians, toxicologists and clinical researchers is likely to yield further useful information which will inform the decision-making process by the Canterbury Regional Council. 相似文献
18.
19.
Hsien Hui Khoo Reginald B. H. Tan Masayuki Sagisaka 《The International Journal of Life Cycle Assessment》2008,13(4):312-318
Background, aim and scope The interest in the use of biomass as a renewable energy resource has rapidly grown over the past few years. In Singapore, biomass resources are mostly from waste wood. This article presents a few technological options, namely carbonization, for the conversion of woody biomass into a solid fuel, charcoal. Materials and methods In the first stage, a life cycle assessment (LCA) ‘gate-to-gate’ system was developed for a conventional carbonizer system, a modern carbonizer from Japan, and a proposed four-stage partial furnace carbonizer from Tunisia. The potential environmental impacts were generated for global warming potential, acidification, human toxicity and photochemical oxidant potential. Based on the first set of results, the second LCA investigation was carried out comparing the selected carbonizer from Japan and an existing incinerator in Singapore. The second LCA adopted a unique approach combining social costs of pollution with the economic factors of the two biomass conversion technologies. Results The carbonizer from Japan resulted in approximately 85% less greenhouse gases than the conventional carbonization system and 54% less than the proposed four-stage carbonizer from Tunisia. In terms of acidification and human toxicity, the carbonizers from Japan and Tunisia display nearly similar results—both were considerably lower than the conventional carbonizer. For photochemical oxidant potential, very minimal emissions are generated from the four-stage carbonizer and nearly zero impact is realized for the carbonization technology from Japan. Discussion From the first set of LCA results, the Japanese carbonizer is favored in terms of its environmental results. The highest environmental impacts from the conventional carbonizer were due to large and uncontrolled emissions of acidic gases, greenhouse gases (particularly CO2 and CH4), particulates, and non-methane volatile organic compounds from both fugitive sources and energy requirements. The second LCA addressed the performance of the carbonizer from Japan against an existing incinerator in terms of environmental as well as cost performances. This unique approach translated pollution emissions into monetary costs to highlight the impacts of social health. Conclusions For the first LCA, the accumulated impacts from the Japanese carbonizer proved to display significantly lower environmental impacts, especially for global warming potential. The overall environmental performance of the four-stage carbonizer from Tunisia ranked slightly lower than the one from Japan and much higher than the conventional carbonizer. The second LCA results displayed a noteworthy improvement of 90% for human health from the modern Japanese carbonizer technology—when compared against conventional incinerators. Without considering health issues or social costs, the total value per ton of wood treated is nearly similar for both incinerator and carbonizer. Recommendations and perspectives The interest in biomass as raw material for producing energy has emerged rapidly in many countries. However, careful analysis and comparison of technologies are necessary to ensure favorable environmental outcomes. A full life cycle study, along with costs and the impact of pollution on society, should be performed before any large-scale biomass conversion technology is implemented. LCA can be applied to quantify and verify the overall environmental performance of a particular technology of interest as well as further explore the proposed technology in terms of costs and social implications. 相似文献
20.
Theodoros Skevas Scott M. Swinton Sophia Tanner Gregg Sanford Kurt D. Thelen 《Global Change Biology Bioenergy》2016,8(6):1162-1177
Perennial, cellulosic bioenergy crops represent a risky investment. The potential for adoption of these crops depends not only on mean net returns, but also on the associated probability distributions and on the risk preferences of farmers. Using 6‐year observed crop yield data from highly productive and marginally productive sites in the southern Great Lakes region and assuming risk neutrality, we calculate expected breakeven biomass yields and prices compared to corn (Zea mays L.) as a benchmark. Next we develop Monte Carlo budget simulations based on stochastic crop prices and yields. The crop yield simulations decompose yield risk into three components: crop establishment survival, time to maturity, and mature yield variability. Results reveal that corn with harvest of grain and 38% of stover (as cellulosic bioenergy feedstock) is both the most profitable and the least risky investment option. It dominates all perennial systems considered across a wide range of farmer risk preferences. Although not currently attractive for profit‐oriented farmers who are risk neutral or risk averse, perennial bioenergy crops have a higher potential to successfully compete with corn under marginal crop production conditions. 相似文献