首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Abstract

We show that the classical Metropolis Monte Carlo (MMC) algorithm converges very slowly when applied to the primitive electrolyte environment for a high charge-density polyelectrolyte. This slowness of convergence, which is due to the large density inhomogeneity around the polyelectrolyte, produces noticeable errors in the ion distribution functions for MMC runs of 1.3 × 106 trial steps started from nonequilibrium distributions. We report that an algorithm which we call DSMC (for density-scaled Monte Carlo) overcomes this problem and provides relatively rapid convergence in this application. We suggest that DSMC should be well-suited for other Monte Carlo simulations on physical systems where large density inhomogeneities occur.  相似文献   

2.
3.

Background, aim, and scope

Uncertainty information is essential for the proper use of life cycle assessment (LCA) and environmental assessments in decision making. So far, parameter uncertainty propagation has mainly been studied using Monte Carlo techniques that are relatively computationally heavy to conduct, especially for the comparison of multiple scenarios, often limiting its use to research or to inventory only. Furthermore, Monte Carlo simulations do not automatically assess the sensitivity and contribution to overall uncertainty of individual parameters. The present paper aims to develop and apply to both inventory and impact assessment an explicit and transparent analytical approach to uncertainty. This approach applies Taylor series expansions to the uncertainty propagation of lognormally distributed parameters.

Materials and methods

We first apply the Taylor series expansion method to analyze the uncertainty propagation of a single scenario, in which case the squared geometric standard deviation of the final output is determined as a function of the model sensitivity to each input parameter and the squared geometric standard deviation of each parameter. We then extend this approach to the comparison of two or more LCA scenarios. Since in LCA it is crucial to account for both common inventory processes and common impact assessment characterization factors among the different scenarios, we further develop the approach to address this dependency. We provide a method to easily determine a range and a best estimate of (a) the squared geometric standard deviation on the ratio of the two scenario scores, “A/B”, and (b) the degree of confidence in the prediction that the impact of scenario A is lower than B (i.e., the probability that A/B<1). The approach is tested on an automobile case study and resulting probability distributions of climate change impacts are compared to classical Monte Carlo distributions.

Results

The probability distributions obtained with the Taylor series expansion lead to results similar to the classical Monte Carlo distributions, while being substantially simpler; the Taylor series method tends to underestimate the 2.5% confidence limit by 1-11% and the 97.5% limit by less than 5%. The analytical Taylor series expansion easily provides the explicit contributions of each parameter to the overall uncertainty. For the steel front end panel, the factor contributing most to the climate change score uncertainty is the gasoline consumption (>75%). For the aluminum panel, the electricity and aluminum primary production, as well as the light oil consumption, are the dominant contributors to the uncertainty. The developed approach for scenario comparisons, differentiating between common and independent parameters, leads to results similar to those of a Monte Carlo analysis; for all tested cases, we obtained a good concordance between the Monte Carlo and the Taylor series expansion methods regarding the probability that one scenario is better than the other.

Discussion

The Taylor series expansion method addresses the crucial need of accounting for dependencies in LCA, both for common LCI processes and common LCIA characterization factors. The developed approach in Eq. 8, which differentiates between common and independent parameters, estimates the degree of confidence in the prediction that scenario A is better than B, yielding results similar to those found with Monte Carlo simulations.

Conclusions

The probability distributions obtained with the Taylor series expansion are virtually equivalent to those from a classical Monte Carlo simulation, while being significantly easier to obtain. An automobile case study on an aluminum front end panel demonstrated the feasibility of this method and illustrated its simultaneous and consistent application to both inventory and impact assessment. The explicit and innovative analytical approach, based on Taylor series expansions of lognormal distributions, provides the contribution to the uncertainty from each parameter and strongly reduces calculation time.  相似文献   

4.
Purpose

Objective uncertainty quantification (UQ) of a product life-cycle assessment (LCA) is a critical step for decision-making. Environmental impacts can be measured directly or by using models. Underlying mathematical functions describe a model that approximate the environmental impacts during various LCA stages. In this study, three possible uncertainty sources of a mathematical model, i.e., input variability, model parameter (differentiate from input in this study), and model-form uncertainties, were investigated. A simple and easy to implement method is proposed to quantify each source.

Methods

Various data analytics methods were used to conduct a thorough model uncertainty analysis; (1) Interval analysis was used for input uncertainty quantification. A direct sampling using Monte Carlo (MC) simulation was used for interval analysis, and results were compared to that of indirect nonlinear optimization as an alternative approach. A machine learning surrogate model was developed to perform direct MC sampling as well as indirect nonlinear optimization. (2) A Bayesian inference was adopted to quantify parameter uncertainty. (3) A recently introduced model correction method based on orthogonal polynomial basis functions was used to evaluate the model-form uncertainty. The methods are applied to a pavement LCA to propagate uncertainties throughout an energy and global warming potential (GWP) estimation model; a case of a pavement section in Chicago metropolitan area was used.

Results and discussion

Results indicate that each uncertainty source contributes to the overall energy and GWP output of the LCA. Input uncertainty was shown to have significant impact on overall GWP output; for the example case study, GWP interval was around 50%. Parameter uncertainty results showed that an assumption of ±?10% uniform variation in the model parameter priors resulted in 28% variation in the GWP output. Model-form uncertainty had the lowest impact (less than 10% variation in the GWP). This is because the original energy model is relatively accurate in estimating the energy. However, sensitivity of the model-form uncertainty showed that even up to 180% variation in the results can be achieved due to lower original model accuracies.

Conclusions

Investigating each uncertainty source of the model indicated the importance of the accurate characterization, propagation, and quantification of uncertainty. The outcome of this study proposed independent and relatively easy to implement methods that provide robust grounds for objective model uncertainty analysis for LCA applications. Assumptions on inputs, parameter distributions, and model form need to be justified. Input uncertainty plays a key role in overall pavement LCA output. The proposed model correction method as well as interval analysis were relatively easy to implement. Research is still needed to develop a more generic and simplified MCMC simulation procedure that is fast to implement.

  相似文献   

5.
Purpose

Several models are available in the literature to estimate agricultural emissions. From life cycle assessment (LCA) perspective, there is no standardized procedure for estimating emissions of nitrogen or other nutrients. This article aims to compare four agricultural models (PEF, SALCA, Daisy and Animo) with different complexity levels and test their suitability and sensitivity in LCA.

Methods

Required input data, obtained outputs, and main characteristics of the models are presented. Then, the performance of the models was evaluated according to their potential feasibility to be used in estimating nitrogen emissions in LCA using an adapted version of the criteria proposed by the United Nations Framework Convention on Climate Change (UNFCCC), and other relevant studies, to judge their suitability in LCA. Finally, nitrogen emissions from a case study of irrigated maize in Spain were estimated using the selected models and were tested in a full LCA to characterize the impacts.

Results and discussion

According to the set of criteria, the models scored, from best to worst: Daisy (77%), SALCA (74%), Animo (72%) and PEF (70%), being Daisy the most suitable model to LCA framework. Regarding the case study, the estimated emissions agreed to literature data for the irrigated corn crop in Spain and the Mediterranean, except N2O emissions. The impact characterization showed differences of up to 56% for the most relevant impact categories when considering nitrogen emissions. Additionally, an overview of the models used to estimate nitrogen emissions in LCA studies showed that many models have been used, but not always in a suitable or justified manner.

Conclusions

Although mechanistic models are more laborious, mainly due to the amount of input data required, this study shows that Daisy could be a suitable model to estimate emissions when fertilizer application is relevant for the environmental study. In addition, and due to LCA urgently needing a solid methodology to estimate nitrogen emissions, mechanistic models such as Daisy could be used to estimate default values for different archetype scenarios.

  相似文献   

6.
Abstract

In this paper we extend to the case of Zeolite-Y the topological analysis of the Aluminum distributions of the Faujasite lattice proposed in a previous paper. Here the exact counting of all the inequivalent configurations is complicated by the huge number of possible structures, but the physically relevant distributions can be found by using a Monte Carlo method which turns out to be very efficient. We compare, whenever possible, the Monte Carlo results with the exact countings, and in all these cases we find a perfect agreement. Thus the approach seems to be applicable to the study of every class of Zeolites. In the first two sections the method is introduced, and in the third one the relevant results for the Zeolite-Y are presented and discussed.  相似文献   

7.
PurposeBiological models to estimate the relative biological effectiveness (RBE) or the equivalent dose in 2 Gy fractions (EQD2) are needed for treatment planning and plan evaluation in carbon ion therapy. We present a model-independent, Monte Carlo based sensitivity analysis (SA) approach to quantify the impact of different uncertainties on the biological models.Methods and materialsThe Monte Carlo based SA is used for the evaluation of variations in biological parameters. The key property of this SA is the high number of simulation runs, each with randomized input parameters, allowing for a statistical variance-based ranking of the input variations. The potential of this SA is shown in a simplified one-dimensional treatment plan optimization. Physical properties of carbon ion beams (e.g. fragmentation) are simulated using the Monte Carlo code FLUKA. To estimate biological effects of ion beams compared to X-rays, we use the Local Effect Model (LEM) in the framework of the linear-quadratic (LQ) model. Currently, only uncertainties in the output of the biological models are taken into account.Results/conclusionsThe presented SA is suitable for evaluation of the impact of variations in biological parameters. Major advantages are the possibility to access and display the sensitivity of the evaluated quantity on several parameter variations at the same time. Main challenges for later use in three-dimensional treatment plan evaluation are computational time and memory usage. The presented SA can be performed with any analytical or numerical function and hence be applied to any biological model used in carbon ion therapy.  相似文献   

8.
Abstract

We report results of direct Monte Carlo simulations of n-pentane and n-decane at the liquidvapour interface for a number of temperatures. The intermolecular interactions are modeled using the last version of the anisotropic united atom model (AUA4). We have used the local long range correction energy and an algorithm allowing to select randomly with equal probability two different displacements. The liquid and vapour densities are in excellent agreement with experimental data and with those previously calculated using the GEMC method.  相似文献   

9.
AimThe aim of this work was to develop multiple-source models for electron beams of the NEPTUN 10PC medical linear accelerator using the BEAMDP computer code.BackgroundOne of the most accurate techniques of radiotherapy dose calculation is the Monte Carlo (MC) simulation of radiation transport, which requires detailed information of the beam in the form of a phase-space file. The computing time required to simulate the beam data and obtain phase-space files from a clinical accelerator is significant. Calculation of dose distributions using multiple-source models is an alternative method to phase-space data as direct input to the dose calculation system.Materials and methodsMonte Carlo simulation of accelerator head was done in which a record was kept of the particle phase-space regarding the details of the particle history. Multiple-source models were built from the phase-space files of Monte Carlo simulations. These simplified beam models were used to generate Monte Carlo dose calculations and to compare those calculations with phase-space data for electron beams.ResultsComparison of the measured and calculated dose distributions using the phase-space files and multiple-source models for three electron beam energies showed that the measured and calculated values match well each other throughout the curves.ConclusionIt was found that dose distributions calculated using both the multiple-source models and the phase-space data agree within 1.3%, demonstrating that the models can be used for dosimetry research purposes and dose calculations in radiotherapy.  相似文献   

10.
Purpose

Life Cycle Assessment (LCA) is the process of systematically assessing impacts when there is an interaction between the environment and human activity. Machine learning (ML) with LCA methods can help contribute greatly to reducing impacts. The sheer number of input parameters and their uncertainties that contribute to the full life cycle make a broader application of ML complex and difficult to achieve. Hence a systems engineering approach should be taken to apply ML in isolation to aspects of the LCA. This study addresses the challenge of leveraging ML methods to deliver LCA solutions. The overarching hypothesis is that: LCA underpinned by ML methods and informed by dynamic data paves the way to more accurate LCA while supporting life cycle decision making.

Methods

In this study, previous research on ML for LCA were considered, and a literature review was undertaken.

Results

The results showed that ML can be a useful tool in certain aspects of the LCA. ML methods were shown to be applied efficiently in optimization scenarios in LCA. Finally, ML methods were integrated as part of existing inventory databases to streamline the LCA across many use cases.

Conclusions

The conclusions of this article summarise the characteristics of existing literature and provide suggestions for future work in limitations and gaps which were found in the literature.

  相似文献   

11.
Population modeling for a squirrel monkey colony breeding in a captive laboratory environment was approached with the use of two different mathematical modeling techniques. Deterministic modeling was used initially on a spreadsheet to estimate future census figures for animals in various age/sex classes. Historical data were taken as input parameters for the model, combined with harvesting policies to calculate future population figures in the colony. This was followed by a more sophisticated stochastic model that is capable of accommodating random variations in biological phenomena, as well as smoothing out measurement errors. Point estimates (means) for input parameters used in the deterministic model are replaced by probability distributions fitted into historical data from colony records. With the use of Crystal Ball (Decisioneering, Inc., Denver, CO) software, user-selected distributions are embedded in appropriate cells in the spreadsheet model. A Monte Carlo simulation scheme running within the spreadsheet draws (on each cycle) random values for input parameters from the distribution embedded in each relevant cell, and thus generates output values for forecast variables. After several thousand runs, a distribution is formed at the output end representing estimates for population figures (forecast variables) in the form of probability distributions. Such distributions provide the decision-maker with a mathematical habitat for statistical analysis in a stochastic setting. In addition to providing standard statistical measures (e.g., mean, variance, and range) that describe the location and shape of the distribution, this approach offers the potential for investigating crucial issues such as conditions surrounding the plausibility of extinction.  相似文献   

12.
Abstract

The principle purpose of this paper is to demonstrate the use of the Inverse Monte Carlo technique for calculating pair interaction energies in monoatomic liquids from a given equilibrium property. This method is based on the mathematical relation between transition probability and pair potential given by the fundamental equation of the “importance sampling” Monte Carlo method. In order to have well defined conditions for the test of the Inverse Monte Carlo method a Metropolis Monte Carlo simulation of a Lennard Jones liquid is carried out to give the equilibrium pair correlation function determined by the assumed potential. Because an equilibrium configuration is prerequisite for an Inverse Monte Carlo simulation a model system is generated reproducing the pair correlation function, which has been calculated by the Metropolis Monte Carlo simulation and therefore representing the system in thermal equilibrium. This configuration is used to simulate virtual atom displacements. The resulting changes in atom distribution for each single simulation step are inserted in a set of non-linear equations defining the transition probability for the virtual change of configuration. The solution of the set of equations for pair interaction energies yields the Lennard Jones potential by which the equilibrium configuration has been determined.  相似文献   

13.
Purpose

It is frequently mentioned in literature that LCA is linear, without a proof, or even without a clear definition of the criterion for linearity. Here we study the meaning of the term linear, and in relation to that, the question if LCA is indeed linear.

Methods

We explore the different meanings of the term linearity in the context of mathematical models. This leads to a distinction between linear functions, homogeneous functions, homogenous linear functions, bilinear functions, and multilinear functions. Each of them is defined in accessible terms and illustrated with examples.

Results

We analyze traditional, matrix-based, LCA, and conclude that LCA is not linear in any of the senses defined.

Discussion and conclusions

Despite the negative answer to the research question, there are many respects in which LCA can be regarded to be, at least to some extent, linear. We discuss a few of such cases. We also discuss a few practical implications for practitioners of LCA and for developers of new methods for LCI and LCIA.

  相似文献   

14.
Purpose

In support of the sustainable development of our societies, future engineers should have elementary knowledge in sustainability assessment and use of life cycle assessment. Publications on pedagogical experience with teaching life cycle assessment (LCA) in high-level education are however scarce. Here, we describe and discuss 20 years of experience in teaching LCA at MSc level in an engineering university with the ambition to share our insights and inspire teaching of LCA as part of a university curriculum.

Methods

We detail the design of an LCA course taught at the Technical University of Denmark since 1997. The course structure relies on (i) a structured combination of theoretical teaching, practical assignments and hands-on practice on LCA case studies, and (ii) the conduct of real-life LCA case studies in collaboration with companies or other organisations. Through the semester-long duration of the course, students from different engineering backgrounds perform full-fledged LCA studies in groups, passing through two iterations—a screening LCA supporting a more targeted LCA.

Results and discussion

The course design, which relies on a learning-by-doing principle, is transparently described to inspire LCA teachers among the readers. Historical evolution and statistics about the course, including its 192 case studies run in collaboration with 105 companies and institutions, are analysed and serve as basis to discuss the benefits and challenges of its different components, such as the theory acquisition, the assignment work, the LCA software learning, the conduct of case studies, the merits of industrial collaborations and grading approaches.

Conclusions

We demonstrate the win-win situation created by the setting of the course, in which the students are actively engaged and learn efficiently how to perform an LCA while the collaborating companies often get useful insights into their analysed case studies. The course can also be an eye opener for companies unfamiliar with LCA, who get introduced to life cycle thinking and the potential benefits of LCA. We have no hesitation in recommending industries and LCA teachers to engage into such collaborations even in the fundamental teaching of LCA techniques.

  相似文献   

15.
The life cycle environmental profile of energy‐consuming products is dominated by the products’ use stage. Variation in real‐world product use can therefore yield large differences in the results of life cycle assessment (LCA). Adequate characterization of input parameters is paramount for uncertainty quantification and has been a challenge to wider adoption of the LCA method. After emphasis in recent years on methodological development, data development has become the primary focus again. Pervasive sensing presents the opportunity to collect rich data sets and improve profiling of use‐stage parameters. Illustrating a data‐driven approach, we examine energy use in domestic cooling systems, focusing on climate change as the impact category. Specific objectives were to examine: (1) how characterization of the use stage by different probability distributions and (2) how characterizing data aggregated at successively higher granularity affects LCA modeling results and the uncertainty in output. Appliance‐level electricity data were sourced from domestic residences for 3 years. Use‐stage variables were propagated in a stochastic model and analyses simulated by Monte Carlo procedure. Although distribution choice did not necessarily significantly impact the estimated output, there were differences in the estimated uncertainty. Characterization of use‐stage power consumption in the model at successively higher data granularity reduced the output uncertainty with diminishing returns. Results therefore justify the collection of high granularity data sets representing the life cycle use stage of high‐energy products. The availability of such data through proliferation of pervasive sensing presents increasing opportunities to better characterize data and increase confidence in results of LCA.  相似文献   

16.
Abstract

A bulk Lennard-Jones fluid was simulated using the grand canonical Monte Carlo method. Three different sampling methods were used in the transition matrix, namely the Metropolis, Barker and a third novel method. While it can be shown that the Metropolis method will give the most accurate ensemble averages in the limit of an infinitely long run, the new method termed “Modified Barker Sampling” (MBS), is shown to be superior for the runs of practical length for the particular system studied.  相似文献   

17.
Low-dose-rate extrapolation using the multistage model   总被引:3,自引:0,他引:3  
C Portier  D Hoel 《Biometrics》1983,39(4):897-906
The distribution of the maximum likelihood estimates of virtually safe levels of exposure to environmental chemicals is derived by using large-sample theory and Monte Carlo simulation according to the Armitage-Doll multistage model. Using historical dose-response we develop a set of 33 two-stage models upon which we base our conclusions. The large-sample distributions of the virtually safe dose are normal for cases in which the multistage-model parameters have nonzero expectation, and are skewed in other cases. The large-sample theory does not provide a good approximation of the distribution observed for small bioassays when Monte Carlo simulation is used. The constrained nature of the multistage-model parameters leads to bimodal distributions for small bioassays. The two modes are the direct result of estimating the linear parameter in the multistage model; the lower mode results from estimating this parameter to be nonzero, and the upper mode from estimating it to be zero. The results of this research emphasize the need for incorporation of the biological theory in the model-selection process.  相似文献   

18.

Purpose

The analysis of uncertainty in life cycle assessment (LCA) studies has been a topic for more than 10 years, and many commercial LCA programs now feature a sampling approach called Monte Carlo analysis. Yet, a full Monte Carlo analysis of a large LCA system, for instance containing the 4,000 unit processes of ecoinvent v2.2, is rarely carried out by LCA practitioners. One reason for this is computation time. An alternative faster than Monte Carlo method is analytical error propagation by means of a Taylor series expansion; however, this approach suffers from being explained in the literature in conflicting ways, hampering implementation in most software packages for LCA. The purpose of this paper is to compare the two different approaches from a theoretical and practical perspective.

Methods

In this paper, we compare the analytical and sampling approaches in terms of their theoretical background and their mathematical formulation. Using three case studies—one stylized, one real-sized, and one input–output (IO)-based—we approach these techniques from a practical perspective and compare them in terms of speed and results.

Results

Depending on the precise question, a sampling or an analytical approach provides more useful information. Whenever they provide the same indicators, an analytical approach is much faster but less reliable when the uncertainties are large.

Conclusions

For a good analysis, analytical and sampling approaches are equally important, and we recommend practitioners to use both whenever available, and we recommend software suppliers to implement both.  相似文献   

19.
Abstract

Time saving procedures unifying Monte Carlo and self consistent field approaches for the calculation of equilibrium potentials and density distributions of mobile ions around a polyion in a polyelectrolyte system are considered. In the final version of the method the region around the polyion is divided into two zones—internal and external; all the ions of the internal zone are accounted for explicitly in a Monte Carlo procedure, in the external zone the self consistent field approximation is applied with an exchange of ions between regions. Simulations are carried out for cylindrical and spherical polyions in solutions with mono-and divalent ions and their mixtures. The results are compared with Poisson—Boltzmann approximation and experimental data on intrinsic viscosity.  相似文献   

20.
Inventory data and characterization factors in life cycle assessment (LCA) contain considerable uncertainty. The most common method of parameter uncertainty propagation to the impact scores is Monte Carlo simulation, which remains a resource‐intensive option—probably one of the reasons why uncertainty assessment is not a regular step in LCA. An analytical approach based on Taylor series expansion constitutes an effective means to overcome the drawbacks of the Monte Carlo method. This project aimed to test the approach on a real case study, and the resulting analytical uncertainty was compared with Monte Carlo results. The sensitivity and contribution of input parameters to output uncertainty were also analytically calculated. This article outlines an uncertainty analysis of the comparison between two case study scenarios. We conclude that the analytical method provides a good approximation of the output uncertainty. Moreover, the sensitivity analysis reveals that the uncertainty of the most sensitive input parameters was not initially considered in the case study. The uncertainty analysis of the comparison of two scenarios is a useful means of highlighting the effects of correlation on uncertainty calculation. This article shows the importance of the analytical method in uncertainty calculation, which could lead to a more complete uncertainty analysis in LCA practice.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号