首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This study presents a method for identifying cost effective sampling designs for long-term monitoring of remediation of groundwater over multiple monitoring periods under uncertain flow conditions. A contaminant transport model is used to simulate plume migration under many equally likely stochastic hydraulic conductivity fields and provides representative samples of contaminant concentrations. Monitoring costs are minimized under a constraint to meet an acceptable level of error in the estimation of total mass for multiple contaminants simultaneously over many equiprobable realizations of hydraulic conductivity field. A new myopic heuristic algorithm (MS-ER) that combines a new error-reducing search neighborhood is developed to solve the optimization problem. A simulated annealing algorithm using the error-reducing neighborhood (SA-ER) and a genetic algorithm (GA) are also considered for solving the optimization problem. The method is applied to a hypothetical aquifer where enhanced anaerobic bioremediation of four toxic chlorinated ethene species is modeled using a complex contaminant transport model. The MS-ER algorithm consistently performed better in multiple trials of each algorithm when compared to SA-ER and GA. The best design of MS-ER algorithm produced a savings of nearly 25% in project cost over a conservative sampling plan that uses all possible locations and samples.  相似文献   

2.
Long term monitoring optimization (LTMO) has proved a valuable method for reducing costs, assuring proper remedial decisions are made, and streamlining data collection and management requirements over the life of a monitoring program. A three-tiered approach for LTMO has been developed that combines a qualitative evaluation with an evaluation of temporal trends in contaminant concentrations, and a spatial statistical analysis. The results of the three evaluations are combined to determine the degree to which a monitoring program addresses the monitoring program objectives, and a decision algorithm is applied to assess the optimal frequency of monitoring and spatial distribution of the components of the monitoring network. Ultimately, application of the three-tiered method can be used to identify potential modifications in sampling locations and sampling frequency that will optimally meet monitoring objectives. To date, the three-tiered approach has been applied to monitoring programs at 18 sites and has been used to identify a potential average reduction of over one-third of well sampling events per year. This paper discusses the three-tiered approach methodology, including data compilation and site screening, qualitative evaluation decision logic, temporal trend evaluation, and spatial statistical analysis, illustrated using the results of a case study site. Additionally, results of multiple applications of the three-tiered LTMO approach are summarized, and future work is discussed.  相似文献   

3.
Long term monitoring optimization (LTMO) has proved a valuable method for reducing costs, assuring proper remedial decisions are made, and streamlining data collection and management requirements over the life of a monitoring program. A three-tiered approach for LTMO has been developed that combines a qualitative evaluation with an evaluation of temporal trends in contaminant concentrations, and a spatial statistical analysis. The results of the three evaluations are combined to determine the degree to which a monitoring program addresses the monitoring program objectives, and a decision algorithm is applied to assess the optimal frequency of monitoring and spatial distribution of the components of the monitoring network. Ultimately, application of the three-tiered method can be used to identify potential modifications in sampling locations and sampling frequency that will optimally meet monitoring objectives. To date, the three-tiered approach has been applied to monitoring programs at 18 sites and has been used to identify a potential average reduction of over one-third of well sampling events per year. This paper discusses the three-tiered approach methodology, including data compilation and site screening, qualitative evaluation decision logic, temporal trend evaluation, and spatial statistical analysis, illustrated using the results of a case study site. Additionally, results of multiple applications of the three-tiered LTMO approach are summarized, and future work is discussed.  相似文献   

4.
The effect of three‐dimensional heterogeneity of saturated hydraulic conductivity on the vertical transport of solutes in soils is examined by means of controlled numerical experiments. Saturated hydraulic conductivity, an important transport parameter that controls the dispersion of pollutants in heterogeneous soils, is assumed to be composed of a homogeneous mean value and a perturbation caused by the vertical variability of soil properties, producing a stochastic process in the mean flow direction. The spatial heterogeneity of porous soils is characterized by the variance and correlation scale of the saturated hydraulic conductivity in the transport domain. Numerical experiments are carried out to evaluate the extent of contaminant dispersion in Hawaiian Oxic soils when uncertainty exists as a result of the spatial heterogeneity of saturated hydraulic conductivity. Statistical analysis of the saturated hydraulic conductivity measurements on undisturbed soil cores from two locations in Hawaiian Oxic soils indicated two different soils with the same mean and different variances. The partial differential equations describing three‐dimensional transient flow and solute transport in soils with a random conductivity field were solved to evaluate the effect of these two variance levels on the transport of a contaminant plume originating from the surface. The significance of the variance on the spatial and temporal distribution of tracer concentrations is demonstrated using solute breakthrough curves at various depths in the soil profile. The longitudinal macrodispersivity resulting from tracer spreading in the heterogeneous soils with a finite local dispersivity is also analyzed. The analysis shows a similar solute dispersion behavior for the two variances. However, there is an overall reduction in the dispersion of solutes resulting from a uniform velocity field with the same mean. Macrodispersivity values in heterogeneous soils are proportional to the variance at smaller travel distances but converge to the same value at larger travel distances.  相似文献   

5.
Lawrence Livermore National Laboratory (LLNL) uses a cost-effective sampling (CES) methodology to evaluate and review ground water contaminant data and optimize the site's ground water monitoring plan. The CES methodology is part of LLNL's regulatory approved compliance monitoring plan (Lamarre et al., 1996). It allows LLNL to adjust the ground water sampling plan every quarter in response to changing conditions at the site. Since the use of the CES methodology has been approved by the appropriate regulatory agencies, such adjustments do not need additional regulatory approval. This permits LLNL to respond more quickly to changing conditions. The CES methodology bases the sampling frequency for each location on trend, variability, and magnitude statistics describing the contaminants at that location, and on the input of the technical staff (hydrologists, chemists, statisticians, and project leaders). After initial setup is complete, each application of CES takes only a few days for as many as 400 wells. Effective use of the CES methodology requires sufficient data, an understanding of contaminant transport at the site, and an adequate number of monitoring wells downgradient of the contamination. The initial implementation of CES at LLNL in 1992 produced a 40% reduction in the required number of annual routine ground water samples at LLNL. This has saved LLNL $390,000 annually in sampling, analysis, and data management costs.  相似文献   

6.
Existing long-term groundwater monitoring programs can be optimized to increase their effectiveness/efficiency with the potential to generate considerable cost savings. The optimization can be achieved through an overall evaluation of conditions of the contaminant plume and the monitoring network, focused spatial and temporal sampling analyses, and automated and efficient management of data, analyses, and reporting. Version 2.0 of the Monitoring and Remediation Optimization System (MAROS) software, by integrating long-term monitoring analysis strategies and innovative optimization methods with a data management, processing, and reporting system, allows site managers to quickly and readily develop cost-effective long-term groundwater monitoring plans. The MAROS optimization strategy consists of a hierarchical combination of analysis methods essential to the decision-making process. Analyses are performed in three phases: 1) evaluating site information and historical monitoring data to obtain local concentration trends and an overview of the plume status; 2) developing optimal sampling plans for future monitoring at the site with innovative optimization methods; and 3) assessing the statistical sufficiency of the sampling plans to provide insights into the future performance of the monitoring program. Two case studies are presented to demonstrate the usefulness of the developed techniques and the rigor of the software.  相似文献   

7.
Existing long-term groundwater monitoring programs can be optimized to increase their effectiveness/efficiency with the potential to generate considerable cost savings. The optimization can be achieved through an overall evaluation of conditions of the contaminant plume and the monitoring network, focused spatial and temporal sampling analyses, and automated and efficient management of data, analyses, and reporting. Version 2.0 of the Monitoring and Remediation Optimization System (MAROS) software, by integrating long-term monitoring analysis strategies and innovative optimization methods with a data management, processing, and reporting system, allows site managers to quickly and readily develop cost-effective long-term groundwater monitoring plans. The MAROS optimization strategy consists of a hierarchical combination of analysis methods essential to the decision-making process. Analyses are performed in three phases: 1) evaluating site information and historical monitoring data to obtain local concentration trends and an overview of the plume status; 2) developing optimal sampling plans for future monitoring at the site with innovative optimization methods; and 3) assessing the statistical sufficiency of the sampling plans to provide insights into the future performance of the monitoring program. Two case studies are presented to demonstrate the usefulness of the developed techniques and the rigor of the software.  相似文献   

8.
A biphasic nonlinear mathematical model is proposed for the mass transport that occurs during constant flow-rate infusions into brain tissue. The model takes into account geometric and material nonlinearities and a hydraulic conductivity dependent upon strain. The biphasic and convective–diffusive transport equations were implemented in a custom-written code assuming spherical symmetry and using an updated Lagrangian finite element algorithm. Results of the model indicate that the inclusion of these nonlinearities produced modest changes in the interstitial concentration but important variations in drug penetration and bulk concentration. Increased penetration of the drug but smaller bulk concentrations were obtained at smaller strains caused by combination of parameters such as increased Young’s modulus and initial hydraulic conductivity. This indicates that simulations of constant flow-rate infusions under the assumption of infinitesimal deformations or rigidity of the tissue may yield lower bulk concentrations near the infusion cavity and over-predictions of the penetration of the infused agent. The analyses also showed that decrease in the infusion flow rate of a fixed amount of drug results in increased penetration of the infused agent. From the clinical point-of-view, this may promote a safer infusion that delivers the therapeutic range over the desired volume while avoiding damage to the tissue by minimizing deformation and strain.  相似文献   

9.
Much modern work in phylogenetics depends on statistical sampling approaches to phylogeny construction to estimate probability distributions of possible trees for any given input data set. Our theoretical understanding of sampling approaches to phylogenetics remains far less developed than that for optimization approaches, however, particularly with regard to the number of sampling steps needed to produce accurate samples of tree partition functions. Despite the many advantages in principle of being able to sample trees from sophisticated probabilistic models, we have little theoretical basis for concluding that the prevailing sampling approaches do in fact yield accurate samples from those models within realistic numbers of steps. We propose a novel approach to phylogenetic sampling intended to be both efficient in practice and more amenable to theoretical analysis than the prevailing methods. The method depends on replacing the standard tree rearrangement moves with an alternative Markov model in which one solves a theoretically hard but practically tractable optimization problem on each step of sampling. The resulting method can be applied to a broad range of standard probability models, yielding practical algorithms for efficient sampling and rigorous proofs of accurate sampling for heated versions of some important special cases. We demonstrate the efficiency and versatility of the method by an analysis of uncertainty in tree inference over varying input sizes. In addition to providing a new practical method for phylogenetic sampling, the technique is likely to prove applicable to many similar problems involving sampling over combinatorial objects weighted by a likelihood model.  相似文献   

10.
 In recent years the genetic algorithm (GA) was used successfully to solve many optimization problems. One of the most difficult questions of applying GA to a particular problem is that of coding. In this paper a scheme is derived to optimize one aspect of the coding in an automatic fashion. This is done by using a high cardinality alphabet and optimizing the meaning of the letters. The scheme is especially well suited in cases where a number of similar problems need to be solved. The use of the scheme is demonstrated with such a group of problems: the simplified problem of navigating a ‘robot’ in a ‘room.’ It is shown that for the sample problem family the proposed algorithm is superior to the canonical GA. Received: 26 August 1994/Accepted in revised form: 13 January 1995  相似文献   

11.
12.
Root hydraulic conductivity has been shown to decrease under phosphorus (P) deficiency. This study Investigated how the formation of aerenchyma is related to this change. Root anatomy, as well as root hydraulic conductivity was studied In maize (Zea mays L.) roots under different phosphorus nutrition conditions. Plant roots under P stress showed enhanced degradation of cortical cells and the aerenchyma formation was associated with their reduced root hydraulic conductivity, supporting our hypothesis that air spaces that form in the cortex of phosphorusstressed roots Impede the radial transport of water in a root cylinder. Further evidence came from the variation In aerenchyma formation due to genotypic differences. Five maize inbred lines with different porosity in their root cortex showed a significant negative correlation with their root hydraulic conductivity. Shoot relative water content was also found lower In P-deficient maize plants than that in P-sufficient ones when such treatment was prolonged enough, suggesting a limitation of water transport due to lowered root hydraulic conductivity of P-deficient plants.  相似文献   

13.
The self-organizing map (SOM), as a kind of unsupervised neural network, has been used for both static data management and dynamic data analysis. To further exploit its search abilities, in this paper we propose an SOM-based algorithm (SOMS) for optimization problems involving both static and dynamic functions. Furthermore, a new SOM weight updating rule is proposed to enhance the learning efficiency; this may dynamically adjust the neighborhood function for the SOM in learning system parameters. As a demonstration, the proposed SOMS is applied to function optimization and also dynamic trajectory prediction, and its performance compared with that of the genetic algorithm (GA) due to the similar ways both methods conduct searches.  相似文献   

14.
Waste stabilization ponds (WSPs) have been used extensively to provide wastewater treatment throughout the world. However, no rigorous assessment of WSPs that account for cost in addition to hydrodynamics and treatment efficiency has been performed. A study was conducted that utilized computational fluid dynamics (CFD) coupled with an optimization program to optimize the selection of the best WSP configuration based on cost and treatment efficiency. The results of monitoring the fecal coliform concentration at the reactor outlet showed that the conventional 70% pond-width baffle pond design is not consistently the best pond configuration as previously reported in the literature. The target effluent log reduction can be achieved by reducing the amount of construction material and tolerating some degree of fluid mixing within the pond. As expected, the multi-objective genetic algorithm optimization did produce a lower-cost WSP design compared to a SIMPLEX optimization algorithm, however, with only a marginal increase in the effluent microbial log reduction. Several other designs generated by the CFD/optimization model showed that both shorter and longer baffles, alternative depths, and reactor length to width ratios could improve the hydraulic efficiency of the ponds at a reduced overall construction cost.  相似文献   

15.
Xylem vessel structure changes as trees grow and mature. Age‐ and development‐related changes in xylem structure are likely related to changes in hydraulic function. We examined whether hydraulic function, including hydraulic conductivity and vulnerability to water‐stress‐induced xylem embolism, changed over the course of cambial development in the stems of 17 tree species. We compared current‐year growth of young (1–4 years), intermediate (2–7 years), and older (3–10 years) stems occurring in series along branches. Diffuse and ring porous species were examined, but nearly all species produced only diffuse porous xylem in the distal branches that were examined irrespective of their mature xylem porosity type. Vessel diameter and length increased with cambial age. Xylem became both more conductive and more cavitation resistant with cambial age. Ring porous species had longer and wider vessels and xylem that had higher conductivity and was more vulnerable to cavitation; however, these differences between porosity types were not present in young stem samples. Understanding plant hydraulic function and architecture requires the sampling of multiple‐aged tissues because plants may vary considerably in their xylem structural and functional traits throughout the plant body, even over relatively short distances and closely aged tissues.  相似文献   

16.
This paper studies the application of evolutionary algorithms for bi-objective travelling salesman problem. Two evolutionary algorithms, including estimation of distribution algorithm (EDA) and genetic algorithm (GA), are considered. The solution to this problem is a set of trade-off alternatives. The problem is solved by optimizing the order of the cities so as to simultaneously minimize the two objectives of travelling distance and travelling cost incurred by the travelling salesman. In this paper, binary-representation-based evolutionary algorithms are replaced with an integer-representation. Three existing EDAs are altered to use this integer-representation, namely restricted Boltzmann machine (RBM), univariate marginal distribution algorithm (UMDA), and population-based incremental learning (PBIL). Each city is associated with a representative integer, and the probability of any of this representative integer to be located in any position of the chromosome is constructed through the modeling approach of the EDAs. New sequences of cities are obtained by sampling from the probabilistic model. A refinement operator and a local search operator are proposed in this piece of work. The EDAs are subsequently hybridized with GA in order to complement the limitations of both algorithms. The effect that each of these operators has on the quality of the solutions are investigated. Empirical results show that the hybrid algorithms are capable of finding a set of good trade-off solutions.  相似文献   

17.
Information on the distribution of multiple species in a common landscape is fundamental to effective conservation and management. However, distribution data are expensive to obtain and often limited to high‐profile species in a system. A recently developed technique, environmental DNA (eDNA) sampling, has been shown to be more sensitive than traditional detection methods for many aquatic species. A second and perhaps underappreciated benefit of eDNA sampling is that a sample originally collected to determine the presence of one species can be re‐analyzed to detect additional taxa without additional field effort. We developed an eDNA assay for the western pearlshell mussel (Margaritifera falcata) and evaluated its effectiveness by analyzing previously collected eDNA samples that were annotated with information including sample location and deposited in a central repository. The eDNA samples were initially collected to determine habitat occupancy by nonbenthic fish species at sites that were in the vicinity of locations recently occupied by western pearlshell. These repurposed eDNA samples produced results congruent with historical western pearlshell surveys and permitted a more precise delineation of the extent of local populations. That a sampling protocol designed to detect fish was also successful for detecting a freshwater mussel suggests that rapidly accumulating collections of eDNA samples can be repurposed to enhance the efficiency and cost‐effectiveness of aquatic biodiversity monitoring.  相似文献   

18.
19.
Iterative pass optimization of sequence data   总被引:3,自引:1,他引:2  
The problem of determining the minimum-cost hypothetical ancestral sequences for a given cladogram is known to be NP-complete. This "tree alignment" problem has motivated the considerable effort placed in multiple sequence alignment procedures. Wheeler in 1996 proposed a heuristic method, direct optimization, to calculate cladogram costs without the intervention of multiple sequence alignment. This method, though more efficient in time and more effective in cladogram length than many alignment-based procedures, greedily optimizes nodes based on descendent information only. In their proposal of an exact multiple alignment solution, Sankoff et al. in 1976 described a heuristic procedure--the iterative improvement method--to create alignments at internal nodes by solving a series of median problems. The combination of a three-sequence direct optimization with iterative improvement and a branch-length-based cladogram cost procedure, provides an algorithm that frequently results in superior (i.e., lower) cladogram costs. This iterative pass optimization is both computation and memory intensive, but economies can be made to reduce this burden. An example in arthropod systematics is discussed.  相似文献   

20.
Traffic optimization of railroad networks was considered using an algorithm that was biologically inspired by an amoeba-like organism, plasmodium of the true slime mold, Physarum polycephalum. The organism developed a transportation network consisting of a tubular structure to transport protoplasm. It was reported that plasmodium can find the shortest path interconnecting multiple food sites during an adaptation process (Nakagaki et al., 2001. Biophys. Chem. 92, 47-52). By mimicking the adaptation process a path finding algorithm was developed by Tero et al. (2007). In this paper, the algorithm is newly modified for applications of traffic distribution optimization in transportation networks of infrastructure such as railroads under the constraint that the network topology is given. Application of the algorithm to a railroad in metropolitan Tokyo, Japan is demonstrated. The results are evaluated using three performance functions related to cost, traveling efficiency, and network weakness. The traffic distribution suggests that the modified Physarum algorithm balances the performances under a certain parameter range, indicating a biological process.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号