首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
Existing long-term groundwater monitoring programs can be optimized to increase their effectiveness/efficiency with the potential to generate considerable cost savings. The optimization can be achieved through an overall evaluation of conditions of the contaminant plume and the monitoring network, focused spatial and temporal sampling analyses, and automated and efficient management of data, analyses, and reporting. Version 2.0 of the Monitoring and Remediation Optimization System (MAROS) software, by integrating long-term monitoring analysis strategies and innovative optimization methods with a data management, processing, and reporting system, allows site managers to quickly and readily develop cost-effective long-term groundwater monitoring plans. The MAROS optimization strategy consists of a hierarchical combination of analysis methods essential to the decision-making process. Analyses are performed in three phases: 1) evaluating site information and historical monitoring data to obtain local concentration trends and an overview of the plume status; 2) developing optimal sampling plans for future monitoring at the site with innovative optimization methods; and 3) assessing the statistical sufficiency of the sampling plans to provide insights into the future performance of the monitoring program. Two case studies are presented to demonstrate the usefulness of the developed techniques and the rigor of the software.  相似文献   

2.
生物物种资源监测原则与指标及抽样设计方法   总被引:1,自引:0,他引:1  
生物物种资源监测是了解生物物种资源现状、开展生物物种资源保护与管理的基础工作和重要手段.阐述了生物物种资源监测的科学性原则、可操作性原则和持续性原则.提出了监测计划的制定程序;监测计划应充分考虑所具有的人力、资金和后勤保障等条件,并进行定期评估.分析了指示物种在物种资源监测中的作用与不足;认为应选择具有不同生态需求和生活史的生物类群作为监测对象.讨论了监测指标的选取方法;监测指标应可测量、有科学基础、易被公众接受、低成本和高效益;监测方法应具有科学性,能检测到相应的变化,应采用高效率、低成本的标准化监测方法.分析了现有监测计划在抽样设计方面存在的问题,探讨了空间变异性和可检测率对监测数据误差的影响及其处理方式,讨论了样本量确定和监测样地的大小、形状及位置设计.监测样地要有较好的代表性,能在有限的监测面积中较好地反映监测区域内群落种类组成与数量特征.最后,讨论了生物物种资源监测的尺度和标准化问题.  相似文献   

3.
An effective groundwater monitoring system can be implemented by the combined utilization of cone penetrometer (CPT), HydroPunch® sampling, and borehole geophysical methods. The combined techniques provide a cost‐effective method for the design of a groundwater monitoring system for geologists or hydrogeologists assessing a site. With the relatively high costs associated with determining groundwater quality for site assessments, coupled with regulatory agency compliance, these combined methods can provide an effective edge in an increasingly competitive environmental industry. CPT combined with HydroPunch sampling can delineate the horizontal and vertical extent and concentration of a contaminant plume, define the extent and thickness of a free product plume, define soil and aquifer characteristics, and aid in the proper selection of well location and screen placement. The use of borehole geophysics further enhances the interpretation provided from the CPT. The interpretation of borehole geophysics provides additional information about the deposition regime of the area of investigation and a more detailed investigation of the stratigraphy. The CPT and HydroPunch can be used in unconsolidated sediments, and HydroPunch sampling can be combined with a hollow‐stem auger system. Borehole geophysics can be run in almost any environment. CPT and borehole geophysics provide information on specific lithologic characteristics necessary to obtain a groundwater sample from vertically separated aquifers. The HydroPunch can obtain a discrete, chemically representative groundwater sample from the targeted aquifer. CPT and borehole geophysics can also be used to determine lithology and for correlation of equivalent stratas from one borehole or well to the next. Borehole geophysical interpretation also provides a means of determining not only the stratigraphy and lithology but also the aquifer parameters and the type of fluids in the aquifer. Hydrogeologic and geologic data obtained from using these three methods can be employed to maximize the cost‐effectiveness and design efficiency of a groundwater monitoring system. Proper location of wells and screened interval placements are determined by a coherent design process rather than by random chance. Two studies demonstrating the combined applications of CPT, HydroPunch, and borehole geophysics for the design and placement of groundwater monitoring wells are presented in the following discussion.  相似文献   

4.
Explosives are subject to several attenuation processes that potentially reduce concentrations in groundwater over time. Some of these processes are well defined, while others are poorly understood. The objective of the project was to optimize data collection and processing procedures for evaluation and implementation of monitored natural attenuation of explosives. After conducting experiments to optimize data quality, a protocol was established for quarterly monitoring of thirty wells over a 2-year period at a former waste disposal site. Microbial biomarkers and stable isotopes of nitrogen and carbon were explored as additional approaches to tracking attenuation processes. The project included a cone penetrometry sampling event to characterize site lithology and to obtain sample material for biomarker studies. A three-dimensional groundwater model was applied to conceptualize and predict future behavior of the contaminant plume. The groundwater monitoring data demonstrated declining concentrations of explosives over the 2 years. Biomarker data showed the potential for microbial degradation and provided an estimate of the degradation rate. Measuring stable isotopic fractions of nitrogen in TNT was a promising method of monitoring TNT attenuation. Overall, results of the demonstration suggest that monitored natural attenuation is a viable option that should be among the options considered for remediation of explosives-contaminated sites.  相似文献   

5.
Natural attenuation of benzene and dichloroethanes in groundwater contaminated by leachate from the West KL Avenue landfill in Kalamazoo, Michigan, was evaluated in three phases. Existing data from the previous site investigations were used to locate a series of high-resolution vertical profile samples. By analyzing data from the discrete vertical profile samples, the rates of attenuation of benzene and dichloroethanes in the plume were forecasted. Permanent monitoring wells were installed over the depth intervals associated with high concentrations in the vertical profile sampling. These wells were monitored over time to extract independent estimates of the rates of degradation of benzene and dichloroethanes. Estimates of first-order attenuation rate constants were obtained using two methods: a method due to Buscheck and Alcantar (1995), which is based on a one-dimensional steady-state analytical solution, and the tracer correction method of Wiedemeier et al. (1996). The rates of attenuation predicted from the vertical profile sampling were found to be in good agreement with the rates obtained from the permanent monitoring well data, indicating that the long-term behavior of the contaminant plumes is consistent with the initial forecast. The results also indicated that the natural attenuation of benzene, 1,1-dichloroethane (DCA), and 1,2-DCA was statistically significant (at the 0.05 level).  相似文献   

6.
A nationwide survey of chlorinated volatile organic compound (CVOC) plumes was conducted across a spectrum of sites from diverse hydrogeologic environments and contaminant release scenarios. The goal was to evaluate significant trends in the data that relate plume behavior to site variables (e.g., source strength, mean groundwater velocity, reductive dehalogenation regime) through correlation and population analyses. Data from 65 sites (government facilities, dry cleaners, landfills) were analyzed, yielding 247 individual CVOC plumes by compound. Data analyses revealed several trends, notably correlations between plume length and maximum observed concentration (presumably reflecting the source term) and mean groundwater velocities. Reductive dehalogenation, indicated by daughter products and groundwater geochemistry, appears to exert a relatively subtle effect on plume length, apparent only after the contributions of source strength and groundwater velocity are factored out. CVOC properties (Koc, Henry's Law constant) exert significant effects on variability in maximum observed concentrations between sites but hold little influence on plume length. Probabilistic plume modeling, entailing Monte Carlo simulation of an analytical solution for average plume behavior with parameter distributions derived from site data, was used to produce a synthetic plume set for comparison with field data. Modeling results exhibited good agreement with field data in terms of parameter sensitivity.  相似文献   

7.
Poplar and willow tree stands were installed in 2003 at a site in Raleigh, North Carolina containing total petroleum hydrocarbon – contaminated groundwater. The objective was groundwater uptake and plume control. The water table was 5 to 6 m below ground surface (bgs) and therefore methods were used to encourage deep root development. Growth rates, rooting depth and sap flow were measured for trees in Plot A located in the center of the plume and in Plot B peripheral to the plume. The trees were initially sub-irrigated with vertically installed drip-lines and by 2005 had roots 4 to 5 m bgs. Water balance calculations suggested groundwater uptake. In 2007, the average sap flow was higher for Plot B (~59 L per day per tree) than for Plot A (~23 L per day per tree), probably as a result of TPH-induced stress in Plot A. Nevertheless, the estimated rate of groundwater uptake for Plot A was sufficient, relative to the calculated rate of groundwater flux beneath the stand, that a high level of plume control was achieved based on MODFLOW modeling results. Down-gradient groundwater monitoring wells installed in late 2011 should provide quantitative data for plume control.  相似文献   

8.
Tert-butyl alcohol (TBA) may be present in groundwater as an original component of leaked gasoline, or as a degradation product of methyl tert-butyl ether (MTBE). Evidence for natural attenuation of TBA in groundwater is presented from a chemical plant in Pasadena, Texas. Shallow groundwater in several areas of the plant has been affected by historic leaks and spills of TBA. A decade of regular groundwater monitoring of one groundwater plume, consisting primarily of TBA, shows generally declining concentrations and a plume area that is shrinking. Natural attenuation mechanisms are limiting the advective transport of TBA. The principal mechanism of attenuation in this case is probably biodegradation as the other physical components of natural attenuation (dilution, dispersion, diffusion, adsorption, chemical reactions, and volatilization) cannot explain the behavior of the plume over time. Biodegradation was also indicated by the enrichment of stable carbon isotope composition (13C/12C) of TBA along the flow path. Preliminary dissolved gas and electron acceptor analyses indicate the groundwater is at least under sulfate reducing condition in the core of the plume and the process responsible for biodegradation of TBA may include fermentation under aerobic (plume fringes) and possible anaerobic conditions. This case history demonstrates that natural attenuation of TBA is important, and can be used as a groundwater management tool at this site.  相似文献   

9.
Question: How can long-term monitoring of hydrological and ecological parameters support management strategies aimed towards wetland restoration and re-creation in a complex hydrological system? Location: Newham Bog National Nature Reserve, Northumberland, UK, a site with a long history of active management, and recorded as drought-sensitive over the last 100 years. Methods: Water level readings are correlated with longer-term hydrological databases, and these data related to vegetation data collected intermittently over a 12 year period. Two analyses are undertaken: (1) a composite DCA analysis of 1993 and 2002 survey data to assess plant community transitions within the wetland and over time, and (2) analysis of recent vegetation data to explore wider vegetation gradients. This allows (3) communities to be classified using NVC classes and (4) integrated with revised Ellenberg F-values. Results: Drought impact and subsequent hydrological recovery over a 22-year period are quantified. Vegetation data display strong moisture and successional gradients. Analysis shows a shift from grassland communities toward mire communities across much of the site. Conclusion: The site is regionally unique in that it has a detailed long-term monitoring record. Hydrological data and vegetation survey have allowed the impact of the most recent ‘groundwater’drought (1989–1997) to be quantified. This information on system resilience, combined with eco-hydrological analyses of plant community-water regime/quality relationships, provide a basis for recommendations concerning conservation and restoration.  相似文献   

10.
This study presents a method for identifying cost effective sampling designs for long-term monitoring of remediation of groundwater over multiple monitoring periods under uncertain flow conditions. A contaminant transport model is used to simulate plume migration under many equally likely stochastic hydraulic conductivity fields and provides representative samples of contaminant concentrations. Monitoring costs are minimized under a constraint to meet an acceptable level of error in the estimation of total mass for multiple contaminants simultaneously over many equiprobable realizations of hydraulic conductivity field. A new myopic heuristic algorithm (MS-ER) that combines a new error-reducing search neighborhood is developed to solve the optimization problem. A simulated annealing algorithm using the error-reducing neighborhood (SA-ER) and a genetic algorithm (GA) are also considered for solving the optimization problem. The method is applied to a hypothetical aquifer where enhanced anaerobic bioremediation of four toxic chlorinated ethene species is modeled using a complex contaminant transport model. The MS-ER algorithm consistently performed better in multiple trials of each algorithm when compared to SA-ER and GA. The best design of MS-ER algorithm produced a savings of nearly 25% in project cost over a conservative sampling plan that uses all possible locations and samples.  相似文献   

11.
This study presents a method for identifying cost effective sampling designs for long-term monitoring of remediation of groundwater over multiple monitoring periods under uncertain flow conditions. A contaminant transport model is used to simulate plume migration under many equally likely stochastic hydraulic conductivity fields and provides representative samples of contaminant concentrations. Monitoring costs are minimized under a constraint to meet an acceptable level of error in the estimation of total mass for multiple contaminants simultaneously over many equiprobable realizations of hydraulic conductivity field. A new myopic heuristic algorithm (MS-ER) that combines a new error-reducing search neighborhood is developed to solve the optimization problem. A simulated annealing algorithm using the error-reducing neighborhood (SA-ER) and a genetic algorithm (GA) are also considered for solving the optimization problem. The method is applied to a hypothetical aquifer where enhanced anaerobic bioremediation of four toxic chlorinated ethene species is modeled using a complex contaminant transport model. The MS-ER algorithm consistently performed better in multiple trials of each algorithm when compared to SA-ER and GA. The best design of MS-ER algorithm produced a savings of nearly 25% in project cost over a conservative sampling plan that uses all possible locations and samples.  相似文献   

12.
Long term monitoring optimization (LTMO) has proved a valuable method for reducing costs, assuring proper remedial decisions are made, and streamlining data collection and management requirements over the life of a monitoring program. A three-tiered approach for LTMO has been developed that combines a qualitative evaluation with an evaluation of temporal trends in contaminant concentrations, and a spatial statistical analysis. The results of the three evaluations are combined to determine the degree to which a monitoring program addresses the monitoring program objectives, and a decision algorithm is applied to assess the optimal frequency of monitoring and spatial distribution of the components of the monitoring network. Ultimately, application of the three-tiered method can be used to identify potential modifications in sampling locations and sampling frequency that will optimally meet monitoring objectives. To date, the three-tiered approach has been applied to monitoring programs at 18 sites and has been used to identify a potential average reduction of over one-third of well sampling events per year. This paper discusses the three-tiered approach methodology, including data compilation and site screening, qualitative evaluation decision logic, temporal trend evaluation, and spatial statistical analysis, illustrated using the results of a case study site. Additionally, results of multiple applications of the three-tiered LTMO approach are summarized, and future work is discussed.  相似文献   

13.
Long term monitoring optimization (LTMO) has proved a valuable method for reducing costs, assuring proper remedial decisions are made, and streamlining data collection and management requirements over the life of a monitoring program. A three-tiered approach for LTMO has been developed that combines a qualitative evaluation with an evaluation of temporal trends in contaminant concentrations, and a spatial statistical analysis. The results of the three evaluations are combined to determine the degree to which a monitoring program addresses the monitoring program objectives, and a decision algorithm is applied to assess the optimal frequency of monitoring and spatial distribution of the components of the monitoring network. Ultimately, application of the three-tiered method can be used to identify potential modifications in sampling locations and sampling frequency that will optimally meet monitoring objectives. To date, the three-tiered approach has been applied to monitoring programs at 18 sites and has been used to identify a potential average reduction of over one-third of well sampling events per year. This paper discusses the three-tiered approach methodology, including data compilation and site screening, qualitative evaluation decision logic, temporal trend evaluation, and spatial statistical analysis, illustrated using the results of a case study site. Additionally, results of multiple applications of the three-tiered LTMO approach are summarized, and future work is discussed.  相似文献   

14.
Techniques for monitored natural attenuation usually produce large complex datasets that are difficult to interpret. Here, human health risk assessments and multivariate statistical analyses are combined to extract and analyze useful information from large monitoring datasets to identify the main pollutants in a petroleum-contaminated aquifer in northeast China and the main biogeochemical processes affecting the pollutants. The data included organic and inorganic geochemical species concentrations, physicochemical indicators, C and S stable isotope data collected for four years of more than 10 monitoring. The health risk assessment indicated that benzene was a representative pollutant. Cluster analysis classified the groundwater samples into two groups and indicated strong biodegradation occurred near the core and upgradient of the petroleum hydrocarbon plume. The factors explaining most variability were extracted by principal component analysis, which correlated with biodegradation and mineral dissolution processes. The factor scores and spatial distributions of hydrogeochemical and isotope indicators confirmed that biodegradation effects weakened and mineral dissolution strengthened upgradient to downgradient of the contaminated plume. The analysis method could be useful for rapidly studying pollution characteristics and identifying biodegradation processes in contaminated aquifersfrom large complex datasets. The results will provide a basis for developing an enhanced bioremediation scheme for the study site.  相似文献   

15.
Land reclamation associated with natural gas development has become increasingly important to mitigate land surface disturbance in western North America. Since well pads occur on sites with multiple land use and ownership, the progress and outcomes of these efforts are of interest to multiple stakeholders including industry, practitioners and consultants, regulatory agents, private landowners, and the scientific community. Reclamation success criteria often vary within, and among, government agencies and across land ownership type. Typically, reclamation success of a well pad is judged by comparing vegetation cover from a single transect on the pad to a single transect in an adjacent reference site and data are collected by a large number of technicians with various field monitoring skills. We utilized “SamplePoint” image analysis software and a spatially balanced sampling design, called balanced acceptance sampling, to demonstrate how spatially explicit quantitative data can be used to determine if sites are meeting various reclamation success criteria and used chi‐square tests to show how sites in vegetation percent cover differ from a statistical standpoint. This method collects field data faster than traditional methods. We demonstrate how quantitative and spatially explicit data can be utilized by multiple stakeholders, how it can improve upon current reference site selection, how it can satisfy reclamation monitoring requirements for multiple regulatory agencies, how it may help improve future seed mix selection, and discuss how it may reduce costs for operations responsible for reclamation and how it may reduce observer bias.  相似文献   

16.
Lawrence Livermore National Laboratory (LLNL) uses a cost-effective sampling (CES) methodology to evaluate and review ground water contaminant data and optimize the site's ground water monitoring plan. The CES methodology is part of LLNL's regulatory approved compliance monitoring plan (Lamarre et al., 1996). It allows LLNL to adjust the ground water sampling plan every quarter in response to changing conditions at the site. Since the use of the CES methodology has been approved by the appropriate regulatory agencies, such adjustments do not need additional regulatory approval. This permits LLNL to respond more quickly to changing conditions. The CES methodology bases the sampling frequency for each location on trend, variability, and magnitude statistics describing the contaminants at that location, and on the input of the technical staff (hydrologists, chemists, statisticians, and project leaders). After initial setup is complete, each application of CES takes only a few days for as many as 400 wells. Effective use of the CES methodology requires sufficient data, an understanding of contaminant transport at the site, and an adequate number of monitoring wells downgradient of the contamination. The initial implementation of CES at LLNL in 1992 produced a 40% reduction in the required number of annual routine ground water samples at LLNL. This has saved LLNL $390,000 annually in sampling, analysis, and data management costs.  相似文献   

17.
Observational sampling methods provide clearly-defined guidelines for collection and analysis of behavioral data. In some situations, use of formal sampling regimes may be impractical or impossible. A case in point is data collection conducted by animal care staff at zoological parks and aquaria. Often, time is sufficiently limited that data collection is perceived as a task that cannot be accomplished given the normal constraints of the day. Here, we explore the efficacy and validity of using more variable and abridged sampling regimes in an effort to identify the appropriateness of such observation schemes for systematic monitoring of behavior. We describe the results of studies on three species (two polar bears, an Atlantic bottlenose dolphin calf, and two brown bears), conducted over a period of several years at the Brookfield Zoo, Brookfield, Illinois, USA. Data collection schemes varied both within and across groups in order to provide a basis of comparison. In all cases, there were significant differences based on sampling regime for rare behaviors (those that individually comprised <15% of the activity budget), but not for common behaviors. Subsampling from larger data sets indicated that data reliability increases with increasing observation number. We discuss the strengths and weaknesses of such sporadic sampling methods, and suggest that, in many instances such limited data collection may yet yield an accurate picture of animal activity and should not be overlooked as a viable management tool.  相似文献   

18.
The seawater intrusion is a widespread environmental problem of coastal aquifers where more than two third of the world’s population lives. The indiscriminate and unplanned groundwater withdrawal for fulfilling the growing freshwater needs of coastal regions causes this problem. Computer-based models are useful tools for achieving the optimal solution of seawater intrusion management problems. Various simulation and optimization modeling approaches have been used to solve the problems. Optimization approaches have been shown to be of great importance when combined with simulation models. A review on the combined applications of simulation and optimization modeling for the seawater intrusion management of the coastal aquifers are done and is presented in this paper. The reviews revealed that the simulation–optimization modeling approach is very suitable for achieving an optimal solution of seawater intrusion management problems even with a large number of variables. It is recommended that the future research should be directed toward improving the long-term hydraulic assessment by collecting and analyzing widespread spatial data, which can be done by increasing the observation and monitoring networks. The coupling of socioeconomic aspects in the seawater intrusion modeling would be another aspect which could be included in the future studies.  相似文献   

19.
DNA‐based techniques are increasingly used for measuring the biodiversity (species presence, identity, abundance and community composition) of terrestrial and aquatic ecosystems. While there are numerous reviews of molecular methods and bioinformatic steps, there has been little consideration of the methods used to collect samples upon which these later steps are based. This represents a critical knowledge gap, as methodologically sound field sampling is the foundation for subsequent analyses. We reviewed field sampling methods used for metabarcoding studies of both terrestrial and freshwater ecosystem biodiversity over a nearly three‐year period (n = 75). We found that 95% (n = 71) of these studies used subjective sampling methods and inappropriate field methods and/or failed to provide critical methodological information. It would be possible for researchers to replicate only 5% of the metabarcoding studies in our sample, a poorer level of reproducibility than for ecological studies in general. Our findings suggest greater attention to field sampling methods, and reporting is necessary in eDNA‐based studies of biodiversity to ensure robust outcomes and future reproducibility. Methods must be fully and accurately reported, and protocols developed that minimize subjectivity. Standardization of sampling protocols would be one way to help to improve reproducibility and have additional benefits in allowing compilation and comparison of data from across studies.  相似文献   

20.
大型野生动物种群数量估算的理想条件是使用数学模型以及严格的实验设计来选择样本。可是,野外条件状况往往违背数学模型假设前提,不可能随机地选择样本。于是,计算的结果不但不可靠,而且很可能没有意义。就野生动物管理来说,不需要获得一个准确的种群数量,只需一个长期的数量趋势,就足以指导相关管理工作。在中国大型哺乳动物长期监测还没有纳入常规。本文报道了位于青海省野牛沟和甘肃省阿克塞县两个野生有蹄类动物种群数量的长期趋势监测项目。我们这些年里一直用相同的方法持续监测,并明确了监测数值结果包含有不确定性。尽管存在不确定性,仍可以发现监测地点野生有蹄类动物种群变化趋势,这些结果可以帮助野生动物管理者据此变化及时作出相应管理计划。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号