首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Areal up-scaling at reef-scale of organic and inorganic metabolism is possible using in-situ measurements and remote sensing data providing the extent of each bottom type inside the reef. Using a SPOT image and published values of metabolism, the gross production (93,560᎒3 kg C year-1), excess production (10,017᎒3 kg C year-1) and calcification (165,348᎒3 kg CaCO3 year-1) over 35 km2 of coral reef environment in Moorea Island (French Polynesia) are estimated. While the computations are straightforward, certain assumptions must be made in order to conduct the scaling exercise. The exercise is valid only if the metabolism of reef benthos is additive through increasing spatial scale. Despite the difficulty of quantitatively assessing our extrapolations, spatial additivity seems to represent the reality. The other limitation is that the reef must be considered as a closed system, in an equilibrium state supposedly accurately described by the few available in-situ measurements. To consider the reef an open system, long-term metabolic measurements coupled with knowledge of oceanic and land forcing processes are required. These theoretical considerations point to the necessity of integrated multi-scale studies based on both remote sensing and in-situ data in order to better understand the productivity and calcification of reefs in the current global change context.  相似文献   

2.
With the popularization and development of cloud computing, lots of scientific computing applications are conducted in cloud environments. However, current application scenario of scientific computing is also becoming increasingly dynamic and complicated, such as unpredictable submission times of jobs, different priorities of jobs, deadlines and budget constraints of executing jobs. Thus, how to perform scientific computing efficiently in cloud has become an urgent problem. To address this problem, we design an elastic resource provisioning and task scheduling mechanism to perform scientific workflow jobs in cloud. The goal of this mechanism is to complete as many high-priority workflow jobs as possible under budget and deadline constraints. This mechanism consists of four steps: job preprocessing, job admission control, elastic resource provisioning and task scheduling. We perform the evaluation with four kinds of real scientific workflow jobs under different budget constraints. We also consider the uncertainties of task runtime estimations, provisioning delays, and failures in evaluation. The results show that in most cases our mechanism achieves a better performance than other mechanisms. In addition, the uncertainties of task runtime estimations, VM provisioning delays, and task failures do not have major impact on the mechanism’s performance.  相似文献   

3.
Land use change has a major impact on goods and services that our environment supplies for society. While detailed ecological or biophysical field studies are needed to quantify the exact amount of ecosystem service supply at local scales, such a monitoring might be unfeasible at the regional scale. Since field scale monitoring schemes for ecosystem services or ecosystem functioning are missing, proxy based indicators can help to assess the historic development of ecosystem services or ecosystem functioning at the regional scale. We show at the example of the historic development (1964–2004) in the district of Leipzig/Germany how land use/land cover data can be used to derive regional scale indicators for ecosystem functions. We focus thereby on two hypotheses: (1) the ecosystem functioning has degraded over time and (2) changes in land use configuration play an important role in this degradation. The study focuses on indicators for ecosystem functions related to (i) water purification by riparian buffer strips, (ii) pollination, (iii) food production and (iv) outdoor recreation. Each indicator builds on the analysis of land use configuration and land use composition information and is tested on sensitivity/robustness with respect to parameters which had to be estimated based on expert knowledge. We show that land use composition is an important aspect in our ecosystem service assessment. Although our study region is faced with a maximum land use change of 11% in the major land use classes between 1964 and 2004, we see a decrease of ecosystem function indicators up to 23%. The regional assessment shows an overall trend for degradation of ecosystem functioning from 1964 to 1984. This trend is reversed between 1984 and 1994 but the process slowed down until 2004 without reaching the level of 1964.  相似文献   

4.
Synopsis A visual census technique is described in which the results of three separate enumerations of fish at a site are combined to produce a best estimate of the fish fauna present. Its precision and accuracy are examined, and compared to those of censuses obtained by modifications of the technique. Visual censuses can display high repeatability, but they seldom (if ever) completely sample the fish present at a site. Accuracy varies with technique used. In our tests, the preferred method yielded 82% of species and 75% of individuals known to be present and potentially censurable at the time the observations were made. Visual censuses are of comparable accuracy to ichthyocide collections of unenclosed sites, but the two methods sample different components of the total fish fauna. It is important when using visual censuses to remember that their accuracy is not 100%.  相似文献   

5.
Cluster Computing - Cloud computing is a new computation technology that provides services to consumers and businesses. The main idea of Cloud computing is to present software and hardware services...  相似文献   

6.
Continuous summit-to-sea maps showing both land features and shallow-water coral reefs have been completed in Puerto Rico and the U.S. Virgin Islands, using circa 2000 Landsat 7 Enhanced Thematic Mapper (ETM+) Imagery. Continuous land/sea terrain was mapped by merging Digital Elevation Models (DEM) with satellite-derived bathymetry. Benthic habitat characterizations were created by unsupervised classifications of Landsat imagery clustered using field data, and produced maps with an estimated overall accuracy of>75% (Tau coefficient >0.65). These were merged with Geocover-LC (land use/land cover) data to create continuous land/ sea cover maps. Image pairs from different dates were analyzed using Principle Components Analysis (PCA) in order to detect areas of change in the marine environment over two different time intervals: 2000 to 2001, and 1991 to 2003. This activity demonstrates the capabilities of Landsat imagery to produce continuous summit-to-sea maps, as well as detect certain changes in the shallow-water marine environment, providing a valuable tool for efficient coastal zone monitoring and effective management and conservation.  相似文献   

7.
Effective environmental management requires monitoring programmes that provide specific links between changes in environmental conditions and ecosystem health. This article reviews the suitability of a range of bioindicators for use in monitoring programmes that link changes in water quality to changes in the condition of coral-reef ecosystems. From the literature, 21 candidate bioindicators were identified, whose responses to changes in water quality varied spatially and temporally; responses ranged from rapid (hours) changes within individual corals to long-term (years) changes in community composition. From this list, the most suitable bioindicators were identified by determining whether responses were (i) specific, (ii) monotonic, (iii) variable, (iv) practical and (v) ecologically relevant to management goals. For long-term monitoring programmes that aim to quantify the effects of chronic changes in water quality, 11 bioindicators were selected: symbiont photophysiology, colony brightness, tissue thickness and surface rugosity of massive corals, skeletal elemental and isotopic composition, abundance of macro-bioeroders, micro- and meiobenthic organisms such as foraminifera, coral recruitment, macroalgal cover, taxonomic richness of corals and the maximal depth of coral-reef development. For short-term monitoring programmes, or environmental impact assessments that aim to quantify the effects of acute changes in water quality, a subset of seven of these bioindicators were selected, including partial mortality. Their choice will depend on the specific objectives and the timeframe available for each monitoring programme. An assessment framework is presented to assist in the selection of bioindicators to quantify the effects of changing water quality on coral-reef ecosystems.  相似文献   

8.
Approximately one quarter of zooxanthellate coral species have a depth distribution from shallow waters (<30 m) down to mesophotic depths of 30-60 m. The deeper populations of such species are less likely to be affected by certain environmental perturbations, including high temperature/high irradiance causing coral bleaching. This has led to the hypothesis that deep populations may serve as refuges and a source of recruits for shallow reef habitats. The extent of vertical connectivity of reef coral species, however, is largely unquantified. Using 10 coral host microsatellite loci and sequences of the host mtDNA putative control region, as well as ribosomal DNA (rDNA) ITS2 sequences of the coral's algal endosymbionts (Symbiodinium), we examine population structure, connectivity and symbiont specificity in the brooding coral Seriatopora hystrix across a depth profile in both northwest (Scott Reef) and northeast Australia (Yonge Reef). Strong genetic structuring over depth was observed in both regions based on the microsatellite loci; however, Yonge Reef exhibited an additional partitioning of mtDNA lineages (associated with specific symbiont ITS2 types), whereas Scott Reef was dominated by a single mtDNA lineage (with no apparent host-symbiont specificity). Evidence for recruitment of larvae of deep water origin into shallow habitats was found at Scott Reef, suggesting that recovery of shallow water habitats may be aided by migration from deep water refuges. Conversely, no migration from the genetically divergent deep slope populations into the shallow habitats was evident at Yonge Reef, making recovery of shallow habitats from deeper waters at this location highly unlikely.  相似文献   

9.
10.

One of the technology for increasing the safety and welfare of humans in roads is Vehicular Cloud Computing (VCC). This technology can utilize cloud computing advantages in the Vehicular Ad Hoc Network (VANET). VCC by utilizing modern equipment along with current vehicles, can play a significant role in the area of smart transportation systems. Given the potential of this technology, effective methods for managing existing resources and providing the expected service quality that is essential for such an environment are not yet available as it should. One of the most important barriers to providing such solutions seems to be resource constraints and very high dynamics in vehicles in VCC. In this article, based on virtualization and taking into account the environment with these features, we propose simple ways to manage resources better and improve the quality of service. We were able to achieve better results in simulation than previous methods by providing a flexible data structure to control the important data in the environment effectively. To illustrate the impact of the proposed methods, we compared them with some of the most important methods in this context, and we used SUMO 1.2.0 and MATLAB R2019a software to simulate them. Simulation results indicate that the proposed methods provide better results than previous methods in terms of resource efficiency, Quality of Service (QoS), and load balancing.

  相似文献   

11.
Challenges in using land use and land cover data for global change studies   总被引:5,自引:0,他引:5  
Land use and land cover data play a central role in climate change assessments. These data originate from different sources and inventory techniques. Each source of land use/cover data has its own domain of applicability and quality standards. Often data are selected without explicitly considering the suitability of the data for the specific application, the bias originating from data inventory and aggregation, and the effects of the uncertainty in the data on the results of the assessment. Uncertainties due to data selection and handling can be in the same order of magnitude as uncertainties related to the representation of the processes under investigation. While acknowledging the differences in data sources and the causes of inconsistencies, several methods have been developed to optimally extract information from the data and document the uncertainties. These methods include data integration, improved validation techniques and harmonization of classification systems. Based on the data needs of global change studies and the data availability, recommendations are formulated aimed at optimal use of current data and focused efforts for additional data collection. These include: improved documentation using classification systems for land use/cover data; careful selection of data given the specific application and the use of appropriate scaling and aggregation methods. In addition, the data availability may be improved by the combination of different data sources to optimize information content while collection of additional data must focus on validation of available data sets and improved coverage of regions and land cover types with a high level of uncertainty. Specific attention in data collection should be given to the representation of land management (systems) and mosaic landscapes.  相似文献   

12.
Hydrodynamics and water-column properties were investigated off west-central Guam from July 2007 through January 2008. Rapid fluctuations, on time scales of 10s of min, in currents, temperature, salinity, and acoustic backscatter were observed to occur on sub-diurnal frequencies along more than 2 km of the fore reef but not at the reef crest. During periods characterized by higher sea-surface temperatures (SSTs), weaker wind forcing, smaller ocean surface waves, and greater thermal stratification, rapid decreases in temperature and concurrent rapid increases in salinity and acoustic backscatter coincided with onshore-directed near-bed currents and offshore-directed near-surface currents. During the study, these cool-water events, on average, lasted 2.3 h and decreased the water temperature 0.57 °C, increased the salinity 0.25 PSU, and were two orders of magnitude more prevalent during the summer season than the winter. During the summer season when the average satellite-derived SST anomaly was +0.63 °C, these cooling events, on average, lowered the temperature 1.14 °C along the fore reef but only 0.11 °C along the reef crest. The rapid shifts appear to be the result of internal tidal bores pumping cooler, more saline, higher-backscatter oceanic water from depths >50 m over cross-shore distances of 100 s of m into the warmer, less saline waters at depths of 20 m and shallower. Such internal bores appear to have the potential to buffer shallow coral reefs from predicted increases in SSTs by bringing cool, offshore water to shallow coral environments. These cooling internal bores may also provide additional benefits to offset stress such as supplying food to thermally stressed corals, reducing stress due to ultraviolet radiation and/or low salinity, and delivering coral larvae from deeper reefs not impacted by surface thermal stress. Thus, the presence of internal bores might be an important factor locally in the resilience of select coral reefs facing increased thermal stress.  相似文献   

13.
Synchronised and fluctuating reproduction by plant populations, called masting, is widespread in diverse taxonomic groups. Here, we propose a new method to explore the proximate mechanism of masting by combining spatiotemporal flowering data, biochemical analysis of resource allocation and mathematical modelling. Flowering data of 170 trees over 13 years showed the emergence of clustering with trees in a given cluster mutually synchronised in reproduction, which was successfully explained by resource budget models. Analysis of resources invested in the development of reproductive organs showed that parametric values used in the model are significantly different between nitrogen and carbon. Using a fully parameterised model, we showed that the observed flowering pattern is explained only when the interplay between nitrogen dynamics and climatic cues was considered. This result indicates that our approach successfully identified resource type‐specific roles on masting and that the method is suitable for a wide range of plant species.  相似文献   

14.
We attempted to obtain carbon sequestration maps of deciduous forests in Japan using detectable parameters from the Moderate Resolution Imaging Spectrometer (MODIS) sensor and to determine how the spatial pattern of carbon sequestration differs within the same forest ecosystem type. For this investigation, we firstly parameterized the MODIS algorithm at one flux tower site, Takayama, for the years 2002–2003. The MODIS algorithm could link flux-based net ecosystem productivity (NEP) with simple functions controlled by a thermal infrared band and a vegetation index. Second, the performance of the MODIS algorithm was validated through comparisons with the flux-based NEP at another flux tower site, Hitsujigaoka. The MODIS-based NEP at Hitsujigaoka was also within an accuracy of a flux-based NEP with R 2 of 0.879 and root mean square error of 1.64 gC m−2 day−1, regardless of canopy structure and age. The MODIS algorithm was noteworthy for its general applicability in different locations. Finally, we used the MODIS algorithm for the same forest ecosystem type in Japan for regional extrapolation of NEP. The MODIS-based NEP of deciduous forests in Japan showed great variance with 347 ± 288 gC m−2 year−1 in 2002, according to the stand structure and climatic condition of the year. Studies for quantification of ecosystem carbon balance need to consider variance, frequency and spatial distributions of NEP. Satellite remote sensing demonstrated the potential for the large-scale mapping of NEP.  相似文献   

15.
Objective To summarise the evidence supporting the use of rapid d-dimer testing combined with estimation of clinical probability to exclude the diagnosis of deep venous thrombosis among outpatients.Data sources Medline (June 1993 to December 2003), the Database of Abstracts and Reviews (DARE), and reference lists of studies in English.Selection of studies We selected 12 studies from among 84 reviewed. The selected studies included more than 5000 patients and used a rapid d-dimer assay and explicit criteria to classify cases as having low, intermediate, or high clinical probability of deep vein thrombosis of the lower extremity among consecutive outpatients.Review methods Diagnosis required objective confirmation, and untreated patients had to have at least three months of follow up. The outcome was objectively documented venous thromboembolism. Two authors independently abstracted data by using a data collection form.Results When the less sensitive SimpliRED d-dimer assay was used the three month incidence of venous thromboembolism was 0.5% (95% confidence interval 0.07% to 1.1%) among patients with a low clinical probability of deep vein thrombosis and normal d-dimer concentrations. When a highly sensitive d-dimer assay was used, the three month incidence of venous thromboembolism was 0.4% (0.04% to 1.1%) among outpatients with low or moderate clinical probability of deep vein thrombosis and a normal d-dimer concentration.Conclusions The combination of low clinical probability for deep vein thrombosis and a normal result from the SimpliRED d-dimer test safely excludes a diagnosis of acute venous thrombosis A normal result from a highly sensitive d-dimer test effectively rules out deep vein thrombosis among patients classified as having either low or moderate clinical probability of deep vein thrombosis.  相似文献   

16.
PurposeTo train and evaluate a very deep dilated residual network (DD-ResNet) for fast and consistent auto-segmentation of the clinical target volume (CTV) for breast cancer (BC) radiotherapy with big data.MethodsDD-ResNet was an end-to-end model enabling fast training and testing. We used big data comprising 800 patients who underwent breast-conserving therapy for evaluation. The CTV were validated by experienced radiation oncologists. We performed a fivefold cross-validation to test the performance of the model. The segmentation accuracy was quantified by the Dice similarity coefficient (DSC) and the Hausdorff distance (HD). The performance of the proposed model was evaluated against two different deep learning models: deep dilated convolutional neural network (DDCNN) and deep deconvolutional neural network (DDNN).ResultsMean DSC values of DD-ResNet (0.91 and 0.91) were higher than the other two networks (DDCNN: 0.85 and 0.85; DDNN: 0.88 and 0.87) for both right-sided and left-sided BC. It also has smaller mean HD values of 10.5 mm and 10.7 mm compared with DDCNN (15.1 mm and 15.6 mm) and DDNN (13.5 mm and 14.1 mm). Mean segmentation time was 4 s, 21 s and 15 s per patient with DDCNN, DDNN and DD-ResNet, respectively. The DD-ResNet was also superior with regard to results in the literature.ConclusionsThe proposed method could segment the CTV accurately with acceptable time consumption. It was invariant to the body size and shape of patients and could improve the consistency of target delineation and streamline radiotherapy workflows.  相似文献   

17.

Many consumers participate in the smart city via smart portable gadgets such as wearables, personal gadgets, mobile devices, or sensor systems. In the edge computation systems of IoT in the smart city, the fundamental difficulty of the sensor is to pick reliable participants. Since not all smart IoT gadgets are dedicated, certain intelligent IoT gadgets might destroy the networks or services deliberately and degrade the customer experience. A trust-based internet of things (TM-IoT) cloud computing method is proposed in this research. The problem is solved by choosing trustworthy partners to enhance the quality services of the IoT edging network in the Smart architectures. A smart device choice recommendation method based on the changing networks was developed. It applied the evolutionary concept of games to examine the reliability and durability of the technique of trust management presented in this article. The reliability and durability of the trustworthiness-managing system, the Lyapunov concept was applied.A real scenario for personal-health-control systems and air-qualitymonitoring and assessment in a smart city setting confirmed the efficiency of the confidence-management mechanism. Experiments have demonstrated that the methodology for trust administration suggested in this research plays a major part in promoting multi-intelligent gadget collaboration in the IoT edge computer system with an efficiency of 97%. It resists harmful threads against service suppliers more consistently and is ideal for the smart world's massive IoT edge computer system.

  相似文献   

18.
Short amplicon primers were redesigned for 17 microsatellite loci developed in St. Vincent's Amazon and six loci developed in blue-and-yellow macaw and tested using six species of Neotropical parrot. Polymorphism was observed at 12 loci in blue-and-yellow macaw, 10 in red-and-green macaw, 11 in scarlet macaw, 10 in chestnut-fronted macaw, 11 in red-bellied macaw and 16 in mealy parrot. Number of alleles per locus ranged from two to 23 and expected heterozygosity ranged from 0.05 to 0.95. The resulting multiplexed loci will be useful in evaluating genetic diversity, genetic structure and mating system in Neotropical parrots.  相似文献   

19.
Some general equations for stage-frequency estimation are presented and their applications discussed.Tukey' s (1958) jackknife technique is suggested for the calculation of the approximate variances associated with estimators of population parameters.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号