首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Species distributions are already affected by climate change. Forecasting their long‐term evolution requires models with thoroughly assessed validation. Our aim here is to demonstrate that the sensitivity of such models to climate input characteristics may complicate their validation and introduce uncertainties in their predictions. In this study, we conducted a sensitivity analysis of a process‐based tree distribution model Phenofit to climate input characteristics. This analysis was conducted for two North American trees which differ greatly in their distribution and eight different types of climate input for the historic period which differ in their spatial (local or gridded data) and temporal (daily vs. monthly) resolution as well as their type (locally recorded, extrapolated or simulated by General Circulation Models). We show that the climate data resolution (spatial and temporal) and their type, highly affect the model predictions. The sensitivity analysis also revealed, the importance, for global climate change impact assessment, of (i) the daily variability of temperatures in modeling the biological processes shaping species distribution, (ii) climate data at high latitudes and elevations and (iii) climate data with high spatial resolution.  相似文献   

2.
Intergovernmental Panel on Climate Change (IPCC) Tier 1 methodologies commonly underpin project‐scale carbon accounting for changes in land use and management and are used in frameworks for Life Cycle Assessment and carbon footprinting of food and energy crops. These methodologies were intended for use at large spatial scales. This can introduce error in predictions at finer spatial scales. There is an urgent need for development and implementation of higher tier methodologies that can be applied at fine spatial scales (e.g. farm/project/plantation) for food and bioenergy crop greenhouse gas (GHG) accounting to facilitate decision making in the land‐based sectors. Higher tier methods have been defined by IPCC and must be well evaluated and operate across a range of domains (e.g. climate region, soil type, crop type, topography), and must account for land use transitions and management changes being implemented. Furthermore, the data required to calibrate and drive the models used at higher tiers need to be available and applicable at fine spatial resolution, covering the meteorological, soil, cropping system and management domains, with quantified uncertainties. Testing the reliability of the models will require data either from sites with repeated measurements or from chronosequences. We review current global capability for estimating changes in soil carbon at fine spatial scales and present a vision for a framework capable of quantifying land use change and management impacts on soil carbon, which could be used for addressing issues such as bioenergy and biofuel sustainability, food security, forest protection, and direct/indirect impacts of land use change. The aim of this framework is to provide a globally accepted standard of carbon measurement and modelling appropriate for GHG accounting that could be applied at project to national scales (allowing outputs to be scaled up to a country level), to address the impacts of land use and land management change on soil carbon.  相似文献   

3.
A global prognostic scheme of leaf onset using satellite data   总被引:2,自引:0,他引:2  
Leaf phenology describes the seasonal cycle of leaf functioning. Although it is essential for understanding the interactions between the biosphere, the climate, and biogeochemical cycles, it has received little attention in the modelling community at global scale. This article focuses on the prediction of spatial patterns of the climatological onset date of leaf growth for the decade 1983–93. It examines the possibility of extrapolating existing local models of leaf onset date to the global scale. Climate is the main variable that controls leaf phenology for a given biome at this scale, and satellite observations provide a unique means to study the seasonal cycle of canopies. We combine leaf onset dates retrieved from NOAA/AVHRR satellite NDVI with climate data and the DISCover land‐cover map to identify appropriate models, and determine their new parameters at a 0.5° spatial resolution. We define two main regions: at temperate and high latitudes leaf onset models are mainly dependent on temperature; at low latitudes they are controlled by water availability. Some local leaf onset models are no longer relevant at the global scale making their calibration impossible. Nevertheless, we define our unified model by retaining the model that best reproduced the spatial distribution of leaf onset dates for each biome. The main spatial patterns of leaf onset date are well simulated, such as the Sahelian gradient due to aridity and the high latitude gradient due to frost. At temperate and high latitudes, simulated onset dates are in good agreement with climatological observations; 62% of treated grid‐cells have a simulated leaf onset date within 10 days of the satellite observed onset date (which is also the temporal resolution of the NDVI data). In tropical areas, the subgrid heterogeneity of the phenology is larger and our model's predictive power is diminished. The difficulties encountered in the tropics are due to the ambiguity of the satellite signal interpretation and the low reliability of rainfall and soil moisture fields.  相似文献   

4.
Ecosystems provide life-sustaining services upon which human civilization depends, but their degradation largely continues unabated. Spatially explicit information on ecosystem services (ES) provision is required to better guide decision making, particularly for mountain systems, which are characterized by vertical gradients and isolation with high topographic complexity, making them particularly sensitive to global change. But while spatially explicit ES quantification and valuation allows the identification of areas of abundant or limited supply of and demand for ES, the accuracy and usefulness of the information varies considerably depending on the scale and methods used. Using four case studies from mountainous regions in Europe and the U.S., we quantify information gains and losses when mapping five ES - carbon sequestration, flood regulation, agricultural production, timber harvest, and scenic beauty - at coarse and fine resolution (250 m vs. 25 m in Europe and 300 m vs. 30 m in the U.S.). We analyze the effects of scale on ES estimates and their spatial pattern and show how these effects are related to different ES, terrain structure and model properties. ES estimates differ substantially between the fine and coarse resolution analyses in all case studies and across all services. This scale effect is not equally strong for all ES. We show that spatially explicit information about non-clustered, isolated ES tends to be lost at coarse resolution and against expectation, mainly in less rugged terrain, which calls for finer resolution assessments in such contexts. The effect of terrain ruggedness is also related to model properties such as dependency on land use-land cover data. We close with recommendations for mapping ES to make the resulting maps more comparable, and suggest a four-step approach to address the issue of scale when mapping ES that can deliver information to support ES-based decision making with greater accuracy and reliability.  相似文献   

5.
Space is a very important aspect in the simulation of biochemical systems; recently, the need for simulation algorithms able to cope with space is becoming more and more compelling. Complex and detailed models of biochemical systems need to deal with the movement of single molecules and particles, taking into consideration localized fluctuations, transportation phenomena, and diffusion. A common drawback of spatial models lies in their complexity: models can become very large, and their simulation could be time consuming, especially if we want to capture the systems behavior in a reliable way using stochastic methods in conjunction with a high spatial resolution. In order to deliver the promise done by systems biology to be able to understand a system as whole, we need to scale up the size of models we are able to simulate, moving from sequential to parallel simulation algorithms. In this paper, we analyze Smoldyn, a widely diffused algorithm for stochastic simulation of chemical reactions with spatial resolution and single molecule detail, and we propose an alternative, innovative implementation that exploits the parallelism of Graphics Processing Units (GPUs). The implementation executes the most computational demanding steps (computation of diffusion, unimolecular, and bimolecular reaction, as well as the most common cases of molecule-surface interaction) on the GPU, computing them in parallel on each molecule of the system. The implementation offers good speed-ups and real time, high quality graphics output  相似文献   

6.
Johansson AC  Lindahl E 《Proteins》2008,70(4):1332-1344
Studies of insertion and interactions of amino acids in lipid membranes are pivotal to our understanding of membrane protein structure and function. Calculating the insertion cost as a function of transmembrane helix sequence is thus an important step towards improved membrane protein prediction and eventually drug design. Here, we present position-dependent free energies of solvation for all amino acid analogs along the membrane normal. The profiles cover the entire region from bulk water to hydrophobic core, and were produced from all-atom molecular dynamics simulations. Experimental differences corresponding to mutations and costs for entire segments match experimental data well, and in addition the profiles provide the spatial resolution currently not available from experiments. Polar side-chains largely maintain their hydration and assume quite ordered conformations, which indicates the solvation cost is mainly entropic. The cost of solvating charged side-chains is not only significantly lower than for implicit solvation models, but also close to experiments, meaning these could well maintain their protonation states inside the membrane. The single notable exception to the experimental agreement is proline, which is quite expensive to introduce in vivo despite its hydrophobicity--a difference possibly explained by kinks making it harder to insert helices in the translocon.  相似文献   

7.
Cell adhesion and migration crucially depend on the transmission of actomyosin-generated forces through sites of focal adhesion to the extracellular matrix. Here we report experimental and computational advances in improving the resolution and reliability of traction force microscopy. First, we introduce the use of two differently colored nanobeads as fiducial markers in polyacrylamide gels and explain how the displacement field can be computationally extracted from the fluorescence data. Second, we present different improvements regarding standard methods for force reconstruction from the displacement field, which are the boundary element method, Fourier-transform traction cytometry, and traction reconstruction with point forces. Using extensive data simulation, we show that the spatial resolution of the boundary element method can be improved considerably by splitting the elastic field into near, intermediate, and far field. Fourier-transform traction cytometry requires considerably less computer time, but can achieve a comparable resolution only when combined with Wiener filtering or appropriate regularization schemes. Both methods tend to underestimate forces, especially at small adhesion sites. Traction reconstruction with point forces does not suffer from this limitation, but is only applicable with stationary and well-developed adhesion sites. Third, we combine these advances and for the first time reconstruct fibroblast traction with a spatial resolution of ∼1 μm.  相似文献   

8.
There is a growing body of public health research documenting how characteristics of neighborhoods are associated with differences in the health status of residents. However, little is known about how the spatial resolution of neighborhood observational data or community audits affects the identification of neighborhood differences in health. We developed a systematic neighborhood observation instrument for collecting data at very high spatial resolution (we observe each parcel independently) and used it to collect data in a low-income minority neighborhood in Dallas, TX. In addition, we collected data on the health status of individuals residing in this neighborhood. We then assessed the inter-rater reliability of the instrument and compared the costs and benefits of using data at this high spatial resolution. Our instrument provides a reliable and cost-effect method for collecting neighborhood observational data at high spatial resolution, which then allows researchers to explore the impact of varying geographic aggregations. Furthermore, these data facilitate a demonstration of the predictive accuracy of self-reported health status. We find that ordered logit models of health status using observational data at different spatial resolution produce different results. This implies a need to analyze the variation in correlative relationships at different geographic resolutions when there is no solid theoretical rational for choosing a particular resolution. We argue that neighborhood data at high spatial resolution greatly facilitates the evaluation of alternative geographic specifications in studies of neighborhood and health.  相似文献   

9.
Many cellular structures and organelles are too small to be properly resolved by conventional light microscopy. This is particularly true for dendritic spines and glial processes, which are very small, dynamic, and embedded in dense tissue, making it difficult to image them under realistic experimental conditions. Two-photon microscopy is currently the method of choice for imaging in thick living tissue preparations, both in acute brain slices and in vivo. However, the spatial resolution of a two-photon microscope, which is limited to ∼350 nm by the diffraction of light, is not sufficient for resolving many important details of neural morphology, such as the width of spine necks or thin glial processes. Recently developed superresolution approaches, such as stimulated emission depletion microscopy, have set new standards of optical resolution in imaging living tissue. However, the important goal of superresolution imaging with significant subdiffraction resolution has not yet been accomplished in acute brain slices. To overcome this limitation, we have developed a new microscope based on two-photon excitation and pulsed stimulated emission depletion microscopy, which provides unprecedented spatial resolution and excellent experimental access in acute brain slices using a long-working distance objective. The new microscope improves on the spatial resolution of a regular two-photon microscope by a factor of four to six, and it is compatible with time-lapse and simultaneous two-color superresolution imaging in living cells. We demonstrate the potential of this nanoscopy approach for brain slice physiology by imaging the morphology of dendritic spines and microglial cells well below the surface of acute brain slices.  相似文献   

10.
Generalized Spatial Dirichlet Process Models   总被引:1,自引:0,他引:1  
Many models for the study of point-referenced data explicitlyintroduce spatial random effects to capture residual spatialassociation. These spatial effects are customarily modelledas a zero-mean stationary Gaussian process. The spatial Dirichletprocess introduced by Gelfand et al. (2005) produces a randomspatial process which is neither Gaussian nor stationary. Rather,it varies about a process that is assumed to be stationary andGaussian. The spatial Dirichlet process arises as a probability-weightedcollection of random surfaces. This can be limiting for modellingand inferential purposes since it insists that a process realizationmust be one of these surfaces. We introduce a random distributionfor the spatial effects that allows different surface selectionat different sites. Moreover, we can specify the model so thatthe marginal distribution of the effect at each site still comesfrom a Dirichlet process. The development is offered constructively,providing a multivariate extension of the stick-breaking representationof the weights. We then introduce mixing using this generalizedspatial Dirichlet process. We illustrate with a simulated datasetof independent replications and note that we can embed the generalizedprocess within a dynamic model specification to eliminate theindependence assumption.  相似文献   

11.
The development of fluorescent indicators represented a revolution for life sciences. Genetically encoded and synthetic fluorophores with sensing abilities allowed the visualization of biologically relevant species with high spatial and temporal resolution. Synthetic dyes are of particular interest thanks to their high tunability and the wide range of measureable analytes. However, these molecules suffer several limitations related to small molecule behavior (poor solubility, difficulties in targeting, often no ratiometric imaging allowed). In this work we introduce the development of dendrimer-based sensors and present a procedure for pH measurement in vitro, in living cells and in vivo. We choose dendrimers as ideal platform for our sensors for their many desirable properties (monodispersity, tunable properties, multivalency) that made them a widely used scaffold for several biomedical devices. The conjugation of fluorescent pH indicators to the dendrimer scaffold led to an enhancement of their sensing performances. In particular dendrimers exhibit reduced cell leakage, improved intracellular targeting and allow ratiometric measurements. These novel sensors were successfully employed to measure pH in living HeLa cells and in vivo in mouse brain.  相似文献   

12.
Stochastic ecological network occupancy (SENO) models predict the probability that species will occur in a sample of an ecological network. In this review, we introduce SENO models as a means to fill a gap in the theoretical toolkit of ecologists. As input, SENO models use a topological interaction network and rates of colonization and extinction (including consumer effects) for each species. A SENO model then simulates the ecological network over time, resulting in a series of sub-networks that can be used to identify commonly encountered community modules. The proportion of time a species is present in a patch gives its expected probability of occurrence, whose sum across species gives expected species richness. To illustrate their utility, we provide simple examples of how SENO models can be used to investigate how topological complexity, species interactions, species traits, and spatial scale affect communities in space and time. They can categorize species as biodiversity facilitators, contributors, or inhibitors, making this approach promising for ecosystem-based management of invasive, threatened, or exploited species.  相似文献   

13.
Because the regulation of microcirculation in the cerebral cortex cannot be analyzed without measuring the blood flow dynamics and oxygen concentration in cerebral microvessels, we developed a fluorescence and phosphorescence system for estimating red blood cell velocity and oxygen tension in cerebral microcirculation noninvasively and continuously with high spatial resolution. Using red blood cells labeled with fluorescent isothiocyanate to visualize red cell distribution and using the oxygen quenching of Pd-meso-tetra-(4-carboxyphenyl)-porphyrin phosphorescence to measure oxygen tension enabled simultaneous measurement of blood velocity and oxygen tension. We examined how the measurement accuracy was affected by the spatial resolution and by the excitation laser light passing through the targeted microvessel and exciting the oxygen probe dye in the tissue beneath it. Focusing the excitation light into the microvessel stabilized the phosphorescence lifetime at each spatial resolution; moreover, it greatly reduced phosphorescence from the brain tissue. Animal experiments involving acute hemorrhagic shock demonstrated the feasibility of our system by showing that the changes in venular velocity and oxygen tension are synchronized to the change in mean arterial pressure. Our system measures the red cell velocity and oxygen concentration in the cerebral microcirculation by using the differences in luminescence and wavelength between fluorescence and phosphorescence, making it possible to easily acquire information about cerebral microcirculatory distribution and oxygen tension simultaneously.  相似文献   

14.
Field studies analyzing the stable isotope composition of xylem water are providing important information on ecosystem water relations. However, the capacity of stable isotopes to characterize the functioning of plants in their environment has not been fully explored because of methodological constraints on the extent and resolution at which samples could be collected and analysed. Here, we introduce an in situ method offering the potential to continuously monitor the stable isotope composition of tree xylem water via its vapour phase using a commercial laser‐based isotope analyser and compact microporous probes installed into the xylem. Our technique enables efficient high‐frequency measurement with intervals of only a few minutes per sample while eliminating the need for costly and cumbersome destructive collection of plant material and laboratory‐based processing. We present field observations of xylem water hydrogen and oxygen isotope compositions obtained over several days including a labelled irrigation event and compare them against results from concurrent destructive sampling with cryogenic distillation and mass spectrometric analysis. The data demonstrate that temporal changes as well as spatial patterns of integration in xylem water isotope composition can be resolved through direct measurement. The new technique can therefore present a valuable tool to study the hydraulic architecture and water utilization of trees.  相似文献   

15.
《Biophysical journal》2020,118(12):3026-3040
Currently, a significant barrier to building predictive models of cellular self-assembly processes is that molecular models cannot capture minutes-long dynamics that couple distinct components with active processes, whereas reaction-diffusion models cannot capture structures of molecular assembly. Here, we introduce the nonequilibrium reaction-diffusion self-assembly simulator (NERDSS), which addresses this spatiotemporal resolution gap. NERDSS integrates efficient reaction-diffusion algorithms into generalized software that operates on user-defined molecules through diffusion, binding and orientation, unbinding, chemical transformations, and spatial localization. By connecting the fast processes of binding with the slow timescales of large-scale assembly, NERDSS integrates molecular resolution with reversible formation of ordered, multisubunit complexes. NERDSS encodes models using rule-based formatting languages to facilitate model portability, usability, and reproducibility. Applying NERDSS to steps in clathrin-mediated endocytosis, we design multicomponent systems that can form lattices in solution or on the membrane, and we predict how stochastic but localized dephosphorylation of membrane lipids can drive lattice disassembly. The NERDSS simulations reveal the spatial constraints on lattice growth and the role of membrane localization and cooperativity in nucleating assembly. By modeling viral lattice assembly and recapitulating oscillations in protein expression levels for a circadian clock model, we illustrate the adaptability of NERDSS. NERDSS simulates user-defined assembly models that were previously inaccessible to existing software tools, with broad applications to predicting self-assembly in vivo and designing high-yield assemblies in vitro.  相似文献   

16.
An ab initio method for building structural models of proteins from x-ray solution scattering data is presented. Simulated annealing is employed to find a chain-compatible spatial distribution of dummy residues which fits the experimental scattering pattern up to a resolution of 0.5 nm. The efficiency of the method is illustrated by the ab initio reconstruction of models of several proteins, with known and unknown crystal structure, from experimental scattering data. The new method substantially improves the resolution and reliability of models derived from scattering data and makes solution scattering a useful technique in large-scale structural characterization of proteins.  相似文献   

17.
Summary The reliability of multi‐item scales has received a lot of attention in the psychometric literature, where a myriad of measures like the Cronbach's α or the Spearman–Brown formula have been proposed. Most of these measures, however, are based on very restrictive models that apply only to unidimensional instruments. In this article, we introduce two measures to quantify the reliability of multi‐item scales based on a more general model. We show that they capture two different aspects of the reliability problem and satisfy a minimum set of intuitive properties. The relevance and complementary value of the measures is studied and earlier approaches are placed in a broader theoretical framework. Finally, we apply them to investigate the reliability of the Positive and Negative Syndrome Scale, a rating scale for the assessment of the severity of schizophrenia.  相似文献   

18.
The interaction between a charged metal implant surface and a surrounding body fluid (electrolyte solution) leads to ion redistribution and thus to formation of an electrical double layer (EDL). The physical properties of the EDL contribute essentially to the formation of the complex implant-biosystem interface. Study of the EDL began in 1879 by Hermann von Helmholtz and still today remains a scientific challenge. The present mini review is focused on introducing the generalized Stern theory of an EDL, which takes into account the orientational ordering of water molecules. To ascertain the plausibility of the generalized Stern models described, we follow the classical model of Stern and introduce two Langevin models for spatial variation of the relative permittivity for point-like and finite sized ions. We attempt to uncover the subtle interplay between water ordering and finite sized ions and their impact on the electric potential near the charged implant surface. Two complementary effects appear to account for the spatial dependency of the relative permittivity near the charged implant surface — the dipole moment vectors of water molecules are predominantly oriented towards the surface and water molecules are depleted due to the accumulation of counterions. At the end the expressions for relative permittivity in both Langevin models were generalized by also taking into account the cavity and reaction field.  相似文献   

19.
20.
The current approach to using machine learning (ML) algorithms in healthcare is to either require clinician oversight for every use case or use their predictions without any human oversight. We explore a middle ground that lets ML algorithms abstain from making a prediction to simultaneously improve their reliability and reduce the burden placed on human experts. To this end, we present a general penalized loss minimization framework for training selective prediction-set (SPS) models, which choose to either output a prediction set or abstain. The resulting models abstain when the outcome is difficult to predict accurately, such as on subjects who are too different from the training data, and achieve higher accuracy on those they do give predictions for. We then introduce a model-agnostic, statistical inference procedure for the coverage rate of an SPS model that ensembles individual models trained using K-fold cross-validation. We find that SPS ensembles attain prediction-set coverage rates closer to the nominal level and have narrower confidence intervals for its marginal coverage rate. We apply our method to train neural networks that abstain more for out-of-sample images on the MNIST digit prediction task and achieve higher predictive accuracy for ICU patients compared to existing approaches.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号