首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Although adoption of newer Point-of-Care (POC) diagnostics is increasing, there is a significant challenge using POC diagnostics data to improve epidemiological models. In this work, we propose a method to process zip-code level POC datasets and apply these processed data to calibrate an epidemiological model. We specifically develop a calibration algorithm using simulated annealing and calibrate a parsimonious equation-based model of modified Susceptible-Infected-Recovered (SIR) dynamics. The results show that parsimonious models are remarkably effective in predicting the dynamics observed in the number of infected patients and our calibration algorithm is sufficiently capable of predicting peak loads observed in POC diagnostics data while staying within reasonable and empirical parameter ranges reported in the literature. Additionally, we explore the future use of the calibrated values by testing the correlation between peak load and population density from Census data. Our results show that linearity assumptions for the relationships among various factors can be misleading, therefore further data sources and analysis are needed to identify relationships between additional parameters and existing calibrated ones. Calibration approaches such as ours can determine the values of newly added parameters along with existing ones and enable policy-makers to make better multi-scale decisions.  相似文献   

2.
Uses of models of land use change are primary tools for analyzing the causes and consequences of land use changes, assessing the impacts of land use change on ecosystems and supporting land use planning and policy. However, no single model is able to capture all of key processes essential to explore land use change at different scales and make a full assessment of driving factors and impacts. Based on the multi-scale characteristics of land use change, combination and integration of currently existed models of land use change could be a feasible solution. Taken Sangong watershed as a case study, this paper describes an integrated methodology in which the conversion of land use and its effect model (CLUE), a spatially explicit land use change model, has been combined with a system dynamic model (SD) to analyze land use dynamics at different scales. A SD model is used to calculate area changes in demand for land types as a whole while a CLUE model is used to transfer these demands to land use patterns. Without the spatial consideration, the SD model ensures an appropriate treatment of macro-economic, demographic and technology developments, and changes in economic policies influencing the demand and supply for land use in a specific region. With CLUE model the land use change has been simulated at a high spatial resolution with the spatial consideration of land use suitability, spatial policies and restrictions to satisfy the balance between land use demand and supply. The application of the combination of SD and CLUE model in Sangong watershed suggests that this methodology have the ability to reflect the complex behaviors of land use system at different scales to some extent and be a useful tool for analysis of complex land use driving factors such as land use policies and assessment of its impacts on land use change. The established SD model was fitted or calibrated with the 1987–1998 data and validated with the 1998–2004 data; combining SD model with CLUE-S model, future land use scenarios were analyzed during 2004–2030. This work could be used for better understanding of the possible impacts of land use change on terrestrial ecosystem and provide scientific support for land use planning and managements of the watershed.  相似文献   

3.
SAmBA is a new software for the design of minimal experimental protocols using the notion of orthogonal arrays of strength 2. The main application of SAmBA is the search of protein crystallization conditions. Given a user input defining the relevant effectors/variables (e.g., pH, temperature, salts) and states (e.g., pH: 5, 6, 7 and 8), this software proposes an optimal set of experiments in which all tested variables and the pairwise interactions between them are symmetrically sampled. No a priori restrictions on the number and range of experimental variables is imposed. SAmBA consists of two complementary programs, SAm and BA, using a simulated annealing approach and a backtracking algorithm, respectively. The software is freely available as C code or as an interactive JAVA applet at http://igs-server.cnrs-mrs.fr . Proteins 29:252–257, 1997. © 1997 Wiley-Liss, Inc.  相似文献   

4.
营口市城市及村镇聚落增长与土地利用变化的模拟预测   总被引:6,自引:0,他引:6  
基于辽宁省营口市1988、1992、1997、2000和2004年5期Landsat TM卫星遥感影像数据,利用城市增长和土地利用变化模拟模型SLEUTH模拟预测了6种预案(当前趋势预案、无保护预案、适当保护预案、管理增长预案、生态可持续预案和区域及城市规划预案)下2005—2030年营口市城市及农村聚落的增长和土地利用变化情况.结果表明:1988—2004年,营口市城市及村镇聚落的增长面积为14.93 km2;1997—2004年,研究区水域、园地、矿山、耕地等土地类型面积的变化较大.2005—2030年,生态可持续预案下,营口市城市及村镇聚落的面积将缓慢增长,较好地保护耕地、林地等资源,但在一定程度上将限制城市及村镇聚落的增长;无保护预案下研究区城市及农村聚落的增长速度最快,耕地流失面积较大;当前趋势预案下,耕地流失面积与无保护预案相近,但耕地流失的格局不同;适当保护预案和管理增长预案下,耕地的流失面积较小;区域与城市规划预案下,城市及村镇聚落增长主要分布在城市开发区和城市周边地区.利用不同预案下的SLEUTH模型可以模拟不同土地管理政策对城市及村镇聚落增长和土地利用变化的影响,对我国实施统筹城乡发展、建设社会主义新农村具有指导意义.  相似文献   

5.
One of the most difficult and time-consuming aspects of building compartmental models of single neurons is assigning values to free parameters to make models match experimental data. Automated parameter-search methods potentially represent a more rapid and less labor-intensive alternative to choosing parameters manually. Here we compare the performance of four different parameter-search methods on several single-neuron models. The methods compared are conjugate-gradient descent, genetic algorithms, simulated annealing, and stochastic search. Each method has been tested on five different neuronal models ranging from simple models with between 3 and 15 parameters to a realistic pyramidal cell model with 23 parameters. The results demonstrate that genetic algorithms and simulated annealing are generally the most effective methods. Simulated annealing was overwhelmingly the most effective method for simple models with small numbers of parameters, but the genetic algorithm method was equally effective for more complex models with larger numbers of parameters. The discussion considers possible explanations for these results and makes several specific recommendations for the use of parameter searches on neuronal models.  相似文献   

6.
基于SLEUTH模型对1997—2013年阜新市城市扩展进行模拟.结果表明:阜新市转型进程中城市扩展最优系数分别为:扩散系数6、繁衍系数64、蔓延系数44、坡度阻抗系数52、道路引力系数90.阜新市主要表现为新中心增长(自发式增长产生的新城市中心)和边缘增长(新、老城市中心的进一步增长);阜新市转型进程中城市扩展受道路引力影响很大,其值达到90;阜新市作为资源枯竭型城市,矿区枯竭带来了滑坡、塌陷等一系列自然灾害,研究期间城市扩展受到坡度的阻抗是很大的.从城市规模角度看,道路引力对小城市的作用大于大城市;从新中心增长情况来看,小城市更容易出现新中心增长.阜新作为资源型城市,目前确定为资源枯竭型城市,经济转型是头等大事,引进的外资企业、新建的开发区、工业用地等在选址上更偏向道路交通便利区域,受道路影响较大,而且更容易出现飞地式发展.利用SLEUTH模型校正所得的最优参数,对阜新市城市范围进行模拟,其中,城市边缘增长的模拟效果较好,城市新中心即飞地增长的模拟效果较差,主要因为新中心增长受决策影响较大,元胞关系作用不大.2001、2006、2010、2013年的阜新市城市范围模拟精度较高.  相似文献   

7.
By rearranging naturally occurring genetic components, gene networks can be created that display novel functions. When designing these networks, the kinetic parameters describing DNA/protein binding are of great importance, as these parameters strongly influence the behavior of the resulting gene network. This article presents an optimization method based on simulated annealing to locate combinations of kinetic parameters that produce a desired behavior in a genetic network. Since gene expression is an inherently stochastic process, the simulation component of simulated annealing optimization is conducted using an accurate multiscale simulation algorithm to calculate an ensemble of network trajectories at each iteration of the simulated annealing algorithm. Using the three-gene repressilator of Elowitz and Leibler as an example, we show that gene network optimizations can be conducted using a mechanistically realistic model integrated stochastically. The repressilator is optimized to give oscillations of an arbitrary specified period. These optimized designs may then provide a starting-point for the selection of genetic components needed to realize an in vivo system.  相似文献   

8.
A parallel genetic algorithm for optimization is outlined, and its performance on both mathematical and biomechanical optimization problems is compared to a sequential quadratic programming algorithm, a downhill simplex algorithm and a simulated annealing algorithm. When high-dimensional non-smooth or discontinuous problems with numerous local optima are considered, only the simulated annealing and the genetic algorithm, which are both characterized by a weak search heuristic, are successful in finding the optimal region in parameter space. The key advantage of the genetic algorithm is that it can easily be parallelized at negligible overhead.  相似文献   

9.
An automated calibration method is proposed and applied to the complex hydro-ecological model Delft3D-BLOOM which is calibrated from monitoring data of the lake Champs-sur-Marne, a small shallow urban lake in the Paris region (France). This method (ABC-RF-SA) combines Approximate Bayesian Computation (ABC) with the machine learning algorithm Random Forest (RF) and a Sensitivity Analysis (SA) of the model parameters. Three target variables are used (total chlorophyll, cyanobacteria and dissolved oxygen concentration) to calibrate 133 parameters. ABC-RF-SA is first applied on a set of simulated observations to validate the methodology. It is then applied on a real set of high-frequency observations recorded during about two weeks on the lake Champs-sur-Marne. The methodology is also compared to standard ABC and ABC-RF formulations. Only ABC-RF-SA allowed the model to reproduce the observed biogeochemical dynamics. The coupling of ABC with RF and SA thus appears crucial for its application to complex hydro-ecological models.  相似文献   

10.
Budburst models have mainly been developed to capture the processes of individual trees, and vary in their complexity and plant physiological realism. We evaluated how well eleven models capture the variation in budburst of birch and Norway spruce in Germany, Austria, the United Kingdom and Finland. The comparison was based on the models performance in relation to their underlying physiological assumptions with four different calibration schemes. The models were not able to accurately simulate the timing of budburst. In general the models overestimated the temperature effect, thereby the timing of budburst was simulated too early in the United Kingdom and too late in Finland. Among the better performing models were three models based on the growing degree day concept, with or without day length or chilling, and an empirical model based on spring temperatures. These models were also the models least influenced by the calibration data. For birch the best calibration scheme was based on multiple sites in either Germany or Europe, and for Norway spruce the best scheme included multiple sites in Germany or cold years of all sites. Most model and calibration combinations indicated greater bias with higher spring temperatures, mostly simulating earlier than observed budburst.  相似文献   

11.
Biomineralized skeletons are widespread in animals, and their origins can be traced to the latest Ediacaran or early Cambrian fossil record, in virtually all animal groups. The origin of animal skeletons is inextricably linked with the diversification of animal body plans and the dramatic changes in ecology and geosphere–biosphere interactions across the Ediacaran–Cambrian transition. This apparent independent acquisition of skeletons across diverse animal clades has been proposed to have been driven by co‐option of a conserved ancestral genetic toolkit in different lineages at the same time. This ‘biomineralization toolkit’ hypothesis makes predictions of the early evolution of the skeleton, predictions tested herein through a critical review of the evidence from both the fossil record and development of skeletons in extant organisms. Furthermore, the distribution of skeletons is here plotted against a time‐calibrated animal phylogeny, and the nature of the deep ancestors of biomineralizing animals interpolated using ancestral state reconstruction. All these lines of evidence point towards multiple instances of the evolution of biomineralization through the co‐option of an inherited organic skeleton and genetic toolkit followed by the stepwise acquisition of more complex skeletal tissues under tighter biological control. This not only supports the ‘biomineralization toolkit’ hypothesis but also provides a model for describing the evolution of complex biological systems across the Ediacaran–Cambrian transition.  相似文献   

12.
Inertial measurement units (IMUs) are integrated electronic devices that contain accelerometers, magnetometers and gyroscopes. Wearable motion capture systems based on IMUs have been advertised as alternatives to optical motion capture. In this paper, the accuracy of five different IMUs of the same type in measuring 3D orientation in static situations, as well as the calibration of the accelerometers and magnetometers within the IMUs, has been investigated. The maximum absolute static orientation error was 5.2 degrees , higher than the 1 degrees claimed by the vendor. If the IMUs are re-calibrated at the time of measurement with the re-calibration procedure described in this paper, it is possible to obtain an error of less than 1 degrees , in agreement with the vendor's specifications (XSens Technologies B.V. 2005. Motion tracker technical documentation Mtx-B. Version 1.03. Available from: www.xsens.com). The new calibration appears to be valid for at least 22 days providing the sensor is not exposed to high impacts. However, if several sensors are 'daisy chained' together changes to the magnetometer bias can cause heading errors of up to 15 degrees . The results demonstrate the non-linear relationship between the vendor's orthogonality claim of < 0.1 degrees and the accuracy of 3D orientation obtained from factory calibrated IMUs in static situations. The authors hypothesise that the high magnetic dip (64 degrees ) in our laboratory may have exacerbated the errors reported. For biomechanical research, small relative movements of a body segment from a calibrated position are likely to be more accurate than large scale global motion that may have an error of up to 9.8 degrees .  相似文献   

13.
Applications of ecosystem flux models on large geographical scales are often limited by model complexity and data availability. Here we calibrated and evaluated a semi‐empirical ecosystem flux model, PREdict Light‐use efficiency, Evapotranspiration and Soil water (PRELES), for various forest types and climate conditions, based on eddy covariance data from 55 sites. A Bayesian approach was adopted for model calibration and uncertainty quantification. We applied the site‐specific calibrations and multisite calibrations to nine plant functional types (PFTs) to obtain the site‐specific and PFT‐specific parameter vectors for PRELES. A systematically designed cross‐validation was implemented to evaluate calibration strategies and the risks in extrapolation. The combination of plant physiological traits and climate patterns generated significant variation in vegetation responses and model parameters across but not within PFTs, implying that applying the model without PFT‐specific parameters is risky. But within PFT, the multisite calibrations performed as accurately as the site‐specific calibrations in predicting gross primary production (GPP) and evapotranspiration (ET). Moreover, the variations among sites within one PFT could be effectively simulated by simply adjusting the parameter of potential light‐use efficiency (LUE), implying significant convergence of simulated vegetation processes within PFT. The hierarchical modelling of PRELES provides a compromise between satellite‐driven LUE and physiologically oriented approaches for extrapolating the geographical variation of ecosystem productivity. Although measurement errors of eddy covariance and remotely sensed data propagated a substantial proportion of uncertainty or potential biases, the results illustrated that PRELES could reliably capture daily variations of GPP and ET for contrasting forest types on large geographical scales if PFT‐specific parameterizations were applied.  相似文献   

14.
Analyzing urban expansion trends and its drivers is extremely important for sustainable urban development. However, in Vietnam, the urbanization process has been considered mainly in big cities, often ignoring the mountainous and border ones. The present study examined urban expansion and urbanization trends in different directions in Lao Cai city, northern Vietnam based on remote sensing and GIS data. After 35 years, the city's urban areas are mainly concentrated on the riverside and in the north-northwest direction (accounting for 27.95% of urban land area) due to the impact of the border gate economy. It is also for the reason that the intensity of urbanization (UII) in the range of 70–100% is mainly concentrated in the north-northwest region. With the urban intensity in 35 years reaching over 43%, Lao Cai is experiencing in a high rate of urbanization, but it has great potential flood risk. To determine flood risk in the study area based on natural and socio-economic factors, we used the Gauss process regression (GPR) model. However, due to the limitation of the GPR model, we combined GPR with Firefly algorithm (FA) to contribute to optimizing model performance. The results proved that the FA-GPR model is suitable for flood risk mapping in Lao Cai city. With R2 = 0.87, this work shows that the greater intensity of urbanization performs the greater flood risk. Therefore, for sustainable development, it is necessary to ensure harmony between economic goals and environmental protection goals.  相似文献   

15.
Conventional electromyography-driven (EMG) musculoskeletal models are calibrated during maximum voluntary contraction (MVC) tasks, but individuals with low back pain cannot perform unbiased MVCs. To address this issue, EMG-driven models can be calibrated in submaximal tasks. However, the effects of maximal (when data points include the maximum contraction) and submaximal calibration techniques on model outputs (e.g., muscle forces, spinal loads) remain yet unknown. We calibrated a subject-specific EMG-driven model, using maximal/submaximal isometric contractions, and simulated different independent tasks. Both approaches satisfactorily predicted external moments (Pearson’s correlation ∼ 0.75; relative error = 44%), and removing calibration tasks under axial torques markedly improved the model performance (Pearson’s correlation ∼ 0.92; relative error ∼ 28%). Unlike individual muscle forces, gross (aggregate) model outputs (i.e., spinal loads, stability index, and sum of abdominal/back muscle forces) estimated from maximal and submaximal calibration techniques were highly correlated (r > 0.78). Submaximal calibration method overestimated spinal loads (6% in average) and abdominal muscle forces (11% in average). Individual muscle forces estimated from maximal and submaximal approaches were substantially different; however, gross model outputs (especially internal loads and stability index) remained highly correlated with small to moderate relative differences; therefore, the submaximal calibration technique can be considered as an alternative to the conventional maximal calibration approach.  相似文献   

16.
How cognitive task behavior is generated by brain network interactions is a central question in neuroscience. Answering this question calls for the development of novel analysis tools that can firstly capture neural signatures of task information with high spatial and temporal precision (the “where and when”) and then allow for empirical testing of alternative network models of brain function that link information to behavior (the “how”). We outline a novel network modeling approach suited to this purpose that is applied to noninvasive functional neuroimaging data in humans. We first dynamically decoded the spatiotemporal signatures of task information in the human brain by combining MRI-individualized source electroencephalography (EEG) with multivariate pattern analysis (MVPA). A newly developed network modeling approach—dynamic activity flow modeling—then simulated the flow of task-evoked activity over more causally interpretable (relative to standard functional connectivity [FC] approaches) resting-state functional connections (dynamic, lagged, direct, and directional). We demonstrate the utility of this modeling approach by applying it to elucidate network processes underlying sensory–motor information flow in the brain, revealing accurate predictions of empirical response information dynamics underlying behavior. Extending the model toward simulating network lesions suggested a role for the cognitive control networks (CCNs) as primary drivers of response information flow, transitioning from early dorsal attention network-dominated sensory-to-response transformation to later collaborative CCN engagement during response selection. These results demonstrate the utility of the dynamic activity flow modeling approach in identifying the generative network processes underlying neurocognitive phenomena.

How is cognitive task behavior generated by brain network interactions? This study describes a novel network modeling approach and applies it to source electroencephalography data. The model accurately predicts future information dynamics underlying behavior and (via simulated lesioning) suggests a role for cognitive control networks as key drivers of response information flow.  相似文献   

17.
The lens of the vertebrate eye was the classic model used to demonstrate the concepts of inductive interactions controlling development. However, it is in the Drosophila model that the greatest progress in understanding molecular mechanisms of eye development have most recently been made. This progress can be attributed to the power of molecular genetics, an approach that was once confined to simpler systems like worms and flies, but is now becoming possible in vertebrates. Thus, the use of transgenic and knock-out gene technology, coupled with the availability of new positional cloning methods, has recently initiated a surge of progress in the mouse genetic model and has also led to the identification of genes involved in human inherited disorders. In addition, gene transfer techniques have opened up opportunities for progress using chick, Xenopus, and other classic developmental systems. Finally, a new vertebrate genetic model, zebrafish, appears very promising for molecular studies. As a result of the opportunities presented by these new approaches, eye development has come into the limelight, hence the timeliness of this focus issue of Developmental Genetics. In this introductory review, we discuss three areas of current work arising through the use of these newer genetic approaches, and pertinent to research articles presented herein. We also touch on related studies reported at the first Keystone Meeting on Ocular Cell and Molecular Biology, recently held in Tamarron Springs, Colorado, January 7–12, 1997. Dev. Genet. 20:175–185, 1997. © 1997 Wiley-Liss, Inc.  相似文献   

18.
19.
BackgroundPhotoneutrons are produced in radiation therapy with high energy photons. Also, capture gamma rays are the byproduct of neutrons interactions with wall material of radiotherapy rooms.AimIn the current study an analytical formula was proposed for capture gamma dose calculations in double bend mazes in radiation therapy rooms.Materials and methodsA total of 40 different layouts with double-bend mazes and a 18 MeV photon beam of Varian 2100 Clinac were simulated using MCNPX Monte Carlo (MC) code. Neutron capture gamma ray dose equivalent was calculated by the MC method along the maze and at the maze entrance door of all the simulated rooms. Then, all MC resulted data were fitted to an empirical formula for capture gamma dose calculations. Wu–McGinley analytical formula for capture gamma dose equivalent at the maze entrance door in single-bend mazes was also used for comparison purposes.ResultsFor capture gamma dose equivalents at the maze entrance door, the difference of 2–11% was seen between MC and the derived equation, while the difference of 36–87% was found between MC and the Wu–McGinley methods.ConclusionOur results showed that the derived formula results were consistent with the MC results for all of 40 different geometries. However, as a new formula, further evaluations are required to validate its use in practical situations. Finally, its application is recommend for capture gamma dose calculations in double-bend mazes to improve shielding calculations.  相似文献   

20.
The SITE® model was originally developed to study the response of tropical ecosystems to varying environmental conditions. The present study evaluated the applicability of the SITE model to simulation of energy fluxes in a tropical semi-deciduous forest of the southern Amazon Basin. The model was simulated with data representing the wet and dry season, and was calibrated according to each season. The output data of the calibrated model [net radiation (Rn), latent heat flux (LE) and sensible heat flux (H)] were compared with data observed in the field for validation. Considering changes in parameter calibration for a time step simulation of 30 min, the magnitude of variation in temporal flux was satisfactory when compared to observation field data. There was a tendency to underestimate and overestimate LE and H, respectively. Of all the calibration parameters, the soil moisture parameter presented the highest variation over the seasons, thus influencing SITE model performance.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号