首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
There is an urgent need for developing alternate strategies to combat Malaria caused by Plasmodium falciparum (P. falciparum) because of growing drug resistance and increased incidents of infection in humans. 3D models of P. falciparum annotated proteins using molecular modeling techniques will enhance our understanding about the mechanism of host parasite interactions for the identification of drug targets and malarial vaccine design. Potential structural templates for P. falciparum annotated proteins were selected from PDB (protein databank) using BLASTP (basic local alignment search tool for proteins). This exercise identified 476 Plasmodium proteins with one or more known structural templates (>or= 40 % identity) for further modeling. The pair-wise sequence alignments generated for protein modeling were manually checked for error. The models were then constructed using MODELLER (a comparative protein modelling program for modelling protein structures) followed by energy minimization in AMBER force field and checked for error using PROCHECK. AVAILABILITY: http://bioinfo.icgeb.res.in/codes/model.html.  相似文献   

2.
Park H  Seok C 《Proteins》2012,80(8):1974-1986
Contemporary template-based modeling techniques allow applications of modeling methods to vast biological problems. However, they tend to fail to provide accurate structures for less-conserved local regions in sequence even when the overall structure can be modeled reliably. We call these regions unreliable local regions (ULRs). Accurate modeling of ULRs is of enormous value because they are frequently involved in functional specificity. In this article, we introduce a new method for modeling ULRs in template-based models by employing a sophisticated loop modeling technique. Combined with our previous study on protein termini, the method is applicable to refinement of both loop and terminus ULRs. A large-scale test carried out in a blind fashion in CASP9 (the 9th Critical Assessment of techniques for protein structure prediction) shows that ULR structures are improved over initial template-based models by refinement in more than 70% of the successfully detected ULRs. It is also notable that successful modeling of several long ULRs over 12 residues is achieved. Overall, the current results show that a careful application of loop and terminus modeling can be a promising tool for model refinement in template-based modeling.  相似文献   

3.
Konopka AK 《Proteomics》2007,7(6):846-856
The theory of surrogacy is briefly outlined as one of the conceptual foundations of systems biology that has been developed for the last 30 years in the context of Hertz-Rosen modeling relationship. Conceptual foundations of modeling convoluted (biologically complex) systems are briefly reviewed and discussed in terms of current and future research in systems biology. New as well as older results that pertain to the concepts of modeling relationship, sequence of surrogacies, cascade of representations, complementarity, analogy, metaphor, and epistemic time are presented together with a classification of models in a cascade. Examples of anticipated future applications of surrogacy theory in life sciences are briefly discussed.  相似文献   

4.
PBPK models in risk assessment--A focus on chloroprene   总被引:2,自引:0,他引:2  
Mathematical models are increasingly being used to simulate events in the exposure-response continuum, and to support quantitative predictions of risks to human health. Physiologically based pharmacokinetic (PBPK) models address that portion of the continuum from an external chemical exposure to an internal dose at a target site. Essential data needed to develop a PBPK model include values of key physiological parameters (e.g., tissue volumes, blood flow rates) and chemical specific parameters (rate of chemical absorption, distribution, metabolism, and elimination) for the species of interest. PBPK models are commonly used to: (1) predict concentrations of an internal dose over time at a target site following external exposure via different routes and/or durations; (2) predict human internal concentration at a target site based on animal data by accounting for toxicokinetic and physiological differences; and (3) estimate variability in the internal dose within a human population resulting from differences in individual pharmacokinetics. Himmelstein et al. [M.W. Himmelstein, S.C. Carpenter, P.M. Hinderliter, Kinetic modeling of beta-chloroprene metabolism. I. In vitro rates in liver and lung tissue fractions from mice, rats, hamsters, and humans, Toxicol. Sci. 79 (1) (2004) 18-27; M.W. Himmelstein, S.C. Carpenter, M.V. Evans, P.M. Hinderliter, E.M. Kenyon, Kinetic modeling of beta-chloroprene metabolism. II. The application of physiologically based modeling for cancer dose response analysis, Toxicol. Sci. 79 (1) (2004) 28-37] developed a PBPK model for chloroprene (2-chloro-1,3-butadiene; CD) that simulates chloroprene disposition in rats, mice, hamsters, or humans following an inhalation exposure. Values for the CD-PBPK model metabolic parameters were obtained from in vitro studies, and model simulations compared to data from in vivo gas uptake studies in rats, hamsters, and mice. The model estimate for total amount of metabolite in lung correlated better with rodent tumor incidence than did the external dose. Based on this PBPK model analytical approach, Himmelstein et al. [M.W. Himmelstein, S.C. Carpenter, M.V. Evans, P.M. Hinderliter, E.M. Kenyon, Kinetic modeling of beta-chloroprene metabolism. II. The application of physiologically based modeling for cancer dose response analysis, Toxicol. Sci. 79 (1) (2004) 28-37; M.W. Himmelstein, R. Leonard, R. Valentine, Kinetic modeling of beta-chloroprene metabolism: default and physiologically-based modeling approaches for cancer dose response, in: IISRP Symposium on Evaluation of Butadiene & Chloroprene Health Effects, September 21, 2005, TBD--reference in this proceedings issue of Chemical-Biological Interactions] propose that observed species differences in the lung tumor dose-response result from differences in CD metabolic rates. The CD-PBPK model has not yet been submitted to EPA for use in developing the IRIS assessment for chloroprene, but is sufficiently developed to be considered. The process that EPA uses to evaluate PBPK models is discussed, as well as potential applications for the CD-PBPK model in an IRIS assessment.  相似文献   

5.

Background

Combinatorial complexity is a challenging problem for the modeling of cellular signal transduction since the association of a few proteins can give rise to an enormous amount of feasible protein complexes. The layer-based approach is an approximative, but accurate method for the mathematical modeling of signaling systems with inherent combinatorial complexity. The number of variables in the simulation equations is highly reduced and the resulting dynamic models show a pronounced modularity. Layer-based modeling allows for the modeling of systems not accessible previously.

Results

ALC (Automated Layer Construction) is a computer program that highly simplifies the building of reduced modular models, according to the layer-based approach. The model is defined using a simple but powerful rule-based syntax that supports the concepts of modularity and macrostates. ALC performs consistency checks on the model definition and provides the model output in different formats (C MEX, MATLAB, Mathematica and SBML) as ready-to-run simulation files. ALC also provides additional documentation files that simplify the publication or presentation of the models. The tool can be used offline or via a form on the ALC website.

Conclusion

ALC allows for a simple rule-based generation of layer-based reduced models. The model files are given in different formats as ready-to-run simulation files.  相似文献   

6.
Reliable prediction of model accuracy is an important unsolved problem in protein structure modeling. To address this problem, we studied 24 individual assessment scores, including physics-based energy functions, statistical potentials, and machine learning-based scoring functions. Individual scores were also used to construct approximately 85,000 composite scoring functions using support vector machine (SVM) regression. The scores were tested for their abilities to identify the most native-like models from a set of 6000 comparative models of 20 representative protein structures. Each of the 20 targets was modeled using a template of <30% sequence identity, corresponding to challenging comparative modeling cases. The best SVM score outperformed all individual scores by decreasing the average RMSD difference between the model identified as the best of the set and the model with the lowest RMSD (DeltaRMSD) from 0.63 A to 0.45 A, while having a higher Pearson correlation coefficient to RMSD (r=0.87) than any other tested score. The most accurate score is based on a combination of the DOPE non-hydrogen atom statistical potential; surface, contact, and combined statistical potentials from MODPIPE; and two PSIPRED/DSSP scores. It was implemented in the SVMod program, which can now be applied to select the final model in various modeling problems, including fold assignment, target-template alignment, and loop modeling.  相似文献   

7.
Gravel S 《Genetics》2012,191(2):607-619
Migrations have played an important role in shaping the genetic diversity of human populations. Understanding genomic data thus requires careful modeling of historical gene flow. Here we consider the effect of relatively recent population structure and gene flow and interpret genomes of individuals that have ancestry from multiple source populations as mosaics of segments originating from each population. This article describes general and tractable models for local ancestry patterns with a focus on the length distribution of continuous ancestry tracts and the variance in total ancestry proportions among individuals. The models offer improved agreement with Wright-Fisher simulation data when compared to the state-of-the art and can be used to infer time-dependent migration rates from multiple populations. Considering HapMap African-American (ASW) data, we find that a model with two distinct phases of "European" gene flow significantly improves the modeling of both tract lengths and ancestry variances.  相似文献   

8.
Computational models of integrin-based adhesion complexes have revealed important insights into the mechanisms by which cells establish connections with their external environment. However, how changes in conformation and function of individual adhesion proteins regulate the dynamics of whole adhesion complexes remains largely elusive. This is because of the large separation in time and length scales between the dynamics of individual adhesion proteins (nanoseconds and nanometers) and the emergent dynamics of the whole adhesion complex (seconds and micrometers), and the limitations of molecular simulation approaches in extracting accurate free energies, conformational transitions, reaction mechanisms, and kinetic rates, that can inform mechanisms at the larger scales. In this review, we discuss models of integrin-based adhesion complexes and highlight their main findings regarding: (i) the conformational transitions of integrins at the molecular and macromolecular scales and (ii) the molecular clutch mechanism at the mesoscale. Lastly, we present unanswered questions in the field of modeling adhesions and propose new ideas for future exciting modeling opportunities.  相似文献   

9.
Extrapolation of health risks from high to low doses has received a considerable amount of attention in carcinogenic risk assessment over decades. Fitting statistical dose-response models to experimental data collected at high doses and use of the fitted model for estimating effects at low doses lead to quite different risk predictions. Dissatisfaction with this procedure was formulated both by toxicologists who saw a deficit of biological knowledge in the models as well as by risk modelers who saw the need of mechanistically-based stochastic modeling. This contribution summarizes the present status of low dose modeling and the determination of the shape of dose-response curves. We will address the controversial issues of the appropriateness of threshold models, the estimation of no observed adverse effect levels (NOAEL), and their relevance for low dose modeling. We will distinguish between quantal dose-response models for tumor incidence and models of the more informative age/time dependent tumor incidence. The multistage model and the two-stage model of clonal expansion are considered as dose-response models accounting for biological mechanisms. Problems of the identifiability of mechanisms are addressed, the relation between administered dose and effective target dose is illustrated by examples, and the recently proposed Benchmark Dose concept for risk assessment is presented with its consequences for mechanistic modeling and statistical estimation.  相似文献   

10.
Nuclear pore complexes mediate the rapid trafficking of target macromolecules between the nucleus and the cytoplasm but exclude non-targets. Mathematical modeling helps to define the physical properties of a transport medium that can selectively enhance the permeation of some molecules but block others. Recent pioneering work has established a basis for quantitative modeling of nuclear translocation, and we expect this field to expand rapidly. A second area where modeling of nucleocytoplasmic transport has been prominently employed is in efforts to understand the regulatory networks by which signals pass between the nuclear and cytoplasmic compartments. Recent evidence suggests that the distinctive kinetics and spatial organization of nuclear transport processes can be used to efficiently propagate signals by new and unexpected pathways.  相似文献   

11.
The dynamics of biological reaction networks are strongly constrained by thermodynamics. An holistic understanding of their behavior and regulation requires mathematical models that observe these constraints. However, kinetic models may easily violate the constraints imposed by the principle of detailed balance, if no special care is taken. Detailed balance demands that in thermodynamic equilibrium all fluxes vanish. We introduce a thermodynamic-kinetic modeling (TKM) formalism that adapts the concepts of potentials and forces from irreversible thermodynamics to kinetic modeling. In the proposed formalism, the thermokinetic potential of a compound is proportional to its concentration. The proportionality factor is a compound-specific parameter called capacity. The thermokinetic force of a reaction is a function of the potentials. Every reaction has a resistance that is the ratio of thermokinetic force and reaction rate. For mass-action type kinetics, the resistances are constant. Since it relies on the thermodynamic concept of potentials and forces, the TKM formalism structurally observes detailed balance for all values of capacities and resistances. Thus, it provides an easy way to formulate physically feasible, kinetic models of biological reaction networks. The TKM formalism is useful for modeling large biological networks that are subject to many detailed balance relations.  相似文献   

12.
Computational modeling has traditionally played an important role in dissecting the mechanisms for cardiac dysfunction. Ventricular electromechanical models, likely the most sophisticated virtual organs to date, integrate detailed information across the spatial scales of cardiac electrophysiology and mechanics and are capable of capturing the emergent behavior and the interaction between electrical activation and mechanical contraction of the heart. The goal of this review is to provide an overview of the latest advancements in multiscale electromechanical modeling of the ventricles. We first detail the general framework of multiscale ventricular electromechanical modeling and describe the state of the art in computational techniques and experimental validation approaches. The powerful utility of ventricular electromechanical models in providing a better understanding of cardiac function is then demonstrated by reviewing the latest insights obtained by these models, focusing primarily on the mechanisms by which mechanoelectric coupling contributes to ventricular arrythmogenesis, the relationship between electrical activation and mechanical contraction in the normal heart, and the mechanisms of mechanical dyssynchrony and resynchronization in the failing heart. Computational modeling of cardiac electromechanics will continue to complement basic science research and clinical cardiology and holds promise to become an important clinical tool aiding the diagnosis and treatment of cardiac disease.  相似文献   

13.
We study a hybrid model that combines Cox proportional hazards regression with tree-structured modeling. The main idea is to use step functions, provided by a tree structure, to 'augment' Cox (1972) proportional hazards models. The proposed model not only provides a natural assessment of the adequacy of the Cox proportional hazards model but also improves its model fitting without loss of interpretability. Both simulations and an empirical example are provided to illustrate the use of the proposed method.  相似文献   

14.
Forest insect outbreaks can have large impacts on ecosystems and understanding the underlying ecological processes is critical for their management. Current process-based modeling approaches of insect outbreaks are often based on population processes operating at small spatial scales (i.e. within individual forest stands). As such, they are difficult to parameterize and offer limited applicability when modeling and predicting outbreaks at the landscape level where management actions take place. In this paper, we propose a new process-based landscape model of forest insect outbreaks that is based on stand defoliation, the Forest-Infected-Recovering-Forest (FIRF) model. We explore both spatially-implicit (mean field equations with global dispersal) and spatially-explicit (cellular automata with limited dispersal between neighboring stands) versions of this model to assess the role of dispersal in the landscape dynamics of outbreaks. We show that density-dependent dispersal is necessary to generate cyclic outbreaks in the spatially-implicit version of the model. The spatially-explicit FIRF model with local and stochastic dispersal displays cyclic outbreaks at the landscape scale and patchy outbreaks in space, even without density-dependence. Our simple, process-based FIRF model reproduces large scale outbreaks and can provide an innovative approach to model and manage forest pests at the landscape scale.  相似文献   

15.
The Food and Drug Administration (FDA) initiative of Process Analytical Technology (PAT) encourages the monitoring of biopharmaceutical manufacturing processes by innovative solutions. Raman spectroscopy and the chemometric modeling tool partial least squares (PLS) have been applied to this aim for monitoring cell culture process variables. This study compares the chemometric modeling methods of Support Vector Machine radial (SVMr), Random Forests (RF), and Cubist to the commonly used linear PLS model for predicting cell culture components—glucose, lactate, and ammonia. This research is performed to assess whether the use of PLS as standard practice is justified for chemometric modeling of Raman spectroscopy and cell culture data. Model development data from five small-scale bioreactors (2 × 1 L and 3 × 5 L) using two Chinese hamster ovary (CHO) cell lines were used to predict against a manufacturing scale bioreactor (2,000 L). Analysis demonstrated that Cubist predictive models were better for average performance over PLS, SVMr, and RF for glucose, lactate, and ammonia. The root mean square error of prediction (RMSEP) of Cubist modeling was acceptable for the process concentration ranges of glucose (1.437 mM), lactate (2.0 mM), and ammonia (0.819 mM). Interpretation of variable importance (VI) results theorizes the potential advantages of Cubist modeling in avoiding interference of Raman spectral peaks. Predictors/Raman wavenumbers (cm−1) of interest for individual variables are X1139–X1141 for glucose, X846–X849 for lactate, and X2941–X2943 for ammonia. These results demonstrate that other beneficial chemometric models are available for use in monitoring cell culture with Raman spectroscopy.  相似文献   

16.
湖泊生态系统动力学模型研究进展   总被引:15,自引:1,他引:14  
从系统分析在湖泊生态系统动力学研究中的作用出发,对湖泊生态系统的动力学建模过程、方法和软件等进行了总结.在此基础上,综述了国内外湖泊生态系统动力学模型的发展.从1960年代至今,湖泊生态系统动力学模型从简单的零维模型发展到复杂的水质水动力学生态综合模型和生态结构动力学模型,如LakeWeb模型.中国的湖泊生态系统动力学模型研究始于20世纪80年代,主要集中在滇池、太湖、东湖和巢湖等富营养化严重的湖泊以及其他水体.目前,已经开发一些软件用于湖泊生态系统动力学模拟,主要有CEQUALICM、WASP、AQUATOX、PAMOLARE、CAEDYM等,以及用来模拟湖泊能流的软件ECOPATH.湖泊生态系统动力学模型还在监测、数据共享和模型结构、参数选取和不确定性分析等方面存在不足,需在今后的研究中加以改进.  相似文献   

17.
In many research projects on modeling and analyzing biological pathways, the Petri net has been recognized as a promising method for representing biological pathways. From the pioneering works by Reddy et al., 1993, and Hofest?dt, 1994, that model metabolic pathways by traditional Petri net, several enhanced Petri nets such as colored Petri net, stochastic Petri net, and hybrid Petri net have been used for modeling biological phenomena. Recently, Matsuno et al., 2003b, introduced the hybrid functional Petri net (HFPN) in order to give a more intuitive and natural modeling method for biological pathways than these existing Petri nets. Although the paper demonstrates the effectiveness of HFPN with two examples of gene regulation mechanism for circadian rhythms and apoptosis signaling pathway, there has been no detailed explanation about the method of HFPN construction for these examples. The purpose of this paper is to describe method to construct biological pathways with the HFPN step-by-step. The method is demonstrated by the well-known glycolytic pathway controlled by the lac operon gene regulatory mechanism.  相似文献   

18.
Modeling organism distributions from survey data involves numerous statistical challenges, including accounting for zero‐inflation, overdispersion, and selection and incorporation of environmental covariates. In environments with high spatial and temporal variability, addressing these challenges often requires numerous assumptions regarding organism distributions and their relationships to biophysical features. These assumptions may limit the resolution or accuracy of predictions resulting from survey‐based distribution models. We propose an iterative modeling approach that incorporates a negative binomial hurdle, followed by modeling of the relationship of organism distribution and abundance to environmental covariates using generalized additive models (GAM) and generalized additive models for location, scale, and shape (GAMLSS). Our approach accounts for key features of survey data by separating binary (presence‐absence) from count (abundance) data, separately modeling the mean and dispersion of count data, and incorporating selection of appropriate covariates and response functions from a suite of potential covariates while avoiding overfitting. We apply our modeling approach to surveys of sea duck abundance and distribution in Nantucket Sound (Massachusetts, USA), which has been proposed as a location for offshore wind energy development. Our model results highlight the importance of spatiotemporal variation in this system, as well as identifying key habitat features including distance to shore, sediment grain size, and seafloor topographic variation. Our work provides a powerful, flexible, and highly repeatable modeling framework with minimal assumptions that can be broadly applied to the modeling of survey data with high spatiotemporal variability. Applying GAMLSS models to the count portion of survey data allows us to incorporate potential overdispersion, which can dramatically affect model results in highly dynamic systems. Our approach is particularly relevant to systems in which little a priori knowledge is available regarding relationships between organism distributions and biophysical features, since it incorporates simultaneous selection of covariates and their functional relationships with organism responses.  相似文献   

19.
Species distribution models (SDMs) are important management tools for highly mobile marine species because they provide spatially and temporally explicit information on animal distribution. Two prevalent modeling frameworks used to develop SDMs for marine species are generalized additive models (GAMs) and boosted regression trees (BRTs), but comparative studies have rarely been conducted; most rely on presence‐only data; and few have explored how features such as species distribution characteristics affect model performance. Since the majority of marine species BRTs have been used to predict habitat suitability, we first compared BRTs to GAMs that used presence/absence as the response variable. We then compared results from these habitat suitability models to GAMs that predict species density (animals per km2) because density models built with a subset of the data used here have previously received extensive validation. We compared both the explanatory power (i.e., model goodness of fit) and predictive power (i.e., performance on a novel dataset) of the GAMs and BRTs for a taxonomically diverse suite of cetacean species using a robust set of systematic survey data (1991–2014) within the California Current Ecosystem. Both BRTs and GAMs were successful at describing overall distribution patterns throughout the study area for the majority of species considered, but when predicting on novel data, the density GAMs exhibited substantially greater predictive power than both the presence/absence GAMs and BRTs, likely due to both the different response variables and fitting algorithms. Our results provide an improved understanding of some of the strengths and limitations of models developed using these two methods. These results can be used by modelers developing SDMs and resource managers tasked with the spatial management of marine species to determine the best modeling technique for their question of interest.  相似文献   

20.
Flexible parametric measurement error models   总被引:2,自引:0,他引:2  
Inferences in measurement error models can be sensitive to modeling assumptions. Specifically, if the model is incorrect, the estimates can be inconsistent. To reduce sensitivity to modeling assumptions and yet still retain the efficiency of parametric inference, we propose using flexible parametric models that can accommodate departures from standard parametric models. We use mixtures of normals for this purpose. We study two cases in detail: a linear errors-in-variables model and a change-point Berkson model.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号