首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Characterizing the three-dimensional structure of macromolecules is central to understanding their function. Traditionally, structures of proteins and their complexes have been determined using experimental techniques such as X-ray crystallography, NMR, or cryo-electron microscopy—applied individually or in an integrative manner. Meanwhile, however, computational methods for protein structure prediction have been improving their accuracy, gradually, then suddenly, with the breakthrough advance by AlphaFold2, whose models of monomeric proteins are often as accurate as experimental structures. This breakthrough foreshadows a new era of computational methods that can build accurate models for most monomeric proteins. Here, we envision how such accurate modeling methods can combine with experimental structural biology techniques, enhancing integrative structural biology. We highlight the challenges that arise when considering multiple structural conformations, protein complexes, and polymorphic assemblies. These challenges will motivate further developments, both in modeling programs and in methods to solve experimental structures, towards better and quicker investigation of structure–function relationships.  相似文献   

2.
The use of computational modeling and simulation has increased in many biological fields, but despite their potential these techniques are only marginally applied in nutritional sciences. Nevertheless, recent applications of modeling have been instrumental in answering important nutritional questions from the cellular up to the physiological levels. Capturing the complexity of today''s important nutritional research questions poses a challenge for modeling to become truly integrative in the consideration and interpretation of experimental data at widely differing scales of space and time. In this review, we discuss a selection of available modeling approaches and applications relevant for nutrition. We then put these models into perspective by categorizing them according to their space and time domain. Through this categorization process, we identified a dearth of models that consider processes occurring between the microscopic and macroscopic scale. We propose a “middle-out” strategy to develop the required full-scale, multilevel computational models. Exhaustive and accurate phenotyping, the use of the virtual patient concept, and the development of biomarkers from “-omics” signatures are identified as key elements of a successful systems biology modeling approach in nutrition research—one that integrates physiological mechanisms and data at multiple space and time scales.  相似文献   

3.
Human physiological functions are regulated across many orders of magnitude in space and time. Integrating the information and dynamics from one scale to another is critical for the understanding of human physiology and the treatment of diseases. Multi-scale modeling, as a computational approach, has been widely adopted by researchers in computational and systems biology. A key unsolved issue is how to represent appropriately the dynamical behaviors of a high-dimensional model of a lower scale by a low-dimensional model of a higher scale, so that it can be used to investigate complex dynamical behaviors at even higher scales of integration. In the article, we first review the widely-used different modeling methodologies and their applications at different scales. We then discuss the gaps between different modeling methodologies and between scales, and discuss potential methods for bridging the gaps between scales.  相似文献   

4.
During the past several years, there have been a number of advances in the computational and theoretical modeling of lipid bilayer structural and dynamical properties. Molecular dynamics (MD) simulations have increased in length and time scales by about an order of magnitude. MD simulations continue to be applied to more complex systems, including mixed bilayers and bilayer self-assembly. A critical problem is bridging the gap between the still very small MD simulations and the time and length scales of experimental observations. Several new and promising techniques, which use atomic-level correlation and response functions from simulations as input to coarse-grained modeling, are being pursued.  相似文献   

5.
Recent developments in implicit solvent models may be compared in terms of accuracy and computational efficiency. Based on improvements in the accuracy of generalized Born methods and the speed of Poisson-Boltzmann solvers, it appears that the two techniques are converging to a point at which both will be suitable for simulating certain types of biomolecular systems over sizable time and length scales.  相似文献   

6.
The goal of this retrospective article is to place the body of my lab's multiscale mechanobiology work in context of top-down and bottom-up engineering of bone. We have used biosystems engineering, computational modeling and novel experimental approaches to understand bone physiology, in health and disease, and across time (in utero, postnatal growth, maturity, aging and death, as well as evolution) and length scales (a single bone like a femur, m; a sample of bone tissue, mm-cm; a cell and its local environment, μm; down to the length scale of the cell's own skeleton, the cytoskeleton, nm). First we introduce the concept of flow in bone and the three calibers of porosity through which fluid flows. Then we describe, in the context of organ-tissue, tissue-cell and cell-molecule length scales, both multiscale computational models and experimental methods to predict flow in bone and to understand the flow of fluid as a means to deliver chemical and mechanical cues in bone. Addressing a number of studies in the context of multiple length and time scales, the importance of appropriate boundary conditions, site specific material parameters, permeability measures and even micro-nanoanatomically correct geometries are discussed in context of model predictions and their value for understanding multiscale mechanobiology of bone. Insights from these multiscale computational modeling and experimental methods are providing us with a means to predict, engineer and manufacture bone tissue in the laboratory and in the human body.  相似文献   

7.
8.
The nature of the optical cycle of photoactive yellow protein (PYP) makes its elucidation challenging for both experiment and theory. The long transition times render conventional simulation methods ineffective, and yet the short signaling-state lifetime makes experimental data difficult to obtain and interpret. Here, through an innovative combination of computational methods, a prediction and analysis of the biological signaling state of PYP is presented. Coarse-grained modeling and locally scaled diffusion map are first used to obtain a rough bird''s-eye view of the free energy landscape of photo-activated PYP. Then all-atom reconstruction, followed by an enhanced sampling scheme; diffusion map-directed-molecular dynamics are used to focus in on the signaling-state region of configuration space and obtain an ensemble of signaling state structures. To the best of our knowledge, this is the first time an all-atom reconstruction from a coarse grained model has been performed in a relatively unexplored region of molecular configuration space. We compare our signaling state prediction with previous computational and more recent experimental results, and the comparison is favorable, which validates the method presented. This approach provides additional insight to understand the PYP photo cycle, and can be applied to other systems for which more direct methods are impractical.  相似文献   

9.
Given the importance of protein-protein interactions for nearly all biological processes, the design of protein affinity reagents for use in research, diagnosis or therapy is an important endeavor. Engineered proteins would ideally have high specificities for their intended targets, but achieving interaction specificity by design can be challenging. There are two major approaches to protein design or redesign. Most commonly, proteins and peptides are engineered using experimental library screening and/or in vitro evolution. An alternative approach involves using protein structure and computational modeling to rationally choose sequences predicted to have desirable properties. Computational design has successfully produced novel proteins with enhanced stability, desired interactions and enzymatic function. Here we review the strengths and limitations of experimental library screening and computational structure-based design, giving examples where these methods have been applied to designing protein interaction specificity. We highlight recent studies that demonstrate strategies for combining computational modeling with library screening. The computational methods provide focused libraries predicted to be enriched in sequences with the properties of interest. Such integrated approaches represent a promising way to increase the efficiency of protein design and to engineer complex functionality such as interaction specificity.  相似文献   

10.
Agent-based models (ABM) are widely used to study immune systems, providing a procedural and interactive view of the underlying system. The interaction of components and the behavior of individual objects is described procedurally as a function of the internal states and the local interactions, which are often stochastic in nature. Such models typically have complex structures and consist of a large number of modeling parameters. Determining the key modeling parameters which govern the outcomes of the system is very challenging. Sensitivity analysis plays a vital role in quantifying the impact of modeling parameters in massively interacting systems, including large complex ABM. The high computational cost of executing simulations impedes running experiments with exhaustive parameter settings. Existing techniques of analyzing such a complex system typically focus on local sensitivity analysis, i.e. one parameter at a time, or a close “neighborhood” of particular parameter settings. However, such methods are not adequate to measure the uncertainty and sensitivity of parameters accurately because they overlook the global impacts of parameters on the system. In this article, we develop novel experimental design and analysis techniques to perform both global and local sensitivity analysis of large-scale ABMs. The proposed method can efficiently identify the most significant parameters and quantify their contributions to outcomes of the system. We demonstrate the proposed methodology for ENteric Immune SImulator (ENISI), a large-scale ABM environment, using a computational model of immune responses to Helicobacter pylori colonization of the gastric mucosa.  相似文献   

11.
12.
Metabolic compartmentation represents a major characteristic of eukaryotic cells. The analysis of compartmented metabolic networks is complicated by separation and parallelization of pathways, intracellular transport, and the need for regulatory systems to mediate communication between interdependent compartments. Metabolic flux analysis (MFA) has the potential to reveal compartmented metabolic events, although it is a challenging task requiring demanding experimental techniques and sophisticated modeling. At present no ready-made solution can be provided to cope with the complexity of compartmented metabolic networks, but new powerful tools are emerging. This review gives an overview of different strategies to approach this issue, focusing on different MFA methods and highlighting the additional information that should be included to improve the outcome of an experiment and associate estimation procedures.  相似文献   

13.
Biological systems are traditionally studied by focusing on a specific subsystem, building an intuitive model for it, and refining the model using results from carefully designed experiments. Modern experimental techniques provide massive data on the global behavior of biological systems, and systematically using these large datasets for refining existing knowledge is a major challenge. Here we introduce an extended computational framework that combines formalization of existing qualitative models, probabilistic modeling, and integration of high-throughput experimental data. Using our methods, it is possible to interpret genomewide measurements in the context of prior knowledge on the system, to assign statistical meaning to the accuracy of such knowledge, and to learn refined models with improved fit to the experiments. Our model is represented as a probabilistic factor graph, and the framework accommodates partial measurements of diverse biological elements. We study the performance of several probabilistic inference algorithms and show that hidden model variables can be reliably inferred even in the presence of feedback loops and complex logic. We show how to refine prior knowledge on combinatorial regulatory relations using hypothesis testing and derive p-values for learned model features. We test our methodology and algorithms on a simulated model and on two real yeast models. In particular, we use our method to explore uncharacterized relations among regulators in the yeast response to hyper-osmotic shock and in the yeast lysine biosynthesis system. Our integrative approach to the analysis of biological regulation is demonstrated to synergistically combine qualitative and quantitative evidence into concrete biological predictions.  相似文献   

14.
15.
Varner VD  Taber LA 《Bio Systems》2012,109(3):412-419
Researchers in developmental biology are increasingly recognizing the value of theoretical models in studies of morphogenesis. However, creating and testing realistic quantitative models for morphogenetic processes can be an extremely challenging task. The focus of this paper is on models for the mechanics of morphogenesis. Models for these problems often must include large changes in geometry, leading to highly nonlinear problems with the possibility of multiple solutions that must be sorted out using experimental data. Here, we illustrate our approach to these problems using the specific example of head fold formation in the early chick embryo. The interplay between experimental and theoretical results is emphasized throughout, as the model is gradually refined. Some of the limitations inherent in theoretical/computational modeling of biological systems are also discussed.  相似文献   

16.
Mathematical and computational modeling of cardiac excitation-contraction coupling has produced considerable insights into how the heart muscle contracts. With the increase in biophysical and physiological data available, the modeling has become more sophisticated with investigations spanning in scale from molecular components to whole cells. These modeling efforts have provided insight into cardiac excitation-contraction coupling that advanced and complemented experimental studies. One goal is to extend these detailed cellular models to model the whole heart. While this has been done with mechanical and electophysiological models, the complexity and fast time course of calcium dynamics have made inclusion of detailed calcium dynamics in whole heart models impractical. Novel methods such as the probability density approach and moment closure technique which increase computational efficiency might make this tractable.  相似文献   

17.
The traditional approach to computational biophysics studies of molecular systems is brute force molecular dynamics simulations under the conditions of interest. The disadvantages of this approach are that the time and length scales that are accessible to computer simulations often do not reach biologically relevant scales. An alternative approach, which we call intuitive modeling, is hypothesis-driven and based on tailoring simplified protein models to the systems of interest. Using intuitive modeling, the length and time scales that can be achieved using simplified protein models exceed those of traditional molecular-dynamic simulations. Here, we describe several recent studies that signify the predictive power of simplified protein models within the intuitive-modeling approach.  相似文献   

18.
Hübner K  Sahle S  Kummer U 《The FEBS journal》2011,278(16):2767-2857
Systems biology has received an ever increasing interest during the last decade. A large amount of third-party funding is spent on this topic, which involves quantitative experimentation integrated with computational modeling. Industrial companies are also starting to use this approach more and more often, especially in pharmaceutical research and biotechnology. This leads to the question of whether such interest is wisely invested and whether there are success stories to be told for basic science and/or technology/biomedicine. In this review, we focus on the application of systems biology approaches that have been employed to shed light on both biochemical functions and previously unknown mechanisms. We point out which computational and experimental methods are employed most frequently and which trends in systems biology research can be observed. Finally, we discuss some problems that we have encountered in publications in the field.  相似文献   

19.
Recent technological advances enabled high-throughput collection of Small Angle X-ray Scattering (SAXS) profiles of biological macromolecules. Thus, computational methods for integrating SAXS profiles into structural modeling are needed more than ever. Here, we review specifically the use of SAXS profiles for the structural modeling of proteins, nucleic acids, and their complexes. First, the approaches for computing theoretical SAXS profiles from structures are presented. Second, computational methods for predicting protein structures, dynamics of proteins in solution, and assembly structures are covered. Third, we discuss the use of SAXS profiles in integrative structure modeling approaches that depend simultaneously on several data types.  相似文献   

20.
Zhao N  Pang B  Shyu CR  Korkin D 《Proteomics》2011,11(22):4321-4330
Structural knowledge about protein-protein interactions can provide insights to the basic processes underlying cell function. Recent progress in experimental and computational structural biology has led to a rapid growth of experimentally resolved structures and computationally determined near-native models of protein-protein interactions. However, determining whether a protein-protein interaction is physiological or it is the artifact of an experimental or computational method remains a challenging problem. In this work, we have addressed two related problems. The first problem is distinguishing between the experimentally obtained physiological and crystal-packing protein-protein interactions. The second problem is concerned with the classification of near-native and inaccurate docking models. We first defined a universal set of interface features and employed a support vector machines (SVM)-based approach to classify the interactions for both problems, with the accuracy, precision, and recall for the first problem classifier reaching 93%. To improve the classification, we next developed a semi-supervised learning approach for the second problem, using transductive SVM (TSVM). We applied both classifiers to a commonly used protein docking benchmark of 124 complexes. We found that while we reached the classification accuracies of 78.9% for the SVM classifier and 80.3% for the TSVM classifier, improving protein-docking methods by model re-ranking remains a challenging problem.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号