首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Neuron modeling may be said to have originated with the Hodgkin and Huxley action potential model in 1952 and Rall’s models of integrative activity of dendrites in 1964. Over the ensuing decades, these approaches have led to a massive development of increasingly accurate and complex data-based models of neurons and neuronal circuits. ModelDB was founded in 1996 to support this new field and enhance the scientific credibility and utility of computational neuroscience models by providing a convenient venue for sharing them. It has grown to include over 1100 published models covering more than 130 research topics. It is actively curated and developed to help researchers discover and understand models of interest. ModelDB also provides mechanisms to assist running models both locally and remotely, and has a graphical tool that enables users to explore the anatomical and biophysical properties that are represented in a model. Each of its capabilities is undergoing continued refinement and improvement in response to user experience. Large research groups (Allen Brain Institute, EU Human Brain Project, etc.) are emerging that collect data across multiple scales and integrate that data into many complex models, presenting new challenges of scale. We end by predicting a future for neuroscience increasingly fueled by new technology and high performance computation, and increasingly in need of comprehensive user-friendly databases such as ModelDB to provide the means to integrate the data for deeper insights into brain function in health and disease.  相似文献   

2.
ABSTRACT: BACKGROUND: Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the three-dimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application RESULTS: This paper describes a database and related software tools that allow testing of a given structure based methods on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. CONCLUSIONS: The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Pre-computed data for a non-redundant set of deposited protein structures are available for analysis and download in the ModelDB database.  相似文献   

3.
SenseLab: new developments in disseminating neuroscience information   总被引:1,自引:0,他引:1  
This article presents the latest developments in neuroscience information dissemination through the SenseLab suite of databases: NeuronDB, CellPropDB, ORDB, OdorDB, OdorMapDB, ModelDB and BrainPharm. These databases include information related to: (i) neuronal membrane properties and neuronal models, and (ii) genetics, genomics, proteomics and imaging studies of the olfactory system. We describe here: the new features for each database, the evolution of SenseLab's unifying database architecture and instances of SenseLab database interoperation with other neuroscience online resources.  相似文献   

4.
5.
The objective of the paper is to show that electroosmotic flow might play an important role in the intracellular transport of biomolecules. The paper presents two mathematical models describing the role of electroosmosis in the transport of the negatively charged messenger proteins to the negatively charged nucleus and in the recovery of the fluorescence after photobleaching. The parameters of the models were derived from the extensive review of the literature data. Computer simulations were performed within the COMSOL 4.2a software environment. The first model demonstrated that the presence of electroosmosis might intensify the flux of messenger proteins to the nucleus and allow the efficient transport of the negatively charged phosphorylated messenger proteins against the electrostatic repulsion of the negatively charged nucleus. The second model revealed that the presence of the electroosmotic flow made the time of fluorescence recovery dependent on the position of the bleaching spot relative to cellular membrane. The magnitude of the electroosmotic flow effect was shown to be quite substantial, i.e. increasing the flux of the messengers onto the nucleus up to 4-fold relative to pure diffusion and resulting in the up to 3-fold change in the values of fluorescence recovery time, and therefore the apparent diffusion coefficient determined from the fluorescence recovery after photobleaching experiments. Based on the results of the modeling and on the universal nature of the electroosmotic flow, the potential wider implications of electroosmotic flow in the intracellular and extracellular biological processes are discussed. Both models are available for download at ModelDB.  相似文献   

6.
7.
The methods used for ecosystem modelling are generally based on differential equations. Nowadays, new computational models based on concurrent processing of multiple agents (multi-agents) or the simulation of biological processes with the Population Dynamic P-System models (PDPs) are gaining importance. These models have significant advantages over traditional models, such as high computational efficiency, modularity and its ability to model the interaction between different biological processes which operate concurrently. By this, they are becoming useful for simulating complex dynamic ecosystems, untreatable with classical techniques. On the other hand, the main counterpart of P-System models is the need for calibration. The model parameters represent the field measurements taken by experts. However, the exact values of some of these parameters are unknown and experts define a numerical interval of possible values. Therefore, it is necessary to perform a calibration process to fit the best value of each interval. When the number of unknown parameters increases, the calibration process becomes computationally complex and storage requirements increase significantly. In this paper, we present a parallel tool (PSysCal) for calibrating next generation PDP models. The results shown that the calibration time is reduced exponentially with the amount of computational resources. However, the complexity of the calibration process and a limitation in the number of available computational resources make the calibration process intractable for large models. To solve this, we propose a heuristic technique (PSysCal+H). The results show that this technique significantly reduces the computational cost, it being practical for solving large model instances even with limited computational resources.  相似文献   

8.
One of the most important challenges of contemporary biology is understanding how cells assemble into tissues. The complexity of morphogenesis calls for computational tools able to identify the dominant mechanisms involved in shaping tissues. This narrative review presents individual-based computational models that proved useful in simulating phenomena of interest in tissue engineering (TE), a research field that aims to create tissue replacements in the laboratory. First, we briefly describe morphogenetic mechanisms. Then, we present several computational models of cellular and subcellular resolution, along with applications that illustrate their potential to address problems of TE. Finally, we analyze experiments that may be used to validate computational models of tissue constructs made of cohesive cells. Our analysis shows that the models available in the literature are not exploited to their full potential. We argue that, upon validation, a computational model can be used to optimize cell culture conditions and to design new experiments.  相似文献   

9.
This paper presents two approaches to the individual-based modelling of bacterial ecologies and evolution using computational tools. The first approach is a fine-grained model that is based on networks of interactivity between computational objects representing genes and proteins. The second approach is a coarser-grained, agent-based model, which is designed to explore the evolvability of adaptive behavioural strategies in artificial bacteria represented by learning classifier systems. The structure and implementation of these computational models is discussed, and some results from simulation experiments are presented. Finally, the potential applications of the proposed models to the solution of real-world computational problems, and their use in improving our understanding of the mechanisms of evolution, are briefly outlined.  相似文献   

10.
There exists a large body of research on the lens of the mammalian eye over the past several decades. The objective of this work is to provide a link between the most recent computational models and some of the pioneering work in the 1970s and 80s. We introduce a general nonelectroneutral model to study the microcirculation in the lens of the eye. It describes the steady-state relationships among ion fluxes, between water flow and electric field inside cells, and in the narrow extracellular spaces between cells in the lens. Using asymptotic analysis, we derive a simplified model based on physiological data and compare our results with those in the literature. We show that our simplified model can be reduced further to the first-generation models, whereas our full model is consistent with the most recent computational models. In addition, our simplified model captures in its equations the main features of the full computational models. Our results serve as a useful link intermediate between the computational models and the first-generation analytical models. Simplified models of this sort may be particularly helpful as the roles of similar osmotic pumps of microcirculation are examined in other tissues with narrow extracellular spaces, such as cardiac and skeletal muscle, liver, kidney, epithelia in general, and the narrow extracellular spaces of the central nervous system, the “brain.” Simplified models may reveal the general functional plan of these systems before full computational models become feasible and specific.  相似文献   

11.
In this paper we take the view that computational models of biological systems should satisfy two conditions – they should be able to predict function at a systems biology level, and robust techniques of validation against biological models must be available. A modelling paradigm for developing a predictive computational model of cellular interaction is described, and methods of providing robust validation against biological models are explored, followed by a consideration of software issues.  相似文献   

12.

Background  

Computational models of protein structure are usually inaccurate and exhibit significant deviations from the true structure. The utility of models depends on the degree of these deviations. A number of predictive methods have been developed to discriminate between the globally incorrect and approximately correct models. However, only a few methods predict correctness of different parts of computational models. Several Model Quality Assessment Programs (MQAPs) have been developed to detect local inaccuracies in unrefined crystallographic models, but it is not known if they are useful for computational models, which usually exhibit different and much more severe errors.  相似文献   

13.
Simplified material models are commonly used in computational simulation of biological soft tissue as an approximation of the complicated material response and to minimize computational resources. However, the simulation of complex loadings, such as long-duration tissue swelling, necessitates complex models that are not easy to formulate. This paper strives to offer the updated Lagrangian formulation comprehensive procedure of various non-linear material models for the application of finite element analysis of biological soft tissues including a definition of the Cauchy stress and the spatial tangential stiffness. The relationships between water content, osmotic pressure, ionic concentration and the pore pressure stress of the tissue are discussed with the merits of these models and their applications.  相似文献   

14.
Computational models are increasingly essential to systems neuroscience. Models serve as proofs of concept, tests of sufficiency, and as quantitative embodiments of working hypotheses and are important tools for understanding and interpreting complex data sets. In the olfactory system, models have played a particularly prominent role in framing contemporary theories and presenting novel hypotheses, a role that will only grow as the complexity and intricacy of experimental data continue to increase. This review will attempt to provide a comprehensive, functional overview of computational ideas in olfaction and outline a computational framework for olfactory processing based on the insights provided by these diverse models and their supporting data.  相似文献   

15.
Recent advances in experimental plant biology have led to an increased potential to investigate plant development at a systems level. The emerging research field of Computational Morphodynamics has the aim to lead this development by combining dynamic spatial experimental data with computational models of molecular networks, growth, and mechanics in a multicellular context. The increased number of published models may lead to a diversification of our understanding of the systems, and methods for evaluating, comparing, and sharing models are main challenges for the future. We will discuss this problem using ideas originating from physics and use recent computational models of plant development as examples.  相似文献   

16.
《Biophysical journal》2020,118(6):1455-1465
Physical models of biological systems can become difficult to interpret when they have a large number of parameters. But the models themselves actually depend on (i.e., are sensitive to) only a subset of those parameters. This phenomenon is due to parameter space compression (PSC), in which a subset of parameters emerges as “stiff” as a function of time or space. PSC has only been used to explain analytically solvable physics models. We have generalized this result by developing a numerical approach to PSC that can be applied to any computational model. We validated our method against analytically solvable models of a random walk with drift and protein production and degradation. We then applied our method to a simple computational model of microtubule dynamic instability. We propose that numerical PSC has the potential to identify the low-dimensional structure of many computational models in biophysics. The low-dimensional structure of a model is easier to interpret and identifies the mechanisms and experiments that best characterize the system.  相似文献   

17.
We present Illuminator, a user-friendly web front end to computational models such as docking and 3D shape similarity calculations. Illuminator was specifically created to allow non-experts to design and submit molecules to computational chemistry programs. As such it provides a simple user interface allowing users to submit jobs starting from a 2D structure. The models provided are pre-optimized by computational chemists for each specific target. We provide an example of how Illuminator was used to prioritize the design of molecular substituents in the Anadys HCV Polymerase (NS5B) project. With 7500 submitted jobs in 1.5 years, Illuminator has allowed project teams at Anadys to accelerate the optimization of novel leads. It has also improved communication between project members and increased demand for computational drug discovery tools.  相似文献   

18.
An essential phenomenon of the functional brain is synaptic plasticity which is associated with changes in the strength of synapses between neurons. These changes are affected by both extracellular and intracellular mechanisms. For example, intracellular phosphorylation-dephosphorylation cycles have been shown to possess a special role in synaptic plasticity. We, here, provide the first computational comparison of models for synaptic plasticity by evaluating five models describing postsynaptic signal transduction networks. Our simulation results show that some of the models change their behavior completely due to varying total concentrations of protein kinase and phosphatase. Furthermore, the responses of the models vary when models are compared to each other. Based on our study, we conclude that there is a need for a general setup to objectively compare the models and an urgent demand for the minimum criteria that a computational model for synaptic plasticity needs to meet.  相似文献   

19.
Although phylogenetic inference of protein-coding sequences continues to dominate the literature, few analyses incorporate evolutionary models that consider the genetic code. This problem is exacerbated by the exclusion of codon-based models from commonly employed model selection techniques, presumably due to the computational cost associated with codon models. We investigated an efficient alternative to standard nucleotide substitution models, in which codon position (CP) is incorporated into the model. We determined the most appropriate model for alignments of 177 RNA virus genes and 106 yeast genes, using 11 substitution models including one codon model and four CP models. The majority of analyzed gene alignments are best described by CP substitution models, rather than by standard nucleotide models, and without the computational cost of full codon models. These results have significant implications for phylogenetic inference of coding sequences as they make it clear that substitution models incorporating CPs not only are a computationally realistic alternative to standard models but may also frequently be statistically superior.  相似文献   

20.
Computational techniques and software for the analysis of problems in mechanics have naturally moved from their origins in the traditional engineering disciplines to the study of cell, tissue and organ biomechanics. Increasingly complex models have been developed to describe and predict the mechanical behavior of such biological systems. While the availability of advanced computational tools has led to exciting research advances in the field, the utility of these models is often the subject of criticism due to inadequate model verification and validation (V&V). The objective of this review is to present the concepts of verification, validation and sensitivity studies with regard to the construction, analysis and interpretation of models in computational biomechanics. Specific examples from the field are discussed. It is hoped that this review will serve as a guide to the use of V&V principles in the field of computational biomechanics, thereby improving the peer acceptance of studies that use computational modeling techniques.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号