首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Appropriate combined models are discussed for the analysis of complex traits. It is argued that combined models may be necessary for optimally extracting the information from family studies. It is further argued that, especially as we face genes with much smaller effects, our ability to find these genes will depend on how precisely and accurately we are able to model the interrelationships. We need these newer models and methods for optimally extracting the information from family data, and we also need to reorient ourselves as to how we interpret the very information extracted. It is projected that path and segregation analysis, as seen in terms of combined models, will be useful in the new millennium.  相似文献   

2.
Protein folding is an important problem in structural biology with significant medical implications, particularly for misfolding disorders like Alzheimer's disease. Solving the folding problem will ultimately require a combination of theory and experiment, with theoretical models providing a comprehensive view of folding and experiments grounding these models in reality. Here we review progress towards this goal over the past decade, with an emphasis on recent theoretical advances that are empowering chemically detailed models of folding and the new results these technologies are providing. In particular, we discuss new insights made possible by Markov state models (MSMs), including the role of non-native contacts and the hub-like character of protein folded states.  相似文献   

3.
Results and new hypotheses in animal models often stimulate development of new paradigms in how we view rheumatoid arthritis (RA). The complexity of RA does, however, eventually lead to the rejection of these hypotheses. Here, it is argued that the large number of so-far described animal models, when taken together, also reveals a complex disease. Fortunately, detailed study of each of the animal models will reveal this complexity, and may also be helpful in elucidating the complexity of the human disease. Benoist and Mathis [1] recently contributed a new animal model in which an autoimmune response to a ubiquitous antigen leads to an antibody-mediated inflammatory attack in the joints. It is argued that this new model, as with other animal models, is unlikely to explain RA, but it will add to the tools available to reveal the complexity of RA.  相似文献   

4.
Whole-cell models that explicitly represent all cellular components at the molecular level have the potential to predict phenotype from genotype. However, even for simple bacteria, whole-cell models will contain thousands of parameters, many of which are poorly characterized or unknown. New algorithms are needed to estimate these parameters and enable researchers to build increasingly comprehensive models. We organized the Dialogue for Reverse Engineering Assessments and Methods (DREAM) 8 Whole-Cell Parameter Estimation Challenge to develop new parameter estimation algorithms for whole-cell models. We asked participants to identify a subset of parameters of a whole-cell model given the model’s structure and in silico “experimental” data. Here we describe the challenge, the best performing methods, and new insights into the identifiability of whole-cell models. We also describe several valuable lessons we learned toward improving future challenges. Going forward, we believe that collaborative efforts supported by inexpensive cloud computing have the potential to solve whole-cell model parameter estimation.  相似文献   

5.
Microbe Hunting     
Summary: Platforms for pathogen discovery have improved since the days of Koch and Pasteur; nonetheless, the challenges of proving causation are at least as daunting as they were in the late 1800s. Although we will almost certainly continue to accumulate low-hanging fruit, where simple relationships will be found between the presence of a cultivatable agent and a disease, these successes will be increasingly infrequent. The future of the field rests instead in our ability to follow footprints of infectious agents that cannot be characterized using classical microbiological techniques and to develop the laboratory and computational infrastructure required to dissect complex host-microbe interactions. I have tried to refine the criteria used by Koch and successors to prove linkage to disease. These refinements are working constructs that will continue to evolve in light of new technologies, new models, and new insights. What will endure is the excitement of the chase. Happy hunting!  相似文献   

6.
Are we there yet? Tracking the development of new model systems   总被引:2,自引:0,他引:2  
It is increasingly clear that additional 'model' systems are needed to elucidate the genetic and developmental basis of organismal diversity. Whereas model system development previously required enormous investment, recent advances including the decreasing cost of DNA sequencing and the power of reverse genetics to study gene function are greatly facilitating the process. In this review, we consider two aspects of the development of new genetic model systems: first, the types of questions being advanced using these new models; and second, the essential characteristics and molecular tools for new models, depending on the research focus. We hope that researchers will be inspired to explore this array of emerging models and even consider developing new molecular tools for their own favorite organism.  相似文献   

7.
Modeling the biogeographic consequences of climate change requires confidence in model predictions under novel conditions. However, models often fail when extended to new locales, and such instances have been used as evidence of a change in physiological tolerance, that is, a fundamental niche shift. We explore an alternative explanation and propose a method for predicting the likelihood of failure based on physiological performance curves and environmental variance in the original and new environments. We define the transient event margin (TEM) as the gap between energetic performance failure, defined as CTmax, and the upper lethal limit, defined as LTmax. If TEM is large relative to environmental fluctuations, models will likely fail in new locales. If TEM is small relative to environmental fluctuations, models are likely to be robust for new locales, even when mechanism is unknown. Using temperature, we predict when biogeographic models are likely to fail and illustrate this with a case study. We suggest that failure is predictable from an understanding of how climate drives nonlethal physiological responses, but for many species such data have not been collected. Successful biogeographic forecasting thus depends on understanding when the mechanisms limiting distribution of a species will differ among geographic regions, or at different times, resulting in realized niche shifts. TEM allows prediction of the likelihood of such model failure.  相似文献   

8.
The rapid ecological shifts that are occurring due to climate change present major challenges for managers and policymakers and, therefore, are one of the main concerns for environmental modelers and evolutionary biologists. Species distribution models (SDM) are appropriate tools for assessing the relationship between species distribution and environmental conditions, so being customarily used to forecast the biogeographical response of species to climate change. A serious limitation of species distribution models when forecasting the effects of climate change is that they normally assume that species behavior and climatic tolerances will remain constant through time. In this study, we propose a new methodology, based on fuzzy logic, useful for incorporating the potential capacity of species to adapt to new conditions into species distribution models. Our results demonstrate that it is possible to include different behavioral responses of species when predicting the effects of climate change on species distribution. Favorability models offered in this study show two extremes: one considering that the species will not modify its present behavior, and another assuming that the species will take full advantage of the possibilities offered by an increase in environmental favorability. This methodology may mean a more realistic approach to the assessment of the consequences of global change on species' distribution and conservation. Overlooking the potential of species' phenotypical plasticity may under‐ or overestimate the predicted response of species to changes in environmental drivers and its effects on species distribution. Using this approach, we could reinforce the science behind conservation planning in the current situation of rapid climate change.  相似文献   

9.
The previous companion paper describes the initial (seed) schema architecture that gives rise to the observed prey-catching behavior. In this second paper in the series we describe the fundamental adaptive processes required during learning after lesioning. Following bilateral transections of the hypoglossal nerve, anurans lunge toward mealworms with no accompanying tongue or jaw movement. Nevertheless anurans with permanent hypoglossal transections eventually learn to catch their prey by first learning to open their mouth again and then lunging their body further and increasing their head angle. In this paper we present a new learning framework, called schema-based learning (SBL). SBL emphasizes the importance of the current existent structure (schemas), that defines a functioning system, for the incremental and autonomous construction of ever more complex structure to achieve ever more complex levels of functioning. We may rephrase this statement into the language of Schema Theory (Arbib 1992, for a comprehensive review) as the learning of new schemas based on the stock of current schemas. SBL emphasizes a fundamental principle of organization called coherence maximization, that deals with the maximization of congruence between the results of an interaction (external or internal) and the expectations generated for that interaction. A central hypothesis consists of the existence of a hierarchy of predictive internal models (predictive schemas) all over the control center-brain-of the agent. Hence, we will include predictive models in the perceptual, sensorimotor, and motor components of the autonomous agent architecture. We will then show that predictive models are fundamental for structural learning. In particular we will show how a system can learn a new structural component (augment the overall network topology) after being lesioned in order to recover (or even improve) its original functionality. Learning after lesioning is a special case of structural learning but clearly shows that solutions cannot be known/hardwired a priori since it cannot be known, in advance, which substructure is going to break down.  相似文献   

10.
Recent advances in biotechnology and the availability of ever more powerful computers have led to the formulation of increasingly complex models at all levels of biology. One of the main aims of systems biology is to couple these together to produce integrated models across multiple spatial scales and physical processes. In this review, we formulate a definition of multi-scale in terms of levels of biological organisation and describe the types of model that are found at each level. Key issues that arise in trying to formulate and solve multi-scale and multi-physics models are considered and examples of how these issues have been addressed are given for two of the more mature fields in computational biology: the molecular dynamics of ion channels and cardiac modelling. As even more complex models are developed over the coming few years, it will be necessary to develop new methods to model them (in particular in coupling across the interface between stochastic and deterministic processes) and new techniques will be required to compute their solutions efficiently on massively parallel computers. We outline how we envisage these developments occurring.  相似文献   

11.
Inhibitors of farnesyltransferase are effective against a variety of tumors in mouse models of cancer. Clinical trials to evaluate these agents in humans are ongoing. In our effort to develop new farnesyltransferase inhibitors, we have discovered bioavailable aryl tetrahydropyridines that are potent in cell culture. The design, synthesis, SAR and biological properties of these compounds will be discussed.  相似文献   

12.
Large scale, high-resolution global data on farm animal distributions are essential for spatially explicit assessments of the epidemiological, environmental and socio-economic impacts of the livestock sector. This has been the major motivation behind the development of the Gridded Livestock of the World (GLW) database, which has been extensively used since its first publication in 2007. The database relies on a downscaling methodology whereby census counts of animals in sub-national administrative units are redistributed at the level of grid cells as a function of a series of spatial covariates. The recent upgrade of GLW1 to GLW2 involved automating the processing, improvement of input data, and downscaling at a spatial resolution of 1 km per cell (5 km per cell in the earlier version). The underlying statistical methodology, however, remained unchanged. In this paper, we evaluate new methods to downscale census data with a higher accuracy and increased processing efficiency. Two main factors were evaluated, based on sample census datasets of cattle in Africa and chickens in Asia. First, we implemented and evaluated Random Forest models (RF) instead of stratified regressions. Second, we investigated whether models that predicted the number of animals per rural person (per capita) could provide better downscaled estimates than the previous approach that predicted absolute densities (animals per km2). RF models consistently provided better predictions than the stratified regressions for both continents and species. The benefit of per capita over absolute density models varied according to the species and continent. In addition, different technical options were evaluated to reduce the processing time while maintaining their predictive power. Future GLW runs (GLW 3.0) will apply the new RF methodology with optimized modelling options. The potential benefit of per capita models will need to be further investigated with a better distinction between rural and agricultural populations.  相似文献   

13.
Inhibitors of farnesyltransferase are effective against a variety of tumors in mouse models of cancer. Clinical trials to evaluate these agents in humans are ongoing. In our effort to develop new farnesyltransferase inhibitors, we have discovered a series of aryl tetrahydropyridines that incorporate substituted glycine, phenylalanine and histidine residues. The design, synthesis, SAR and biological properties of these compounds will be discussed.  相似文献   

14.
The objective of this article is to help risk assessors and managers step back from paying sole attention to finer and finer detail (e.g., measuring nuances of a single chemical's biochemical action at the molecular level). As professionals, we must always remember that the higher service that risk assessment provides is to improve everyone's long-term well being and survival. It is especially important to note that there are many lessons all Americans should experience as early as possible. For instance, our Native American tribal members are taught from birth that we all live enmeshed within the environment, not isolated from it or superior to it. Our practical every day needs, including basic nutritional, spiritual, and economic needs, are all derived directly from a clean, functioning environment. In return, we must accept the fact that we are not masters or owners of the environment, and that we don't have dominion over ecological processes. Our relationship is best viewed as part of a human-eco-cultural system. The risks to this system as a whole must be reflected in new transparent system-level models that easily show the relationships among, and equality of, all of these elements. Transparent, user-friendly, system-level models must become standard tools in every risk assessors toolbox. Such models are being developed and have already made a difference. Tribal risk information, in particular, needs to be produced at the community and system level, must include eco-cultural metrics, and requires geospatial and temporal integration that most conventional models cannot accomplish. I expect that risk assessors, once enlightened, will insist that such models will become a required part of risk analysis in the near future.  相似文献   

15.
The mechanism that controls digit formation has long intrigued developmental and theoretical biologists, and many different models and mechanisms have been proposed. Here we review models of limb development with a specific focus on digit and long bone formation. Decades of experiments have revealed the basic signaling circuits that control limb development, and recent advances in imaging and molecular technologies provide us with unprecedented spatial detail and a broader view of the regulatory networks. Computational approaches are important to integrate the available information into a consistent framework that will allow us to achieve a deeper level of understanding, and that will help with the future planning and interpretation of complex experiments, paving the way to in silico genetics. Previous models of development had to be focused on very few, simple regulatory interactions. Algorithmic developments and increasing computing power now enable the generation and validation of increasingly realistic models that can be used to test old theories and uncover new mechanisms. Birth Defects Research (Part C) 102:1–12, 2014. © 2014 Wiley Periodicals, Inc.  相似文献   

16.
When resources are patchily distributed in an environment, behavioral ecologists frequently turn to ideal free distribution (IFD) models to predict the spatial distribution of organisms. In these models, predictions about distributions depend upon two key factors: the quality of habitat patches and the nature of competition between consumers. Surprisingly, however, no IFD models have explored the possibility that consumers modulate their competitive efforts in an evolutionarily stable manner. Instead, previous models assume that resource acquisition ability and competition are fixed within species or within phenotypes. We explored the consequences of adaptive modulation of competitive effort by incorporating tug-of-war theory into payoff equations from the two main classes of IFD models (continuous input (CI) and interference). In the models we develop, individuals can increase their share of the resources available in a patch, but do so at the costs of increased resource expenditures and increased negative interactions with conspecifics. We show how such models can provide new hypotheses to explain what are thought to be deviations from IFDs (e.g., the frequent observation of fewer animals than predicted in "good" patches of habitat). We also detail straightforward predictions made uniquely by the models we develop, and we outline experimental tests that will distinguish among alternatives.  相似文献   

17.
Models of kin or group selection usually feature only one possible fitness transfer. The phenotypes are either to make this transfer or not to make it and for any given fitness transfer, Hamilton's rule predicts which of the two phenotypes will spread. In this article we allow for the possibility that different individuals or different generations face similar, but not necessarily identical possibilities for fitness transfers. In this setting, phenotypes are preference relations, which concisely specify behaviour for a range of possible fitness transfers (rather than being a specification for only one particular situation an animal or human can be in). For this more general set-up, we find that only preference relations that are linear in fitnesses can be explained using models of kin selection and that the same applies to a large class of group selection models. This provides a new implication of hierarchical selection models that could in principle falsify them, even if relatedness--or a parameter for assortativeness--is unknown. The empirical evidence for humans suggests that hierarchical selection models alone are not enough to explain their other-regarding or altruistic behaviour.  相似文献   

18.
Multichannel data collection in the neurosciences is routine and has necessitated the development of methods to identify the direction of interactions among processes. The most widely used approach for detecting these interactions in such data is based on autoregressive models of stochastic processes, although some work has raised the possibility of serious difficulties with this approach. This article demonstrates that these difficulties are present and that they are intrinsic features of the autoregressive method. Here, we introduce a new method taking into account unobserved processes and based on coherence. Two examples of three-process networks are used to demonstrate that although coherence measures are intrinsically non-directional, a particular network configuration will be associated with a particular set of coherences. These coherences may not specify the network uniquely, but in principle will specify all network configurations consistent with their values and will also specify the relationships among the unobserved processes. Moreover, when new information becomes available, the values of the measures of association already in place do not change, but the relationships among the unobserved processes may become further resolved.  相似文献   

19.
Prey-dependent models, with the predation rate (per predator) a function of prey numbers alone, predict the existence of a trophic cascade. In a trophic cascade, the addition of a top predator to a two-level food chain to make a three-level food chain will lead to increases in the population size of the primary producers, and the addition of nutrients to three-level chains will lead to increases in the population numbers at only the first and third trophic levels. In contrast, ratio-dependent models, with the predation rate (per predator) dependent on the ratio of predator numbers to prey, predict that additions of top predators will not increase the population sizes of the primary producers, and that the addition of nutrients to a three-level food chain will lead to increases in population numbers at all trophic levels. Surprisingly, recent meta-analyses show that freshwater pelagic food web patterns match neither prey-dependent models (in pelagic webs, ''prey'' are phytoplankton, and ''predators'' are zooplankton), nor ratio-dependent models. In this paper we use a modification of the prey-dependent model, incorporating strong interference within the zooplankton trophic level, that does yield patterns matching those found in nature. This zooplankton interference model corresponds to a more reticulate food web than in the linear, prey-dependent model, which lacks zooplankton interference. We thus reconcile data with a new model, and make the testable prediction that the strength of trophic cascades will depend on the degree of heterogeneity in the zooplankton level of the food chain.  相似文献   

20.
This study adapted human videofluoroscopic swallowing study (VFSS) methods for use with murine disease models for the purpose of facilitating translational dysphagia research. Successful outcomes are dependent upon three critical components: test chambers that permit self-feeding while standing unrestrained in a confined space, recipes that mask the aversive taste/odor of commercially-available oral contrast agents, and a step-by-step test protocol that permits quantification of swallow physiology. Elimination of one or more of these components will have a detrimental impact on the study results. Moreover, the energy level capability of the fluoroscopy system will determine which swallow parameters can be investigated. Most research centers have high energy fluoroscopes designed for use with people and larger animals, which results in exceptionally poor image quality when testing mice and other small rodents. Despite this limitation, we have identified seven VFSS parameters that are consistently quantifiable in mice when using a high energy fluoroscope in combination with the new murine VFSS protocol. We recently obtained a low energy fluoroscopy system with exceptionally high imaging resolution and magnification capabilities that was designed for use with mice and other small rodents. Preliminary work using this new system, in combination with the new murine VFSS protocol, has identified 13 swallow parameters that are consistently quantifiable in mice, which is nearly double the number obtained using conventional (i.e., high energy) fluoroscopes. Identification of additional swallow parameters is expected as we optimize the capabilities of this new system. Results thus far demonstrate the utility of using a low energy fluoroscopy system to detect and quantify subtle changes in swallow physiology that may otherwise be overlooked when using high energy fluoroscopes to investigate murine disease models.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号