首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Self-organizing feature maps (SOFMs) represent a dimensionality-reduction algorithm that has been used to replicate feature topographies observed experimentally in primary visual cortex (V1). We used the SOFM algorithm to model possible topographies of generic sensory cortical areas containing up to five arbitrary physiological features. This study explored the conditions under which these multi-feature SOFMs contained two features that were mapped monotonically and aligned orthogonally with one another (i.e., “globally orthogonal”), as well as the conditions under which the map of one feature aligned with the longest anatomical dimension of the modeled cortical area (i.e., “dominant”). In a single SOFM with more than two features, we never observed more than one dominant feature, nor did we observe two globally orthogonal features in the same map in which a dominant feature occurred. Whether dominance or global orthogonality occurred depended upon how heavily weighted the features were relative to one another. The most heavily weighted features are likely to correspond to those physical stimulus properties transduced directly by the sensory epithelium of a particular sensory modality. Our results imply, therefore, that in the primary cortical area of sensory modalities with a two-dimensional sensory epithelium, these two features are likely to be organized globally orthogonally to one another, and neither feature is likely to be dominant. In the primary cortical area of sensory modalities with a one-dimensional sensory epithelium, however, this feature is likely to be dominant, and no two features are likely to be organized globally orthogonally to one another. Because the auditory system transduces a single stimulus feature (i.e., frequency) along the entire length of the cochlea, these findings may have particular relevance for topographic maps of primary auditory cortex. This research was supported by The McDonnell Center for Higher Brain Function, The Wallace H. Coulter Foundation and NIH grant DC008880.  相似文献   

2.
Influenza viruses have been responsible for large losses of lives around the world and continue to present a great public health challenge. Antigenic characterization based on hemagglutination inhibition (HI) assay is one of the routine procedures for influenza vaccine strain selection. However, HI assay is only a crude experiment reflecting the antigenic correlations among testing antigens (viruses) and reference antisera (antibodies). Moreover, antigenic characterization is usually based on more than one HI dataset. The combination of multiple datasets results in an incomplete HI matrix with many unobserved entries. This paper proposes a new computational framework for constructing an influenza antigenic cartography from this incomplete matrix, which we refer to as Matrix Completion-Multidimensional Scaling (MC-MDS). In this approach, we first reconstruct the HI matrices with viruses and antibodies using low-rank matrix completion, and then generate the two-dimensional antigenic cartography using multidimensional scaling. Moreover, for influenza HI tables with herd immunity effect (such as those from Human influenza viruses), we propose a temporal model to reduce the inherent temporal bias of HI tables caused by herd immunity. By applying our method in HI datasets containing H3N2 influenza A viruses isolated from 1968 to 2003, we identified eleven clusters of antigenic variants, representing all major antigenic drift events in these 36 years. Our results showed that both the completed HI matrix and the antigenic cartography obtained via MC-MDS are useful in identifying influenza antigenic variants and thus can be used to facilitate influenza vaccine strain selection. The webserver is available at http://sysbio.cvm.msstate.edu/AntigenMap.  相似文献   

3.
In this review, we summarize recent evidence that perceptual learning can occur not only under training conditions but also in situations of unattended and passive sensory stimulation. We suggest that the key to learning is to boost stimulus-related activity that is normally insufficient exceed a learning threshold. We discuss how factors such as attention and reinforcement have crucial, permissive roles in learning. We observe, however, that highly optimized stimulation protocols can also boost responses and promote learning. This helps to reconcile observations of how learning can occur (or fail to occur) in seemingly contradictory circumstances, and argues that different processes that affect learning operate through similar mechanisms that are probably based on, and mediated by, neuromodulatory factors.  相似文献   

4.
A unifying computational framework for motor control and social interaction   总被引:17,自引:0,他引:17  
Recent empirical studies have implicated the use of the motor system during action observation, imitation and social interaction. In this paper, we explore the computational parallels between the processes that occur in motor control and in action observation, imitation, social interaction and theory of mind. In particular, we examine the extent to which motor commands acting on the body can be equated with communicative signals acting on other people and suggest that computational solutions for motor control may have been extended to the domain of social interaction.  相似文献   

5.
Despite the establishment of design principles to optimize codon choice for heterologous expression vector design, the relationship between codon sequence and final protein yield remains poorly understood. In this work, we present a computational framework for the identification of a set of mutant codon sequences for optimized heterologous protein production, which uses a codon-sequence mechanistic model of protein synthesis. Through a sensitivity analysis on the optimal steady state configuration of protein synthesis we are able to identify the set of codons, that are the most rate limiting with respect to steady state protein synthesis rate, and we replace them with synonymous codons recognized by charged tRNAs more efficient for translation, so that the resulting codon-elongation rate is higher. Repeating this procedure, we iteratively optimize the codon sequence for higher protein synthesis rate taking into account multiple constraints of various types. We determine a small set of optimized synonymous codon sequences that are very close to each other in sequence space, but they have an impact on properties such as ribosomal utilization or secondary structure. This limited number of sequences can then be offered for further experimental study. Overall, the proposed method is very valuable in understanding the effects of the different properties of mRNA sequences on the final protein yield in heterologous protein production and it can find applications in synthetic biology and biotechnology.  相似文献   

6.
Technological advances in genomics and imaging have led to an explosion of molecular and cellular profiling data from large numbers of samples. This rapid increase in biological data dimension and acquisition rate is challenging conventional analysis strategies. Modern machine learning methods, such as deep learning, promise to leverage very large data sets for finding hidden structure within them, and for making accurate predictions. In this review, we discuss applications of this new breed of analysis approaches in regulatory genomics and cellular imaging. We provide background of what deep learning is, and the settings in which it can be successfully applied to derive biological insights. In addition to presenting specific applications and providing tips for practical use, we also highlight possible pitfalls and limitations to guide computational biologists when and how to make the most use of this new technology.  相似文献   

7.
Grating cells were discovered in the V1 and V2 areas of the monkey visual cortex by von der Heydt et al. (1992). These cells responded vigorously to grating patterns of appropriate orientation and periodicity. Computational models inspired by these findings were used as texture operator (Kruzinga and Petkov 1995, 1999; Petkov and Kruzinga 1997) and for the emergence and self-organization of grating cells (Brunner et al. 1998; Bauer et al. 1999). The aim of this paper is to create a grating cell operator that demonstrates similar responses to monkey grating cells by applying operator to the same stimuli as in the experiments carried out by von der Heydt et al. (1992). Operator will be tested on images that contain periodic patterns as suggested by De Valois (1988). In order to learn more about the role of grating cells in natural vision, operator is applied to 338 real-world images of textures obtained from three different databases. The results suggest that grating cells respond strongly to regular alternating periodic patterns of a certain orientation. Such patterns are common in images of human-made structures, like buildings, fabrics, and tiles, and to regular natural periodic patterns, which are relatively rare in nature.  相似文献   

8.
MOTIVATION: The task of engineering a protein to perform a target biological function is known as protein design. A commonly used paradigm casts this functional design problem as a structural one, assuming a fixed backbone. In probabilistic protein design, positional amino acid probabilities are used to create a random library of sequences to be simultaneously screened for biological activity. Clearly, certain choices of probability distributions will be more successful in yielding functional sequences. However, since the number of sequences is exponential in protein length, computational optimization of the distribution is difficult. RESULTS: In this paper, we develop a computational framework for probabilistic protein design following the structural paradigm. We formulate the distribution of sequences for a structure using the Boltzmann distribution over their free energies. The corresponding probabilistic graphical model is constructed, and we apply belief propagation (BP) to calculate marginal amino acid probabilities. We test this method on a large structural dataset and demonstrate the superiority of BP over previous methods. Nevertheless, since the results obtained by BP are far from optimal, we thoroughly assess the paradigm using high-quality experimental data. We demonstrate that, for small scale sub-problems, BP attains identical results to those produced by exact inference on the paradigmatic model. However, quantitative analysis shows that the distributions predicted significantly differ from the experimental data. These findings, along with the excellent performance we observed using BP on the smaller problems, suggest potential shortcomings of the paradigm. We conclude with a discussion of how it may be improved in the future.  相似文献   

9.
The "Virtual Cell" provides a general system for testing cell biological mechanisms and creates a framework for encapsulating the burgeoning knowledge base comprising the distribution and dynamics of intracellular biochemical processes. It approaches the problem by associating biochemical and electrophysiological data describing individual reactions with experimental microscopic image data describing their subcellular localizations. Individual processes are collected within a physical and computational infrastructure that accommodates any molecular mechanism expressible as rate equations or membrane fluxes. An illustration of the method is provided by a dynamic simulation of IP3-mediated Ca2+ release from endoplasmic reticulum in a neuronal cell. The results can be directly compared to experimental observations and provide insight into the role of experimentally inaccessible components of the overall mechanism.  相似文献   

10.
Endovascular aneurysm repair (Greenhalgh in N Engl J Med 362(20):1863–1871, 2010) techniques have revolutionized the treatment of thoracic and abdominal aortic aneurysm disease, greatly reducing the perioperative mortality and morbidity associated with open surgical repair techniques. However, EVAR is not free of important complications such as late device migration, endoleak formation and fracture of device components that may result in adverse events such as aneurysm enlargement, need for long-term imaging surveillance and secondary interventions or even death. These complications result from the device inability to withstand the hemodynamics of blood flow and to keep its originally intended post-operative position over time. Understanding the in vivo biomechanical working environment experienced by endografts is a critical factor in improving their long-term performance. To date, no study has investigated the mechanics of contact between device and aorta in a three-dimensional setting. In this work, we developed a comprehensive Computational Solid Mechanics and Computational Fluid Dynamics framework to investigate the mechanics of endograft positional stability. The main building blocks of this framework are: (1) Three-dimensional non-planar aortic and stent-graft geometrical models, (2) Realistic multi-material constitutive laws for aorta, stent, and graft, (3) Physiological values for blood flow and pressure, and (4) Frictional model to describe the contact between the endograft and the aorta. We introduce a new metric for numerical quantification of the positional stability of the endograft. Lastly, in the results section, we test the framework by investigating the impact of several factors that are clinically known to affect endograft stability.  相似文献   

11.
Lipids are important compounds for human physiology and as renewable resources for fuels and chemicals. In lipid research, there is a big gap between the currently available pathway-level representations of lipids and lipid structure databases in which the number of compounds is expanding rapidly with high-throughput mass spectrometry methods.In this work, we introduce a computational approach to bridge this gap by making associations between metabolic pathways and the lipid structures discovered increasingly thorough lipidomics studies. Our approach, called NICELips (Network Integrated Computational Explorer for Lipidomics), is based on the formulation of generalized enzymatic reaction rules for lipid metabolism, and it employs the generalized rules to postulate novel pathways of lipid metabolism. It further integrates all discovered lipids in biological networks of enzymatic reactions that consist their biosynthesis and biodegradation pathways.We illustrate the utility of our approach through a case study of bis(monoacylglycero)phosphate (BMP), a biologically important glycerophospholipid with immature synthesis and catabolic route(s). Using NICELips, we were able to propose various synthesis and degradation pathways for this compound and several other lipids with unknown metabolism like BMP, and in addition several alternative novel biosynthesis and biodegradation pathways for lipids with known metabolism. NICELips has potential applications in designing therapeutic interventions for lipid-associated disorders and in the metabolic engineering of model organisms for improving the biobased production of lipid-derived fuels and chemicals.  相似文献   

12.
13.
14.
15.
W Chen  W Zhou  T Xia  X Gu 《PloS one》2012,7(7):e38699
One difficulty in conducting biologically meaningful dynamic analysis at the systems biology level is that in vivo system regulation is complex. Meanwhile, many kinetic rates are unknown, making global system analysis intractable in practice. In this article, we demonstrate a computational pipeline to help solve this problem, using the exocytotic process as an example. Exocytosis is an essential process in all eukaryotic cells that allows communication in cells through vesicles that contain a wide range of intracellular molecules. During this process a set of proteins called SNAREs acts as an engine in this vesicle-membrane fusion, by forming four-helical bundle complex between (membrane) target-specific and vesicle-specific SNAREs. As expected, the regulatory network for exocytosis is very complex. Based on the current understanding of the protein-protein interaction network related to exocytosis, we mathematically formulated the whole system, by the ordinary differential equations (ODE). We then applied a mathematical approach (called inverse problem) to estimating the kinetic parameters in the fundamental subsystem (without regulation) from limited in vitro experimental data, which fit well with the reports by the conventional assay. These estimates allowed us to conduct an efficient stability analysis under a specified parameter space for the exocytotic process with or without regulation. Finally, we discuss the potential of this approach to explain experimental observations and to make testable hypotheses for further experimentation.  相似文献   

16.
High-throughput genomic technologies are revolutionizing modern biology. In particular, DNA microarrays have become one of the most powerful tools for profiling global mRNA expression in different tissues and environmental conditions, and for detecting single nucleotide polymorphisms. The broad applicability of gene expression profiling to the biological and medical realms has generated expanding demand for mass production of microarrays, which in turn has created considerable interest in improving the cost effectiveness of microarray fabrication techniques. We have developed the computational framework for an optimal synthesis strategy for oligonucleotide microarrays. The problem was introduced by Hubbell et al. Here, we formalize the problem, obtain precise bounds on its complexity and devise several computational solutions.  相似文献   

17.
Confidence judgements, self-assessments about the quality of a subject's knowledge, are considered a central example of metacognition. Prima facie, introspection and self-report appear the only way to access the subjective sense of confidence or uncertainty. Contrary to this notion, overt behavioural measures can be used to study confidence judgements by animals trained in decision-making tasks with perceptual or mnemonic uncertainty. Here, we suggest that a computational approach can clarify the issues involved in interpreting these tasks and provide a much needed springboard for advancing the scientific understanding of confidence. We first review relevant theories of probabilistic inference and decision-making. We then critically discuss behavioural tasks employed to measure confidence in animals and show how quantitative models can help to constrain the computational strategies underlying confidence-reporting behaviours. In our view, post-decision wagering tasks with continuous measures of confidence appear to offer the best available metrics of confidence. Since behavioural reports alone provide a limited window into mechanism, we argue that progress calls for measuring the neural representations and identifying the computations underlying confidence reports. We present a case study using such a computational approach to study the neural correlates of decision confidence in rats. This work shows that confidence assessments may be considered higher order, but can be generated using elementary neural computations that are available to a wide range of species. Finally, we discuss the relationship of confidence judgements to the wider behavioural uses of confidence and uncertainty.  相似文献   

18.
The analysis of hemodynamic parameters and functional reactivity of cerebral capillaries is still controversial. To assess the hemodynamic parameters in the cortical capillary network, a generic model was created using 2D voronoi tessellation in which each edge represents a capillary segment. This method is capable of creating an appropriate generic model of cerebral capillary network relating to each part of the brain cortex because the geometric model is able to vary the capillary density. The modeling presented here is based on morphometric parameters extracted from physiological data of the human cortex. The pertinent hemodynamic parameters were obtained by numerical simulation based on effective blood viscosity as a function of hematocrit and microvessel diameter, phase separation and plasma skimming effects. The hemodynamic parameters of capillary networks with two different densities (consistent with the variation of the morphometric data in the human cortical capillary network) were analyzed. The results show pertinent hemodynamic parameters for each model. The heterogeneity (coefficient variation) and the mean value of hematocrits, flow rates and velocities of the both network models were specified. The distributions of blood flow throughout the both models seem to confirm the hypothesis in which all capillaries in a cortical network are recruited at rest (normal condition). The results also demonstrate a discrepancy of the network resistance between two models, which are derived from the difference in the number density of capillary segments between the models.  相似文献   

19.
20.
Biomechanics and Modeling in Mechanobiology - The process of vision begins in the retina, yet the role of biomechanical forces in the retina is relatively unknown and only recently being explored....  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号