首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The results of a previous theoretical study of a class of systems are applied for the design of neural nets which try to simulate biological behavior. Besides the models for single aperiodic and periodic neurons, a “neural oscillator” is developed which consists of two cross-excited neurons. Its response is similar to the firing pattern of certain biological neural oscillators, like the flying system of the locust. Also, by proper change of its parameters, it can be made highly irregular, providing a deterministic model for the spontaneous neural activity.  相似文献   

2.
Neurons, glia, and endothelial cells of the cerebral microvasculature co-exist in intimate proximity in nervous tissues, and their homeostatic interactions in health, as well as coordinated response to injury, have led to the concept that they form the basic elements of a functional neurovascular unit. During the course of normal cellular metabolism, growth, and development, each of these brain cell types secrete various species of potentially neurotoxic peptides and factors, events that increase in magnitude as brain cells age. This article reviews contemporary research on the secretory products of the three primary cell types that constitute the neurovascular unit in deep brain regions. We provide some novel in vitro data that illustrate potentially pathogenic paracrine effects within primary cells of the neurovascular unit. For example, the pro-inflammatory cytokine interleukin (IL)-1β was found to stimulate amyloid-β (Aβ) peptide release from human neural cells, and human brain microvessel endothelial cells exposed to transient hypoxia were found to secrete IL-1β at concentrations known to induce Aβ42 peptide release from human neural cells. Hypoxia and excessive IL-1β and Aβ42 abundance are typical pathogenic stress factors implicated in the initiation and development of common, chronic neurological disorders such as Alzheimer's disease. These data support the hypothesis that paracrine effects of stressed constituent cells of the neurovascular unit may contribute to “spreading effects” characteristic of progressive neurodegenerative disorders.  相似文献   

3.
The response time of a random net is defined as the expected time (measured in the number of synaptic delays) required for the excitation in the net (measured by the fraction of neurons firing per unit time) to reach a certain level. The response time is calculated in terms of the net parameters as a function of the intensity of the outside stimulation. Two principal types of cases are studied, 1) an instantaneous initial stimulation, and 2) continuously applied stimulation. It is shown that for a certain type of net where the required level of excitation is small, the response time-intensity equation reduces to the one derived on the basis of the “one-factor” theory applied to a neural connection. More general assumptions, however, give different types of equations. The concept of the “net threshold” is defined, and its calculation indicated. The net threshold for instantaneous stimulation is, in general, greater than that for continuous stimulation. The results are discussed with reference to existing theories of reaction times.  相似文献   

4.
Partial directed coherence: a new concept in neural structure determination   总被引:2,自引:1,他引:1  
 This paper introduces a new frequency-domain approach to describe the relationships (direction of information flow) between multivariate time series based on the decomposition of multivariate partial coherences computed from multivariate autoregressive models. We discuss its application and compare its performance to other approaches to the problem of determining neural structure relations from the simultaneous measurement of neural electrophysiological signals. The new concept is shown to reflect a frequency-domain representation of the concept of Granger causality. Received: 25 April 2000 / Accepted in revised form: 13 November 2000  相似文献   

5.
The aim of this paper is to develop a multiscale hierarchical hybrid model based on finite element analysis and neural network computation to link mesoscopic scale (trabecular network level) and macroscopic (whole bone level) to simulate the process of bone remodelling. As whole bone simulation, including the 3D reconstruction of trabecular level bone, is time consuming, finite element calculation is only performed at the macroscopic level, whilst trained neural networks are employed as numerical substitutes for the finite element code needed for the mesoscale prediction. The bone mechanical properties are updated at the macroscopic scale depending on the morphological and mechanical adaptation at the mesoscopic scale computed by the trained neural network. The digital image-based modelling technique using μ-CT and voxel finite element analysis is used to capture volume elements representativeof 2 mm3 at the mesoscale level of the femoral head. The input data for the artificial neural network are a set of bone material parameters, boundary conditions and the applied stress. The output data are the updated bone properties and some trabecular bone factors. The current approach is the first model, to our knowledge, that incorporates both finite element analysis and neural network computation to rapidly simulate multilevel bone adaptation.  相似文献   

6.
The linear-no-threshold (LNT) controversy covers much more than the mere discussion whether or not “the LNT hypothesis is valid”. It is shown that one cannot expect to find only one or even the only one dose–effect relationship. Each element within the biological reaction chain that is affected by ionizing radiation contributes in a specific way to the final biological endpoint of interest. The resulting dose–response relationship represents the superposition of all these effects. Till now there is neither a closed and clear picture of the entirety of radiation action for doses below some 10 mSv, nor does clear epidemiological evidence exist for an increase of risk for stochastic effects, in this dose range. On the other hand, radiation protection demands for quantitative risk estimates as well as for practicable dose concepts. In this respect, the LNT concept is preferred against any alternative concept. However, the LNT concept does not necessarily mean that the mechanism of cancer induction is intrinsically linear. It could hold even if the underlying multi-step mechanisms act in a non-linear way. In this case it would express a certain “attenuation” of non-linearities. Favouring LNT against threshold-, hyper-, or sub-linear models for radiation-protection purposes on the one hand, but preferring one of these models (e.g. for a specific effect) because of biological considerations for scientific purposes on the other hand, does not mean a contradiction.  相似文献   

7.
The calculation of the size of the “sensitive volume” or “control center” in biological effects of radiations is discussed from the viewpoint of the probabilistic theory of these phenomena based on the concept of random “effective events”. On the bases of that theory, the resistivity of a microorganism to radiation is defined as its “mean life” under a radiation of one roentgen per minute. This mean is calculated for processes with and without recovery. The case of variable sensitivity, as it occurs for instance during mitosis, is discussed in detail. Methods are given to calculate this variability from survival curves or similar experimental data. The theory is applied to experiments of A. Zuppinger on irradiation ofAscaris eggs with X-rays.  相似文献   

8.
Despite the voluminous literature on biological functions produced over the last 40 years, few philosophers have studied the concept of function as it is used in neuroscience. Recently, Craver (forthcoming; also see Craver 2001) defended the causal role theory against the selected effects theory as the most appropriate theory of function for neuroscience. The following argues that though neuroscientists do study causal role functions, the scope of that theory is not as universal as claimed. Despite the strong prima facie superiority of the causal role theory, the selected effects theory (when properly developed) can handle many cases from neuroscience with equal facility. It argues this by presenting a new theory of function that generalizes the notion of a ‘selection process’ to include processes such as neural selection, antibody selection, and some forms of learning—that is, to include structures that have been differentially retained as well as those that have been differentially reproduced. This view, called the generalized selected effects theory of function, will be defended from criticism and distinguished from similar views in the literature.  相似文献   

9.
We wondered whether random populations of dissociated cultured cortical neurons, despite of their lack of structure and/or regional specialization, are capable of modulating their neural activity as the effect of a time-varying stimulation – a simulated ‘sensory’ afference. More specifically, we used localized low-frequency, non-periodic trains of stimuli to simulate sensory afferences, and asked how much information about the original trains of stimuli could be extracted from the neural activity recorded at the different sites. Furthermore, motivated by the results of studies performed both in vivo and in vitro on different preparations, which suggested that isolated spikes and bursts may play different roles in coding time-varying signals, we explored the amount of such ‘sensory’ information that could be associated to these different firing modes. Finally, we asked whether and how such ‘sensory’ information is transferred from the sites of stimulation (i.e., the ‘sensory’ areas), to the other regions of the neural populations. To do this we applied stimulus reconstruction techniques and information theoretic concepts that are typically used to investigate neural coding in sensory systems. Our main results are that (1) slow variations of the rate of stimulation are coded into isolated spikes and in the time of occurrence of bursts (but not in the bursts’ temporal structure); (2) increasing the rate of stimulation has the effect of increasing the proportion of isolated spikes in the average evoked response and their importance in coding for the stimuli; and, (3) the ability to recover the time course of the pattern of stimulation is strongly related to the degree of functional connectivity between stimulation and recording sites. These observations parallel similar findings in intact nervous systems regarding the complementary roles of bursts and tonic spikes in encoding sensory information. Our results also have interesting implications in the field of neuro-robotic interfaces. In fact, the ability of populations of neurons to code information is a prerequisite for obtaining hybrid systems, in which neuronal populations are used to control external devices.  相似文献   

10.
Hebb proposed the concept of a neural assembly distributed across cortical tissue as a model for representation of information in the cerebral cortex. Later developments of the concept highlight the need for overlapping membership between independent assemblies, and the spread of activity throughout the assembly once it is activated above a critical level (ignition). Formalisation of the neural assembly concept, especially in relation to quantitative data from the real cortex, is at a very early stage. We consider two constraints on neural assembly size: (1) if a neural assembly is too small the fraction of its neurons that need to be active to ignite the whole assembly becomes unrealistically large; (2) if assemblies in a block of cortical tissue become too large then the block becomes ‘unsafe’, that is, unwanted spread from an active assembly to overlapping ones becomes inevitable. We consider variations in three parameters: neuronal firing threshold; connection density; and the total number of assemblies stored in the block of cortical tissue. Given biologically plausible values for these parameters we estimate maximum assembly size compatible with ignitability of individual assemblies, low probability of unwanted spread to overlapping assemblies, and safe operation of the block as a whole. Received: 7 March 1997 / Accepted in revised form: 1 July 1997  相似文献   

11.
By “neural net” will be meant “neural net without circles.” Every neural net effects a transformation from inputs (i.e., firing patterns of the input neurons) to outputs (firing patterns of the output neurons). Two neural nets will be calledequivalent if they effect the same transformation from inputs to outputs. A canonical form is found for neural nets with respect to equivalence; i.e., a class of neural nets is defined, no two of which are equivalent, and which contains a neural net equivalent to any given neural net. This research was supported by the U.S. Air Force under Contract AF 49(638)-414 monitored by the Air Force Office of Scientific Research.  相似文献   

12.
Studies on drawing circles with both hands in the horizontal plane have shown that this task is easy to perform across a wide range of movement frequencies under the symmetrical mode of coordination, whereas under the asymmetrical mode (both limbs moving clockwise or counterclockwise) increases in movement frequency have a disruptive effect on trajectory control and hand coordination. To account for these interference effects, we propose a simplified computer model for bimanual circle drawing based on the assumptions that (1) circular trajectories are generated from two orthogonal oscillations coupled with a phase delay, (2) the trajectories are organized on two levels, “intention” and “motor execution”, and (3) the motor systems controlling each hand are prone to neural cross-talk. The neural cross-talk consists in dispatching some fraction of any force command sent to one limb as a mirror image to the other limb. Assuming predominating coupling influences from the dominant to the nondominant limb, the simulations successfully reproduced the main characteristics of performance during asymmetrical bimanual circle drawing with increasing movement frequencies, including disruption of the circular form drawn with the nondominant hand, increasing dephasing of the hand movements, increasing variability of the phase difference, and occasional reversals of the movement direction in the nondominant limb. The implications of these results for current theories of bimanual coordination are discussed. Received: 23 June 1998 / Accepted in revised form: 20 April 1999  相似文献   

13.
In this paper, we highlight the topological properties of leader neurons whose existence is an experimental fact. Several experimental studies show the existence of leader neurons in population bursts of activity in 2D living neural networks (Eytan and Marom, J Neurosci 26(33):8465–8476, 2006; Eckmann et al., New J Phys 10(015011), 2008). A leader neuron is defined as a neuron which fires at the beginning of a burst (respectively network spike) more often than we expect by chance considering its mean firing rate. This means that leader neurons have some burst triggering power beyond a chance-level statistical effect. In this study, we characterize these leader neuron properties. This naturally leads us to simulate neural 2D networks. To build our simulations, we choose the leaky integrate and fire (lIF) neuron model (Gerstner and Kistler 2002; Cessac, J Math Biol 56(3):311–345, 2008), which allows fast simulations (Izhikevich, IEEE Trans Neural Netw 15(5):1063–1070, 2004; Gerstner and Naud, Science 326:379–380, 2009). The dynamics of our lIF model has got stable leader neurons in the burst population that we simulate. These leader neurons are excitatory neurons and have a low membrane potential firing threshold. Except for these two first properties, the conditions required for a neuron to be a leader neuron are difficult to identify and seem to depend on several parameters involved in the simulations themselves. However, a detailed linear analysis shows a trend of the properties required for a neuron to be a leader neuron. Our main finding is: A leader neuron sends signals to many excitatory neurons as well as to few inhibitory neurons and a leader neuron receives only signals from few other excitatory neurons. Our linear analysis exhibits five essential properties of leader neurons each with different relative importance. This means that considering a given neural network with a fixed mean number of connections per neuron, our analysis gives us a way of predicting which neuron is a good leader neuron and which is not. Our prediction formula correctly assesses leadership for at least ninety percent of neurons.  相似文献   

14.
This work investigated the growth of Kluyveromyces marxianus NRRL Y-7571 in solid-state fermentation in a medium composed of sugarcane bagasse, molasses, corn steep liquor and soybean meal within a packed-bed bioreactor. Seven experimental runs were carried out to evaluate the effects of flow rate and inlet air temperature on the following microbial rates: cell mass production, total reducing sugar and oxygen consumption, carbon dioxide and ethanol production, metabolic heat and water generation. A mathematical model based on an artificial neural network was developed to predict the above-mentioned microbial rates as a function of the fermentation time, initial total reducing sugar concentration, inlet and outlet air temperatures. The results showed that the microbial rates were temperature dependent for the range 27–50°C. The proposed model efficiently predicted the microbial rates, indicating that the neural network approach could be used to simulate the microbial growth in SSF.  相似文献   

15.
Analysis schemes for the classification of synergism and antagonism for mixed agents operate on the discrepancies between observed and calculated results. As such they cannot be confirmed by experiments and therefore have to be tested in terms of mathematical and logical self-consistency. The concept of independent action is close to the literal meaning of the term “non-interaction”. Since this concept does not depend on the mechanisms of actions nor on the type of effect scale used, it is suitable as one of the basic criterion for the definition of synergism and antagonism. A general mathematical framework of independent action is presented in this paper based on the concept of “relative effect” as used in the literature. The, different equations for independent action currently used in various areas are shown to be manifestations, of a general formula under different sets of boundary conditions, which are the natural limiting values of the effects of the corresponding system observed at low and at high doses of the agents. The framework can, be generalized to the combined action ofn-agents as well as to the interaction of an agent with itself. In addition, the differential form of the formula for independent action is derived. This framework of systematic definitions and derived equations enable a more in-depth study of the implications of the concept of independent action and its relation to other concepts of non-interaction.  相似文献   

16.
This essay examines the origin(s) of genotype–environment interaction, or G × E. “Origin(s)” and not “the origin” because the thesis is that there were actually two distinct concepts of G × E at this beginning: a biometric concept, or G × EB, and a developmental concept, or G × ED. R. A. Fisher, one of the founders of population genetics and the creator of the statistical analysis of variance, introduced the biometric concept as he attempted to resolve one of the main problems in the biometric tradition of biology – partitioning the relative contributions of nature and nurture responsible for variation in a population. Lancelot Hogben, an experimental embryologist and also a statistician, introduced the developmental concept as he attempted to resolve one of the main problems in the developmental tradition of biology – determining the role that developmental relationships between genotype and environment played in the generation of variation. To argue for this thesis, I outline Fisher and Hogben’s separate routes to their respective concepts of G × E; then these separate interpretations of G × E are drawn on to explicate a debate between Fisher and Hogben over the importance of G × E, the first installment of a persistent controversy. Finally, Fisher’s G × EB and Hogben’s G × ED are traced beyond their own work into mid-20th century population and developmental genetics, and then into the infamous IQ Controversy of the 1970s.  相似文献   

17.
18.
The present study was to investigate the influence of tenuigenin, an active ingredient of Polygala tenuifolia Willd, on the proliferation and differentiation of hippocampal neural stem cells in vitro. Tenuigenin was added to a neurosphere culture and neurosphere growth was measured using MTT assay. The influence of tenuigenin on the proliferation of neural progenitors was examined by Clone forming assay and BrdU detection. In addition, the differentiation of neural stem cells was compared using immunocytochemistry for β III-tubulin and GFAP. The results showed that addition of tenuigenin to the neural stem cell medium increased the number of newly formed neurospheres. More neurons were also obtained when tenuigenin was added in the differentiation medium. These findings suggest that tenuigenin is involved in regulating the proliferation and differentiation of hippocampal neural stem cells. This result may be one of the underlying reasons for tenuigenin’s nootropic and anti-aging effects.  相似文献   

19.
20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号