首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Self-organization of neurons described by the maximum-entropy principle   总被引:6,自引:0,他引:6  
In the article the maximum-entropy principle and Parzen windows are applied to derive an optimal mapping of a continuous into a descrete random variable. The mapping can be performed by a network of self-organizing information processing units similar to biological neurons. Each neuron is selectively sensitized to one prototype from the sample space of the discrete random variable. The continuous random variable is applied as the input signal exciting the neurons. The response of the network is described by the excitation vector which represents the encoded input signal. Due to the interaction between neurons adaptive changes of prototypes are caused by the excitations. The derived mathematical model explains this interaction in detail; a simplified self-organization rule derived from it corresponds to that of Kohonen. One and two-dimensional examples of self-organization simulated on a computer are shown in the article.  相似文献   

2.
A computer model of sustained chopper neurons in the ventral cochlear nucleus is presented and investigated. In the companion paper, the underlying neurophysiological and neuroanatomical data are demonstrated. To explain the preference of chopper neurons for oscillations with periods which are multiples of a 0.4 ms synaptic delay, we suggest a model of circularly connected chopper neurons. In order to simulate chopper neurons within a physiological dynamic range for periodicity encoding, it is necessary to assume that they receive an input from onset neurons. Our computer analysis of the resulting simple neuronal network shows that it can produce stable oscillations. The chopping can be triggered by an amplitude-modulated signal (AM). The dynamic range and the synchronous response of the simulated chopper neurons to AM are enhanced significantly by an additional input from onset neurons. Physiological properties of chopper neurons in the cat, such as mean, standard deviation, and coefficient of variation of the interspike interval are matched precisely by our simulations.  相似文献   

3.
The mean input and variance of the total synaptic input to a neuron can vary independently, suggesting two distinct information channels. Here we examine the impact of rapidly varying signals, delivered via these two information conduits, on the temporal dynamics of neuronal firing rate responses. We examine the responses of model neurons to step functions in either the mean or the variance of the input current. Our results show that the temporal dynamics governing response onset depends on the choice of model. Specifically, the existence of a hard threshold introduces an instantaneous component into the response onset of a leaky-integrate-and-fire model that is not present in other models studied here. Other response features, for example a decaying oscillatory approach to a new steady-state firing rate, appear to be more universal among neuronal models. The decay time constant of this approach is a power-law function of noise magnitude over a wide range of input parameters. Understanding how specific model properties underlie these response features is important for understanding how neurons will respond to rapidly varying signals, as the temporal dynamics of the response onset and response decay to new steady-state determine what range of signal frequencies a population of neurons can respond to and faithfully encode.  相似文献   

4.
A mathematical model of neural processing is proposed which incorporates a theory for the storage of information. The model consists of a network of neurons that linearly processes incoming neural activity. The network stores the input by modifying the synaptic properties of all of its neurons. The model lends support to a distributive theory of memory using synaptic modification. The dynamics of the processing and storage are represented by a discrete system. Asymptotic analysis is applied to the system to show the learning capabilities of the network under constant input. Results are also given to predict the network's ability to learn periodic input, and input subjected to small random fluctuations.  相似文献   

5.
Neuronal impedance characterizes the magnitude and timing of the subthreshold response of a neuron to oscillatory input at a given frequency. It is known to be influenced by both the morphology of the neuron and the presence of voltage-gated conductances in the cell membrane. Most existing theoretical accounts of neuronal impedance considered the effects of voltage-gated conductances but neglected the spatial extent of the cell, while others examined spatially extended dendrites with a passive or spatially uniform quasi-active membrane. We derived an explicit mathematical expression for the somatic input impedance of a model neuron consisting of a somatic compartment coupled to an infinite dendritic cable which contained voltage-gated conductances, in the more general case of non-uniform dendritic membrane potential. The validity and generality of this model was verified through computer simulations of various model neurons. The analytical model was then applied to the analysis of experimental data from real CA1 pyramidal neurons. The model confirmed that the biophysical properties and predominantly dendritic localization of the hyperpolarization-activated cation current I (h) were important determinants of the impedance profile, but also predicted a significant contribution from a depolarization-activated fast inward current. Our calculations also implicated the interaction of I (h) with amplifying currents as the main factor governing the shape of the impedance-frequency profile in two types of hippocampal interneuron. Our results provide not only a theoretical advance in our understanding of the frequency-dependent behavior of nerve cells, but also a practical tool for the identification of candidate mechanisms that determine neuronal response properties.  相似文献   

6.
Spinal motor neurons have voltage gated ion channels localized in their dendrites that generate plateau potentials. The physical separation of ion channels for spiking from plateau generating channels can result in nonlinear bistable firing patterns. The physical separation and geometry of the dendrites results in asymmetric coupling between dendrites and soma that has not been addressed in reduced models of nonlinear phenomena in motor neurons. We measured voltage attenuation properties of six anatomically reconstructed and type-identified cat spinal motor neurons to characterize asymmetric coupling between the dendrites and soma. We showed that the voltage attenuation at any distance from the soma was direction-dependent and could be described as a function of the input resistance at the soma. An analytical solution for the lumped cable parameters in a two-compartment model was derived based on this finding. This is the first two-compartment modeling approach that directly derived lumped cable parameters from the geometrical and passive electrical properties of anatomically reconstructed neurons.  相似文献   

7.
Adaptive behavior in unicellular organisms (i.e., bacteria) depends on highly organized networks of proteins governing purposefully the myriad of molecular processes occurring within the cellular system. For instance, bacteria are able to explore the environment within which they develop by utilizing the motility of their flagellar system as well as a sophisticated biochemical navigation system that samples the environmental conditions surrounding the cell, searching for nutrients or moving away from toxic substances or dangerous physical conditions. In this paper we discuss how proteins of the intervening signal transduction network could be modeled as artificial neurons, simulating the dynamical aspects of the bacterial taxis. The model is based on the assumption that, in some important aspects, proteins can be considered as processing elements or McCulloch-Pitts artificial neurons that transfer and process information from the bacterium's membrane surface to the flagellar motor. This simulation of bacterial taxis has been carried out on a hardware realization of a McCulloch-Pitts artificial neuron using an operational amplifier. Based on the behavior of the operational amplifier we produce a model of the interaction between CheY and FliM, elements of the prokaryotic two component system controlling chemotaxis, as well as a simulation of learning and evolution processes in bacterial taxis. On the one side, our simulation results indicate that, computationally, these protein 'switches' are similar to McCulloch-Pitts artificial neurons, suggesting a bridge between evolution and learning in dynamical systems at cellular and molecular levels and the evolutive hardware approach. On the other side, important protein 'tactilizing' properties are not tapped by the model, and this suggests further complexity steps to explore in the approach to biological molecular computing.  相似文献   

8.
The discharge rates of premotor, brain-stem neurons that create eye movements modulate in relation to eye velocity yet firing rates of extraocular motoneurons contain both eye-position and eyevelocity signals. The eye-position signal is derived from the eye-velocity command by means of a neural network which functioins as a temporal integrator. We have previously proposed a network of lateral-inhibitory neurons that is capable of performing the required integration. That analysis centered on the temporal aspects of the signal processing for a limited class of idealized inputs. All of its cells were identical and carried only the integrated signal. Recordings in the brain stem, however, show that neurons in the region of the neural integrator have a variety of background firing rates, all carry some eye-velocity signal as well as the eye-position signal, and carry the former with different strengths depending on the type of eye movement being made. It was necessary to see if the proposed model could be modified to make its neurons more realistic.By modifying the spatial distribution of afferents to the network, we demonstrate that the same basic model functions properly in spite of afferents with nonuniform background firing rates. To introduce the eye-velocity signal a double-layer network, consisting of inhibitory and excitatory cells, was necessary. By presenting the velocity input to only local regions of this network it was shown that all cells in the network still carried the integrated signal and that its cells could carry different eye-velocity signals for different types of eye movements. Thus, this model stimulates quantitatively and qualitatively, the behavior of neurons seen in the region of the neural integrator.  相似文献   

9.
Synchronized oscillation is very commonly observed in many neuronal systems and might play an important role in the response properties of the system. We have studied how the spontaneous oscillatory activity affects the responsiveness of a neuronal network, using a neural network model of the visual cortex built from Hodgkin-Huxley type excitatory (E-) and inhibitory (I-) neurons. When the isotropic local E-I and I-E synaptic connections were sufficiently strong, the network commonly generated gamma frequency oscillatory firing patterns in response to random feed-forward (FF) input spikes. This spontaneous oscillatory network activity injects a periodic local current that could amplify a weak synaptic input and enhance the network's responsiveness. When E-E connections were added, we found that the strength of oscillation can be modulated by varying the FF input strength without any changes in single neuron properties or interneuron connectivity. The response modulation is proportional to the oscillation strength, which leads to self-regulation such that the cortical network selectively amplifies various FF inputs according to its strength, without requiring any adaptation mechanism. We show that this selective cortical amplification is controlled by E-E cell interactions. We also found that this response amplification is spatially localized, which suggests that the responsiveness modulation may also be spatially selective. This suggests a generalized mechanism by which neural oscillatory activity can enhance the selectivity of a neural network to FF inputs.  相似文献   

10.
An olfactory neuronal network for vapor recognition in an artificial nose   总被引:4,自引:0,他引:4  
Odorant sensitivity and discrimination in the olfactory system appear to involve extensive neural processing of the primary sensory inputs from the olfactory epithelium. To test formally the functional consequences of such processing, we implemented in an artificial chemosensing system a new analytical approach that is based directly on neural circuits of the vertebrate olfactory system. An array of fiber-optic chemosensors, constructed with response properties similar to those of olfactory sensory neurons, provide time-varying inputs to a computer simulation of the olfactory bulb (OB). The OB simulation produces spatiotemporal patterns of neuronal firing that vary with vapor type. These patterns are then recognized by a delay line neural network (DLNN). In the final output of these two processing steps, vapor identity is encoded by the spatial patterning of activity across units in the DLNN, and vapor intensity is encoded by response latency. The OB-DLNN combination thus separates identity and intensity information into two distinct codes carried by the same output units, enabling discrimination among organic vapors over a range of input signal intensities. In addition to providing a well-defined system for investigating olfactory information processing, this biologically based neuronal network performs better than standard feed-forward neural networks in discriminating vapors when small amounts of training data are used. Received: 30 June 1997 / Accepted in revised form: 12 January 1998  相似文献   

11.
Dendritic shaft (Zd) and spine (Zsp) input impedances were computed numerically for sites on hippocampal neurons, using a segmental format of cable calculations. The Zsp values for a typical spine appended onto a dendritic shaft averaged less than 2% higher than the Zd values for the adjacent dendritic shaft. Spine synaptic inputs were simulated by a brief conductance transient, which possessed a time integral of 12 X 10(-10)S X ms. This input resulted in an average peak spine response of 20 mV for both dentate granule neurons and CA1 pyramidal cells. The average spine transient was attenuated less than 2% in conduction across the spine neck, considering peak voltage, waveform parameters, and charge transfer. The spine conductance transient resulted in an average somatic response of 100 microV in the dentate granule neurons, because of passive electrotonic propagation. The same input transient was also applied to proximal and distal sites on CA1 pyramidal cells. The predicted responses at the soma demonstrated a clear difference between the proximal and distal inputs, in terms of both peak voltage and waveform parameters. Thus, the main determinant of the passive propagation of transient electrical signals in these neurons appears to be dendritic branching rather than signal attenuation through the spine neck.  相似文献   

12.
A cable model is presented for a pair of electrotonically coupled neurons to investigate the spatial effects of soma-somatic gap junctions. The model extends that of Poznanski et al.(1995) in which each neuron is represented by a tapered equivalent cable attached to an isopotential soma with the two somas being electrically coupled. The model is posed generally, so that both active and passive properties can be considered. In the active case a system of nonlinear integral equations is derived for the voltage, whilst in the passive case these have an exact solution that also holds for inputs modelled as synaptic reversal potentials. Analytical and numerical methods are used to examine the sensitivity of the soma potentials (in particular) to the coupling resistance.  相似文献   

13.
14.
Chopper neurons in the cochlear nucleus are characterized by intrinsic oscillations with short average interspike intervals (ISIs) and relative level independence of their response (Pfeiffer, Exp Brain Res 1:220–235, 1966; Blackburn and Sachs, J Neurophysiol 62:1303–1329, 1989), properties which are unattained by models of single chopper neurons (e.g., Rothman and Manis, J Neurophysiol 89:3070–3082, 2003a). In order to achieve short ISIs, we optimized the time constants of Rothman and Manis single neuron model with genetic algorithms. Some parameters in the optimization, such as the temperature and the capacity of the cell, turned out to be crucial for the required acceleration of their response. In order to achieve the relative level independence, we have simulated an interconnected network consisting of Rothman and Manis neurons. The results indicate that by stabilization of intrinsic oscillations, it is possible to simulate the physiologically observed level independence of ISIs. As previously reviewed and demonstrated (Bahmer and Langner, Biol Cybern 95:371–379, 2006a), chopper neurons show a preference for ISIs which are multiples of 0.4 ms. It was also demonstrated that the network consisting of two optimized Rothman and Manis neurons which activate each other with synaptic delays of 0.4 ms shows a preference for ISIs of 0.8 ms. Oscillations with various multiples of 0.4 ms as ISIs may be derived from neurons in a more complex network that is activated by simultaneous input of an onset neuron and several auditory nerve fibers.  相似文献   

15.
The complexity of biological neural networks does not allow to directly relate their biophysical properties to the dynamics of their electrical activity. We present a reservoir computing approach for functionally identifying a biological neural network, i.e. for building an artificial system that is functionally equivalent to the reference biological network. Employing feed-forward and recurrent networks with fading memory, i.e. reservoirs, we propose a point process based learning algorithm to train the internal parameters of the reservoir and the connectivity between the reservoir and the memoryless readout neurons. Specifically, the model is an Echo State Network (ESN) with leaky integrator neurons, whose individual leakage time constants are also adapted. The proposed ESN algorithm learns a predictive model of stimulus-response relations in in vitro and simulated networks, i.e. it models their response dynamics. Receiver Operating Characteristic (ROC) curve analysis indicates that these ESNs can imitate the response signal of a reference biological network. Reservoir adaptation improved the performance of an ESN over readout-only training methods in many cases. This also held for adaptive feed-forward reservoirs, which had no recurrent dynamics. We demonstrate the predictive power of these ESNs on various tasks with cultured and simulated biological neural networks.  相似文献   

16.
In a simulated neuron with a dendritic tree, the relative effects of active and passive dendritic membranes on transfer properties were studied. The simulations were performed by means of a digital computer. The computations calculated the changes in transmembrane voltages of many compartments over time as a function of other biophysical variables. These variables were synaptic input intensity, critical firing threshold, rate of leakage of current across the membrane, and rate of longitudinal current spread between compartments. For both passive and active dendrites, the transfer properties of the soma studied for different rates of longitudinal current spread. With low rates of current spread, graded changes in firing threshold produced correspondingly graded changes in output discharge. With high rates of current spread, the neuron became a bistable operator where spiking was enhanced if the threshold was below a certain level and suppressed if the threshold was above that level. Since alterations in firing threshold were shown to have the same effect on firing rate as alterations in synaptic input intensity, the neuron can be said to change from graded to contrast-enhancing in its response to stimuli of different intensities. The presence or absence of dendritic spiking was found to have a significant effect on the integrative properties of the simulated neuron. In particular, contrast enhancement was considerably more pronounced in neurons with passive than with active dendrites in that somatic spike rates reached a higher maximum when dendrites were passive. With active dendrites, a less intense input was needed to initiate somatic spiking than with passive dendrites because a distal dendritic spike could easily propagate by means of longitudinal current spread to the soma. Once somatic spiking was initiated, though, spike rates tended to be lower with active than with passive dendrites because the soma recovered more slowly from its post-spike refractory period if it was also influenced by refractory periods in the dendrites. The experiment of comparing neurons with active and passive dendrites was repeated at a different, higher value of synaptic input. The same differences in transfer properties between the active and passive cases emerged as before. Spiking patterns in neurons with active dendrites were also affected by the time distribution of synaptic inputs. In a previous study, inputs had been random over both space and time, varying about a predetermined mean, whereas in the present study, inputs were random over space but uniform over time. When inputs were made uniform over time, spiking became more difficult to initiate and the transition from graded to bistable response became less sharp.  相似文献   

17.
Baroni F  Torres JJ  Varona P 《PloS one》2010,5(12):e15023
Neurons react differently to incoming stimuli depending upon their previous history of stimulation. This property can be considered as a single-cell substrate for transient memory, or context-dependent information processing: depending upon the current context that the neuron "sees" through the subset of the network impinging on it in the immediate past, the same synaptic event can evoke a postsynaptic spike or just a subthreshold depolarization. We propose a formal definition of History-Dependent Excitability (HDE) as a measure of the propensity to firing in any moment in time, linking the subthreshold history-dependent dynamics with spike generation. This definition allows the quantitative assessment of the intrinsic memory for different single-neuron dynamics and input statistics. We illustrate the concept of HDE by considering two general dynamical mechanisms: the passive behavior of an Integrate and Fire (IF) neuron, and the inductive behavior of a Generalized Integrate and Fire (GIF) neuron with subthreshold damped oscillations. This framework allows us to characterize the sensitivity of different model neurons to the detailed temporal structure of incoming stimuli. While a neuron with intrinsic oscillations discriminates equally well between input trains with the same or different frequency, a passive neuron discriminates better between inputs with different frequencies. This suggests that passive neurons are better suited to rate-based computation, while neurons with subthreshold oscillations are advantageous in a temporal coding scheme. We also address the influence of intrinsic properties in single-cell processing as a function of input statistics, and show that intrinsic oscillations enhance discrimination sensitivity at high input rates. Finally, we discuss how the recognition of these cell-specific discrimination properties might further our understanding of neuronal network computations and their relationships to the distribution and functional connectivity of different neuronal types.  相似文献   

18.
Interaction mechanisms between excitatory and inhibitory impulse sequences operating on neurons play an important role for the processing of information by the nervous system. For instance, the convergence of excitatory and inhibitory influences on retinal ganglion cells to form their receptive fields has been taken as an example for the process of neuronal sharpening by lateral inhibition. In order to analyze quantitatively the functional behavior of such a system, Shannon's entropy method for multiple access channels has been applied to biological two-inputs-one-output systems using the theoretical model developed by Tsukada et al. (1979). Here we give an extension of this procedure from the point of view to reduce redundancy of information in the input signal space of single neurons and attempt to obtain a new interpretation for the information processing of the system. The concept for the redundancy reducing mechanism in single neurons is examined and discussed for the following two processes. The first process is concerned with a signal space formed by superposing two random sequences on the input of a neuron. In this process, we introduce a coding technique to encode the inhibitory sequence by using the timing of the excitatory sequence, which is closely related to an encoding technique of multiple access channels with a correlated source (Marko, 1966, 1970, 1973; Slepian and Wolf, 1973) and which is an invariant transformation in the input signal space without changing the information contents of the input. The second process is concerned with a procedure of reducing redundant signals in the signal space mentioned before. In this connection, it is an important point to see how single neurons reduce the dimensionality of the signal space via transformation with a minimum loss of effective information. For this purpose we introduce the criterion that average transmission of information from signal space to the output does not change when redundant signals are added. This assumption is based on the fact that two signals are equivalent if and only if they have identical input-output behavior. The mechanism is examined and estimated by using a computer-simulated model. As the result of such a simulation we can estimate the minimal segmentation in the signal space which is necessary and sufficient for temporal pattern sensitivity in neurons.  相似文献   

19.
Estimating biologically realistic model neurons from electrophysiological data is a key issue in neuroscience that is central to understanding neuronal function and network behavior. However, directly fitting detailed Hodgkin?CHuxley type model neurons to somatic membrane potential data is a notoriously difficult optimization problem that can require hours/days of supercomputing time. Here we extend an efficient technique that indirectly matches neuronal currents derived from somatic membrane potential data to two-compartment model neurons with passive dendrites. In consequence, this approach can fit semi-realistic detailed model neurons in a few minutes. For validation, fits are obtained to model-derived data for various thalamo-cortical neuron types, including fast/regular spiking and bursting neurons. A key aspect of the validation is sensitivity testing to perturbations arising in experimental data, including sampling rates, inadequately estimated membrane dynamics/channel kinetics and intrinsic noise. We find that maximal conductance estimates and the resulting membrane potential fits diverge smoothly and monotonically from near-perfect matches when unperturbed. Curiously, some perturbations have little effect on the error because they are compensated by the fitted maximal conductances. Therefore, the extended current-based technique applies well under moderately inaccurate model assumptions, as required for application to experimental data. Furthermore, the accompanying perturbation analysis gives insights into neuronal homeostasis, whereby tuning intrinsic neuronal properties can compensate changes from development or neurodegeneration.  相似文献   

20.
Chaos and synchrony in a model of a hypercolumn in visual cortex   总被引:2,自引:0,他引:2  
Neurons in cortical slices emit spikes or bursts of spikes regularly in response to a suprathreshold current injection. This behavior is in marked contrast to the behavior of cortical neurons in vivo, whose response to electrical or sensory input displays a strong degree of irregularity. Correlation measurements show a significant degree of synchrony in the temporal fluctuations of neuronal activities in cortex. We explore the hypothesis that these phenomena are the result of the synchronized chaos generated by the deterministic dynamics of local cortical networks. A model of a hypercolumn in the visual cortex is studied. It consists of two populations of neurons, one inhibitory and one excitatory. The dynamics of the neurons is based on a Hodgkin-Huxley type model of excitable voltage-clamped cells with several cellular and synaptic conductances. A slow potassium current is included in the dynamics of the excitatory population to reproduce the observed adaptation of the spike trains emitted by these neurons. The pattern of connectivity has a spatial structure which is correlated with the internal organization of hypercolumns in orientation columns. Numerical simulations of the model show that in an appropriate parameter range, the network settles in a synchronous chaotic state, characterized by a strong temporal variability of the neural activity which is correlated across the hypercolumn. Strong inhibitory feedback is essential for the stabilization of this state. These results show that the cooperative dynamics of large neuronal networks are capable of generating variability and synchrony similar to those observed in cortex. Auto-correlation and cross-correlation functions of neuronal spike trains are computed, and their temporal and spatial features are analyzed. In other parameter regimes, the network exhibits two additional states: synchronized oscillations and an asynchronous state. We use our model to study cortical mechanisms for orientation selectivity. It is shown that in a suitable parameter regime, when the input is not oriented, the network has a continuum of states, each representing an inhomogeneous population activity which is peaked at one of the orientation columns. As a result, when a weakly oriented input stimulates the network, it yields a sharp orientation tuning. The properties of the network in this regime, including the appearance of virtual rotations and broad stimulus-dependent cross-correlations, are investigated. The results agree with the predictions of the mean field theory which was previously derived for a simplified model of stochastic, two-state neurons. The relation between the results of the model and experiments in visual cortex are discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号