首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 959 毫秒
1.
Two observations about the cortex have puzzled neuroscientists for a long time. First, neural responses are highly variable. Second, the level of excitation and inhibition received by each neuron is tightly balanced at all times. Here, we demonstrate that both properties are necessary consequences of neural networks that represent information efficiently in their spikes. We illustrate this insight with spiking networks that represent dynamical variables. Our approach is based on two assumptions: We assume that information about dynamical variables can be read out linearly from neural spike trains, and we assume that neurons only fire a spike if that improves the representation of the dynamical variables. Based on these assumptions, we derive a network of leaky integrate-and-fire neurons that is able to implement arbitrary linear dynamical systems. We show that the membrane voltage of the neurons is equivalent to a prediction error about a common population-level signal. Among other things, our approach allows us to construct an integrator network of spiking neurons that is robust against many perturbations. Most importantly, neural variability in our networks cannot be equated to noise. Despite exhibiting the same single unit properties as widely used population code models (e.g. tuning curves, Poisson distributed spike trains), balanced networks are orders of magnitudes more reliable. Our approach suggests that spikes do matter when considering how the brain computes, and that the reliability of cortical representations could have been strongly underestimated.  相似文献   

2.
This paper proposes an extension to the model of a spiking neuron for information processing in artificial neural networks, developing a new approach for the dynamic threshold of the integrate-and-fire neuron. This new approach invokes characteristics of biological neurons such as the behavior of chemical synapses and the receptor field. We demonstrate how such a digital model of spiking neurons can solve complex nonlinear classification with a single neuron, performing experiments for the classical XOR problem. Compared with rate-coded networks and the classical integrate-and-fire model, the trained network demonstrated faster information processing, requiring fewer neurons and shorter learning periods. The extended model validates all the logic functions of biological neurons when such functions are necessary for the proper flow of binary codes through a neural network.  相似文献   

3.
Mejias JF  Torres JJ 《PloS one》2011,6(3):e17255
In this work we study the detection of weak stimuli by spiking (integrate-and-fire) neurons in the presence of certain level of noisy background neural activity. Our study has focused in the realistic assumption that the synapses in the network present activity-dependent processes, such as short-term synaptic depression and facilitation. Employing mean-field techniques as well as numerical simulations, we found that there are two possible noise levels which optimize signal transmission. This new finding is in contrast with the classical theory of stochastic resonance which is able to predict only one optimal level of noise. We found that the complex interplay between adaptive neuron threshold and activity-dependent synaptic mechanisms is responsible for this new phenomenology. Our main results are confirmed by employing a more realistic FitzHugh-Nagumo neuron model, which displays threshold variability, as well as by considering more realistic stochastic synaptic models and realistic signals such as poissonian spike trains.  相似文献   

4.
5.
6.
The voltage trace of neuronal activities can follow multiple timescale dynamics that arise from correlated membrane conductances. Such processes can result in power-law behavior in which the membrane voltage cannot be characterized with a single time constant. The emergent effect of these membrane correlations is a non-Markovian process that can be modeled with a fractional derivative. A fractional derivative is a non-local process in which the value of the variable is determined by integrating a temporal weighted voltage trace, also called the memory trace. Here we developed and analyzed a fractional leaky integrate-and-fire model in which the exponent of the fractional derivative can vary from 0 to 1, with 1 representing the normal derivative. As the exponent of the fractional derivative decreases, the weights of the voltage trace increase. Thus, the value of the voltage is increasingly correlated with the trajectory of the voltage in the past. By varying only the fractional exponent, our model can reproduce upward and downward spike adaptations found experimentally in neocortical pyramidal cells and tectal neurons in vitro. The model also produces spikes with longer first-spike latency and high inter-spike variability with power-law distribution. We further analyze spike adaptation and the responses to noisy and oscillatory input. The fractional model generates reliable spike patterns in response to noisy input. Overall, the spiking activity of the fractional leaky integrate-and-fire model deviates from the spiking activity of the Markovian model and reflects the temporal accumulated intrinsic membrane dynamics that affect the response of the neuron to external stimulation.  相似文献   

7.
For simulations of large spiking neuron networks, an accurate, simple and versatile single-neuron modeling framework is required. Here we explore the versatility of a simple two-equation model: the adaptive exponential integrate-and-fire neuron. We show that this model generates multiple firing patterns depending on the choice of parameter values, and present a phase diagram describing the transition from one firing type to another. We give an analytical criterion to distinguish between continuous adaption, initial bursting, regular bursting and two types of tonic spiking. Also, we report that the deterministic model is capable of producing irregular spiking when stimulated with constant current, indicating low-dimensional chaos. Lastly, the simple model is fitted to real experiments of cortical neurons under step current stimulation. The results provide support for the suitability of simple models such as the adaptive exponential integrate-and-fire neuron for large network simulations.  相似文献   

8.
The response of a population of neurons to time-varying synaptic inputs can show a rich phenomenology, hardly predictable from the dynamical properties of the membrane’s inherent time constants. For example, a network of neurons in a state of spontaneous activity can respond significantly more rapidly than each single neuron taken individually. Under the assumption that the statistics of the synaptic input is the same for a population of similarly behaving neurons (mean field approximation), it is possible to greatly simplify the study of neural circuits, both in the case in which the statistics of the input are stationary (reviewed in La Camera et al. in Biol Cybern, 2008) and in the case in which they are time varying and unevenly distributed over the dendritic tree. Here, we review theoretical and experimental results on the single-neuron properties that are relevant for the dynamical collective behavior of a population of neurons. We focus on the response of integrate-and-fire neurons and real cortical neurons to long-lasting, noisy, in vivo-like stationary inputs and show how the theory can predict the observed rhythmic activity of cultures of neurons. We then show how cortical neurons adapt on multiple time scales in response to input with stationary statistics in vitro. Next, we review how it is possible to study the general response properties of a neural circuit to time-varying inputs by estimating the response of single neurons to noisy sinusoidal currents. Finally, we address the dendrite–soma interactions in cortical neurons leading to gain modulation and spike bursts, and show how these effects can be captured by a two-compartment integrate-and-fire neuron. Most of the experimental results reviewed in this article have been successfully reproduced by simple integrate-and-fire model neurons.  相似文献   

9.
Long-range dependence (LRD) has been observed in a variety of phenomena in nature, and for several years also in the spiking activity of neurons. Often, this is interpreted as originating from a non-Markovian system. Here we show that a purely Markovian integrate-and-fire (IF) model, with a noisy slow adaptation term, can generate interspike intervals (ISIs) that appear as having LRD. However a proper analysis shows that this is not the case asymptotically. For comparison, we also consider a new model of individual IF neuron with fractional (non-Markovian) noise. The correlations of its spike trains are studied and proven to have LRD, unlike classical IF models. On the other hand, to correctly measure long-range dependence, it is usually necessary to know if the data are stationary. Thus, a methodology to evaluate stationarity of the ISIs is presented and applied to the various IF models. We explain that Markovian IF models may seem to have LRD because of non-stationarities.  相似文献   

10.
11.
RV Florian 《PloS one》2012,7(8):e40233
In many cases, neurons process information carried by the precise timings of spikes. Here we show how neurons can learn to generate specific temporally precise output spikes in response to input patterns of spikes having precise timings, thus processing and memorizing information that is entirely temporally coded, both as input and as output. We introduce two new supervised learning rules for spiking neurons with temporal coding of information (chronotrons), one that provides high memory capacity (E-learning), and one that has a higher biological plausibility (I-learning). With I-learning, the neuron learns to fire the target spike trains through synaptic changes that are proportional to the synaptic currents at the timings of real and target output spikes. We study these learning rules in computer simulations where we train integrate-and-fire neurons. Both learning rules allow neurons to fire at the desired timings, with sub-millisecond precision. We show how chronotrons can learn to classify their inputs, by firing identical, temporally precise spike trains for different inputs belonging to the same class. When the input is noisy, the classification also leads to noise reduction. We compute lower bounds for the memory capacity of chronotrons and explore the influence of various parameters on chronotrons' performance. The chronotrons can model neurons that encode information in the time of the first spike relative to the onset of salient stimuli or neurons in oscillatory networks that encode information in the phases of spikes relative to the background oscillation. Our results show that firing one spike per cycle optimizes memory capacity in neurons encoding information in the phase of firing relative to a background rhythm.  相似文献   

12.
One of the reasons the visual cortex has attracted the interest of computational neuroscience is that it has well-defined inputs. The lateral geniculate nucleus (LGN) of the thalamus is the source of visual signals to the primary visual cortex (V1). Most large-scale cortical network models approximate the spike trains of LGN neurons as simple Poisson point processes. However, many studies have shown that neurons in the early visual pathway are capable of spiking with high temporal precision and their discharges are not Poisson-like. To gain an understanding of how response variability in the LGN influences the behavior of V1, we study response properties of model V1 neurons that receive purely feedforward inputs from LGN cells modeled either as noisy leaky integrate-and-fire (NLIF) neurons or as inhomogeneous Poisson processes. We first demonstrate that the NLIF model is capable of reproducing many experimentally observed statistical properties of LGN neurons. Then we show that a V1 model in which the LGN input to a V1 neuron is modeled as a group of NLIF neurons produces higher orientation selectivity than the one with Poisson LGN input. The second result implies that statistical characteristics of LGN spike trains are important for V1’s function. We conclude that physiologically motivated models of V1 need to include more realistic LGN spike trains that are less noisy than inhomogeneous Poisson processes.  相似文献   

13.
The study of several aspects of the collective dynamics of interacting neurons can be highly simplified if one assumes that the statistics of the synaptic input is the same for a large population of similarly behaving neurons (mean field approach). In particular, under such an assumption, it is possible to determine and study all the equilibrium points of the network dynamics when the neuronal response to noisy, in vivo-like, synaptic currents is known. The response function can be computed analytically for simple integrate-and-fire neuron models and it can be measured directly in experiments in vitro. Here we review theoretical and experimental results about the neural response to noisy inputs with stationary statistics. These response functions are important to characterize the collective neural dynamics that are proposed to be the neural substrate of working memory, decision making and other cognitive functions. Applications to the case of time-varying inputs are reviewed in a companion paper (Giugliano et al. in Biol Cybern, 2008). We conclude that modified integrate-and-fire neuron models are good enough to reproduce faithfully many of the relevant dynamical aspects of the neuronal response measured in experiments on real neurons in vitro.  相似文献   

14.
We investigate the detectability of weak electric field in a noisy neural network based on Izhikevich neuron model systematically. The neural network is composed of excitatory and inhibitory neurons with similar ratio as that in the mammalian neocortex, and the axonal conduction delays between neurons are also considered. It is found that the noise intensity can modulate the detectability of weak electric field. Stochastic resonance (SR) phenomenon induced by white noise is observed when the weak electric field is added to the network. It is interesting that SR almost disappeared when the connections between neurons are cancelled, suggesting the amplification effects of the neural coupling on the synchronization of neuronal spiking. Furthermore, the network parameters, such as the connection probability, the synaptic coupling strength, the scale of neuron population and the neuron heterogeneity, can also affect the detectability of the weak electric field. Finally, the model sensitivity is studied in detail, and results show that the neural network model has an optimal region for the detectability of weak electric field signal.  相似文献   

15.
We present a general analysis of highly connected recurrent neural networks which are able to learn and retrieve a finite number of static patterns. The arguments are based on spike trains and their interval distribution and require no specific model of a neuron. In particular, they apply to formal two-state neurons as well as to more refined models like the integrate-and-fire neuron or the Hodgkin-Huxley equations. We show that the mean firing rate defined as the inverse of the mean interval length is the only relevant parameter (apart from the synaptic weights) that determines the existence of retrieval solutions with a large overlap with one of the learnt patterns. The statistics of the spiking noise (Gaussian, Poisson or other) and hence the shape of the interval distribution does not matter. Thus our unifying approach explains why, and when, all the different associative networks which treat static patterns yield basically the same results, i.e., belong to the same universality class.  相似文献   

16.
The ability of spiking neurons to synchronize their activity in a network depends on the response behavior of these neurons as quantified by the phase response curve (PRC) and on coupling properties. The PRC characterizes the effects of transient inputs on spike timing and can be measured experimentally. Here we use the adaptive exponential integrate-and-fire (aEIF) neuron model to determine how subthreshold and spike-triggered slow adaptation currents shape the PRC. Based on that, we predict how synchrony and phase locked states of coupled neurons change in presence of synaptic delays and unequal coupling strengths. We find that increased subthreshold adaptation currents cause a transition of the PRC from only phase advances to phase advances and delays in response to excitatory perturbations. Increased spike-triggered adaptation currents on the other hand predominantly skew the PRC to the right. Both adaptation induced changes of the PRC are modulated by spike frequency, being more prominent at lower frequencies. Applying phase reduction theory, we show that subthreshold adaptation stabilizes synchrony for pairs of coupled excitatory neurons, while spike-triggered adaptation causes locking with a small phase difference, as long as synaptic heterogeneities are negligible. For inhibitory pairs synchrony is stable and robust against conduction delays, and adaptation can mediate bistability of in-phase and anti-phase locking. We further demonstrate that stable synchrony and bistable in/anti-phase locking of pairs carry over to synchronization and clustering of larger networks. The effects of adaptation in aEIF neurons on PRCs and network dynamics qualitatively reflect those of biophysical adaptation currents in detailed Hodgkin-Huxley-based neurons, which underscores the utility of the aEIF model for investigating the dynamical behavior of networks. Our results suggest neuronal spike frequency adaptation as a mechanism synchronizing low frequency oscillations in local excitatory networks, but indicate that inhibition rather than excitation generates coherent rhythms at higher frequencies.  相似文献   

17.
18.
Capturing the response behavior of spiking neuron models with rate-based models facilitates the investigation of neuronal networks using powerful methods for rate-based network dynamics. To this end, we investigate the responses of two widely used neuron model types, the Izhikevich and augmented multi-adapative threshold (AMAT) models, to a range of spiking inputs ranging from step responses to natural spike data. We find (i) that linear-nonlinear firing rate models fitted to test data can be used to describe the firing-rate responses of AMAT and Izhikevich spiking neuron models in many cases; (ii) that firing-rate responses are generally too complex to be captured by first-order low-pass filters but require bandpass filters instead; (iii) that linear-nonlinear models capture the response of AMAT models better than of Izhikevich models; (iv) that the wide range of response types evoked by current-injection experiments collapses to few response types when neurons are driven by stationary or sinusoidally modulated Poisson input; and (v) that AMAT and Izhikevich models show different responses to spike input despite identical responses to current injections. Together, these findings suggest that rate-based models of network dynamics may capture a wider range of neuronal response properties by incorporating second-order bandpass filters fitted to responses of spiking model neurons. These models may contribute to bringing rate-based network modeling closer to the reality of biological neuronal networks.  相似文献   

19.
Information about external world is delivered to the brain in the form of structured in time spike trains. During further processing in higher areas, information is subjected to a certain condensation process, which results in formation of abstract conceptual images of external world, apparently, represented as certain uniform spiking activity partially independent on the input spike trains details. Possible physical mechanism of condensation at the level of individual neuron was discussed recently. In a reverberating spiking neural network, due to this mechanism the dynamics should settle down to the same uniform/ periodic activity in response to a set of various inputs. Since the same periodic activity may correspond to different input spike trains, we interpret this as possible candidate for information condensation mechanism in a network. Our purpose is to test this possibility in a network model consisting of five fully connected neurons, particularly, the influence of geometric size of the network, on its ability to condense information. Dynamics of 20 spiking neural networks of different geometric sizes are modelled by means of computer simulation. Each network was propelled into reverberating dynamics by applying various initial input spike trains. We run the dynamics until it becomes periodic. The Shannon's formula is used to calculate the amount of information in any input spike train and in any periodic state found. As a result, we obtain explicit estimate of the degree of information condensation in the networks, and conclude that it depends strongly on the net's geometric size.  相似文献   

20.
It has recently been shown that networks of spiking neurons with noise can emulate simple forms of probabilistic inference through “neural sampling”, i.e., by treating spikes as samples from a probability distribution of network states that is encoded in the network. Deficiencies of the existing model are its reliance on single neurons for sampling from each random variable, and the resulting limitation in representing quickly varying probabilistic information. We show that both deficiencies can be overcome by moving to a biologically more realistic encoding of each salient random variable through the stochastic firing activity of an ensemble of neurons. The resulting model demonstrates that networks of spiking neurons with noise can easily track and carry out basic computational operations on rapidly varying probability distributions, such as the odds of getting rewarded for a specific behavior. We demonstrate the viability of this new approach towards neural coding and computation, which makes use of the inherent parallelism of generic neural circuits, by showing that this model can explain experimentally observed firing activity of cortical neurons for a variety of tasks that require rapid temporal integration of sensory information.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号