首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Multiple unit activity in deep layers of the frontal and motor cortices was recorded by chronically implanted semimicroelectrodes in waking cats with different levels of food motivation. From four to seven neuronal spike trains were selected from the recorded multiunit activity. Interactions between neighbouring neurons in the motor and frontal areas of the neocortex (within the local neuronal networks) and between the neurons of these areas (distributed neuronal networks) were estimated by means of statistical crosscorrelation analysis of spike trains within the range of delays from 0 to 100 ms. Neurons in the local networks were divided in two subgroups: the neurons with higher spike amplitudes with the dominance of divergent connections and neurons with lower spike amplitudes with the dominance of convergent connections. Strong monosynaptic connections (discharges with a delay of less than 2 ms) between the neurons with high- and low-amplitude spikes formed the background of the local networks. Connections between low-amplitude neurons in the frontal cortex and high-amplitude neurons in the motor cortex dominated in the distributed networks. A 24-hour food deprivation predominantly altered the late interneuronal crosscorrelations with time delays within the range of 2-100 ms in both local and distributed networks.  相似文献   

2.
Characterizing metastable neural dynamics in finite-size spiking networks remains a daunting challenge. We propose to address this challenge in the recently introduced replica-mean-field (RMF) limit. In this limit, networks are made of infinitely many replicas of the finite network of interest, but with randomized interactions across replicas. Such randomization renders certain excitatory networks fully tractable at the cost of neglecting activity correlations, but with explicit dependence on the finite size of the neural constituents. However, metastable dynamics typically unfold in networks with mixed inhibition and excitation. Here, we extend the RMF computational framework to point-process-based neural network models with exponential stochastic intensities, allowing for mixed excitation and inhibition. Within this setting, we show that metastable finite-size networks admit multistable RMF limits, which are fully characterized by stationary firing rates. Technically, these stationary rates are determined as the solutions of a set of delayed differential equations under certain regularity conditions that any physical solutions shall satisfy. We solve this original problem by combining the resolvent formalism and singular-perturbation theory. Importantly, we find that these rates specify probabilistic pseudo-equilibria which accurately capture the neural variability observed in the original finite-size network. We also discuss the emergence of metastability as a stochastic bifurcation, which can be interpreted as a static phase transition in the RMF limits. In turn, we expect to leverage the static picture of RMF limits to infer purely dynamical features of metastable finite-size networks, such as the transition rates between pseudo-equilibria.  相似文献   

3.
The joint influence of recurrent feedback and noise on gain control in a network of globally coupled spiking leaky integrate-and-fire neurons is studied theoretically and numerically. The context of our work is the origin of divisive versus subtractive gain control, as mixtures of these effects are seen in a variety of experimental systems. We focus on changes in the slope of the mean firing frequency-versus-input bias (fI) curve when the gain control signal to the cells comes from the cells’ output spikes. Feedback spikes are modeled as alpha functions that produce an additive current in the current balance equation. For generality, they occur after a fixed minimum delay. We show that purely divisive gain control, i.e. changes in the slope of the fI curve, arises naturally with this additive negative or positive feedback, due to a linearizing actions of feedback. Negative feedback alone lowers the gain, accounting in particular for gain changes in weakly electric fish upon pharmacological opening of the feedback loop as reported by Bastian (J Neurosci 6:553–562, 1986). When negative feedback is sufficiently strong it further causes oscillatory firing patterns which produce irregularities in the fI curve. Small positive feedback alone increases the gain, but larger amounts cause abrupt jumps to higher firing frequencies. On the other hand, noise alone in open loop linearizes the fI curve around threshold, and produces mixtures of divisive and subtractive gain control. With both noise and feedback, the combined gain control schemes produce a primarily divisive gain control shift, indicating the robustness of feedback gain control in stochastic networks. Similar results are found when the “input” parameter is the contrast of a time-varying signal rather than the bias current. Theoretical results are derived relating the slope of the fI curve to feedback gain and noise strength. Good agreement with simulation results are found for inhibitory and excitatory feedback. Finally, divisive feedback is also found for conductance-based feedback (shunting or excitatory) with and without noise. This article is part of a special issue on Neuronal Dynamics of Sensory Coding.  相似文献   

4.
Networks of neurons produce diverse patterns of oscillations, arising from the network's global properties, the propensity of individual neurons to oscillate, or a mixture of the two. Here we describe noisy limit cycles and quasi-cycles, two related mechanisms underlying emergent oscillations in neuronal networks whose individual components, stochastic spiking neurons, do not themselves oscillate. Both mechanisms are shown to produce gamma band oscillations at the population level while individual neurons fire at a rate much lower than the population frequency. Spike trains in a network undergoing noisy limit cycles display a preferred period which is not found in the case of quasi-cycles, due to the even faster decay of phase information in quasi-cycles. These oscillations persist in sparsely connected networks, and variation of the network's connectivity results in variation of the oscillation frequency. A network of such neurons behaves as a stochastic perturbation of the deterministic Wilson-Cowan equations, and the network undergoes noisy limit cycles or quasi-cycles depending on whether these have limit cycles or a weakly stable focus. These mechanisms provide a new perspective on the emergence of rhythmic firing in neural networks, showing the coexistence of population-level oscillations with very irregular individual spike trains in a simple and general framework.  相似文献   

5.
Spiking Neural Networks, the last generation of Artificial Neural Networks, are characterized by its bio-inspired nature and by a higher computational capacity with respect to other neural models. In real biological neurons, stochastic processes represent an important mechanism of neural behavior and are responsible of its special arithmetic capabilities. In this work we present a simple hardware implementation of spiking neurons that considers this probabilistic nature. The advantage of the proposed implementation is that it is fully digital and therefore can be massively implemented in Field Programmable Gate Arrays. The high computational capabilities of the proposed model are demonstrated by the study of both feed-forward and recurrent networks that are able to implement high-speed signal filtering and to solve complex systems of linear equations.  相似文献   

6.
Finding out the physical structure of neuronal circuits that governs neuronal responses is an important goal for brain research. With fast advances for large-scale recording techniques, identification of a neuronal circuit with multiple neurons and stages or layers becomes possible and highly demanding. Although methods for mapping the connection structure of circuits have been greatly developed in recent years, they are mostly limited to simple scenarios of a few neurons in a pairwise fashion; and dissecting dynamical circuits, particularly mapping out a complete functional circuit that converges to a single neuron, is still a challenging question. Here, we show that a recent method, termed spike-triggered non-negative matrix factorization (STNMF), can address these issues. By simulating different scenarios of spiking neural networks with various connections between neurons and stages, we demonstrate that STNMF is a persuasive method to dissect functional connections within a circuit. Using spiking activities recorded at neurons of the output layer, STNMF can obtain a complete circuit consisting of all cascade computational components of presynaptic neurons, as well as their spiking activities. For simulated simple and complex cells of the primary visual cortex, STNMF allows us to dissect the pathway of visual computation. Taken together, these results suggest that STNMF could provide a useful approach for investigating neuronal systems leveraging recorded functional neuronal activity.  相似文献   

7.
Massive synaptic pruning following over-growth is a general feature of mammalian brain maturation. This article studies the synaptic pruning that occurs in large networks of simulated spiking neurons in the absence of specific input patterns of activity. The evolution of connections between neurons were governed by an original bioinspired spike-timing-dependent synaptic plasticity (STDP) modification rule which included a slow decay term. The network reached a steady state with a bimodal distribution of the synaptic weights that were either incremented to the maximum value or decremented to the lowest value. After 1x10(6) time steps the final number of synapses that remained active was below 10% of the number of initially active synapses independently of network size. The synaptic modification rule did not introduce spurious biases in the geometrical distribution of the remaining active projections. The results show that, under certain conditions, the model is capable of generating spontaneously emergent cell assemblies.  相似文献   

8.
9.
Borisyuk R 《Bio Systems》2002,67(1-3):3-16
We study the dynamics of activity in the neural networks of enhanced integrate-and-fire elements (with random noise, refractory periods, signal propagation delay, decay of postsynaptic potential, etc.). We consider the networks composed of two interactive populations of excitatory and inhibitory neurons with all-to-all or random sparse connections. It is shown by computer simulations that the regime of regular oscillations is very stable in a broad range of parameter values. In particular, oscillations are possible even in the case of very sparse and randomly distributed inhibitory connections and high background activity. We describe two scenarios of how oscillations may appear which are similar to Andronov-Hopf and saddle-node-on-limit-cycle bifurcations in dynamical systems. The role of oscillatory dynamics for information encoding and processing is discussed.  相似文献   

10.
The construction of a Spiking Neural Network (SNN), i.e. the choice of an appropriate topology and the configuration of its internal parameters, represents a great challenge for SNN based applications. Evolutionary Algorithms (EAs) offer an elegant solution for these challenges and methods capable of exploring both types of search spaces simultaneously appear to be the most promising ones. A variety of such heterogeneous optimization algorithms have emerged recently, in particular in the field of probabilistic optimization. In this paper, a literature review on heterogeneous optimization algorithms is presented and an example of probabilistic optimization of SNN is discussed in detail. The paper provides an experimental analysis of a novel Heterogeneous Multi-Model Estimation of Distribution Algorithm (hMM-EDA). First, practical guidelines for configuring the method are derived and then the performance of hMM-EDA is compared to state-of-the-art optimization algorithms. Results show hMM-EDA as a light-weight, fast and reliable optimization method that requires the configuration of only very few parameters. Its performance on a synthetic heterogeneous benchmark problem is highly competitive and suggests its suitability for the optimization of SNN.  相似文献   

11.
Being permanently confronted with an uncertain world, brains have faced evolutionary pressure to represent this uncertainty in order to respond appropriately. Often, this requires visiting multiple interpretations of the available information or multiple solutions to an encountered problem. This gives rise to the so-called mixing problem: since all of these “valid” states represent powerful attractors, but between themselves can be very dissimilar, switching between such states can be difficult. We propose that cortical oscillations can be effectively used to overcome this challenge. By acting as an effective temperature, background spiking activity modulates exploration. Rhythmic changes induced by cortical oscillations can then be interpreted as a form of simulated tempering. We provide a rigorous mathematical discussion of this link and study some of its phenomenological implications in computer simulations. This identifies a new computational role of cortical oscillations and connects them to various phenomena in the brain, such as sampling-based probabilistic inference, memory replay, multisensory cue combination, and place cell flickering.  相似文献   

12.
13.
Can the topology of a recurrent spiking network be inferred from observed activity dynamics? Which statistical parameters of network connectivity can be extracted from firing rates, correlations and related measurable quantities? To approach these questions, we analyze distance dependent correlations of the activity in small-world networks of neurons with current-based synapses derived from a simple ring topology. We find that in particular the distribution of correlation coefficients of subthreshold activity can tell apart random networks from networks with distance dependent connectivity. Such distributions can be estimated by sampling from random pairs. We also demonstrate the crucial role of the weight distribution, most notably the compliance with Dales principle, for the activity dynamics in recurrent networks of different types.  相似文献   

14.
Spontaneous activity in biological neural networks shows patterns of dynamic synchronization. We propose that these patterns support the formation␣of a small-world structure—network connectivity␣optimal for distributed information processing. We␣present numerical simulations with connected Hindmarsh–Rose neurons in which, starting from random connection distributions, small-world networks evolve as a result of applying an adaptive rewiring rule. The rule connects pairs of neurons that tend fire in synchrony, and disconnects ones that fail to synchronize. Repeated application of the rule leads to small-world structures. This mechanism is robustly observed for bursting and irregular firing regimes.  相似文献   

15.
Several efforts are currently underway to decipher the connectome or parts thereof in a variety of organisms. Ascertaining the detailed physiological properties of all the neurons in these connectomes, however, is out of the scope of such projects. It is therefore unclear to what extent knowledge of the connectome alone will advance a mechanistic understanding of computation occurring in these neural circuits, especially when the high-level function of the said circuit is unknown. We consider, here, the question of how the wiring diagram of neurons imposes constraints on what neural circuits can compute, when we cannot assume detailed information on the physiological response properties of the neurons. We call such constraints—that arise by virtue of the connectome—connectomic constraints on computation. For feedforward networks equipped with neurons that obey a deterministic spiking neuron model which satisfies a small number of properties, we ask if just by knowing the architecture of a network, we can rule out computations that it could be doing, no matter what response properties each of its neurons may have. We show results of this form, for certain classes of network architectures. On the other hand, we also prove that with the limited set of properties assumed for our model neurons, there are fundamental limits to the constraints imposed by network structure. Thus, our theory suggests that while connectomic constraints might restrict the computational ability of certain classes of network architectures, we may require more elaborate information on the properties of neurons in the network, before we can discern such results for other classes of networks.  相似文献   

16.
During development, the mammalian brain differentiates into specialized regions with distinct functional abilities. While many factors contribute to functional specialization, we explore the effect of neuronal density on the development of neuronal interactions in vitro. Two types of cortical networks, namely, dense and sparse with 50,000 and 12,500 total cells, respectively, are studied. Activation graphs that represent pairwise neuronal interactions are constructed using a competitive first response model. These graphs reveal that, during development in vitro, dense networks form activation connections earlier than sparse networks. Link entropy analysis of dense network activation graphs suggests that the majority of connections between electrodes are reciprocal in nature. Information theoretic measures reveal that early functional information interactions (among three electrodes) are synergetic in both dense and sparse networks. However, during later stages of development, previously synergetic relationships become primarily redundant in dense, but not in sparse networks. Large link entropy values in the activation graph are related to the domination of redundant ensembles in late stages of development in dense networks. Results demonstrate differences between dense and sparse networks in terms of informational groups, pairwise relationships, and activation graphs. These differences suggest that variations in cell density may result in different functional specializations of nervous system tissue in vivo.  相似文献   

17.
The numerical simulation of spiking neural networks requires particular attention. On the one hand, time-stepping methods are generic but they are prone to numerical errors and need specific treatments to deal with the discontinuities of integrate-and-fire models. On the other hand, event-driven methods are more precise but they are restricted to a limited class of neuron models. We present here a voltage-stepping scheme that combines the advantages of these two approaches and consists of a discretization of the voltage state-space. The numerical simulation is reduced to a local event-driven method that induces an implicit activity-dependent time discretization (time-steps automatically increase when the neuron is slowly varying). We show analytically that such a scheme leads to a high-order algorithm so that it accurately approximates the neuronal dynamics. The voltage-stepping method is generic and can be used to simulate any kind of neuron models. We illustrate it on nonlinear integrate-and-fire models and show that it outperforms time-stepping schemes of Runge-Kutta type in terms of simulation time and accuracy.
D. MartinezEmail:
  相似文献   

18.
Self-organized criticality refers to the spontaneous emergence of self-similar dynamics in complex systems poised between order and randomness. The presence of self-organized critical dynamics in the brain is theoretically appealing and is supported by recent neurophysiological studies. Despite this, the neurobiological determinants of these dynamics have not been previously sought. Here, we systematically examined the influence of such determinants in hierarchically modular networks of leaky integrate-and-fire neurons with spike-timing-dependent synaptic plasticity and axonal conduction delays. We characterized emergent dynamics in our networks by distributions of active neuronal ensemble modules (neuronal avalanches) and rigorously assessed these distributions for power-law scaling. We found that spike-timing-dependent synaptic plasticity enabled a rapid phase transition from random subcritical dynamics to ordered supercritical dynamics. Importantly, modular connectivity and low wiring cost broadened this transition, and enabled a regime indicative of self-organized criticality. The regime only occurred when modular connectivity, low wiring cost and synaptic plasticity were simultaneously present, and the regime was most evident when between-module connection density scaled as a power-law. The regime was robust to variations in other neurobiologically relevant parameters and favored systems with low external drive and strong internal interactions. Increases in system size and connectivity facilitated internal interactions, permitting reductions in external drive and facilitating convergence of postsynaptic-response magnitude and synaptic-plasticity learning rate parameter values towards neurobiologically realistic levels. We hence infer a novel association between self-organized critical neuronal dynamics and several neurobiologically realistic features of structural connectivity. The central role of these features in our model may reflect their importance for neuronal information processing.  相似文献   

19.
20.
Seung HS 《Neuron》2003,40(6):1063-1073
It is well-known that chemical synaptic transmission is an unreliable process, but the function of such unreliability remains unclear. Here I consider the hypothesis that the randomness of synaptic transmission is harnessed by the brain for learning, in analogy to the way that genetic mutation is utilized by Darwinian evolution. This is possible if synapses are "hedonistic," responding to a global reward signal by increasing their probabilities of vesicle release or failure, depending on which action immediately preceded reward. Hedonistic synapses learn by computing a stochastic approximation to the gradient of the average reward. They are compatible with synaptic dynamics such as short-term facilitation and depression and with the intricacies of dendritic integration and action potential generation. A network of hedonistic synapses can be trained to perform a desired computation by administering reward appropriately, as illustrated here through numerical simulations of integrate-and-fire model neurons.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号