首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This report continues our research into the effectiveness of adaptive synaptogenesis in constructing feed-forward networks which perform good transformations on their inputs. Good transformations are characterized by the maintenance of input information and the removal of statistical dependence. Adaptive synaptogenesis stochastically builds and sculpts a synaptic connectivity in initially unconnected networks using two mechanisms. The first, synaptogenesis, creates new, excitatory, feed-forward connections. The second, associative modification, adjusts the strength of existing synapses. Our previous implementations of synaptogenesis only incorporated a postsynaptic regulatory process, receptivity to new innervation (Adelsberger-Mangan and Levy 1993a, b). In the present study, a presynaptic regulatory process, presynaptic avidity, which regulates the tendency of a presynaptic neuron to participate in a new synaptic connection as a function of its total synaptic weight, is incorporated into the synaptogenesis process. In addition, we investigate a third mechanism, selective synapse removal. This process removes synapses between neurons whose firing is poorly correlated. Networks that are constructed with the presynaptic regulatory process maintain more information and remove more statistical dependence than networks constructed with postsynaptic receptivity and associative modification alone. Selective synapse removal also improves network performance, but only when implemented in conjunction with the presynaptic regulatory process. Received: 20 August 1993/Accepted in revised form: 16 April 1994  相似文献   

2.
This study compares the ability of excitatory, feed-forward neural networks to construct good transformations on their inputs. The quality of such a transformation is judged by the minimization of two information measures: the information loss of the transformation and the statistical dependency of the output. The networks that are compared differ from each other in the parametric properties of their neurons and in their connectivity. The particular network parameters studied are output firing threshold, synaptic connectivity, and associative modification of connection weights. The network parameters that most directly affect firing levels are threshold and connectivity. Networks incorporating neurons with dynamic threshold adjustment produce better transformations. When firing threshold is optimized, sparser synaptic connectivity produces a better transformation than denser connectivity. Associative modification of synaptic weights confers only a slight advantage in the construction of optimal transformations. Additionally, our research shows that some environments are better suited than others for recoding. Specifically, input environments high in statistical dependence, i.e. those environments most in need of recoding, are more likely to undergo successful transformations.  相似文献   

3.
Spike-timing-dependent plasticity (STDP) is believed to structure neuronal networks by slowly changing the strengths (or weights) of the synaptic connections between neurons depending upon their spiking activity, which in turn modifies the neuronal firing dynamics. In this paper, we investigate the change in synaptic weights induced by STDP in a recurrently connected network in which the input weights are plastic but the recurrent weights are fixed. The inputs are divided into two pools with identical constant firing rates and equal within-pool spike-time correlations, but with no between-pool correlations. Our analysis uses the Poisson neuron model in order to predict the evolution of the input synaptic weights and focuses on the asymptotic weight distribution that emerges due to STDP. The learning dynamics induces a symmetry breaking for the individual neurons, namely for sufficiently strong within-pool spike-time correlation each neuron specializes to one of the input pools. We show that the presence of fixed excitatory recurrent connections between neurons induces a group symmetry-breaking effect, in which neurons tend to specialize to the same input pool. Consequently STDP generates a functional structure on the input connections of the network.  相似文献   

4.
RV Florian 《PloS one》2012,7(8):e40233
In many cases, neurons process information carried by the precise timings of spikes. Here we show how neurons can learn to generate specific temporally precise output spikes in response to input patterns of spikes having precise timings, thus processing and memorizing information that is entirely temporally coded, both as input and as output. We introduce two new supervised learning rules for spiking neurons with temporal coding of information (chronotrons), one that provides high memory capacity (E-learning), and one that has a higher biological plausibility (I-learning). With I-learning, the neuron learns to fire the target spike trains through synaptic changes that are proportional to the synaptic currents at the timings of real and target output spikes. We study these learning rules in computer simulations where we train integrate-and-fire neurons. Both learning rules allow neurons to fire at the desired timings, with sub-millisecond precision. We show how chronotrons can learn to classify their inputs, by firing identical, temporally precise spike trains for different inputs belonging to the same class. When the input is noisy, the classification also leads to noise reduction. We compute lower bounds for the memory capacity of chronotrons and explore the influence of various parameters on chronotrons' performance. The chronotrons can model neurons that encode information in the time of the first spike relative to the onset of salient stimuli or neurons in oscillatory networks that encode information in the phases of spikes relative to the background oscillation. Our results show that firing one spike per cycle optimizes memory capacity in neurons encoding information in the phase of firing relative to a background rhythm.  相似文献   

5.
Recordings from area V4 of monkeys have revealed that when the focus of attention is on a visual stimulus within the receptive field of a cortical neuron, two distinct changes can occur: The firing rate of the neuron can change and there can be an increase in the coherence between spikes and the local field potential (LFP) in the gamma-frequency range (30-50 Hz). The hypothesis explored here is that these observed effects of attention could be a consequence of changes in the synchrony of local interneuron networks. We performed computer simulations of a Hodgkin-Huxley type neuron driven by a constant depolarizing current, I, representing visual stimulation and a modulatory inhibitory input representing the effects of attention via local interneuron networks. We observed that the neuron's firing rate and the coherence of its output spike train with the synaptic inputs was modulated by the degree of synchrony of the inhibitory inputs. When inhibitory synchrony increased, the coherence of spiking model neurons with the synaptic input increased, but the firing rate either increased or remained the same. The mean number of synchronous inhibitory inputs was a key determinant of the shape of the firing rate versus current (f-I) curves. For a large number of inhibitory inputs (approximately 50), the f-I curve saturated for large I and an increase in input synchrony resulted in a shift of sensitivity-the model neuron responded to weaker inputs I. For a small number (approximately 10), the f-I curves were non-saturating and an increase in input synchrony led to an increase in the gain of the response-the firing rate in response to the same input was multiplied by an approximately constant factor. The firing rate modulation with inhibitory synchrony was highest when the input network oscillated in the gamma frequency range. Thus, the observed changes in firing rate and coherence of neurons in the visual cortex could be controlled by top-down inputs that regulated the coherence in the activity of a local inhibitory network discharging at gamma frequencies.  相似文献   

6.
Networks of synchronized fast-spiking interneurons are thought to be key elements in the generation of gamma (γ) oscillations (30–80 Hz) in the brain. We examined how such γ-oscillatory inhibition regulates the output of a cortical pyramidal cell. Specifically, we modeled a situation where a pyramidal cell receives inputs from γ-synchronized fast-spiking inhibitory interneurons. This model successfully reproduced several important aspects of a recent experimental result regarding the γ-inhibitory regulation of pyramidal cellular firing that is presumably associated with the sensation of whisker stimuli. Through an in-depth analysis of this model system, we show that there is an obvious rhythmic gating effect of the γ-oscillated interneuron networks on the pyramidal neuron’s signal transmission. This effect is further illustrated by the interactions of this interneuron network and the pyramidal neuron. Prominent power in the γ frequency range can emerge provided that there are appropriate delays on the excitatory connections and inhibitory synaptic conductance between interneurons. These results indicate that interactions between excitation and inhibition are critical for the modulation of coherence and oscillation frequency of network activities.  相似文献   

7.
Spike-timing-dependent plasticity (STDP) determines the evolution of the synaptic weights according to their pre- and post-synaptic activity, which in turn changes the neuronal activity. In this paper, we extend previous studies of input selectivity induced by (STDP) for single neurons to the biologically interesting case of a neuronal network with fixed recurrent connections and plastic connections from external pools of input neurons. We use a theoretical framework based on the Poisson neuron model to analytically describe the network dynamics (firing rates and spike-time correlations) and thus the evolution of the synaptic weights. This framework incorporates the time course of the post-synaptic potentials and synaptic delays. Our analysis focuses on the asymptotic states of a network stimulated by two homogeneous pools of “steady” inputs, namely Poisson spike trains which have fixed firing rates and spike-time correlations. The (STDP) model extends rate-based learning in that it can implement, at the same time, both a stabilization of the individual neuron firing rates and a slower weight specialization depending on the input spike-time correlations. When one input pathway has stronger within-pool correlations, the resulting synaptic dynamics induced by (STDP) are shown to be similar to those arising in the case of a purely feed-forward network: the weights from the more correlated inputs are potentiated at the expense of the remaining input connections.  相似文献   

8.
Accurately describing synaptic interactions between neurons and how interactions change over time are key challenges for systems neuroscience. Although intracellular electrophysiology is a powerful tool for studying synaptic integration and plasticity, it is limited by the small number of neurons that can be recorded simultaneously in vitro and by the technical difficulty of intracellular recording in vivo. One way around these difficulties may be to use large-scale extracellular recording of spike trains and apply statistical methods to model and infer functional connections between neurons. These techniques have the potential to reveal large-scale connectivity structure based on the spike timing alone. However, the interpretation of functional connectivity is often approximate, since only a small fraction of presynaptic inputs are typically observed. Here we use in vitro current injection in layer 2/3 pyramidal neurons to validate methods for inferring functional connectivity in a setting where input to the neuron is controlled. In experiments with partially-defined input, we inject a single simulated input with known amplitude on a background of fluctuating noise. In a fully-defined input paradigm, we then control the synaptic weights and timing of many simulated presynaptic neurons. By analyzing the firing of neurons in response to these artificial inputs, we ask 1) How does functional connectivity inferred from spikes relate to simulated synaptic input? and 2) What are the limitations of connectivity inference? We find that individual current-based synaptic inputs are detectable over a broad range of amplitudes and conditions. Detectability depends on input amplitude and output firing rate, and excitatory inputs are detected more readily than inhibitory. Moreover, as we model increasing numbers of presynaptic inputs, we are able to estimate connection strengths more accurately and detect the presence of connections more quickly. These results illustrate the possibilities and outline the limits of inferring synaptic input from spikes.  相似文献   

9.
Synaptic information efficacy (SIE) is a statistical measure to quantify the efficacy of a synapse. It measures how much information is gained, on the average, about the output spike train of a postsynaptic neuron if the input spike train is known. It is a particularly appropriate measure for assessing the input–output relationship of neurons receiving dynamic stimuli. Here, we compare the SIE of simulated synaptic inputs measured experimentally in layer 5 cortical pyramidal neurons in vitro with the SIE computed from a minimal model constructed to fit the recorded data. We show that even with a simple model that is far from perfect in predicting the precise timing of the output spikes of the real neuron, the SIE can still be accurately predicted. This arises from the ability of the model to predict output spikes influenced by the input more accurately than those driven by the background current. This indicates that in this context, some spikes may be more important than others. Lastly we demonstrate another aspect where using mutual information could be beneficial in evaluating the quality of a model, by measuring the mutual information between the model’s output and the neuron’s output. The SIE, thus, could be a useful tool for assessing the quality of models of single neurons in preserving input–output relationship, a property that becomes crucial when we start connecting these reduced models to construct complex realistic neuronal networks.  相似文献   

10.
The magnitude and apparent complexity of the brain''s connectivity have left explicit networks largely unexplored. As a result, the relationship between the organization of synaptic connections and how the brain processes information is poorly understood. A recently proposed retinal network that produces neural correlates of color vision is refined and extended here to a family of general logic circuits. For any combination of high and low activity in any set of neurons, one of the logic circuits can receive input from the neurons and activate a single output neuron whenever the input neurons have the given activity state. The strength of the output neuron''s response is a measure of the difference between the smallest of the high inputs and the largest of the low inputs. The networks generate correlates of known psychophysical phenomena. These results follow directly from the most cost-effective architectures for specific logic circuits and the minimal cellular capabilities of excitation and inhibition. The networks function dynamically, making their operation consistent with the speed of most brain functions. The networks show that well-known psychophysical phenomena do not require extraordinarily complex brain structures, and that a single network architecture can produce apparently disparate phenomena in different sensory systems.  相似文献   

11.
Statistical Signs of Synaptic Interaction in Neurons   总被引:21,自引:0,他引:21       下载免费PDF全文
The influence of basic open-loop synaptic connections on the firing of simultaneously recorded neurons has been investigated with auto- and cross-correlation histograms, using experimental records and computer simulations. The basic connections examined were direct synaptic excitation, direct synaptic inhibition, and shared synaptic input. Each type of synaptic connection produces certain characteristic features in the cross-correlogram depending on the properties of the synapse and statistical features in the firing pattern of each neuron. Thus, empirically derived cross-correlation measures can be interpreted in terms of the underlying physiological mechanisms. Their potential uses and limitations in the detection and identification of synaptic connections between neurons whose extracellularly recorded spike trains are available are discussed.  相似文献   

12.
A time-varying Resistance-Capacitance (RC) circuit computer model was constructed based on known membrane and synaptic properties of the visualvestibular network of the marine snail Hermissenda crassicornis. Specific biophysical properties and synaptic connections of identified neurons are represented as lumped parameters (circuit elements) in the model; in the computer simulation, differential equations are approximated by difference equations. The model's output, membrane potential, an indirect measure of firing frequency, closely parallels the behavioral and electrophysiologic outputs of Hermissenda in response to the same input stimuli presented during and after associative learning. The parallelism of the computer modeled and the biologic outputs suggests that the model captures the features necessary and sufficient for associative learning.  相似文献   

13.
Brain networks store new memories using functional and structural synaptic plasticity. Memory formation is generally attributed to Hebbian plasticity, while homeostatic plasticity is thought to have an ancillary role in stabilizing network dynamics. Here we report that homeostatic plasticity alone can also lead to the formation of stable memories. We analyze this phenomenon using a new theory of network remodeling, combined with numerical simulations of recurrent spiking neural networks that exhibit structural plasticity based on firing rate homeostasis. These networks are able to store repeatedly presented patterns and recall them upon the presentation of incomplete cues. Storage is fast, governed by the homeostatic drift. In contrast, forgetting is slow, driven by a diffusion process. Joint stimulation of neurons induces the growth of associative connections between them, leading to the formation of memory engrams. These memories are stored in a distributed fashion throughout connectivity matrix, and individual synaptic connections have only a small influence. Although memory-specific connections are increased in number, the total number of inputs and outputs of neurons undergo only small changes during stimulation. We find that homeostatic structural plasticity induces a specific type of “silent memories”, different from conventional attractor states.  相似文献   

14.
Vasopressin neurons, responding to input generated by osmotic pressure, use an intrinsic mechanism to shift from slow irregular firing to a distinct phasic pattern, consisting of long bursts and silences lasting tens of seconds. With increased input, bursts lengthen, eventually shifting to continuous firing. The phasic activity remains asynchronous across the cells and is not reflected in the population output signal. Here we have used a computational vasopressin neuron model to investigate the functional significance of the phasic firing pattern. We generated a concise model of the synaptic input driven spike firing mechanism that gives a close quantitative match to vasopressin neuron spike activity recorded in vivo, tested against endogenous activity and experimental interventions. The integrate-and-fire based model provides a simple physiological explanation of the phasic firing mechanism involving an activity-dependent slow depolarising afterpotential (DAP) generated by a calcium-inactivated potassium leak current. This is modulated by the slower, opposing, action of activity-dependent dendritic dynorphin release, which inactivates the DAP, the opposing effects generating successive periods of bursting and silence. Model cells are not spontaneously active, but fire when perturbed by random perturbations mimicking synaptic input. We constructed one population of such phasic neurons, and another population of similar cells but which lacked the ability to fire phasically. We then studied how these two populations differed in the way that they encoded changes in afferent inputs. By comparison with the non-phasic population, the phasic population responds linearly to increases in tonic synaptic input. Non-phasic cells respond to transient elevations in synaptic input in a way that strongly depends on background activity levels, phasic cells in a way that is independent of background levels, and show a similar strong linearization of the response. These findings show large differences in information coding between the populations, and apparent functional advantages of asynchronous phasic firing.  相似文献   

15.
What cellular and network properties allow reliable neuronal rhythm generation or firing that can be started and stopped by brief synaptic inputs? We investigate rhythmic activity in an electrically-coupled population of brainstem neurons driving swimming locomotion in young frog tadpoles, and how activity is switched on and off by brief sensory stimulation. We build a computational model of 30 electrically-coupled conditional pacemaker neurons on one side of the tadpole hindbrain and spinal cord. Based on experimental estimates for neuron properties, population sizes, synapse strengths and connections, we show that: long-lasting, mutual, glutamatergic excitation between the neurons allows the network to sustain rhythmic pacemaker firing at swimming frequencies following brief synaptic excitation; activity persists but rhythm breaks down without electrical coupling; NMDA voltage-dependency doubles the range of synaptic feedback strengths generating sustained rhythm. The network can be switched on and off at short latency by brief synaptic excitation and inhibition. We demonstrate that a population of generic Hodgkin-Huxley type neurons coupled by glutamatergic excitatory feedback can generate sustained asynchronous firing switched on and off synaptically. We conclude that networks of neurons with NMDAR mediated feedback excitation can generate self-sustained activity following brief synaptic excitation. The frequency of activity is limited by the kinetics of the neuron membrane channels and can be stopped by brief inhibitory input. Network activity can be rhythmic at lower frequencies if the neurons are electrically coupled. Our key finding is that excitatory synaptic feedback within a population of neurons can produce switchable, stable, sustained firing without synaptic inhibition.  相似文献   

16.
Dynamical behavior of a biological neuronal network depends significantly on the spatial pattern of synaptic connections among neurons. While neuronal network dynamics has extensively been studied with simple wiring patterns, such as all-to-all or random synaptic connections, not much is known about the activity of networks with more complicated wiring topologies. Here, we examined how different wiring topologies may influence the response properties of neuronal networks, paying attention to irregular spike firing, which is known as a characteristic of in vivo cortical neurons, and spike synchronicity. We constructed a recurrent network model of realistic neurons and systematically rewired the recurrent synapses to change the network topology, from a localized regular and a “small-world” network topology to a distributed random network topology. Regular and small-world wiring patterns greatly increased the irregularity or the coefficient of variation (Cv) of output spike trains, whereas such an increase was small in random connectivity patterns. For given strength of recurrent synapses, the firing irregularity exhibited monotonous decreases from the regular to the random network topology. By contrast, the spike coherence between an arbitrary neuron pair exhibited a non-monotonous dependence on the topological wiring pattern. More precisely, the wiring pattern to maximize the spike coherence varied with the strength of recurrent synapses. In a certain range of the synaptic strength, the spike coherence was maximal in the small-world network topology, and the long-range connections introduced in this wiring changed the dependence of spike synchrony on the synaptic strength moderately. However, the effects of this network topology were not really special in other properties of network activity. Action Editor: Xiao-Jing Wang  相似文献   

17.
Synaptic plasticity is the cellular mechanism underlying the phenomena of learning and memory. Much of the research on synaptic plasticity is based on the postulate of Hebb (1949) who proposed that, when a neuron repeatedly takes part in the activation of another neuron, the efficacy of the connections between these neurons is increased. Plasticity has been extensively studied, and often demonstrated through the processes of LTP (Long Term Potentiation) and LTD (Long Term Depression), which represent an increase and a decrease of the efficacy of long-term synaptic transmission. This review summarizes current knowledge concerning the cellular mechanisms of LTP and LTD, whether at the level of excitatory synapses, which have been the most studied, or at the level of inhibitory synapses. However, if we consider neuronal networks rather than the individual synapses, the consequences of synaptic plasticity need to be considered on a large scale to determine if the activity of networks are changed or not. Homeostatic plasticity takes into account the mechanisms which control the efficacy of synaptic transmission for all the synaptic inputs of a neuron. Consequently, this new concept deals with the coordinated activity of excitatory and inhibitory networks afferent to a neuron which maintain a controlled level of excitability during the acquisition of new information related to the potentiation or to the depression of synaptic efficacy. We propose that the protocols of stimulation used to induce plasticity at the synaptic level set up a "homeostatic potentiation" or a "homeostatic depression" of excitation and inhibition at the level of the neuronal networks. The coordination between excitatory and inhibitory circuits allows the neuronal networks to preserve a level of stable activity, thus avoiding episodes of hyper- or hypo-activity during the learning and memory phases.  相似文献   

18.
Dragoi G  Harris KD  Buzsáki G 《Neuron》2003,39(5):843-853
In the brain, information is encoded by the firing patterns of neuronal ensembles and the strength of synaptic connections between individual neurons. We report here that representation of the environment by "place" cells is altered by changing synaptic weights within hippocampal networks. Long-term potentiation (LTP) of intrinsic hippocampal pathways abolished existing place fields, created new place fields, and rearranged the temporal relationship within the affected population. The effect of LTP on neuron discharge was rate and context dependent. The LTP-induced "remapping" occurred without affecting the global firing rate of the network. The findings support the view that learned place representation can be accomplished by LTP-like synaptic plasticity within intrahippocampal networks.  相似文献   

19.
Our goal is to understand how nearly synchronous modes arise in heterogenous networks of neurons. In heterogenous networks, instead of exact synchrony, nearly synchronous modes arise, which include both 1:1 and 2:2 phase-locked modes. Existence and stability criteria for 2:2 phase-locked modes in reciprocally coupled two neuron circuits were derived based on the open loop phase resetting curve (PRC) without the assumption of weak coupling. The PRC for each component neuron was generated using the change in synaptic conductance produced by a presynaptic action potential as the perturbation. Separate derivations were required for modes in which the firing order is preserved and for those in which it alternates. Networks composed of two model neurons coupled by reciprocal inhibition were examined to test the predictions. The parameter regimes in which both types of nearly synchronous modes are exhibited were accurately predicted both qualitatively and quantitatively provided that the synaptic time constant is short with respect to the period and that the effect of second order resetting is considered. In contrast, PRC methods based on weak coupling could not predict 2:2 modes and did not predict the 1:1 modes with the level of accuracy achieved by the strong coupling methods. The strong coupling prediction methods provide insight into what manipulations promote near-synchrony in a two neuron network and may also have predictive value for larger networks, which can also manifest changes in firing order. We also identify a novel route by which synchrony is lost in mildly heterogenous networks.  相似文献   

20.
Alcohol dependence and withdrawal has been shown to cause neuroadaptive changes at multiple levels of the nervous system. At the neuron level, adaptations of synaptic connections have been extensively studied in a number of brain areas and accumulating evidence also shows the importance of alcohol dependence-related changes in the intrinsic cellular properties of neurons. At the same time, it is still largely unknown how such neural adaptations impact the firing and integrative properties of neurons. To address these problems, here, we analyze physiological properties of neurons in the bed nucleus of stria terminalis (jcBNST) in animals with a history of alcohol dependence. As a comprehensive approach, first we measure passive and active membrane properties of neurons using conventional current clamp protocols and then analyze their firing responses under the action of simulated synaptic bombardment via dynamic clamp. We find that most physiological properties as measured by DC current injection are barely affected during protracted withdrawal. However, neuronal excitability as measured from firing responses under simulated synaptic inputs with the dynamic clamp is markedly reduced in all 3 types of jcBNST neurons. These results support the importance of studying the effects of alcohol and drugs of abuse on the firing properties of neurons with dynamic clamp protocols designed to bring the neurons into a high conductance state. Since the jcBNST integrates excitatory inputs from the basolateral amygdala (BLA) and cortical inputs from the infralimbic and the insular cortices and in turn is believed to contribute to the inhibitory input to the central nucleus of the amygdala (CeA) the reduced excitability of the jcBNST during protracted withdrawal in alcohol-dependent animals will likely affect ability of the jcBNST to shape the activity and output of the CeA.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号