首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Rhythms at slow (<1 Hz) frequency of alternating Up and Down states occur during slow-wave sleep states, under deep anaesthesia and in cortical slices of mammals maintained in vitro. Such spontaneous oscillations result from the interplay between network reverberations nonlinearly sustained by a strong synaptic coupling and a fatigue mechanism inhibiting the neurons firing in an activity-dependent manner. Varying pharmacologically the excitability level of brain slices we exploit the network dynamics underlying slow rhythms, uncovering an intrinsic anticorrelation between Up and Down state durations. Besides, a non-monotonic change of Down state duration is also observed, which shrinks the distribution of the accessible frequencies of the slow rhythms. Attractor dynamics with activity-dependent self-inhibition predicts a similar trend even when the system excitability is reduced, because of a stability loss of Up and Down states. Hence, such cortical rhythms tend to display a maximal size of the distribution of Up/Down frequencies, envisaging the location of the system dynamics on a critical boundary of the parameter space. This would be an optimal solution for the system in order to display a wide spectrum of dynamical regimes and timescales.  相似文献   

2.
It is commonly assumed that spontaneous activity of striatal output neurons is characterized by a two-state behavior. This assumption is mainly based on in vivo intracellular recordings under urethane and/or ketamine-xylazine anesthesia showing that striatal neurons oscillate between two preferred membrane potentials, a Down state (hyperpolarized level), resulting from an inwardly rectifying potassium conductance, and an Up state (depolarized level) caused by complex interactions between a barrage of cortical synaptic excitation and voltage-dependent potassium conductances. However, a recent comparative study using different anesthetics showed that striatal neurons can exhibit various shapes of synaptic activity depending on the temporal structure and the degree of synchronization of their cortico-striatal afferents. These new data demonstrate that the "classical" Up and Down states do not provide the unique spontaneous activity that can be encountered in striatal neurons in vivo. Rather we propose that striatal neurons should exhibit various synaptic activities and firing patterns depending on the states of vigilance. This hypothesis would be validated in further experiments in which the intracellular activity of striatal neurons will be recorded during the natural sleep-wake cycle.  相似文献   

3.
Rhythmic neuronal network activity underlies brain oscillations. To investigate how connected neuronal networks contribute to the emergence of the α-band and to the regulation of Up and Down states, we study a model based on synaptic short-term depression-facilitation with afterhyperpolarization (AHP). We found that the α-band is generated by the network behavior near the attractor of the Up-state. Coupling inhibitory and excitatory networks by reciprocal connections leads to the emergence of a stable α-band during the Up states, as reflected in the spectrogram. To better characterize the emergence and stability of thalamocortical oscillations containing α and δ rhythms during anesthesia, we model the interaction of two excitatory networks with one inhibitory network, showing that this minimal topology underlies the generation of a persistent α-band in the neuronal voltage characterized by dominant Up over Down states. Finally, we show that the emergence of the α-band appears when external inputs are suppressed, while fragmentation occurs at small synaptic noise or with increasing inhibitory inputs. To conclude, α-oscillations could result from the synaptic dynamics of interacting excitatory neuronal networks with and without AHP, a principle that could apply to other rhythms.  相似文献   

4.
Cortical neural networks exhibit high internal variability in spontaneous dynamic activities and they can robustly and reliably respond to external stimuli with multilevel features–from microscopic irregular spiking of neurons to macroscopic oscillatory local field potential. A comprehensive study integrating these multilevel features in spontaneous and stimulus–evoked dynamics with seemingly distinct mechanisms is still lacking. Here, we study the stimulus–response dynamics of biologically plausible excitation–inhibition (E–I) balanced networks. We confirm that networks around critical synchronous transition states can maintain strong internal variability but are sensitive to external stimuli. In this dynamical region, applying a stimulus to the network can reduce the trial-to-trial variability and shift the network oscillatory frequency while preserving the dynamical criticality. These multilevel features widely observed in different experiments cannot simultaneously occur in non-critical dynamical states. Furthermore, the dynamical mechanisms underlying these multilevel features are revealed using a semi-analytical mean-field theory that derives the macroscopic network field equations from the microscopic neuronal networks, enabling the analysis by nonlinear dynamics theory and linear noise approximation. The generic dynamical principle revealed here contributes to a more integrative understanding of neural systems and brain functions and incorporates multimodal and multilevel experimental observations. The E–I balanced neural network in combination with the effective mean-field theory can serve as a mechanistic modeling framework to study the multilevel neural dynamics underlying neural information and cognitive processes.  相似文献   

5.
During slow-wave sleep, brain electrical activity is dominated by the slow (< 1 Hz) electroencephalogram (EEG) oscillations characterized by the periodic transitions between active (or Up) and silent (or Down) states in the membrane voltage of the cortical and thalamic neurons. Sleep slow oscillation is believed to play critical role in consolidation of recent memories. Past computational studies, based on the Hodgkin-Huxley type neuronal models, revealed possible intracellular and network mechanisms of the neuronal activity during sleep, however, they failed to explore the large-scale cortical network dynamics depending on collective behavior in the large populations of neurons. In this new study, we developed a novel class of reduced discrete time spiking neuron models for large-scale network simulations of wake and sleep dynamics. In addition to the spiking mechanism, the new model implemented nonlinearities capturing effects of the leak current, the Ca2+ dependent K+ current and the persistent Na+ current that were found to be critical for transitions between Up and Down states of the slow oscillation. We applied the new model to study large-scale two-dimensional cortical network activity during slow-wave sleep. Our study explained traveling wave dynamics and characteristic synchronization properties of transitions between Up and Down states of the slow oscillation as observed in vivo in recordings from cats. We further predict a critical role of synaptic noise and slow adaptive currents for spike sequence replay as found during sleep related memory consolidation.  相似文献   

6.
Randomly-connected networks of integrate-and-fire (IF) neurons are known to display asynchronous irregular (AI) activity states, which resemble the discharge activity recorded in the cerebral cortex of awake animals. However, it is not clear whether such activity states are specific to simple IF models, or if they also exist in networks where neurons are endowed with complex intrinsic properties similar to electrophysiological measurements. Here, we investigate the occurrence of AI states in networks of nonlinear IF neurons, such as the adaptive exponential IF (Brette-Gerstner-Izhikevich) model. This model can display intrinsic properties such as low-threshold spike (LTS), regular spiking (RS) or fast-spiking (FS). We successively investigate the oscillatory and AI dynamics of thalamic, cortical and thalamocortical networks using such models. AI states can be found in each case, sometimes with surprisingly small network size of the order of a few tens of neurons. We show that the presence of LTS neurons in cortex or in thalamus, explains the robust emergence of AI states for relatively small network sizes. Finally, we investigate the role of spike-frequency adaptation (SFA). In cortical networks with strong SFA in RS cells, the AI state is transient, but when SFA is reduced, AI states can be self-sustained for long times. In thalamocortical networks, AI states are found when the cortex is itself in an AI state, but with strong SFA, the thalamocortical network displays Up and Down state transitions, similar to intracellular recordings during slow-wave sleep or anesthesia. Self-sustained Up and Down states could also be generated by two-layer cortical networks with LTS cells. These models suggest that intrinsic properties such as adaptation and low-threshold bursting activity are crucial for the genesis and control of AI states in thalamocortical networks.  相似文献   

7.
Up-Down synchronization in neuronal networks refers to spontaneous switches between periods of high collective firing activity (Up state) and periods of silence (Down state). Recent experimental reports have shown that astrocytes can control the emergence of such Up-Down regimes in neural networks, although the molecular or cellular mechanisms that are involved are still uncertain. Here we propose neural network models made of three populations of cells: excitatory neurons, inhibitory neurons and astrocytes, interconnected by synaptic and gliotransmission events, to explore how astrocytes can control this phenomenon. The presence of astrocytes in the models is indeed observed to promote the emergence of Up-Down regimes with realistic characteristics. Our models show that the difference of signalling timescales between astrocytes and neurons (seconds versus milliseconds) can induce a regime where the frequency of gliotransmission events released by the astrocytes does not synchronize with the Up and Down phases of the neurons, but remains essentially stable. However, these gliotransmission events are found to change the localization of the bifurcations in the parameter space so that with the addition of astrocytes, the network enters a bistability region of the dynamics that corresponds to Up-Down synchronization. Taken together, our work provides a theoretical framework to test scenarios and hypotheses on the modulation of Up-Down dynamics by gliotransmission from astrocytes.  相似文献   

8.
Here we explore the possibility that a core function of sensory cortex is the generation of an internal simulation of sensory environment in real-time. A logical elaboration of this idea leads to a dynamical neural architecture that oscillates between two fundamental network states, one driven by external input, and the other by recurrent synaptic drive in the absence of sensory input. Synaptic strength is modified by a proposed synaptic state matching (SSM) process that ensures equivalence of spike statistics between the two network states. Remarkably, SSM, operating locally at individual synapses, generates accurate and stable network-level predictive internal representations, enabling pattern completion and unsupervised feature detection from noisy sensory input. SSM is a biologically plausible substrate for learning and memory because it brings together sequence learning, feature detection, synaptic homeostasis, and network oscillations under a single unifying computational framework.  相似文献   

9.
We explore and analyze the nonlinear switching dynamics of neuronal networks with non-homogeneous connectivity. The general significance of such transient dynamics for brain function is unclear; however, for instance decision-making processes in perception and cognition have been implicated with it. The network under study here is comprised of three subnetworks of either excitatory or inhibitory leaky integrate-and-fire neurons, of which two are of the same type. The synaptic weights are arranged to establish and maintain a balance between excitation and inhibition in case of a constant external drive. Each subnetwork is randomly connected, where all neurons belonging to a particular population have the same in-degree and the same out-degree. Neurons in different subnetworks are also randomly connected with the same probability; however, depending on the type of the pre-synaptic neuron, the synaptic weight is scaled by a factor. We observed that for a certain range of the “within” versus “between” connection weights (bifurcation parameter), the network activation spontaneously switches between the two sub-networks of the same type. This kind of dynamics has been termed “winnerless competition”, which also has a random component here. In our model, this phenomenon is well described by a set of coupled stochastic differential equations of Lotka-Volterra type that imply a competition between the subnetworks. The associated mean-field model shows the same dynamical behavior as observed in simulations of large networks comprising thousands of spiking neurons. The deterministic phase portrait is characterized by two attractors and a saddle node, its stochastic component is essentially given by the multiplicative inherent noise of the system. We find that the dwell time distribution of the active states is exponential, indicating that the noise drives the system randomly from one attractor to the other. A similar model for a larger number of populations might suggest a general approach to study the dynamics of interacting populations of spiking networks.  相似文献   

10.
Short-term memory in the brain cannot in general be explained the way long-term memory can – as a gradual modification of synaptic weights – since it takes place too quickly. Theories based on some form of cellular bistability, however, do not seem able to account for the fact that noisy neurons can collectively store information in a robust manner. We show how a sufficiently clustered network of simple model neurons can be instantly induced into metastable states capable of retaining information for a short time (a few seconds). The mechanism is robust to different network topologies and kinds of neural model. This could constitute a viable means available to the brain for sensory and/or short-term memory with no need of synaptic learning. Relevant phenomena described by neurobiology and psychology, such as local synchronization of synaptic inputs and power-law statistics of forgetting avalanches, emerge naturally from this mechanism, and we suggest possible experiments to test its viability in more biological settings.  相似文献   

11.
Zero-lag synchronization between distant cortical areas has been observed in a diversity of experimental data sets and between many different regions of the brain. Several computational mechanisms have been proposed to account for such isochronous synchronization in the presence of long conduction delays: Of these, the phenomenon of “dynamical relaying” – a mechanism that relies on a specific network motif – has proven to be the most robust with respect to parameter mismatch and system noise. Surprisingly, despite a contrary belief in the community, the common driving motif is an unreliable means of establishing zero-lag synchrony. Although dynamical relaying has been validated in empirical and computational studies, the deeper dynamical mechanisms and comparison to dynamics on other motifs is lacking. By systematically comparing synchronization on a variety of small motifs, we establish that the presence of a single reciprocally connected pair – a “resonance pair” – plays a crucial role in disambiguating those motifs that foster zero-lag synchrony in the presence of conduction delays (such as dynamical relaying) from those that do not (such as the common driving triad). Remarkably, minor structural changes to the common driving motif that incorporate a reciprocal pair recover robust zero-lag synchrony. The findings are observed in computational models of spiking neurons, populations of spiking neurons and neural mass models, and arise whether the oscillatory systems are periodic, chaotic, noise-free or driven by stochastic inputs. The influence of the resonance pair is also robust to parameter mismatch and asymmetrical time delays amongst the elements of the motif. We call this manner of facilitating zero-lag synchrony resonance-induced synchronization, outline the conditions for its occurrence, and propose that it may be a general mechanism to promote zero-lag synchrony in the brain.  相似文献   

12.
The dynamics of networks of sparsely connected excitatory and inhibitory integrate-and-fire neurons are studied analytically. The analysis reveals a rich repertoire of states, including synchronous states in which neurons fire regularly; asynchronous states with stationary global activity and very irregular individual cell activity; and states in which the global activity oscillates but individual cells fire irregularly, typically at rates lower than the global oscillation frequency. The network can switch between these states, provided the external frequency, or the balance between excitation and inhibition, is varied. Two types of network oscillations are observed. In the fast oscillatory state, the network frequency is almost fully controlled by the synaptic time scale. In the slow oscillatory state, the network frequency depends mostly on the membrane time constant. Finite size effects in the asynchronous state are also discussed.  相似文献   

13.
Cellular memory, which allows cells to retain information from their environment, is important for a variety of cellular functions, such as adaptation to external stimuli, cell differentiation, and synaptic plasticity. Although posttranslational modifications have received much attention as a source of cellular memory, the mechanisms directing such alterations have not been fully uncovered. It may be possible to embed memory in multiple stable states in dynamical systems governing modifications. However, several experiments on modifications of proteins suggest long-term relaxation depending on experienced external conditions, without explicit switches over multi-stable states. As an alternative to a multistability memory scheme, we propose “kinetic memory” for epigenetic cellular memory, in which memory is stored as a slow-relaxation process far from a stable fixed state. Information from previous environmental exposure is retained as the long-term maintenance of a cellular state, rather than switches over fixed states. To demonstrate this kinetic memory, we study several models in which multimeric proteins undergo catalytic modifications (e.g., phosphorylation and methylation), and find that a slow relaxation process of the modification state, logarithmic in time, appears when the concentration of a catalyst (enzyme) involved in the modification reactions is lower than that of the substrates. Sharp transitions from a normal fast-relaxation phase into this slow-relaxation phase are revealed, and explained by enzyme-limited competition among modification reactions. The slow-relaxation process is confirmed by simulations of several models of catalytic reactions of protein modifications, and it enables the memorization of external stimuli, as its time course depends crucially on the history of the stimuli. This kinetic memory provides novel insight into a broad class of cellular memory and functions. In particular, applications for long-term potentiation are discussed, including dynamic modifications of calcium-calmodulin kinase II and cAMP-response element-binding protein essential for synaptic plasticity.  相似文献   

14.
Efficient degradation of autophagic vacuoles (AVs) via lysosomes is an important cellular homeostatic process. This is particularly challenging for neurons because mature acidic lysosomes are relatively enriched in the soma. Although dynein-driven retrograde transport of AVs was suggested, a fundamental question remains how autophagosomes generated at distal axons acquire dynein motors for retrograde transport toward the soma. In this paper, we demonstrate that late endosome (LE)–loaded dynein–snapin complexes drive AV retrograde transport in axons upon fusion of autophagosomes with LEs into amphisomes. Blocking the fusion with syntaxin17 knockdown reduced recruitment of dynein motors to AVs, thus immobilizing them in axons. Deficiency in dynein–snapin coupling impaired AV transport, resulting in AV accumulation in neurites and synaptic terminals. Altogether, our study provides the first evidence that autophagosomes recruit dynein through fusion with LEs and reveals a new motor–adaptor sharing mechanism by which neurons may remove distal AVs engulfing aggregated proteins and dysfunctional organelles for efficient degradation in the soma.  相似文献   

15.
In systems biology, questions concerning the molecular and cellular makeup of an organism are of utmost importance, especially when trying to understand how unreliable components—like genetic circuits, biochemical cascades, and ion channels, among others—enable reliable and adaptive behaviour. The repertoire and speed of biological computations are limited by thermodynamic or metabolic constraints: an example can be found in neurons, where fluctuations in biophysical states limit the information they can encode—with almost 20–60% of the total energy allocated for the brain used for signalling purposes, either via action potentials or by synaptic transmission. Here, we consider the imperatives for neurons to optimise computational and metabolic efficiency, wherein benefits and costs trade-off against each other in the context of self-organised and adaptive behaviour. In particular, we try to link information theoretic (variational) and thermodynamic (Helmholtz) free-energy formulations of neuronal processing and show how they are related in a fundamental way through a complexity minimisation lemma.  相似文献   

16.
When neurons fire action potentials, dissipation of free energy is usually not directly considered, because the change in free energy is often negligible compared to the immense reservoir stored in neural transmembrane ion gradients and the long–term energy requirements are met through chemical energy, i.e., metabolism. However, these gradients can temporarily nearly vanish in neurological diseases, such as migraine and stroke, and in traumatic brain injury from concussions to severe injuries. We study biophysical neuron models based on the Hodgkin–Huxley (HH) formalism extended to include time–dependent ion concentrations inside and outside the cell and metabolic energy–driven pumps. We reveal the basic mechanism of a state of free energy–starvation (FES) with bifurcation analyses showing that ion dynamics is for a large range of pump rates bistable without contact to an ion bath. This is interpreted as a threshold reduction of a new fundamental mechanism of ionic excitability that causes a long–lasting but transient FES as observed in pathological states. We can in particular conclude that a coupling of extracellular ion concentrations to a large glial–vascular bath can take a role as an inhibitory mechanism crucial in ion homeostasis, while the pumps alone are insufficient to recover from FES. Our results provide the missing link between the HH formalism and activator–inhibitor models that have been successfully used for modeling migraine phenotypes, and therefore will allow us to validate the hypothesis that migraine symptoms are explained by disturbed function in ion channel subunits, pumps, and other proteins that regulate ion homeostasis.  相似文献   

17.
Cortical neurons are bistable; as a consequence their local field potentials can fluctuate between quiescent and active states, generating slow [Formula: see text] Hz oscillations which are widely known as transitions between Up and Down States. Despite a large number of studies on Up-Down transitions, deciphering its nature, mechanisms and function are still today challenging tasks. In this paper we focus on recent experimental evidence, showing that a class of spontaneous oscillations can emerge within the Up states. In particular, a non-trivial peak around [Formula: see text] Hz appears in their associated power-spectra, what produces an enhancement of the activity power for higher frequencies (in the [Formula: see text] Hz band). Moreover, this rhythm within Ups seems to be an emergent or collective phenomenon given that individual neurons do not lock to it as they remain mostly unsynchronized. Remarkably, similar oscillations (and the concomitant peak in the spectrum) do not appear in the Down states. Here we shed light on these findings by using different computational models for the dynamics of cortical networks in presence of different levels of physiological complexity. Our conclusion, supported by both theory and simulations, is that the collective phenomenon of "stochastic amplification of fluctuations" - previously described in other contexts such as Ecology and Epidemiology - explains in an elegant and parsimonious manner, beyond model-dependent details, this extra-rhythm emerging only in the Up states but not in the Downs.  相似文献   

18.
Discriminative touch relies on afferent information carried to the central nervous system by action potentials (spikes) in ensembles of primary afferents bundled in peripheral nerves. These sensory quanta are first processed by the cuneate nucleus before the afferent information is transmitted to brain networks serving specific perceptual and sensorimotor functions. Here we report data on the integration of primary afferent synaptic inputs obtained with in vivo whole cell patch clamp recordings from the neurons of this nucleus. We find that the synaptic integration in individual cuneate neurons is dominated by 4–8 primary afferent inputs with large synaptic weights. In a simulation we show that the arrangement with a low number of primary afferent inputs can maximize transfer over the cuneate nucleus of information encoded in the spatiotemporal patterns of spikes generated when a human fingertip contact objects. Hence, the observed distributions of synaptic weights support high fidelity transfer of signals from ensembles of tactile afferents. Various anatomical estimates suggest that a cuneate neuron may receive hundreds of primary afferents rather than 4–8. Therefore, we discuss the possibility that adaptation of synaptic weight distribution, possibly involving silent synapses, may function to maximize information transfer in somatosensory pathways.  相似文献   

19.
Tracking moving objects, including one’s own body, is a fundamental ability of higher organisms, playing a central role in many perceptual and motor tasks. While it is unknown how the brain learns to follow and predict the dynamics of objects, it is known that this process of state estimation can be learned purely from the statistics of noisy observations. When the dynamics are simply linear with additive Gaussian noise, the optimal solution is the well known Kalman filter (KF), the parameters of which can be learned via latent-variable density estimation (the EM algorithm). The brain does not, however, directly manipulate matrices and vectors, but instead appears to represent probability distributions with the firing rates of population of neurons, “probabilistic population codes.” We show that a recurrent neural network—a modified form of an exponential family harmonium (EFH)—that takes a linear probabilistic population code as input can learn, without supervision, to estimate the state of a linear dynamical system. After observing a series of population responses (spike counts) to the position of a moving object, the network learns to represent the velocity of the object and forms nearly optimal predictions about the position at the next time-step. This result builds on our previous work showing that a similar network can learn to perform multisensory integration and coordinate transformations for static stimuli. The receptive fields of the trained network also make qualitative predictions about the developing and learning brain: tuning gradually emerges for higher-order dynamical states not explicitly present in the inputs, appearing as delayed tuning for the lower-order states.  相似文献   

20.
A fundamental issue in neuroscience is how to identify the multiple biophysical mechanisms through which neurons generate observed patterns of spiking activity. In previous work, we proposed a method for linking observed patterns of spiking activity to specific biophysical mechanisms based on a state space modeling framework and a sequential Monte Carlo, or particle filter, estimation algorithm. We have shown, in simulation, that this approach is able to identify a space of simple biophysical models that were consistent with observed spiking data (and included the model that generated the data), but have yet to demonstrate the application of the method to identify realistic currents from real spike train data. Here, we apply the particle filter to spiking data recorded from rat layer V cortical neurons, and correctly identify the dynamics of an slow, intrinsic current. The underlying intrinsic current is successfully identified in four distinct neurons, even though the cells exhibit two distinct classes of spiking activity: regular spiking and bursting. This approach – linking statistical, computational, and experimental neuroscience – provides an effective technique to constrain detailed biophysical models to specific mechanisms consistent with observed spike train data.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号