首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 484 毫秒
1.
Voltage-gated ion channels in neuronal membranes fluctuate randomly between different conformational states due to thermal agitation. Fluctuations between conducting and nonconducting states give rise to noisy membrane currents and subthreshold voltage fluctuations and may contribute to variability in spike timing. Here we study subthreshold voltage fluctuations due to active voltage-gated Na+ and K+ channels as predicted by two commonly used kinetic schemes: the Mainen et al. (1995) (MJHS) kinetic scheme, which has been used to model dendritic channels in cortical neurons, and the classical Hodgkin-Huxley (1952) (HH) kinetic scheme for the squid giant axon. We compute the magnitudes, amplitude distributions, and power spectral densities of the voltage noise in isopotential membrane patches predicted by these kinetic schemes. For both schemes, noise magnitudes increase rapidly with depolarization from rest. Noise is larger for smaller patch areas but is smaller for increased model temperatures. We contrast the results from Monte Carlo simulations of the stochastic nonlinear kinetic schemes with analytical, closed-form expressions derived using passive and quasi-active linear approximations to the kinetic schemes. For all subthreshold voltage ranges, the quasi-active linearized approximation is accurate within 8% and may thus be used in large-scale simulations of realistic neuronal geometries.  相似文献   

2.
High-frequency oscillations (above 30 Hz) have been observed in sensory and higher-order brain areas, and are believed to constitute a general hallmark of functional neuronal activation. Fast inhibition in interneuronal networks has been suggested as a general mechanism for the generation of high-frequency oscillations. Certain classes of interneurons exhibit subthreshold oscillations, but the effect of this intrinsic neuronal property on the population rhythm is not completely understood. We study the influence of intrinsic damped subthreshold oscillations in the emergence of collective high-frequency oscillations, and elucidate the dynamical mechanisms that underlie this phenomenon. We simulate neuronal networks composed of either Integrate-and-Fire (IF) or Generalized Integrate-and-Fire (GIF) neurons. The IF model displays purely passive subthreshold dynamics, while the GIF model exhibits subthreshold damped oscillations. Individual neurons receive inhibitory synaptic currents mediated by spiking activity in their neighbors as well as noisy synaptic bombardment, and fire irregularly at a lower rate than population frequency. We identify three factors that affect the influence of single-neuron properties on synchronization mediated by inhibition: i) the firing rate response to the noisy background input, ii) the membrane potential distribution, and iii) the shape of Inhibitory Post-Synaptic Potentials (IPSPs). For hyperpolarizing inhibition, the GIF IPSP profile (factor iii)) exhibits post-inhibitory rebound, which induces a coherent spike-mediated depolarization across cells that greatly facilitates synchronous oscillations. This effect dominates the network dynamics, hence GIF networks display stronger oscillations than IF networks. However, the restorative current in the GIF neuron lowers firing rates and narrows the membrane potential distribution (factors i) and ii), respectively), which tend to decrease synchrony. If inhibition is shunting instead of hyperpolarizing, post-inhibitory rebound is not elicited and factors i) and ii) dominate, yielding lower synchrony in GIF networks than in IF networks.  相似文献   

3.
Due to the limitations of current voltage sensing techniques, optimal filtering of noisy, undersampled voltage signals on dendritic trees is a key problem in computational cellular neuroscience. These limitations lead to voltage data that is incomplete (in the sense of only capturing a small portion of the full spatiotemporal signal) and often highly noisy. In this paper we use a Kalman filtering framework to develop optimal experimental design methods for voltage sampling. Our approach is to use a simple greedy algorithm with lazy evaluation to minimize the expected square error of the estimated spatiotemporal voltage signal. We take advantage of some particular features of the dendritic filtering problem to efficiently calculate the Kalman estimator’s covariance. We test our framework with simulations of real dendritic branching structures and compare the quality of both time-invariant and time-varying sampling schemes. While the benefit of using the experimental design methods was modest in the time-invariant case, improvements of 25–100% over more na?ve methods were found when the observation locations were allowed to change with time. We also present a heuristic approximation to the greedy algorithm that is an order of magnitude faster while still providing comparable results.  相似文献   

4.
The classical cable equation, in which membrane conductance is considered constant, is modified by including the linearized effect of membrane potential on sodium and potassium ionic currents, as formulated in the Hodgkin-Huxley equations for the squid giant axon. The resulting partial differential equation is solved by numerical inversion of the Laplace transform of the voltage response to current and voltage inputs. The voltage response is computed for voltage step, current step, and current pulse inputs, and the effect of temperature on the response to a current step input is also calculated.The validity of the linearized approximation is examined by comparing the linearized response to a current step input with the solution of the nonlinear partial differential cable equation for various subthreshold current step inputs.All the computed responses for the squid giant axon show oscillatory behavior and depart significantly from what is predicted on the basis of the classical cable equation. The linearization procedure, coupled with numerical inversion of the Laplace transform, proves to be a convenient approach which predicts at least qualitatively the subthreshold behavior of the nonlinear system.  相似文献   

5.
Hodgkin–Huxley (HH) models of neuronal membrane dynamics consist of a set of nonlinear differential equations that describe the time-varying conductance of various ion channels. Using observations of voltage alone we show how to estimate the unknown parameters and unobserved state variables of an HH model in the expected circumstance that the measurements are noisy, the model has errors, and the state of the neuron is not known when observations commence. The joint probability distribution of the observed membrane voltage and the unobserved state variables and parameters of these models is a path integral through the model state space. The solution to this integral allows estimation of the parameters and thus a characterization of many biological properties of interest, including channel complement and density, that give rise to a neuron’s electrophysiological behavior. This paper describes a method for directly evaluating the path integral using a Monte Carlo numerical approach. This provides estimates not only of the expected values of model parameters but also of their posterior uncertainty. Using test data simulated from neuronal models comprising several common channels, we show that short (<50 ms) intracellular recordings from neurons stimulated with a complex time-varying current yield accurate and precise estimates of the model parameters as well as accurate predictions of the future behavior of the neuron. We also show that this method is robust to errors in model specification, supporting model development for biological preparations in which the channel expression and other biophysical properties of the neurons are not fully known.  相似文献   

6.
An analytical approach is presented for determining the response of a neuron or of the activity in a network of connected neurons, represented by systems of nonlinear ordinary stochastic differential equations—the Fitzhugh-Nagumo system with Gaussian white noise current. For a single neuron, five equations hold for the first- and second-order central moments of the voltage and recovery variables. From this system we obtain, under certain assumptions, five differential equations for the means, variances, and covariance of the two components. One may use these quantities to estimate the probability that a neuron is emitting an action potential at any given time. The differential equations are solved by numerical methods. We also perform simulations on the stochastic Fitzugh-Nagumo system and compare the results with those obtained from the differential equations for both sustained and intermittent deterministic current inputs withsuperimposed noise. For intermittent currents, which mimic synaptic input, the agreement between the analytical and simulation results for the moments is excellent. For sustained input, the analytical approximations perform well for small noise as there is excellent agreement for the moments. In addition, the probability that a neuron is spiking as obtained from the empirical distribution of the potential in the simulations gives a result almost identical to that obtained using the analytical approach. However, when there is sustained large-amplitude noise, the analytical method is only accurate for short time intervals. Using the simulation method, we study the distribution of the interspike interval directly from simulated sample paths. We confirm that noise extends the range of input currents over which (nonperiodic) spike trains may exist and investigate the dependence of such firing on the magnitude of the mean input current and the noise amplitude. For networks we find the differential equations for the means, variances, and covariances of the voltage and recovery variables and show how solving them leads to an expression for the probability that a given neuron, or given set of neurons, is firing at time t. Using such expressions one may implement dynamical rules for changing synaptic strengths directly without sampling. The present analytical method applies equally well to temporally nonhomogeneous input currents and is expected to be useful for computational studies of information processing in various nervous system centers.  相似文献   

7.
Medial entorhinal cortex layer II stellate cells display subthreshold oscillations (STOs). We study a single compartment biophysical model of such cells which qualitatively reproduces these STOs. We argue that in the subthreshold interval (STI) the seven-dimensional model can be reduced to a three-dimensional system of equations with well differentiated times scales. Using dynamical systems arguments we provide a mechanism for generations of STOs. This mechanism is based on the “canard structure,” in which relevant trajectories stay close to repelling manifolds for a significant interval of time. We also show that the transition from subthreshold oscillatory activity to spiking (“canard explosion”) is controlled in the STI by the same structure. A similar mechanism is invoked to explain why noise increases the robustness of the STO regime. Taking advantage of the reduction of the dimensionality of the full stellate cell system, we propose a nonlinear artificially spiking (NAS) model in which the STI reduced system is supplemented with a threshold for spiking and a reset voltage. We show that the synchronization properties in networks made up of the NAS cells are similar to those of networks using the full stellate cell models. In memory of Angel A. Alonso  相似文献   

8.
9.
 The theory of optimal foraging predicts abrupt changes in consumer behavior which lead to discontinuities in the functional response. Therefore population dynamical models with optimal foraging behavior can be appropriately described by differential equations with discontinuous right-hand sides. In this paper we analyze the behavior of three different Lotka–Volterra predator–prey systems with optimal foraging behavior. We examine a predator–prey model with alternative food, a two-patch model with mobile predators and resident prey, and a two-patch model with both predators and prey mobile. We show that in the studied examples, optimal foraging behavior changes the neutral stability intrinsic to Lotka–Volterra systems to the existence of a bounded global attractor. The analysis is based on the construction and use of appropriate Lyapunov functions for models described by discontinuous differential equations. Received: 23 March 1999  相似文献   

10.
Optimal filtering of noisy voltage signals on dendritic trees is a key problem in computational cellular neuroscience. However, the state variable in this problem—the vector of voltages at every compartment—is very high-dimensional: realistic multicompartmental models often have on the order of N = 104 compartments. Standard implementations of the Kalman filter require O(N 3) time and O(N 2) space, and are therefore impractical. Here we take advantage of three special features of the dendritic filtering problem to construct an efficient filter: (1) dendritic dynamics are governed by a cable equation on a tree, which may be solved using sparse matrix methods in O(N) time; and current methods for observing dendritic voltage (2) provide low SNR observations and (3) only image a relatively small number of compartments at a time. The idea is to approximate the Kalman equations in terms of a low-rank perturbation of the steady-state (zero-SNR) solution, which may be obtained in O(N) time using methods that exploit the sparse tree structure of dendritic dynamics. The resulting methods give a very good approximation to the exact Kalman solution, but only require O(N) time and space. We illustrate the method with applications to real and simulated dendritic branching structures, and describe how to extend the techniques to incorporate spatially subsampled, temporally filtered, and nonlinearly transformed observations.  相似文献   

11.
Neurons generate spikes reliably with millisecond precision if driven by a fluctuating current—is it then possible to predict the spike timing knowing the input? We determined parameters of an adapting threshold model using data recorded in vitro from 24 layer 5 pyramidal neurons from rat somatosensory cortex, stimulated intracellularly by a fluctuating current simulating synaptic bombardment in vivo. The model generates output spikes whenever the membrane voltage (a filtered version of the input current) reaches a dynamic threshold. We find that for input currents with large fluctuation amplitude, up to 75% of the spike times can be predicted with a precision of ±2 ms. Some of the intrinsic neuronal unreliability can be accounted for by a noisy threshold mechanism. Our results suggest that, under random current injection into the soma, (i) neuronal behavior in the subthreshold regime can be well approximated by a simple linear filter; and (ii) most of the nonlinearities are captured by a simple threshold process.  相似文献   

12.
We present fast methods for filtering voltage measurements and performing optimal inference of the location and strength of synaptic connections in large dendritic trees. Given noisy, subsampled voltage observations we develop fast l 1-penalized regression methods for Kalman state-space models of the neuron voltage dynamics. The value of the l 1-penalty parameter is chosen using cross-validation or, for low signal-to-noise ratio, a Mallows’ C p -like criterion. Using low-rank approximations, we reduce the inference runtime from cubic to linear in the number of dendritic compartments. We also present an alternative, fully Bayesian approach to the inference problem using a spike-and-slab prior. We illustrate our results with simulations on toy and real neuronal geometries. We consider observation schemes that either scan the dendritic geometry uniformly or measure linear combinations of voltages across several locations with random coefficients. For the latter, we show how to choose the coefficients to offset the correlation between successive measurements imposed by the neuron dynamics. This results in a “compressed sensing” observation scheme, with an important reduction in the number of measurements required to infer the synaptic weights.  相似文献   

13.
The generation of spiking resonances in neurons (preferred spiking responses to oscillatory inputs) requires the interplay of the intrinsic ionic currents that operate at the subthreshold voltage level and the spiking mechanisms. Combinations of the same types of ionic currents in different parameter regimes may give rise to different types of nonlinearities in the voltage equation (e.g., parabolic- and cubic-like), generating subthreshold (membrane potential) oscillations patterns with different properties. These nonlinearities are not apparent in the model equations, but can be uncovered by plotting the voltage nullclines in the phase-plane diagram. We investigate the spiking resonant properties of conductance-based models that are biophysically equivalent at the subthreshold level (same ionic currents), but dynamically different (parabolic- and cubic-like voltage nullclines). As a case study we consider a model having a persistent sodium and a hyperpolarization-activated (h-) currents, which exhibits subthreshold resonance in the theta frequency band. We unfold the concept of spiking resonance into evoked and output spiking resonance. The former focuses on the input frequencies that are able to generate spikes, while the latter focuses on the output spiking frequencies regardless of the input frequency that generated these spikes. A cell can exhibit one or both types of resonances. We also measure spiking phasonance, which is an extension of subthreshold phasonance (zero-phase-shift response to oscillatory inputs) to the spiking regime. The subthreshold resonant properties of both types of models are communicated to the spiking regime for low enough input amplitudes as the voltage response for the subthreshold resonant frequency band raises above threshold. For higher input amplitudes evoked spiking resonance is no longer present in these models, but output spiking resonance is present primarily in the parabolic-like model due to a cycle skipping mechanism (involving mixed-mode oscillations), while the cubic-like model shows a better 1:1 entrainment. We use dynamical systems tools to explain the underlying mechanisms and the mechanistic differences between the resonance types. Our results demonstrate that the effective time scales that operate at the subthreshold regime to generate intrinsic subthreshold oscillations, mixed-mode oscillations and subthreshold resonance do not necessarily determine the existence of a preferred spiking response to oscillatory inputs in the same frequency band. The results discussed in this paper highlight both the complexity of the suprathreshold responses to oscillatory inputs in neurons having resonant and amplifying currents with different time scales and the fact that the identity of the participating ionic currents is not enough to predict the resulting patterns, but additional dynamic information, captured by the geometric properties of the phase-space diagram, is needed.  相似文献   

14.
Huber MT  Braun HA 《Bio Systems》2007,89(1-3):38-43
Biological systems are notoriously noisy. Noise, therefore, also plays an important role in many models of neural impulse generation. Noise is not only introduced for more realistic simulations but also to account for cooperative effects between noisy and nonlinear dynamics. Often, this is achieved by a simple noise term in the membrane equation (current noise). However, there are ongoing discussions whether such current noise is justified or whether rather conductance noise should be introduced because it is closer to the natural origin of noise. Therefore, we have compared the effects of current and conductance noise in a neuronal model for subthreshold oscillations and action potential generation. We did not see any significant differences in the model behavior with respect to voltage traces, tuning curves of interspike intervals, interval distributions or frequency responses when the noise strength is adjusted. These findings indicate that simple current noise can give reasonable results in neuronal simulations with regard to physiological relevant noise effects.  相似文献   

15.
Direct measurements of deep-brain and body-core temperature were performed on rats to determine the influence of cerebral blood flow (CBF) on brain temperature regulation under static and dynamic conditions. Static changes of CBF were achieved using different anesthetics (chloral hydrate, CH; α-chloralose, αCS; and isoflurane, IF) with αCS causing larger decreases in CBF than CH and IF; dynamic changes were achieved by inducing transient hypercapnia (5% CO2 in 40% O2 and 55% N2). Initial deep-brain/body-core temperature differentials were anesthetic-type dependent with the largest differential observed with rats under αCS anesthesia (ca. 2°C). Hypercapnia induction raised rat brain temperature under all three anesthesia regimes, but by different anesthetic-dependent amounts correlated with the initial differentials—αCS anesthesia resulted in the largest brain temperature increase (0.32 ± 0.08°C), while CH and IF anesthesia lead to smaller increases (0.12 ± 0.03 and 0.16 ± 0.05°C, respectively). The characteristic temperature transition time for the hypercapnia-induced temperature increase was 2–3 min under CH and IF anesthesia and ~4 min under αCS anesthesia. We conclude that both, the deep-brain/body-core temperature differential and the characteristic temperature transition time correlate with CBF: a lower CBF promotes higher deep-brain/body-core temperature differentials and, upon hypercapnia challenge, longer characteristic transition times to increased temperatures.  相似文献   

16.
Long-range dependence (LRD) has been observed in a variety of phenomena in nature, and for several years also in the spiking activity of neurons. Often, this is interpreted as originating from a non-Markovian system. Here we show that a purely Markovian integrate-and-fire (IF) model, with a noisy slow adaptation term, can generate interspike intervals (ISIs) that appear as having LRD. However a proper analysis shows that this is not the case asymptotically. For comparison, we also consider a new model of individual IF neuron with fractional (non-Markovian) noise. The correlations of its spike trains are studied and proven to have LRD, unlike classical IF models. On the other hand, to correctly measure long-range dependence, it is usually necessary to know if the data are stationary. Thus, a methodology to evaluate stationarity of the ISIs is presented and applied to the various IF models. We explain that Markovian IF models may seem to have LRD because of non-stationarities.  相似文献   

17.
Ion channel stochasticity can influence the voltage dynamics of neuronal membrane, with stronger effects for smaller patches of membrane because of the correspondingly smaller number of channels. We examine this question with respect to first spike statistics in response to a periodic input of membrane patches including stochastic Hodgkin-Huxley channels, comparing these responses to spontaneous firing. Without noise, firing threshold of the model depends on frequency—a sinusoidal stimulus is subthreshold for low and high frequencies and suprathreshold for intermediate frequencies. When channel noise is added, a stimulus in the lower range of subthreshold frequencies can influence spike output, while high subthreshold frequencies remain subthreshold. Both input frequency and channel noise strength influence spike timing. Specifically, spike latency and jitter have distinct minima as a function of input frequency, showing a resonance like behavior. With either no input, or low frequency subthreshold input, or input in the low or high suprathreshold frequency range, channel noise reduces latency and jitter, with the strongest impact for the lowest input frequencies. In contrast, for an intermediate range of suprathreshold frequencies, where an optimal input gives a minimum latency, the noise effect reverses, and spike latency and jitter increase with channel noise. Thus, a resonant minimum of the spike response as a function of frequency becomes more pronounced with less noise. Spike latency and jitter also depend on the initial phase of the input, resulting in minimal latencies at an optimal phase, and depend on the membrane time constant, with a longer time constant broadening frequency tuning for minimal latency and jitter. Taken together, these results suggest how stochasticity of ion channels may influence spike timing and thus coding for neurons with functionally localized concentrations of channels, such as in “hot spots” of dendrites, spines or axons.  相似文献   

18.
Many mathematical models for physical and biological problems have been and will be built in the form of differential equations or systems of such equations. With the advent of digital computers one has been able to find (approximate) solutions for equations that used to be intractable. Many of the mathematical techniques used in this area amount to replacing the given differential equations by appropriate difference equations, so that extensive research has been done into how to choose appropriate difference equations whose solutions are “good” approximations to the solutions of the given differential equations. The present paper investigates a different, although related problem. For many physical and biological phenomena the “continuum” type of thinking, that is at the basis of any differential equation, is not natural to the phenomenon, but rather constitutes an approximation to a basically discrete situation: in much work of this type the “infinitesimal step lengths” handled in the reasoning which leads up to the differential equation, are not really thought of as infinitesimally small, but as finite; yet, in the last stage of such reasoning, where the differential equation rises from the differentials, these “infinitesimal” step lengths are allowed to go to zero: that is where the above-mentioned approximation comes in. Under this kind of circumstances, it seems more natural tobuild themodel as adiscrete difference equation (recurrence relation) from the start, without going through the painful, doubly approximative process of first, during the modeling stage, finding a differential equation to approximate a basically discrete situation, and then, for numerical computing purposes, approximating that differential equation by a difference scheme. The paper pursues this idea for some simple examples, where the old differential equation, though approximative in principle, had been at least qualitatively successful in describing certain phenomena, and shows that this idea, though plausible and sound in itself, does encounter some difficulties. The reason is that each differential equation, as it is set up in the way familiar to theoretical physicists and biologists, does correspond to a plethora of discrete difference equations, all of which in the limit (as step length→0) yield the same differential equation, but whose solutions, for not too small step length, are often widely different, some of them being quite irregular. The disturbing thing is that all these difference equations seem to adequately represent the same (physical or biological) reasoning as the differential equation in question. So, in order to choose the “right” difference equation, one may need to draw upon more detailed (physical or) biological considerations. All this does not say that one should not prefer discrete models for phenomena that seem to call for them; but only that their pursuit may require additional (physical or) biological refinement and insight. The paper also investigates some mathematical problems related to the fact of many difference equations being associated with one differential equation.  相似文献   

19.
A model for enzymic catalysis is presented using the mathematical theories of differential geometry and Stieltjes integration. The Stieltjesintegrator is a complex-valued function of bounded variation which represents the curvature and torsion, hence the conformation, of the backbone of an enzyme molecule. Theintegrand is a complex-valued continuous function which describes the shape of the surface of a substrate molecule. We postulate that enzyme-substrate interactions correspond to evaluations of Stieltjes integrals, and that observables of enzymic catalysis correspond to projections. Results from the mathematical theory of the Stieltjes integral are discussed together with their biological interpretations. We contrast the difference between structural and functional proteins, and construct analogues of enzyme cofactors, modifications, and regulation. Various techniques of locating the active site on enzymes are also given. We construct a total variation metric, which is particularly useful for detecting similarities among proteins. An examination on the many different modes of convergence of mathematical functions representing biological molecules leads to a mathematical statement of the fundamental dogma of molecular biology, that ‘structure implies function’. Similar arguments also result in the converse statement ‘function dictates structure’, which is a basic premise of relational biology. Stepped-helical approximations of the backbone space curves of enzymes provide a concrete computational tool with which to calculate the Stieltjes integrals that model enzymic catalysis, by replacing the integral with a finite series. The duality between enzymes and substrates (that they aremeters ‘observing’ one another) is shown to be a consequence of the mathematical duality of Banach spaces. The Stieltjes integrals of enzyme-substrate interactions are hence shown to be bounded bilinear functionals. The mechanism of enzymic catalysis, the transformation from substrate to product, is also formulated in the Stieltjes integration context via the mathematical theory of adjoints. The paper closes with suggestions for generalizations, prospects for future studies, and a review of the correspondence between mathematical and biological concepts.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号