首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Small universal spiking neural P systems   总被引:10,自引:0,他引:10  
Păun A  Păun G 《Bio Systems》2007,90(1):48-60
In search for small universal computing devices of various types, we consider here the case of spiking neural P systems (SN P systems), in two variants: as devices that compute functions and as devices that generate sets of numbers. We start with the first case and we produce a universal spiking neural P system with 84 neurons. If a slight generalization of the used rules is adopted, namely, we allow rules for producing simultaneously several spikes, then a considerable reduction, to 49 neurons, is obtained. For SN P systems used as generators of sets of numbers, we find a universal system with restricted rules having 76 neurons and one with extended rules having 50 neurons.  相似文献   

2.
It is well known that some neurons tend to fire packets of action potentials followed by periods of quiescence (bursts) while others within the same stage of sensory processing fire in a tonic manner. However, the respective computational advantages of bursting and tonic neurons for encoding time varying signals largely remain a mystery. Weakly electric fish use cutaneous electroreceptors to convey information about sensory stimuli and it has been shown that some electroreceptors exhibit bursting dynamics while others do not. In this study, we compare the neural coding capabilities of tonically firing and bursting electroreceptor model neurons using information theoretic measures. We find that both bursting and tonically firing model neurons efficiently transmit information about the stimulus. However, the decoding mechanisms that must be used for each differ greatly: a non-linear decoder would be required to extract all the available information transmitted by the bursting model neuron whereas a linear one might suffice for the tonically firing model neuron. Further investigations using stimulus reconstruction techniques reveal that, unlike the tonically firing model neuron, the bursting model neuron does not encode the detailed time course of the stimulus. A novel measure of feature detection reveals that the bursting neuron signals certain stimulus features. Finally, we show that feature extraction and stimulus estimation are mutually exclusive computations occurring in bursting and tonically firing model neurons, respectively. Our results therefore suggest that stimulus estimation and feature extraction might be parallel computations in certain sensory systems rather than being sequential as has been previously proposed.  相似文献   

3.
For understanding the computation and function of single neurons in sensory systems, one needs to investigate how sensory stimuli are related to a neuron’s response and which biological mechanisms underlie this relationship. Mathematical models of the stimulus–response relationship have proved very useful in approaching these issues in a systematic, quantitative way. A starting point for many such analyses has been provided by phenomenological “linear–nonlinear” (LN) models, which comprise a linear filter followed by a static nonlinear transformation. The linear filter is often associated with the neuron’s receptive field. However, the structure of the receptive field is generally a result of inputs from many presynaptic neurons, which may form parallel signal processing pathways. In the retina, for example, certain ganglion cells receive excitatory inputs from ON-type as well as OFF-type bipolar cells. Recent experiments have shown that the convergence of these pathways leads to intriguing response characteristics that cannot be captured by a single linear filter. One approach to adjust the LN model to the biological circuit structure is to use multiple parallel filters that capture ON and OFF bipolar inputs. Here, we review these new developments in modeling neuronal responses in the early visual system and provide details about one particular technique for obtaining the required sets of parallel filters from experimental data.  相似文献   

4.
Although models based on independent component analysis (ICA) have been successful in explaining various properties of sensory coding in the cortex, it remains unclear how networks of spiking neurons using realistic plasticity rules can realize such computation. Here, we propose a biologically plausible mechanism for ICA-like learning with spiking neurons. Our model combines spike-timing dependent plasticity and synaptic scaling with an intrinsic plasticity rule that regulates neuronal excitability to maximize information transmission. We show that a stochastically spiking neuron learns one independent component for inputs encoded either as rates or using spike-spike correlations. Furthermore, different independent components can be recovered, when the activity of different neurons is decorrelated by adaptive lateral inhibition.  相似文献   

5.
Extracting invariant features in an unsupervised manner is crucial to perform complex computation such as object recognition, analyzing music or understanding speech. While various algorithms have been proposed to perform such a task, Slow Feature Analysis (SFA) uses time as a means of detecting those invariants, extracting the slowly time-varying components in the input signals. In this work, we address the question of how such an algorithm can be implemented by neurons, and apply it in the context of audio stimuli. We propose a projected gradient implementation of SFA that can be adapted to a Hebbian like learning rule dealing with biologically plausible neuron models. Furthermore, we show that a Spike-Timing Dependent Plasticity learning rule, shaped as a smoothed second derivative, implements SFA for spiking neurons. The theory is supported by numerical simulations, and to illustrate a simple use of SFA, we have applied it to auditory signals. We show that a single SFA neuron can learn to extract the tempo in sound recordings.  相似文献   

6.
7.
8.
Dassow J  Vaszil G 《Bio Systems》2004,74(1-3):1-7
We consider splicing systems reflecting two important aspects of the behaviour of DNA molecules in nature or in laboratory experiments which so far have not been studied in the literature. We examine the effect of splicing rules applied to finite multisets of words using sequential and different types of parallel derivation strategies and compare the sets of words or sets of multisets which can be obtained.  相似文献   

9.
Novel statistical methods were used to distinguish functionally distinct brain regions using their cDNA array gene expression profiles, and it was found that one of four specific factors is often associated with the most regionally discriminative genes. The gene expression profiles for the substantia nigra (SN), striatum (STR), parietal cortex (PC), and posterolateral cortical amygdaloid nucleus (PLCo) brain regions were determined from each brain region. An F-test identified 339 genes of the 1185 array genes as having a P < or = 0.01 and applied a gene ranking and selection method based on Soft Independent Modeling of Class Analogy (SIMCA) to obtain 59 of the most discriminative genes. Their discriminative power was validated in three steps. The most convincing step showed their ability to correctly predict the brain regional classifications for 18 "test" gene expression sets obtained from the four regions. A two-way Hierarchical Cluster Analysis organized the 59 genes in six clusters according to their expression differences in the brain regions. Expression patterns in the SN and STR regions greatly differed from each other and the PC and PLCo. The closer similarity in the gene expression patterns of the PC and PLCo was probably due to their functional similarity. The important factors in determining differences in the regional gene expression profiles in six clusters were (1) regional myelin/oligodendrocyte levels, (2) resident neuron types, (3) neurotransmitter innervation profiles, and (4) Ca++-dependent signaling and second messenger systems.  相似文献   

10.
RV Florian 《PloS one》2012,7(8):e40233
In many cases, neurons process information carried by the precise timings of spikes. Here we show how neurons can learn to generate specific temporally precise output spikes in response to input patterns of spikes having precise timings, thus processing and memorizing information that is entirely temporally coded, both as input and as output. We introduce two new supervised learning rules for spiking neurons with temporal coding of information (chronotrons), one that provides high memory capacity (E-learning), and one that has a higher biological plausibility (I-learning). With I-learning, the neuron learns to fire the target spike trains through synaptic changes that are proportional to the synaptic currents at the timings of real and target output spikes. We study these learning rules in computer simulations where we train integrate-and-fire neurons. Both learning rules allow neurons to fire at the desired timings, with sub-millisecond precision. We show how chronotrons can learn to classify their inputs, by firing identical, temporally precise spike trains for different inputs belonging to the same class. When the input is noisy, the classification also leads to noise reduction. We compute lower bounds for the memory capacity of chronotrons and explore the influence of various parameters on chronotrons' performance. The chronotrons can model neurons that encode information in the time of the first spike relative to the onset of salient stimuli or neurons in oscillatory networks that encode information in the phases of spikes relative to the background oscillation. Our results show that firing one spike per cycle optimizes memory capacity in neurons encoding information in the phase of firing relative to a background rhythm.  相似文献   

11.
12.
PACES: Protein sequential assignment by computer-assisted exhaustive search   总被引:1,自引:0,他引:1  
A crucial step in determining solution structures of proteins using nuclear magnetic resonance (NMR) spectroscopy is the process of sequential assignment, which correlates backbone resonances to corresponding residues in the primary sequence of a protein, today, typically using data from triple-resonance NMR experiments. Although the development of automated approaches for sequential assignment has greatly facilitated this process, the performance of these programs is usually less satisfactory for large proteins, especially in the cases of missing connectivity or severe chemical shift degeneracy. Here, we report the development of a novel computer-assisted method for sequential assignment, using an algorithm that conducts an exhaustive search of all spin systems both for establishing sequential connectivities and then for assignment. By running the program iteratively with user intervention after each cycle, ambiguities in the assignments can be eliminated efficiently and backbone resonances can be assigned rapidly. The efficiency and robustness of this approach have been tested with 27 proteins of sizes varying from 76 amino acids to 723 amino acids, and with data of varying qualities, using experimental data for three proteins, and published assignments modified with simulated noise for the other 24. The complexity of sequential assignment with regard to the size of the protein, the completeness of NMR data sets, and the uncertainty in resonance positions has been examined.Supplementary material to this paper is available in electronic form at http://dx.doi.org/10.1023/A:1023589029301  相似文献   

13.
Hangartner RD  Cull P 《Bio Systems》2000,58(1-3):167-176
In this paper, we address the question, can biologically feasible neural nets compute more than can be computed by deterministic polynomial time algorithms? Since we want to maintain a claim of plausibility and reasonableness we restrict ourselves to algorithmically easy to construct nets and we rule out infinite precision in parameters and in any analog parts of the computation. Our approach is to consider the recent advances in randomized algorithms and see if such randomized computations can be described by neural nets. We start with a pair of neurons and show that by connecting them with reciprocal inhibition and some tonic input, then the steady-state will be one neuron ON and one neuron OFF, but which neuron will be ON and which neuron will be OFF will be chosen at random (perhaps, it would be better to say that microscopic noise in the analog computation will be turned into a megascale random bit). We then show that we can build a small network that uses this random bit process to generate repeatedly random bits. This random bit generator can then be connected with a neural net representing the deterministic part of randomized algorithm. We, therefore, demonstrate that these neural nets can carry out probabilistic computation and thus be less limited than classical neural nets.  相似文献   

14.
A single neuron, located in the center of each segmental ganglion of H. medicinalis is antidromically activated by electrical stimulation of the ventral cord anteriorly and posteriorly to the ganglion, at the same threshold as the fast conducting system (FCS) and with a latency equal to the FCS conduction time. This neuron is activated trans-synaptically by tactile and photic stimulation of the skin and by stimulation of high-threshold fibres running along the cord. A spike evoked by intracellular stimulation of this neuron propagates along the FCS. Intracellular staining shows that this neuron sends two axonal branches in the anterior and posterior median connectives. Direct electrical stimulation of touch cells (T cells), as well as mechanical stimulation of the skin, lowers the threshold of and may eventually fire, the FCS neurons, not only at the level of the ganglion to which they belong, but also at the level of the neighbouring ganglia. This effect is mediated by bilateral pathwasy located in the lateral connectives. It is concluded that the FCS consists of a chain of single neurons, located in each ganglion and electrotonically coupled to each other. Touch cells project with excitatory synapses on the FCS neurons.  相似文献   

15.
In designing a study to demonstrate the existence of a major locus for a quantitative trait, an investigator chooses a sampling rule to ascertain pedigrees. The choice of sampling rule can significantly affect the study's power. Here, we compare two types of sampling rules for family studies: fixed-structure rules, in which the same set of relatives are sampled for each proband, and sequential rules, in which the relative or relatives to be sampled next may depend on the trait values of the individuals already observed. We compare fixed-structure and sequential sampling in the setting of extended pedigrees, a quantitative trait, and the genetic mixed model. Using computer simulation, we show that sequential sampling can increase power to detect segregation at a dominant major locus by over 60% in comparison with fixed-structure sampling. Just as important, this substantially increased power is obtained with an easily implemented sampling rule, one that might reasonably be employed in a family study of a quantitative trait.  相似文献   

16.
17.
The responses of cortical neurons are often characterized by measuring their spectro-temporal receptive fields (STRFs). The STRF of a cell can be thought of as a representation of its stimulus 'preference' but it is also a filter or 'kernel' that represents the best linear prediction of the response of that cell to any stimulus. A range of in vivo STRFs with varying properties have been reported in various species, although none in humans. Using a computational model it has been shown that responses of ensembles of artificial STRFs, derived from limited sets of formative stimuli, preserve information about utterance class and prosody as well as the identity and sex of the speaker in a model speech classification system. In this work we help to put this idea on a biologically plausible footing by developing a simple model thalamo-cortical system built of conductance based neurons and synapses some of which exhibit spike-time-dependent plasticity. We show that the neurons in such a model when exposed to formative stimuli develop STRFs with varying temporal properties exhibiting a range of heterotopic integration. These model neurons also, in common with neurons measured in vivo, exhibit a wide range of non-linearities; this deviation from linearity can be exposed by characterizing the difference between the measured response of each neuron to a stimulus, and the response predicted by the STRF estimated for that neuron. The proposed model, with its simple architecture, learning rule, and modest number of neurons (<1000), is suitable for implementation in neuromorphic analogue VLSI hardware and hence could form the basis of a developmental, real time, neuromorphic sound classification system.  相似文献   

18.
 Future hybrid neuron-semiconductor chips will consist of complex neural networks that are directly interfaced to electronic integrated circuits. They will help us to understand the dynamics of neuronal networks and may lead to novel computational facilities. Here we report on an elementary step towards such neurochips. We designed and fabricated a silicon chip for multiple two-way interfacing, and cultured on it pairs of neurons from the pedal ganglia of the snail Lymnaea stagnalis. These neurons were joined to each other by an electrical synapse, and to the chip by a capacitive stimulator and a recording transistor. We obtained a set of neuroelectronic units with sequential and parallel signal transmission through the neuron–silicon interface and the synapse, with a bidirectionally interfaced neuron-pair and with a signal path from the chip through a synaptically connected neuron pair back to the chip. The prospects for assembling more involved hybrid networks on the basis of these neuroelectronic units are considered. Received: 13 April 2000 / Accepted in revised form: 29 September 2000  相似文献   

19.
The Hebbian rule (Hebb 1949), coupled with an appropriate mechanism to limit the growth of synaptic weights, allows a neuron to learn to respond to the first principal component of the distribution of its input signals (Oja 1982). Rubner and Schulten (1990) have recently suggested the use of an anti-Hebbian rule in a network with hierarchical lateral connections. When applied to neurons with linear response functions, this model allows additional neurons to learn to respond to additional principal components (Rubner and Tavan 1989). Here we apply the model to neurons with non-linear response functions characterized by a threshold and a transition width. We propose local, unsupervised learning rules for the threshold and the transition width, and illustrate the operation of these rules with some simple examples. A network using these rules sorts the input patterns into classes, which it identifies by a binary code, with the coarser structure coded by the earlier neurons in the hierarchy.  相似文献   

20.
Dynamics of spike-timing dependent synaptic plasticity are analyzed for excitatory and inhibitory synapses onto cerebellar Purkinje cells. The purpose of this study is to place theoretical constraints on candidate synaptic learning rules that determine the changes in synaptic efficacy due to pairing complex spikes with presynaptic spikes in parallel fibers and inhibitory interneurons. Constraints are derived for the timing between complex spikes and presynaptic spikes, constraints that result from the stability of the learning dynamics of the learning rule. Potential instabilities in the parallel fiber synaptic learning rule are found to be stabilized by synaptic plasticity at inhibitory synapses if the inhibitory learning rules are stable, and conditions for stability of inhibitory plasticity are given. Combining excitatory with inhibitory plasticity provides a mechanism for minimizing the overall synaptic input. Stable learning rules are shown to be able to sculpt simple-spike patterns by regulating the excitability of neurons in the inferior olive that give rise to climbing fibers.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号