首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Hangartner RD  Cull P 《Bio Systems》2000,58(1-3):167-176
In this paper, we address the question, can biologically feasible neural nets compute more than can be computed by deterministic polynomial time algorithms? Since we want to maintain a claim of plausibility and reasonableness we restrict ourselves to algorithmically easy to construct nets and we rule out infinite precision in parameters and in any analog parts of the computation. Our approach is to consider the recent advances in randomized algorithms and see if such randomized computations can be described by neural nets. We start with a pair of neurons and show that by connecting them with reciprocal inhibition and some tonic input, then the steady-state will be one neuron ON and one neuron OFF, but which neuron will be ON and which neuron will be OFF will be chosen at random (perhaps, it would be better to say that microscopic noise in the analog computation will be turned into a megascale random bit). We then show that we can build a small network that uses this random bit process to generate repeatedly random bits. This random bit generator can then be connected with a neural net representing the deterministic part of randomized algorithm. We, therefore, demonstrate that these neural nets can carry out probabilistic computation and thus be less limited than classical neural nets.  相似文献   

2.
The chaotic nature of the atmospheric dynamics has stimulated the applications of methods and ideas derived from statistical dynamics. For instance, ensemble systems are used to make weather predictions recently extensive, which are designed to sample the phase space around the initial condition. Such an approach has been shown to improve substantially the usefulness of the forecasts since it allows forecasters to issue probabilistic forecasts. These works have modified the dominant paradigm of the interpretation of the evolution of atmospheric flows (and oceanic motions to some extent) attributing more importance to the probability distribution of the variables of interest rather than to a single representation. The ensemble experiments can be considered as crude attempts to estimate the evolution of the probability distribution of the climate variables, which turn out to be the only physical quantity relevant to practice. However, little work has been done on a direct modeling of the probability evolution itself. In this paper it is shown that it is possible to write the evolution of the probability distribution as a functional integral of the same kind introduced by Feynman in quantum mechanics, using some of the methods and results developed in statistical physics. The approach allows obtaining a formal solution to the Fokker-Planck equation corresponding to the Langevin-like equation of motion with noise. The method is very general and provides a framework generalizable to red noise, as well as to delaying differential equations, and even field equations, i.e., partial differential equations with noise, for example, general circulation models with noise. These concepts will be applied to an example taken from a simple ENSO model.  相似文献   

3.
Biological data suggests that activity patterns emerging in small- and large-scale neural systems may play an important role in performing the functions of the neural system, and in particular, neural computations. It is proposed in this paper that neural systems can be understood in terms of pattern computation and abstract communication systems theory. It is shown that analysing high-resolution surface EEG data, it is possible to determine abstract probabilistic rules that describe how emerging activity patterns follow earlier activity patterns. The results indicate the applicability of the proposed approach for understanding the working of complex neural systems.  相似文献   

4.
Using phase space reconstruct technique from one-dimensional and multi-dimensional time series and the quantitative criterion rule of system chaos, and combining the neural network; analyses, computations and sort are conducted on electroencephalogram (EEG) signals of five kinds of human consciousness activities (relaxation, mental arithmetic of multiplication, mental composition of a letter, visualizing a 3-dimensional object being revolved about an axis, and visualizing numbers being written or erased on a blackboard). Through comparative studies on the determinacy, the phase graph, the power spectra, the approximate entropy, the correlation dimension and the Lyapunov exponent of EEG signals of 5 kinds of consciousness activities, the following conclusions are shown: (1) The statistic results of the deterministic computation indicate that chaos characteristic may lie in human consciousness activities, and central tendency measure (CTM) is consistent with phase graph, so it can be used as a division way of EEG attractor. (2) The analyses of power spectra show that ideology of single subject is almost identical but the frequency channels of different consciousness activities have slight difference. (3) The approximate entropy between different subjects exist discrepancy. Under the same conditions, the larger the approximate entropy of subject is, the better the subject's innovation is. (4) The results of the correlation dimension and the Lyapunov exponent indicate that activities of human brain exist in attractors with fractional dimensions. (5) Nonlinear quantitative criterion rule, which unites the neural network, can classify different kinds of consciousness activities well. In this paper, the results of classification indicate that the consciousness activity of arithmetic has better differentiation degree than that of abstract.  相似文献   

5.
Life history evolution and demographic stochasticity   总被引:1,自引:0,他引:1  
Summary Can demographic stochasticity bias the evolution of life history traits? Under a neutral version of the Cole-Charnov-Schaffer model, variance in offspring number for both annuals and perennials depends on the precise values of fitness components. Either annuals or perennials may have the larger variance (for equal ), depending on the importance of random survivalversus fixed reproduction. By extension, the variance in offspring number should generally depend on whether is mainly composed of highly variable elements or elements with limited variation. Thus, data about the variability of demographic parameters may be as important as data about their mean values.This result concerns only one source of demographic stochasticity, the probabilistic nature of demographic processes like survival. The other source of demographic stochasticity is the fact that populations are composed of whole numbers of individuals (integer arithmetic). Integer arithmetic without probabilistic demography (or environmental variation) can make it difficult for rare invaders to persist in populations even when selection would favour the invaders in a deterministic model. Integer arithmetic can also cause population coexistence when the equivalent deterministic model leads to exclusion. This effect disappears when demography is probabilistic, and probably also when there is environmental variation. Thus probabilistic demography and environmental variation may make some population patterns more, rather than less, understandable.  相似文献   

6.
The question of how the collective activity of neural populations gives rise to complex behaviour is fundamental to neuroscience. At the core of this question lie considerations about how neural circuits can perform computations that enable sensory perception, decision making, and motor control. It is thought that such computations are implemented through the dynamical evolution of distributed activity in recurrent circuits. Thus, identifying dynamical structure in neural population activity is a key challenge towards a better understanding of neural computation. At the same time, interpreting this structure in light of the computation of interest is essential for linking the time-varying activity patterns of the neural population to ongoing computational processes. Here, we review methods that aim to quantify structure in neural population recordings through a dynamical system defined in a low-dimensional latent variable space. We discuss advantages and limitations of different modelling approaches and address future challenges for the field.  相似文献   

7.
Douglas W. Morris 《Oikos》2005,109(2):223-238
Ecologists continue to debate the roles of deterministic versus stochastic (or neutral) processes in the assembly of ecological communities. The debate often hinges on issues of temporal and spatial scale. Resolution of the competing views depends on a detailed understanding of variation in the structure of local communities through time and space. Analyses of twelve years of data on a diverse assemblage of 13 boreal small mammal species revealed both deterministic and stochastic patterns. Stochastic membership in the overall community created unique assemblages of species in both time and space. But the relative abundances of the two codominant species were much less variable, and suggest a significant role for strong interactions that create temporal and spatial autocorrelation in abundance. As species wax and wane in abundance, they are nevertheless subject to probabilistic rules on local assembly. At the scales I report on here, poorly understood large scale processes influence the presence and absence of the majority of (sparse) species in the assembly. But the overall pool of species nevertheless obeys local rules on their ultimate stochastic assembly into groups of interacting species.  相似文献   

8.
The visual system must learn to infer the presence of objects and features in the world from the images it encounters, and as such it must, either implicitly or explicitly, model the way these elements interact to create the image. Do the response properties of cells in the mammalian visual system reflect this constraint? To address this question, we constructed a probabilistic model in which the identity and attributes of simple visual elements were represented explicitly and learnt the parameters of this model from unparsed, natural video sequences. After learning, the behaviour and grouping of variables in the probabilistic model corresponded closely to functional and anatomical properties of simple and complex cells in the primary visual cortex (V1). In particular, feature identity variables were activated in a way that resembled the activity of complex cells, while feature attribute variables responded much like simple cells. Furthermore, the grouping of the attributes within the model closely parallelled the reported anatomical grouping of simple cells in cat V1. Thus, this generative model makes explicit an interpretation of complex and simple cells as elements in the segmentation of a visual scene into basic independent features, along with a parametrisation of their moment-by-moment appearances. We speculate that such a segmentation may form the initial stage of a hierarchical system that progressively separates the identity and appearance of more articulated visual elements, culminating in view-invariant object recognition.  相似文献   

9.
Understanding the genetic regulatory network comprising genes, RNA, proteins and the network connections and dynamical control rules among them, is a major task of contemporary systems biology. I focus here on the use of the ensemble approach to find one or more well-defined ensembles of model networks whose statistical features match those of real cells and organisms. Such ensembles should help explain and predict features of real cells and organisms. More precisely, an ensemble of model networks is defined by constraints on the "wiring diagram" of regulatory interactions, and the "rules" governing the dynamical behavior of regulated components of the network. The ensemble consists of all networks consistent with those constraints. Here I discuss ensembles of random Boolean networks, scale free Boolean networks, "medusa" Boolean networks, continuous variable networks, and others. For each ensemble, M statistical features, such as the size distribution of avalanches in gene activity changes unleashed by transiently altering the activity of a single gene, the distribution in distances between gene activities on different cell types, and others, are measured. This creates an M-dimensional space, where each ensemble corresponds to a cluster of points or distributions. Using current and future experimental techniques, such as gene arrays, these M properties are to be measured for real cells and organisms, again yielding a cluster of points or distributions in the M-dimensional space. The procedure then finds ensembles close to those of real cells and organisms, and hill climbs to attempt to match the observed M features. Thus obtains one or more ensembles that should predict and explain many features of the regulatory networks in cells and organisms.  相似文献   

10.
Climate change impact assessments are plagued with uncertainties from many sources, such as climate projections or the inadequacies in structure and parameters of the impact model. Previous studies tried to account for the uncertainty from one or two of these. Here, we developed a triple‐ensemble probabilistic assessment using seven crop models, multiple sets of model parameters and eight contrasting climate projections together to comprehensively account for uncertainties from these three important sources. We demonstrated the approach in assessing climate change impact on barley growth and yield at Jokioinen, Finland in the Boreal climatic zone and Lleida, Spain in the Mediterranean climatic zone, for the 2050s. We further quantified and compared the contribution of crop model structure, crop model parameters and climate projections to the total variance of ensemble output using Analysis of Variance (ANOVA). Based on the triple‐ensemble probabilistic assessment, the median of simulated yield change was ?4% and +16%, and the probability of decreasing yield was 63% and 31% in the 2050s, at Jokioinen and Lleida, respectively, relative to 1981–2010. The contribution of crop model structure to the total variance of ensemble output was larger than that from downscaled climate projections and model parameters. The relative contribution of crop model parameters and downscaled climate projections to the total variance of ensemble output varied greatly among the seven crop models and between the two sites. The contribution of downscaled climate projections was on average larger than that of crop model parameters. This information on the uncertainty from different sources can be quite useful for model users to decide where to put the most effort when preparing or choosing models or parameters for impact analyses. We concluded that the triple‐ensemble probabilistic approach that accounts for the uncertainties from multiple important sources provide more comprehensive information for quantifying uncertainties in climate change impact assessments as compared to the conventional approaches that are deterministic or only account for the uncertainties from one or two of the uncertainty sources.  相似文献   

11.
Microarray data analysis has been shown to provide an effective tool for studying cancer and genetic diseases. Although classical machine learning techniques have successfully been applied to find informative genes and to predict class labels for new samples, common restrictions of microarray analysis such as small sample sizes, a large attribute space and high noise levels still limit its scientific and clinical applications. Increasing the interpretability of prediction models while retaining a high accuracy would help to exploit the information content in microarray data more effectively. For this purpose, we evaluate our rule-based evolutionary machine learning systems, BioHEL and GAssist, on three public microarray cancer datasets, obtaining simple rule-based models for sample classification. A comparison with other benchmark microarray sample classifiers based on three diverse feature selection algorithms suggests that these evolutionary learning techniques can compete with state-of-the-art methods like support vector machines. The obtained models reach accuracies above 90% in two-level external cross-validation, with the added value of facilitating interpretation by using only combinations of simple if-then-else rules. As a further benefit, a literature mining analysis reveals that prioritizations of informative genes extracted from BioHEL's classification rule sets can outperform gene rankings obtained from a conventional ensemble feature selection in terms of the pointwise mutual information between relevant disease terms and the standardized names of top-ranked genes.  相似文献   

12.
Abstract

Performing molecular dynamics in a fully continuous and differentiable framework can be viewed as a deterministic mathematical mapping between, on one side, the force field parameters that describe the potential energy interactions and input macroscopic conditions, and, on the other, the calculated corresponding macroscopic properties of the bulk molecular system.

Within this framework, it is possible to apply standard methods of variational calculus for the computation of the partial derivatives of the molecular dynamics mapping based on the integration of either the adjoint equations or the sensitivity equations of the classical Newtonian equations of motion. We present procedures for these computations in the standard microcanonical (N, V, E) ensemble, and compare the computational efficiency of the two approaches. The general formulations developed are applied to the specific example of bulk ethane fluid.

With these procedures in place, it is now possible to compute the partial derivatives of any property determined by molecular dynamics with respect to any input property and any potential parameter. Moreover, these derivatives are computed to essentially the same level of numerical accuracy as the output properties themselves.  相似文献   

13.

Background

During sentence processing we decode the sequential combination of words, phrases or sentences according to previously learned rules. The computational mechanisms and neural correlates of these rules are still much debated. Other key issue is whether sentence processing solely relies on language-specific mechanisms or is it also governed by domain-general principles.

Methodology/Principal Findings

In the present study, we investigated the relationship between sentence processing and implicit sequence learning in a dual-task paradigm in which the primary task was a non-linguistic task (Alternating Serial Reaction Time Task for measuring probabilistic implicit sequence learning), while the secondary task were a sentence comprehension task relying on syntactic processing. We used two control conditions: a non-linguistic one (math condition) and a linguistic task (word processing task). Here we show that the sentence processing interfered with the probabilistic implicit sequence learning task, while the other two tasks did not produce a similar effect.

Conclusions/Significance

Our findings suggest that operations during sentence processing utilize resources underlying non-domain-specific probabilistic procedural learning. Furthermore, it provides a bridge between two competitive frameworks of language processing. It appears that procedural and statistical models of language are not mutually exclusive, particularly for sentence processing. These results show that the implicit procedural system is engaged in sentence processing, but on a mechanism level, language might still be based on statistical computations.  相似文献   

14.
Tractography uses diffusion MRI to estimate the trajectory and cortical projection zones of white matter fascicles in the living human brain. There are many different tractography algorithms and each requires the user to set several parameters, such as curvature threshold. Choosing a single algorithm with specific parameters poses two challenges. First, different algorithms and parameter values produce different results. Second, the optimal choice of algorithm and parameter value may differ between different white matter regions or different fascicles, subjects, and acquisition parameters. We propose using ensemble methods to reduce algorithm and parameter dependencies. To do so we separate the processes of fascicle generation and evaluation. Specifically, we analyze the value of creating optimized connectomes by systematically combining candidate streamlines from an ensemble of algorithms (deterministic and probabilistic) and systematically varying parameters (curvature and stopping criterion). The ensemble approach leads to optimized connectomes that provide better cross-validated prediction error of the diffusion MRI data than optimized connectomes generated using a single-algorithm or parameter set. Furthermore, the ensemble approach produces connectomes that contain both short- and long-range fascicles, whereas single-parameter connectomes are biased towards one or the other. In summary, a systematic ensemble tractography approach can produce connectomes that are superior to standard single parameter estimates both for predicting the diffusion measurements and estimating white matter fascicles.  相似文献   

15.
Ordinary algebra may be used to backcalculate health‐based cleanup targets in deterministic risk assessments, but it does not work in interval or probabilistic risk assessments. Equations with interval or random variables do not follow the rules of ordinary algebra. This paper explains the need for more sophisticated methods to backcalculate soil cleanup targets when using interval or random variables.  相似文献   

16.
A fundamental and frequently overlooked aspect of animal learning is its reliance on compatibility between the learning rules used and the attentional and motivational mechanisms directing them to process the relevant data (called here data-acquisition mechanisms). We propose that this coordinated action, which may first appear fragile and error prone, is in fact extremely powerful, and critical for understanding cognitive evolution. Using basic examples from imprinting and associative learning, we argue that by coevolving to handle the natural distribution of data in the animal's environment, learning and data-acquisition mechanisms are tuned jointly so as to facilitate effective learning using relatively little memory and computation. We then suggest that this coevolutionary process offers a feasible path for the incremental evolution of complex cognitive systems, because it can greatly simplify learning. This is illustrated by considering how animals and humans can use these simple mechanisms to learn complex patterns and represent them in the brain. We conclude with some predictions and suggested directions for experimental and theoretical work.  相似文献   

17.
This paper discusses two problems related to three-dimensional object recognition. The first is segmentation and the selection of a candidate object in the image, the second is the recognition of a three-dimensional object from different viewing positions. Regarding segmentation, it is shown how globally salient structures can be extracted from a contour image based on geometrical attributes, including smoothness and contour length. This computation is performed by a parallel network of locally connected neuron-like elements. With respect to the effect of viewing, it is shown how the problem can be overcome by using the linear combinations of a small number of two-dimensional object views. In both problems the emphasis is on methods that are relatively low level in nature. Segmentation is performed using a bottom-up process, driven by the geometry of image contours. Recognition is performed without using explicit three-dimensional models, but by the direct manipulation of two-dimensional images.  相似文献   

18.
This paper presents results of theoretical computations on the interaction energies and geometries for the binding to nucleic acids of a number of representative groove binding non intercalating drugs: netropsin, distamycin A, SN 18071, etc. The computations account for the specificity of binding in all cases and demonstrate that the formation of hydrogen bonds is not necessary neither for binding nor for the preference for the minor groove of AT sequences of B-DNA. It appears that if a relatively good steric fit can be obtained in the minor groove, the interaction will be preferentially stabilized there by the favorable electrostatic potential generated in this groove by the AT sequences. The computation of the interaction energies in free space does not reproduce, however, the order of affinities of the ligands studied and yields too great values of the binding energies. The introduction of the solvent effect, through the computation of the hydration and cavitation effects, confirms the specificity, improves the ordering and brings the values of the energies close to the experimental ones. The theoretical account of the “surprising” effect of netrospin binding to the major groove of theTψC stem of tRNAPhe confirms the decisive significance of the distribution of the molecular electrostatic potential for the selection of the binding site. The inclusion in the computations of the flexibility of DNA enables to predict correctly the main features of the macromolecular deformation upon the binding of the ligand.  相似文献   

19.
Prediction of protein structural class with Rough Sets   总被引:1,自引:0,他引:1  

Background  

A new method for the prediction of protein structural classes is constructed based on Rough Sets algorithm, which is a rule-based data mining method. Amino acid compositions and 8 physicochemical properties data are used as conditional attributes for the construction of decision system. After reducing the decision system, decision rules are generated, which can be used to classify new objects.  相似文献   

20.
Jochen Trommer 《Morphology》2013,23(2):269-289
Syncretism in inflectional paradigms corresponds often only partially to natural classes. In this paper, I propose Morpheme Generalization Grammars, a novel paradigm-based approach to this phenomenon where the morphosyntactic content of every affix corresponds to the maximal area of the paradigm where it regularly occurs, whereas additional Morpheme Generalization Rules selectively extend its paradigmatic coverage by deleting part of the featural content of affixes for specific paradigm cells. The resulting formalism maximizes the use of paradigmatic extension rules familiar from the Rules of Referral in Paradigm Function Morphology, but has also close parallels to Impoverishment rules in Distributed Morphology. By imposing inherent restrictions on the content of inflectional affixes, it substantially reduces the amount of analytic ambiguity in the modeling of inflectional morphology.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号