首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
 The idea that a sparse representation is the computational principle of visual systems has been supported by Olshausen and Field [Nature (1996) 381: 607–609] and many other studies. On the other hand neurons in the inferotemporal cortex respond to moderately complex features called icon alphabets, and such neurons respond invariantly to the stimulus position. To incorporate this property into sparse representation, an algorithm is proposed that trains basis functions using sparse representations with shift invariance. Shift invariance means that basis functions are allowed to move on image data and that coefficients are equipped with shift invariance. The algorithm is applied to natural images. It is ascertained that moderately complex graphical features emerge that are not as simple as Gabor filters and not as complex as real objects. Shift invariance and moderately complex features correspond to the property of icon alphabets. The results show that there is another connection between visual information processing and sparse representations. Received: 3 November 1999 / Accepted in revised form: 17 February 2000  相似文献   

2.
3.
We describe a class of feed forward neural network models for associative content addressable memory (ACAM) which utilize sparse internal representations for stored data. In addition to the input and output layers, our networks incorporate an intermediate processing layer which serves to label each stored memory and to perform error correction and association. We study two classes of internal label representations: the unary representation and various sparse, distributed representations. Finally, we consider storage of sparse data and sparsification of data. These models are found to have advantages in terms of storage capacity, hardware efficiency, and recall reliability when compared to the Hopfield model, and to possess analogies to both biological neural networks and standard digital computer memories.  相似文献   

4.
Paton RC 《Bio Systems》2002,66(1-2):43-53
The contemporary research and development context in multidisciplinary biology has a serious requirement for integrating knowledge from disparate sources, and facilitating much-needed inter- and intra-disciplinary dialogue. A multiplicity of models arises when pluralistic approaches to modelling are followed, and also when there is not only a requirement to model systems and data, but also knowledge of systems and data. The challenges of addressing this multiplicity do not only include articulating the structure of complex systems, but also placing modelling within the framework of a process as well as a product. The graph representations presented here facilitate dialogue, modelling, clarification and specification of concepts, and the sharing of terms. This paper explores relationships between collections of graph representations. It is hoped that in future, when readers look at a node or a process in a graph, they will have a much deeper appreciation of relationships and context.  相似文献   

5.
Once a new vaccine has been granted its licensing, a public health expertise is needed in order to support the decision regarding its possible inclusion within the national immunisation schedule. This analysis, based on an assessment of the benefits/risks balance and costs/effectiveness ratio, is a multidisciplinary exercise. Largely based on epidemiological and immunological expertises, it also requires bio-mathematical and economical inputs, if the long term consequences of the vaccination are to be taken into account. Indeed, the main drivers of the decision are the burden of the disease, the characteristics of the vaccine in term of effectiveness and safety, the cost of the vaccination, the feasibility of the adjunction of the vaccine in the schedule, the social demand for this vaccination and the positive or negative indirect effects of a large vaccination on the epidemiology of the disease, in addition to the direct protective effect for vaccinated individuals. New vaccines are generally characterised by a more limited epidemiological impact than older vaccines, in a context of growing requirements from our society regarding drugs, and especially vaccines, safety. Both the real and perceived benefits/risks balances for the more recent vaccines appear questionable. The possibility of detrimental epidemiological consequences of either insufficient vaccination coverage or serotype (or serogroup) replacement is another factor that makes the decision regarding vaccination strategies increasingly complex.  相似文献   

6.
7.
Molecular Basis for Genetic Recombination   总被引:5,自引:1,他引:5       下载免费PDF全文
  相似文献   

8.
9.
Photographs are often used to establish the identity of an individual or to verify that they are who they claim to be. Yet, recent research shows that it is surprisingly difficult to match a photo to a face. Neither humans nor machines can perform this task reliably. Although human perceivers are good at matching familiar faces, performance with unfamiliar faces is strikingly poor. The situation is no better for automatic face recognition systems. In practical settings, automatic systems have been consistently disappointing. In this review, we suggest that failure to distinguish between familiar and unfamiliar face processing has led to unrealistic expectations about face identification in applied settings. We also argue that a photograph is not necessarily a reliable indicator of facial appearance, and develop our proposal that summary statistics can provide more stable face representations. In particular, we show that image averaging stabilizes facial appearance by diluting aspects of the image that vary between snapshots of the same person. We review evidence that the resulting images can outperform photographs in both behavioural experiments and computer simulations, and outline promising directions for future research.  相似文献   

10.
11.
Accuracy of alternative representations for integrated biochemical systems   总被引:2,自引:0,他引:2  
E O Voit  M A Savageau 《Biochemistry》1987,26(21):6869-6880
The Michaelis-Menten formalism often provides appropriate representations of individual enzyme-catalyzed reactions in vitro but is not well suited for the mathematical analysis of complex biochemical networks. Mathematically tractable alternatives are the linear formalism and the power-law formalism. Within the power-law formalism there are alternative ways to represent biochemical processes, depending upon the degree to which fluxes and concentrations are aggregated. Two of the most relevant variants for dealing with biochemical pathways are treated in this paper. In one variant, aggregation leads to a rate law for each enzyme-catalyzed reaction, which is then represented by a power-law function. In the other, aggregation produces a composite rate law for either net rate of increase or net rate of decrease of each system constituent; the composite rate laws are then represented by a power-law function. The first variant is the mathematical basis for a method of biochemical analysis called metabolic control, the latter for biochemical systems theory. We compare the accuracy of the linear and of the two power-law representations for networks of biochemical reactions governed by Michaelis-Menten and Hill kinetics. Michaelis-Menten kinetics are always represented more accurately by power-law than by linear functions. Hill kinetics are in most cases best modeled by power-law functions, but in some cases linear functions are best. Aggregation into composite rate laws for net increase or net decrease of each system constituent almost always improves the accuracy of the power-law representation. The improvement in accuracy is one of several factors that contribute to the wide range of validity of this power-law representation. Other contributing factors that are discussed include the nonlinear character of the power-law formalism, homeostatic regulatory mechanisms in living systems, and simplification of rate laws by regulatory mechanisms in vivo.  相似文献   

12.
We describe a hierarchical, generative model that can be viewed as a nonlinear generalization of factor analysis and can be implemented in a neural network. The model uses bottom-up, top-down and lateral connections to perform Bayesian perceptual inference correctly. Once perceptual inference has been performed the connection strengths can be updated using a very simple learning rule that only requires locally available information. We demonstrate that the network learns to extract sparse, distributed, hierarchical representations.  相似文献   

13.
Forecasting technological progress is of great interest to engineers, policy makers, and private investors. Several models have been proposed for predicting technological improvement, but how well do these models perform? An early hypothesis made by Theodore Wright in 1936 is that cost decreases as a power law of cumulative production. An alternative hypothesis is Moore''s law, which can be generalized to say that technologies improve exponentially with time. Other alternatives were proposed by Goddard, Sinclair et al., and Nordhaus. These hypotheses have not previously been rigorously tested. Using a new database on the cost and production of 62 different technologies, which is the most expansive of its kind, we test the ability of six different postulated laws to predict future costs. Our approach involves hindcasting and developing a statistical model to rank the performance of the postulated laws. Wright''s law produces the best forecasts, but Moore''s law is not far behind. We discover a previously unobserved regularity that production tends to increase exponentially. A combination of an exponential decrease in cost and an exponential increase in production would make Moore''s law and Wright''s law indistinguishable, as originally pointed out by Sahal. We show for the first time that these regularities are observed in data to such a degree that the performance of these two laws is nearly the same. Our results show that technological progress is forecastable, with the square root of the logarithmic error growing linearly with the forecasting horizon at a typical rate of 2.5% per year. These results have implications for theories of technological change, and assessments of candidate technologies and policies for climate change mitigation.  相似文献   

14.
15.
16.
17.
18.
The evidence for patrilocal group organization among precontact hunter-gatherers is ambiguous. Observations among modern hunter-gatherers suggest that few, if any, are organized into patrilineal hordes or bands, but these observations stem from declining populations in transformed environments. These inadequacies in the data imbue the theoretical arguments concerning hunter-gatherer local organization with special importance. It is shown that ecological arguments against the viability of patrilineal bands fail because they conflate the membership of the camp with the band. The argument that variations in family size preclude patrilineal bands is found to be true in nonpolygynous populations. However, it is argued that patrilineal bands should develop in stable, strongly polygynous populations with small local groups, [hunter-gatherers, local organization, ecology, demography, polygyny]  相似文献   

19.
20.
GOMPERTZ'S model (1825) has remained a purely empirical one, despite its frequent usage. A theoretical justification is given which permits its employment in cases of accretionary growth.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号