首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In 1923, Max Wertheimer proposed a research programme and method in visual perception. He conjectured the existence of a small set of geometric grouping laws governing the perceptual synthesis of phenomenal objects, or "gestalt" from the atomic retina input. In this paper, we review this set of geometric grouping laws, using the works of Metzger, Kanizsa and their schools. In continuation, we explain why the Gestalt theory research programme can be translated into a Computer Vision programme. This translation is not straightforward, since Gestalt theory never addressed two fundamental matters: image sampling and image information measurements. Using these advances, we shall show that gestalt grouping laws can be translated into quantitative laws allowing the automatic computation of gestalts in digital images. From the psychophysical viewpoint, a main issue is raised: the computer vision gestalt detection methods deliver predictable perception thresholds. Thus, we are set in a position where we can build artificial images and check whether some kind of agreement can be found between the computationally predicted thresholds and the psychophysical ones. We describe and discuss two preliminary sets of experiments, where we compared the gestalt detection performance of several subjects with the predictable detection curve. In our opinion, the results of this experimental comparison support the idea of a much more systematic interaction between computational predictions in Computer Vision and psychophysical experiments.  相似文献   

2.

Background

In fledgling areas of research, evidence supporting causal assumptions is often scarce due to the small number of empirical studies conducted. In many studies it remains unclear what impact explicit and implicit causal assumptions have on the research findings; only the primary assumptions of the researchers are often presented. This is particularly true for research on the effect of faculty’s teaching performance on their role modeling. Therefore, there is a need for robust frameworks and methods for transparent formal presentation of the underlying causal assumptions used in assessing the causal effects of teaching performance on role modeling. This study explores the effects of different (plausible) causal assumptions on research outcomes.

Methods

This study revisits a previously published study about the influence of faculty’s teaching performance on their role modeling (as teacher-supervisor, physician and person). We drew eight directed acyclic graphs (DAGs) to visually represent different plausible causal relationships between the variables under study. These DAGs were subsequently translated into corresponding statistical models, and regression analyses were performed to estimate the associations between teaching performance and role modeling.

Results

The different causal models were compatible with major differences in the magnitude of the relationship between faculty’s teaching performance and their role modeling. Odds ratios for the associations between teaching performance and the three role model types ranged from 31.1 to 73.6 for the teacher-supervisor role, from 3.7 to 15.5 for the physician role, and from 2.8 to 13.8 for the person role.

Conclusions

Different sets of assumptions about causal relationships in role modeling research can be visually depicted using DAGs, which are then used to guide both statistical analysis and interpretation of results. Since study conclusions can be sensitive to different causal assumptions, results should be interpreted in the light of causal assumptions made in each study.  相似文献   

3.
The problems are discussed related to development of concepts of rational taxonomy and rational classifications (taxonomic systems) in biology. Rational taxonomy is based on the assumption that the key characteristic of rationality is deductive inference of certain partial judgments about reality under study from other judgments taken as more general and a priory true. Respectively, two forms of rationality are discriminated--ontological and epistemological ones. The former implies inference of classifications properties from general (essential) properties of the reality being investigated. The latter implies inference of the partial rules of judgments about classifications from more general (formal) rules. The following principal concepts of ontologically rational biological taxonomy are considered: "crystallographic" approach, inference of the orderliness of organismal diversity from general laws of Nature, inference of the above orderliness from the orderliness of ontogenetic development programs, based on the concept of natural kind and Cassirer's series theory, based on the systemic concept, based on the idea of periodic systems. Various concepts of ontologically rational taxonomy can be generalized by an idea of the causal taxonomy, according to which any biologically sound classification is founded on a contentwise model of biological diversity that includes explicit indication of general causes responsible for that diversity. It is asserted that each category of general causation and respective background model may serve as a basis for a particular ontologically rational taxonomy as a distinctive research program. Concepts of epistemologically rational taxonomy and classifications (taxonomic systems) can be interpreted in terms of application of certain epistemological criteria of substantiation of scientific status of taxonomy in general and of taxonomic systems in particular. These concepts include: consideration of taxonomy consistency from the standpoint of inductive and hypothetico-deductive argumentation schemes and such fundamental criteria of classifications naturalness as their prognostic capabilities; foundation of a theory of "general taxonomy" as a "general logic", including elements of the axiomatic method. The latter concept constitutes a core of the program of general classiology; it is inconsistent due to absence of anything like "general logic". It is asserted that elaboration of a theory of taxonomy as a biological discipline based on the formal principles of epistemological rationality is not feasible. Instead, it is to be elaborated as ontologically rational one based on biologically sound metatheories about biological diversity causes.  相似文献   

4.
One of the major criticisms of optimal foraging theory (OFT) is that it is not testable. In discussions of this criticism opposing parties have confused methodological concepts and used meaningless biological concepts. In this paper we discuss such misunderstandings and show that OFr has an empirically testable, and even well-confirmed, general core theory. One of our main conclusions is that specific model testing should not be aimed at proving optimality, but rather at identifying the context in which certain types of behaviour are optimal. To do this, it is necessary to be aware of the assumptions made in testing a model. The assumptions that are explicitly stated in the literature up to now do not completely cover the actual assumptions made in testing OFT models in practice. We present a more comprehensive set of assumptions. Although all the assumptions play a role in testing models, they are not of equal status. Crucial assumptions concern constraints and the relation between fitness and currency. Therefore, it is essential to make such assumptions testable in practice. We show that a more explicit relationship between OFT modelling and evolutionary theory can help with this. Specifically, phylogeny reconstruction and population dynamic modelling can and should be used to formulate assumptions concerning constraints and currencies.  相似文献   

5.
栖息地毁坏与动物物种灭绝关系的模拟研究   总被引:28,自引:13,他引:15  
林振山  汪曙光 《生态学报》2002,22(4):535-540
利用多个物种共存模式模拟了不同情况下的不同动物种群演化的动力学特性,研究结果表明:(1)由于栖息地的毁坏所导致的动手的种灭绝是依赖于对物种死亡率和有关平衡态的假设的,不同的假设下,既使栖息地的破坏率相同,灭绝的物种可能是竞争能力最强的若干物种,也可能是竞争能力相对较弱的若干物种,既不象传统的物种进化理论所认为的必是弱的物种先灭绝,也不象Tilman等人所认为的一定是最强的若干物种先灭绝;(2)如果弱的物种具有较高的平均死亡率,则当栖息地受到一定的毁坏时,将有较多强的物种灭绝,而且物种灭绝时间将大大缩短;(3)在物种死亡率不变的情形下,物种在未受毁坏栖息地上的平衡态和大占有率pl^0,将有利于物种的生存。  相似文献   

6.
The results of experiments performed by L.S. Stone and by R. W. Sperry on vision in salamanders are analyzed. A summary of Reichardt's theory, which originated in the researches of B. Hassenstein on insects, is given, and an application to salamanders is carried out. Introducing a simpler mathematical development based on the onefactor theory of excitation, the same formal results are obtained. A complete model, which includes those results and tries to fit the salamander behavior described by stone, is elaborated and translated into Boolean terms. As a complementary consideration, it is pointed out that the functional rigidity of the visuomotor system, deriving from a biological molecular hysteresis, could be framed in theneuro-affinity principle, the spatial expression of which would turn out to be abiological field. This research was supported by the United States Air Force through the Air Force Office of Scientific Research of the Air Research and Development Command under Contract No. AF 18(600)-1454. Reproduction in whole or in part is permitted for any purpose of the United States Government.  相似文献   

7.
8.
A model is developed for the spread of a state in small social groups. Under suitable assumptions the model exhibits formal identity with Markov chain theory. The basic theorems and classifications of Markov chain theory are stated and interpreted in terms of the model. Finally, some procedures for testing the model are indicated.  相似文献   

9.
Blyth MG  Mestel AJ  Zabielski L 《Biorheology》2002,39(3-4):345-350
The macrocirculation is modelled by incompressible Newtonian flow through a rigid network of pipes for which possible simplifications are discussed. The common assumptions of two-dimensionality or axisymmetry can be generalised to helical symmetry, and in the first part of the paper, the three-dimensionality of arterial bends is considered by varying the curvature and torsion of a section of a helical pipe. The torsion is found to impart a preferential twist to the cross-sectional flow. This loss of symmetry ensures that flow separation is less severe for a helical bend than for a toroidal bend. The effects of variations in body size are examined using allometric scaling laws.In the second part of the paper, the approach to "fully developed" Dean or Womersley flow is considered in an attempt to quantify the regions of validity of idealised models. A perturbation approach, akin to hydrodynamic stability theory, is used. It is argued that often potential flows are more suitable for describing the rapid interactions between geometry and pulsatility rather than the eventual fully developed state so that, for example, the first 100 degrees of the aortic arch may be considered irrotational. Helical potential flows are found to develop faster than the corresponding toroidal flows, but slower than those in a straight pipe. The presence of vorticity in the core also retards the development of symmetric flows. It is concluded that while idealised flows can occur at some points in the body, in general experimental observation is needed to justify their use. Particular caution is recommended when interpreting calculations with Poiseuille input.  相似文献   

10.
Reliability theory is a general theory about systems failure. It allows researchers to predict the age-related failure kinetics for a system of given architecture (reliability structure) and given reliability of its components. Reliability theory predicts that even those systems that are entirely composed of non-aging elements (with a constant failure rate) will nevertheless deteriorate (fail more often) with age, if these systems are redundant in irreplaceable elements. Aging, therefore, is a direct consequence of systems redundancy. Reliability theory also predicts the late-life mortality deceleration with subsequent leveling-off, as well as the late-life mortality plateaus, as an inevitable consequence of redundancy exhaustion at extreme old ages. The theory explains why mortality rates increase exponentially with age (the Gompertz law) in many species, by taking into account the initial flaws (defects) in newly formed systems. It also explains why organisms "prefer" to die according to the Gompertz law, while technical devices usually fail according to the Weibull (power) law. Theoretical conditions are specified when organisms die according to the Weibull law: organisms should be relatively free of initial flaws and defects. The theory makes it possible to find a general failure law applicable to all adult and extreme old ages, where the Gompertz and the Weibull laws are just special cases of this more general failure law. The theory explains why relative differences in mortality rates of compared populations (within a given species) vanish with age, and mortality convergence is observed due to the exhaustion of initial differences in redundancy levels. Overall, reliability theory has an amazing predictive and explanatory power with a few, very general and realistic assumptions. Therefore, reliability theory seems to be a promising approach for developing a comprehensive theory of aging and longevity integrating mathematical methods with specific biological knowledge.  相似文献   

11.
There are three partially random kinetic mechanisms which may be visualized for enzymes that catalyze reactions involving three substrates. The steady-state rate equation for these mechanisms can be reduced to the form derived by assuming equilibrium kinetics from consideration of inequality relationships among rate constants alone. The validity of these assumptions and of the resulting rate laws is considered.  相似文献   

12.
In Greece, since 2000, the teaching of evolutionary theory is restricted solely to lower (junior) high school and specifically to ninth grade. Even though the theory of evolution is included to the 12th grade biology textbook, it is not taught in Greek upper (senior) high schools. This study presents research conducted on the conceptions of Greek students regarding issues set out in the theory of evolution after the formal completion of the teaching of the theory. The sample comprised 411 10th grade students from 12 different schools. The research results show that the students appear to have a positive view of the idea of evolution, the evolution of man, and the common origin of organisms. However, they have retained many alternative views, or else they are completely in ignorance of basic issues in evolutionary theory regarding: what is considered evolution in biology, the main mechanism of evolutionary changes in what is considered natural selection, what the theory of evolution actually explains, and what the word theory means in science. At least in Greece, these views still prevail because the theory of evolution is marginalized in the teaching of biology in Greek schools, and biology education does not help students formulate overall conceptual structures to enable them to understand the question of biological change.
Lucia PrinouEmail:
  相似文献   

13.
Insofar as it may be regarded as a purely logicomathematical system, lacking empirical content, economic theory may, by appropriate empirical interpretation, be applied to domains quite apart from the market. Here such use yields a model of caste or jajmani relations. The chief assumptions of the model regarding the stability of a caste system are the following: (1) it requires a high concentration of political power; (2) it requires "prices" that do not change "freely" with supply and demands; (3) it requires consonance of political power, wealth, and ritual rank. These assumptions and various related deductions are tested statistically by data derived from village studies in India, Pakistan, and Ceylon. The evidence indicates that the model is generally plausible and may be applicable to hereditary hierarchies everywhere.  相似文献   

14.
Classical and Connectionist theories of cognitive architecture seek to explain systematicity (i.e., the property of human cognition whereby cognitive capacity comes in groups of related behaviours) as a consequence of syntactically and functionally compositional representations, respectively. However, both theories depend on ad hoc assumptions to exclude specific instances of these forms of compositionality (e.g. grammars, networks) that do not account for systematicity. By analogy with the Ptolemaic (i.e. geocentric) theory of planetary motion, although either theory can be made to be consistent with the data, both nonetheless fail to fully explain it. Category theory, a branch of mathematics, provides an alternative explanation based on the formal concept of adjunction, which relates a pair of structure-preserving maps, called functors. A functor generalizes the notion of a map between representational states to include a map between state transformations (or processes). In a formal sense, systematicity is a necessary consequence of a higher-order theory of cognitive architecture, in contrast to the first-order theories derived from Classicism or Connectionism. Category theory offers a re-conceptualization for cognitive science, analogous to the one that Copernicus provided for astronomy, where representational states are no longer the center of the cognitive universe—replaced by the relationships between the maps that transform them.  相似文献   

15.
Levins and Lewontin have contributed significantly to our philosophical understanding of the structures, processes, and purposes of biological mathematical theorizing and modeling. Here I explore their separate and joint pleas to avoid making abstract and ideal scientific models ontologically independent by confusing or conflating our scientific models and the world. I differentiate two views of theorizing and modeling, orthodox and dialectical, in order to examine Levins and Lewontin’s, among others, advocacy of the latter view. I compare the positions of these two views with respect to four points regarding ontological assumptions: (1) the origin of ontological assumptions, (2) the relation of such assumptions to the formal models of the same theory, (3) their use in integrating and negotiating different formal models of distinct theories, and (4) their employment in explanatory activity. Dialectical is here used in both its Hegelian–Marxist sense of opposition and tension between alternative positions and in its Platonic sense of dialogue between advocates of distinct theories. I investigate three case studies, from Levins and Lewontin as well as from a recent paper of mine, that show the relevance and power of the dialectical understanding of theorizing and modeling.  相似文献   

16.

Purpose

The nature of end-of-life (EoL) processes is highly uncertain for constructions built today. This uncertainty is often neglected in life cycle assessments (LCAs) of construction materials. This paper tests how EoL assumptions influence LCA comparisons of two alternative roof construction elements: glue-laminated wooden beams and steel frames. The assumptions tested include the type of technology and the use of attributional or consequential modelling approaches.

Methods

The study covers impact categories often considered in the construction industry: total and non-renewable primary energy demand, water depletion, global warming, eutrophication and photo-chemical oxidant creation. The following elements of the EoL processes are tested: energy source used in demolition, fuel type used for transportation to the disposal site, means of disposal and method for handling allocation problems of the EoL modelling. Two assumptions regarding technology development are tested: no development from today’s technologies and that today’s low-impact technologies have become representative for the average future technologies. For allocating environmental impacts of the waste handling to by-products (heat or recycled material), an attributional cut-off approach is compared with a consequential substitution approach. A scenario excluding all EoL processes is also considered.

Results and discussion

In all comparable scenarios, glulam beams have clear environmental benefits compared to steel frames, except for in a scenario in which steel frames are recycled and today’s average steel production is substituted, in which impacts are similar. The choice of methodological approach (attributional, consequential or fully disregarding EoL processes) does not seem to influence the relative performance of the compared construction elements. In absolute terms, four factors are shown to be critical for the results: whether EoL phases are considered at all, whether recycling or incineration is assumed in the disposal of glulam beams, whether a consequential or attributional approach is used in modelling the disposal processes and whether today’s average technology or a low-impact technology is assumed for the substituted technology.

Conclusions

The results suggest that EoL assumptions can be highly important for LCA comparisons of construction materials, particularly in absolute terms. Therefore, we recommend that EoL uncertainties are taken into consideration in any LCA of long-lived products. For the studied product type, LCA practitioners should particularly consider EoL assumptions regarding the means of disposal, the expected technology development of disposal processes and any substituted technology and the choice between attributional and consequential approaches.  相似文献   

17.
The scope of current optimal diet theory is greatly restricted by certain rather stringent assumptions upon which it rests. One of these is that the type of prey a predator encounters next is not influenced by the last type encountered. The purpose of this paper is to relax this and certain other assumptions and, in so doing, arrive at a set of rules for determining the structure of the optimal diet which are analogous to, but more general than, those of current theory. Once obtained, these rules are contrasted with their earlier analogues. The major findings are that (1) prey types are not necessarily added to the optimal diet in order of decreasing energy to handling time ratio, and (2) the abundance of a type initially excluded from the diet is not necessarily irrelevant in determining whether or not that type will be included in the future. These findings show that, in the more general case considered, the structure of the optimal diet may be quite different than predicted by current theory.  相似文献   

18.
The aim of the study is to contribute to a better understanding of some aspects of the structure of biological knowledge and to make clearer to what extent the methods of reasoning may be useful in this field when only qualitative information is available. A fragment of biological knowledge (theory of cell motility) is analysed from the logico-methodological point of view as a coherent system and the possibility of its formal representation is investigated. The analysis is based on distinguishing the main objects and their features (attributes) of which a given piece of knowledge is composed and on the values which these features may display. The features are interconnected by relations (in which various number of arguments appear) and these relations constitute the main (general, higher level) laws of a given fragment of knowledge (theory). Values of attributes are also mutually connected and these relations correspond to the detailed (lower level) laws. A computer system (in which Prolog language was used) enables to perform inference operations of progressive as well as regressive type. The main categories of reasoning procedures are described and illustrated by examples, namely a) search for conclusions which may be confronted with the actual knowledge in order to verify the system as a whole, b) formation of working hypotheses in the process of their empirical verification and explanation of facts and laws. The problem of development and modification of the system is also discussed.  相似文献   

19.
The applicability of qualitative analysis and modelling is discussed with reference to the population dynamics of the spruce bark beetle Ips typographus . On the basis of a set of assumptions and postulates, the structure of the population dynamical interactions is discussed. Dynamic analysis suggest that the Scandinavian spruce bark beetle population is a stable system which may be triggered into "boom and bust" dynamics. We suggest that qualitative analysis should proceed, explicitly, through four steps: (1) An explicit statement of the biological assumptions and postulates. (2) A structural analysis, wherein the critical interaction network of the system is defined. (3) An equilibrium analysis, wherein the potential equilibrium points of the key organism (e.g., the pest species) are deduced. (4) A dynamic analysis, wherein the multispecies equilibrium points are found and their sensitivity to pertuberation analyzed. This formal approach should expose many and varied population systems to the interpretive and explanatory powers of qualitative analysis.  相似文献   

20.
D Clive 《Mutation research》1988,205(1-4):313-330
The present analysis examines the assumptions in, the perceptions and predictivity of and the need for short-term tests (STTs) for genotoxicity in light of recent findings that most noncarcinogens from the National Toxicology Program are genotoxic (i.e., positive in one or more in vitro STTs). Reasonable assumptions about the prevalence for carcinogens (1-10% of all chemicals), the sensitivity of these STTs (ca. 90% of all carcinogens are genotoxic) and their estimated "false positive" incidence (60-75%) imply that the majority of chemicals elicit genotoxic responses and, consequently, that most in vitro genotoxins are likely to be noncarcinogenic. Thus, either the usual treatment conditions used in these in vitro STTS are producing a large proportion of artifactual and meaningless positive results or else in vitro mutagenicity is too common a property of chemicals to serve as a useful predictor of carcinogenicity or other human risk. In contrast, the limited data base on in vivo STTs suggests that the current versions of these assays may have low sensitivity which appears unlikely to improve without dropping either their 'short-term' aspect or the rodent carcinogenicity benchmark. It is suggested that in vivo genotoxicity protocols be modified to take into consideration both the fundamentals of toxicology as well as the lessons learned from in vitro genetic toxicology. In the meantime, while in vivo assays are undergoing rigorous validation, genetic toxicology, as currently practiced, should not be a formal aspect of chemical or drug development on the grounds that it is incapable of providing realistic and reliable information on human risk. It is urged that data generated in new, unvalidated in vivo genotoxicity assays be exempted from the normal regulatory reporting requirements in order to encourage industry to participate in the laborious and expensive development of this next phase of genetic toxicology.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号