首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The current paradigm of eukaryotic evolution is based primarily on comparative analysis of ribosomal RNA sequences. It shows several early-emerging lineages, mostly amitochondriate, which might be living relics of a progressive assembly of the eukaryotic cell. However, the analysis of slow-evolving positions, carried out with the newly developed slow-fast method, reveals that these lineages are, in terms of nucleotide substitution, fast-evolving ones, misplaced at the base of the tree by a long branch attraction artefact. Since the fast-evolving groups are not always the same, depending on which macromolecule is used as a marker, this explains most of the observed incongruent phylogenies. The current paradigm of eukaryotic evolution thus has to be seriously re-examined as the eukaryotic phylogeny is presently best summarized by a multifurcation. This is consistent with the Big Bang hypothesis that all extant eukaryotic lineages are the result of multiple cladogeneses within a relatively brief period, although insufficiency of data is also a possible explanation for the lack of resolution. For further resolution, rare evolutionary events such as shared insertions and/or deletions or gene fusions might be helpful.  相似文献   

2.
中国北方农牧交错带优化生态-生产范式   总被引:6,自引:0,他引:6  
叶学华  梁士楚 《生态学报》2004,24(12):2878-2886
中国北方农牧交错带具有生态、生产双重功能,在国民经济发展中具有重要战略地位。构建优化生态生产范式是对其进行科学管理的一种有效方式。优化生态-生产范式是应用生态经济学原理与系统科学方法,以生态恢复与重建为目标,以多用途资源利用和景观生态设计为核心,以生物-自然和社会-经济综合分析为基础,结合现代科学成果和传统农牧业技术的精华而建立起来的结构优化、功能持续、经济可行的农林牧复合经营系统的范例。文章对优化生态生产范式的配置以及配套技术的研究现状进行了综述,提出在构建优化生态生产范式的过程中应当遵守规模的发展与土地承载能力相适应的原则、限制因子原则、景观异质性与尺度的原则以及生态、经济和社会效益统一的原则。针对目前范式研究中着重经济效益、研究尺度单一、缺乏学科间的综合和集成、效益评价体系不完善等一些问题,指出了今后范式研究的趋势,认为今后的研究应重视基础理论和应用技术的研究、对现有范式进行调整和优化、建立统一的评价体系、量化范式适宜程度和优化程度。在基础研究方面应主要集中在主要的农业生态系统对不同人为干扰响应的形式和实质,研究确定一系列重要的生态经济阈值;应用技术方面应注重于调整产业结构,积极发展高科技产业,提高产投比。对范式实施后的各项生态经济指标进行动态监测,从而对范式进行适时调整与优化。对不同范式取得的效益进行横向的、定量的研究,选择出最优的范式。同时,学科间的融入将成为范式研究中的又一特点。  相似文献   

3.
MOTIVATION: The task of engineering a protein to perform a target biological function is known as protein design. A commonly used paradigm casts this functional design problem as a structural one, assuming a fixed backbone. In probabilistic protein design, positional amino acid probabilities are used to create a random library of sequences to be simultaneously screened for biological activity. Clearly, certain choices of probability distributions will be more successful in yielding functional sequences. However, since the number of sequences is exponential in protein length, computational optimization of the distribution is difficult. RESULTS: In this paper, we develop a computational framework for probabilistic protein design following the structural paradigm. We formulate the distribution of sequences for a structure using the Boltzmann distribution over their free energies. The corresponding probabilistic graphical model is constructed, and we apply belief propagation (BP) to calculate marginal amino acid probabilities. We test this method on a large structural dataset and demonstrate the superiority of BP over previous methods. Nevertheless, since the results obtained by BP are far from optimal, we thoroughly assess the paradigm using high-quality experimental data. We demonstrate that, for small scale sub-problems, BP attains identical results to those produced by exact inference on the paradigmatic model. However, quantitative analysis shows that the distributions predicted significantly differ from the experimental data. These findings, along with the excellent performance we observed using BP on the smaller problems, suggest potential shortcomings of the paradigm. We conclude with a discussion of how it may be improved in the future.  相似文献   

4.
The molar and molecular views of behavior are not different theories or levels of analysis; they are different paradigms. The molecular paradigm views behavior as composed of discrete units (responses) occurring at moments in time and strung together in chains to make up complex performances. The discrete pieces are held together as a result of association by contiguity. The molecular view has a long history both in early thought about reflexes and in associationism, and, although it was helpful to getting a science of behavior started, it has outlived its usefulness. The molar view stems from a conviction that behavior is continuous, as argued by John Dewey, Gestalt psychologists, Karl Lashley, and others. The molar paradigm views behavior as inherently extended in time and composed of activities that have integrated parts. In the molar paradigm, activities vary in their scale of organization--i.e., as to whether they are local or extended--and behavior may be controlled sometimes by short-term relations and sometimes by long-term relations. Applied to choice, the molar paradigm rests on two simple principles: (a) all behavior constitutes choice; and (b) all activities take time. Equivalence between choice and behavior occurs because every situation contains more than one alternative activity. The principle that behavior takes time refers not simply to any notion of response duration, but to the necessity that identifying one action or another requires a sample extended in time. The molecular paradigm's momentary responses are inferred from extended samples in retrospect. In this sense, momentary responses constitute abstractions, whereas extended activities constitute concrete particulars. Explanations conceived within the molecular paradigm invariably involve hypothetical constructs, because they require causes to be contiguous with responses. Explanations conceived within the molar paradigm retain direct contact with observable variables.  相似文献   

5.
Paik MC  Sacco R  Lin IF 《Biometrics》2000,56(4):1145-1156
One of the objectives in the Northern Manhattan Stroke Study is to investigate the impact of stroke subtype on the functional status 2 years after the first ischemic stroke. A challenge in this analysis is that the functional status at 2 years after stroke is not completely observed. In this paper, we propose a method to handle nonignorably missing binary functional status when the baseline value and the covariates are completely observed. The proposed method consists of fitting four separate binary regression models: for the baseline outcome, the outcome 2 years after the stroke, the product of the previous two, and finally, the missingness indicator. We then conduct a sensitivity analysis by varying the assumptions about the third and the fourth binary regression models. Our method belongs to an imputation paradigm and can be an alternative to the weighting method of Rotnitzky and Robins (1997, Statistics in Medicine 16, 81-102). A jackknife variance estimate is proposed for the variance of the resulting estimate. The proposed analysis can be implemented using statistical software such as SAS.  相似文献   

6.
The emergent properties of biological systems, organized around complex networks of irregularly connected elements, limit the applications of the direct scientific method to their study. The current lack of knowledge opens new perspectives to the inverse scientific paradigm where observations are accumulated and analysed by advanced data-mining techniques to enable a better understanding and the formulation of testable hypotheses about the structure and functioning of these systems. The current technology allows for the wide application of omics analytical methods in the determination of time-resolved molecular profiles of biological samples. Here it is proposed that the theory of dynamical systems could be the natural framework for the proper analysis and interpretation of such experiments. A new method is described, based on the techniques of non-linear time series analysis, which is providing a global view on the dynamics of biological systems probed with time-resolved omics experiments.  相似文献   

7.
A new method will be presented which allows the perception of body odors in humans to be studied objectively. The analysis of body odor‐evoked potentials was used to investigate if and how the human brain is able to differentiate self from non‐self body odor for the first time. Six subjects (three females) participated in two experimental sessions. In each session, two body odors (axillary hair) were presented within an olfactory oddball paradigm. One of the odors was collected from the subject and the other from an odor donor of the same sex. In the first session the subjects' attention was distracted to a secondary task (passive paradigm), in the second session the subjects were asked to actively differentiate the odors (active paradigm). For the EEG recordings the odors were presented within a constantly flowing airstream. The results show that the subjects could hardly differentiate the body odors subjectively. However, it could be demonstrated that the central nervous processing of one's own odor was faster than the processing of the chemosensory non‐self signal. Moreover, in the active paradigm, the potentials appeared to be larger when the subjects perceived their own body odor. The conclusion is reached that the measurement of chemosensory event‐related potentials (CSERP) is the method of choice for the investigation of HLA‐associated body odors. This revised version was published online in July 2006 with corrections to the Cover Date.  相似文献   

8.
Brains from young (20 day old) and adult rats were used to compare myelin yields obtained by sedimentation and flotation techniques. The flotation method consistently gave approx 70% higher yields of myelin than the sedimentation method. Both myelin preparations have virtually identical protein composition as assessed by sodium dodecyl sulfate-polyacrylamide gel electrophoresis. Electrophoretic analysis revealed substantial concentrations of myelin proteins in the non-myelin particulate fraction obtained by the sedimentation but not by the flotation method. The study indicates that the paradigm of the sedimentation method results in a significant loss of myelin during isolation, and that this loss can be avoided or minimized by employing the flotation method.  相似文献   

9.
This paper describes an initial but fundamental attempt to lay some groundwork for a fuzzy-set-based paradigm for sensory analysis and to demonstrate how fuzzy set and neural network techniques may lead to a natural way for sensory data interpretation. Sensory scales are described as fuzzy sets, sensory attributes as fuzzy variables, and sensory responses as sample membership grades. Multi-judge responses are formulated as a fuzzy membership vector or fuzzy histogram of response, which gives an overall panel response free of the unverifiable assumptions implied in conventional approaches. Neural networks are used to provide an effective tool for modeling and analysis of sensory responses in their naturally fuzzy and complex forms. A maximum method of defuzzification is proposed to give a crisp grade of the majority opinion. Two applications in meat quality evaluation are used to demonstrate the use of the paradigm and procedure. It is hoped that this work will bring up some new ideas and generate interest in research on application of fuzzy sets and neural networks in sensory analysis.  相似文献   

10.
Clustering analysis has a growing role in the study of co-expressed genes for gene discovery. Conventional binary and fuzzy clustering do not embrace the biological reality that some genes may be irrelevant for a problem and not be assigned to a cluster, while other genes may participate in several biological functions and should simultaneously belong to multiple clusters. Also, these algorithms cannot generate tight clusters that focus on their cores or wide clusters that overlap and contain all possibly relevant genes. In this paper, a new clustering paradigm is proposed. In this paradigm, all three eventualities of a gene being exclusively assigned to a single cluster, being assigned to multiple clusters, and being not assigned to any cluster are possible. These possibilities are realised through the primary novelty of the introduction of tunable binarization techniques. Results from multiple clustering experiments are aggregated to generate one fuzzy consensus partition matrix (CoPaM), which is then binarized to obtain the final binary partitions. This is referred to as Binarization of Consensus Partition Matrices (Bi-CoPaM). The method has been tested with a set of synthetic datasets and a set of five real yeast cell-cycle datasets. The results demonstrate its validity in generating relevant tight, wide, and complementary clusters that can meet requirements of different gene discovery studies.  相似文献   

11.
The terms “paradigm” and “paradigm shift” originated in “The Structure of Scientific Revolutions” by Thomas Kuhn. A paradigm can be defined as the generally accepted concepts and practices of a field, and a paradigm shift its replacement in a scientific revolution. A paradigm shift results from a crisis caused by anomalies in a paradigm that reduce its usefulness to a field. Claims of paradigm shifts and revolutions are made frequently in the neurosciences. In this article I will consider neuroscience paradigms, and the claim that new tools and techniques rather than crises have driven paradigm shifts. I will argue that tool development has played a minor role in neuroscience revolutions.  相似文献   

12.
Experiments are the foundation of empirical science, and experimental paradigms that are broadly applicable across settings and species are particularly useful for comparative research. Originally developed to address questions related to perception and cognition of pre‐verbal human infants, the looking time experimental paradigm has been increasingly used to study animal behavior and cognition, particularly in non‐human primates. Looking time experiments are based on the assumption that animals direct eye gaze toward objects or scenes based on their degree of interest, and use looking behavior to infer perceptual or cognitive characteristics of subjects. This paradigm can be used in a variety of contexts and is not based on species‐typical behaviors, allowing for intra‐ and interspecific comparisons. Here, we describe the history of use of looking time measures, provide an overview of the problems and controversies related to this method, and offer recommendations on how to implement looking time tasks, focusing on the preparation of stimuli, experimental procedures, and data analysis. Our overview focuses on non‐human primates, where most work has been carried out, but the issues discussed should be applicable to a wide range of species. We conclude that despite pertinent criticism, looking time tasks are practical when executed and interpreted properly. The further implementation of these methods in studies of animal behavior and cognition is likely to be fruitful.  相似文献   

13.
The environmental changes induced by projects occurring in an area are evaluated by cumulative impact assessments (CIA), which consider the consequences of multiple projects, each insignificant on its own, yet important when evaluated collectively. When future human activities are of interest, the proposed activities are included in CIA using an analytical platform that can supply precise predictions but with asymptotically null accuracy. To compensate for the lack of accuracy we propose a shift in the paradigm governing CIA. The paradigm shift advocates a change in the focus of CIA investigations from the detailed analysis of one unlikely future to the identification of patterns describing the future changes in the environment. To illustrate the paradigm shift, a set of 144 possible and equally likely futures were developed and used to identify the potential impacts of forest harvesting and petroleum drilling on the habitat of moose and marten. The univariate and multivariate analysis of two measures of habitat, namely average habitat suitability index (HSI) and surface of the stands with HSI >0.5, revealed at least three distinct periods in the next 100 years. Multivariate analysis also showed that habitat quality is of immediate importance and marten is sensitive during the first third of the century. Our findings indicate that the spatial and temporal arrangement of the human disturbances could be more important than the magnitude of the disturbance. The attributes associated with significant environmental changes are harvesting age and the investigated environmental elements of an ecosystem, in this case the species habitat.  相似文献   

14.
Mathematical models have played an important role in the analysis of circadian systems. The models include simulation of differential equation systems to assess the dynamic properties of a circadian system and the use of statistical models, primarily harmonic regression methods, to assess the static properties of the system. The dynamical behaviors characterized by the simulation studies are the response of the circadian pacemaker to light, its rate of decay to its limit cycle, and its response to the rest-activity cycle. The static properties are phase, amplitude, and period of the intrinsic oscillator. Formal statistical methods are not routinely employed in simulation studies, and therefore the uncertainty in inferences based on the differential equation models and their sensitivity to model specification and parameter estimation error cannot be evaluated. The harmonic regression models allow formal statistical analysis of static but not dynamical features of the circadian pacemaker. The authors present a paradigm for analyzing circadian data based on the Box iterative scheme for statistical model building. The paradigm unifies the differential equation-based simulations (direct problem) and the model fitting approach using harmonic regression techniques (inverse problem) under a single schema. The framework is illustrated with the analysis of a core-temperature data series collected under a forced desynchrony protocol. The Box iterative paradigm provides a framework for systematically constructing and analyzing models of circadian data.  相似文献   

15.
There has been growing interest in the likelihood paradigm of statistics, where statistical evidence is represented by the likelihood function and its strength is measured by likelihood ratios. The available literature in this area has so far focused on parametric likelihood functions, though in some cases a parametric likelihood can be robustified. This focused discussion on parametric models, while insightful and productive, may have left the impression that the likelihood paradigm is best suited to parametric situations. This article discusses the use of empirical likelihood functions, a well‐developed methodology in the frequentist paradigm, to interpret statistical evidence in nonparametric and semiparametric situations. A comparative review of literature shows that, while an empirical likelihood is not a true probability density, it has the essential properties, namely consistency and local asymptotic normality that unify and justify the various parametric likelihood methods for evidential analysis. Real examples are presented to illustrate and compare the empirical likelihood method and the parametric likelihood methods. These methods are also compared in terms of asymptotic efficiency by combining relevant results from different areas. It is seen that a parametric likelihood based on a correctly specified model is generally more efficient than an empirical likelihood for the same parameter. However, when the working model fails, a parametric likelihood either breaks down or, if a robust version exists, becomes less efficient than the corresponding empirical likelihood.  相似文献   

16.
Experimental research in adult attachment theory is faced with the challenge to adequately activate the adult attachment system. In view of the multitude of methods employed for this purpose so far, this paper suggests to further make use of the methodological advantages of semantic priming. In order to enable the use of such a paradigm in a German speaking context, a set of German words belonging to the semantic categories ‘interpersonal closeness’, ‘interpersonal distance’ and ‘neutral’ were identified and their semantics were validated combining production- and rating method. 164 university students answered corresponding online-questionnaires. Ratings were analysed using analysis of variance (ANOVA) and cluster analysis from which three clearly distinct groups emerged. Beyond providing validated stimulus- and target words which can be used to activate the adult attachment system in a semantic priming paradigm, the results of this study point at important links between attachment and stress which call for further investigation in the future.  相似文献   

17.
18.
陈曦  梁松斌 《生态学报》2023,43(20):8268-8278
随着生态系统服务研究范式转向,重新审视生态系统服务级联框架这一研究工具具有重要理论与现实意义。论文引入共同生产理论视角,运用归纳演绎法、逻辑推演法和比较分析法,提出生态系统服务级联框架重构思路。研究认为:(1)人的主体性关照是生态系统服务研究范式转向的重要特征,级联框架中人类活动反向作用于生态系统服务的关键性级联关系缺失;(2)公众参与是级联框架不可或缺的因素,学术界对此的认知在不断深化,但缺乏系统理论指导,需要深度公众参与理论支撑;(3)共同生产理论是对传统公众参与的超越,为重构级联框架的问题求解提供了"利益相关者-生产场域-生产周期"三重关联性要素;(4)重构的级联框架蕴含"生态系统服务流-人类活动流"双向耦合关系,可以为进一步研究提供完整性、连续性、系统性的分析框架。  相似文献   

19.
In complex software systems, modularity and readability tend to be degraded owing to inseparable interactions between concerns that are distinct features in a program. Such interactions result in tangled code that is hard to develop and maintain. Aspect-Oriented Programming (AOP) is a powerful method for modularizing source code and for decoupling cross-cutting concerns. A decade of growing research on AOP has brought the paradigm into many exciting areas. However, pioneering work on AOP has not flourished enough to enrich the design of distributed systems using the refined AOP paradigm. This article investigates three case studies that cover time-honored issues such as fault-tolerant computing, network heterogeneity, and object replication in the cluster computing community using the AOP paradigm. The aspects that we define here are simple, intuitive, and reusable. Our intensive experiences show that (i) AOP can improve the modularity of cluster computing software by separating the source code into base and instrumented parts, and (ii) AOP helps developers to deploy additional features to legacy cluster computing software without harming code modularity and system performance.  相似文献   

20.
We investigated whether presenting of dilutions of phenyl ethyl alcohol at random succession according to the method of constant stimuli can replace the standard procedure of presenting a various number of dilutions in a staircase paradigm. Forty-six men and 44 women, aged 19-76 years, participated in this study. Phenyl ethyl alcohol was diluted in a ratio of 1:2, starting from 4%. Presentation of the odorant followed a three-alternative, temporal forced-choice paradigm with two blanks in addition to the odorant. Twenty dilutions were administered in a randomized order. Odor threshold was obtained by logistic regression of the correct and incorrect identifications of the probe containing the odorant. Thresholds were also calculated on the basis of the first 16 dilution steps only. Results from these procedures were compared with 'gold-standard' threshold assessment employing a three-alternative, temporal forced-choice staircase paradigm with seven reversals using 16 dilutions of phenyl ethyl alcohol. The method of constant stimuli took a shorter and less variable testing time than the staircase technique. The use of 20 dilution steps provided no better results than the use of 16 steps. The method of constant stimuli exhibited a good test-retest reliability (r = 0.7; P < 0.001) comparable to that of the staircase method and provided unbiased results highly correlated (r = 0.8; P < 0.001) with those of the staircase technique with similar inter-test variability. Applying 16 dilutions (1:2 steps) of phenyl ethyl alcohol at random succession in a three-alternative, temporal forced-choice paradigm is thus a simple and reliable procedure for the reproducible assessment of odor thresholds that may be contemplated as an alternative to the 'gold-standard' staircase method of clinical odor threshold assessment.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号