首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
In this article the question of reconstructing a phylogeny from additive distance data is addressed. Previous algorithms used the complete distance matrix of then OTUs (Operational Taxonomic Unit), that corresponds to the tips of the tree. This usedO(n 2) computing time. It is shown that this is wasteful for biologically reasonable trees. If the tree has internal nodes with degrees that are bounded onO(n*log(n)) algorithm is possible. It is also shown if the nodes can have unbounded degrees the problem hasn 2 as lower bound.  相似文献   

2.
The “expensive tissue hypothesis” states that large brains are active at high metabolic rates, which have to be financed by a significant trade-off with other organs such as the alimentary tract. Recent morphological findings on primate brains and guts support this idea also considering the importance of high-energy diets as a possible driving power of this process. However, the trade-off correlation between brain and alimentary tract, the essence of the “expensive tissue hypothesis”, has not yet been tested using molecular data to complement morpho-functional findings. We therefore hypothesize that the activity of marker proteins expressed both in brain and alimentary tract should parallel functional morphology in organs at the molecular level. Thus, in animals feeding on hard to digest diet, we would expect a high concentration per unit mass of that marker protein in the digestive tract and reversely a lower concentration in the brain. In contrast, in animals feeding on easily-digested, high-energy food we would expect the reverse pattern. Recent preliminary studies suggest that carbonic anhydrase II (CA-II) could act as a marker. The enzyme concentration was found to increase in the brain with higher cerebral activity from cattle to humans and to reversely decrease in salivary secretions. The reverse concentration of CA-II in saliva and brains of cattle and primates might be the first molecular evidence of the validity of the “expensive tissue hypothesis”.  相似文献   

3.
We consider the problem of what is being optimized in human actions with respect to various aspects of human movements and different motor tasks. From the mathematical point of view this problem consists of finding an unknown objective function given the values at which it reaches its minimum. This problem is called the inverse optimization problem. Until now the main approach to this problems has been the cut-and-try method, which consists of introducing an objective function and checking how it reflects the experimental data. Using this approach, different objective functions have been proposed for the same motor action. In the current paper we focus on inverse optimization problems with additive objective functions and linear constraints. Such problems are typical in human movement science. The problem of muscle (or finger) force sharing is an example. For such problems we obtain sufficient conditions for uniqueness and propose a method for determining the objective functions. To illustrate our method we analyze the problem of force sharing among the fingers in a grasping task. We estimate the objective function from the experimental data and show that it can predict the force-sharing pattern for a vast range of external forces and torques applied to the grasped object. The resulting objective function is quadratic with essentially non-zero linear terms.  相似文献   

4.
5.
Evolutionary optimization has been proposed as a method to generate machine learning through automated discovery. A simulation of natural evolution is conducted using the traveling salesman problem as an artificial environment. For an exact solution of a traveling salesman problem, the only known algorithms require the number of steps to grow at least exponentially with the number of elements in the problem. Three adaptive techniques are described and analyzed. Evolutionary adaptation is demonstrated to be worthwhile in a variety of contexts. Local stagnation is prevented by allowing for the probabilistic survival of the simulated organisms. In complex problems, the final routing is estimated to be better than 99.99999999999% of all possible tours, even though only a small fraction (8.58 × 10–151) of the total number of tours are examined.  相似文献   

6.
Analytic expressions for the velocity profile and distribution of neutrally buoyant particles in laminar flow were obtained as functions of the radial distance. A modified Einstein viscosity model and the hypothesis that the total force on all the particles flowing in the tube is a minimum were used. The methods of the variational calculus were used in the mathematical development. A velocity profile differing only slightly from the parabolic form of that for Hagan-Poiseuille flow was obtained. For particle distribution the equations developed predict a maximum concentration along the center-line for some flows and a maximum concentration in a ring some distance from the center line in other flows. Both of these concentration profiles have been observed experimentally. Quantitative predictions from the equations derived must await further experimental work to permit calculation of the parameters included in the equations.  相似文献   

7.
Freese’s Hypothesis states that a single specific alteration in the sequence of nucleotides of an information-bearing DNA molecule results in a specific mutational effect. Within the framework of the DNA-protein coding problem developed elsewhere, and assuming the quasi-ergodicity of the general coding process, it is shown that Freese’s Hypothesis allows us to derive expressions for the length of the smallest mutable DNA molecule and to obtain a bound for the maximal number of allelic molecules of fixed length. To illustrate these ideas, calculations are carried out on appropriate data from bacternophage and man, and the results are shown to differ by a factor of 10 (modulo the rather crude approximations used). It is further shown that, if ρ(N) and ϱ(N) are respectively the number of information-bearing words of lengthN in a given code and the number of words of lengthN, then the number lim ρ(N)/ϱ(N) depends sensitively on the parameter ∈ which specifiesN→∞ the given code. The implications of this result for the spontaneous aggregation of a sufficient number of information-bearing words to characterize an organism are discussed. This research was supported by the United States Air Force through the Air Force Office of Scientific Research of the Air Research and Development Command, under Contract No. AF 49(638)-917.  相似文献   

8.
9.
It is widely acknowledged that integrating fossils into data sets of extant taxa is imperative for proper placement of fossils, resolution of relationships, and a better understanding of character evolution. The importance of this process has been further magnified because of the crucial role of fossils in dating divergence times. Outstanding issues remain, including appropriate methods to place fossils in phylogenetic trees, the importance of molecules versus morphology in these analyses, as well as the impact of potentially large amounts of missing data for fossil taxa. In this study we used the angiosperm clade Juglandaceae as a model for investigating methods of integrating fossils into a phylogenetic framework of extant taxa. The clade has a rich fossil record relative to low extant diversity, as well as a robust molecular phylogeny and morphological database for extant taxa. After combining fossil organ genera into composite and terminal taxa, our objectives were to (1) compare multiple methods for the integration of the fossils and extant taxa (including total evidence, molecular scaffolds, and molecular matrix representation with parsimony [MRP]); (2) explore the impact of missing data (incomplete taxa and characters) and the evidence for placing fossils on the topology; (3) simulate the phylogenetic effect of missing data by creating "artificial fossils"; and (4) place fossils and compare the impact of single and multiple fossil constraints in estimating the age of clades. Despite large and variable amounts of missing data, each of the methods provided reasonable placement of both fossils and simulated "artificial fossils" in the phylogeny previously inferred only from extant taxa. Our results clearly show that the amount of missing data in any given taxon is not by itself an operational guideline for excluding fossils from analysis. Three fossil taxa (Cruciptera simsonii, Paleoplatycarya wingii, and Platycarya americana) were placed within crown clades containing living taxa for which relationships previously had been suggested based on morphology, whereas Polyptera manningii, a mosaic taxon with equivocal affinities, was placed firmly as sister to two modern crown clades. The position of Paleooreomunnea stoneana was ambiguous with total evidence but conclusive with DNA scaffolds and MRP. There was less disturbance of relationships among extant taxa using a total evidence approach, and the DNA scaffold approach did not provide improved resolution or internal support for clades compared to total evidence, whereas weighted MRP retained comparable levels of support but lost crown clade resolution. Multiple internal minimum age constraints generally provided reasonable age estimates, but the use of single constraints provided by extinct genera tended to underestimate clade ages.  相似文献   

10.
The use of DNA microarrays opens up the possibility of measuring the expression levels of thousands of genes simultaneously under different conditions. Time-course experiments allow researchers to study the dynamics of gene interactions. The inference of genetic networks from such measures can give important insights for the understanding of a variety of biological problems. Most of the existing methods for genetic network reconstruction require many experimental data points, or can only be applied to the reconstruction of small subnetworks. Here we present a method that reduces the dimensionality of the dataset and then extracts the significant dynamic correlations among genes. The method requires a number of points achievable in common time-course experiments.  相似文献   

11.
Theory and computational scheme of three-dimensional structure and dynamic conformational properties of naturally occurring peptides are proposed basing on a known amino acid sequence. The diverse biological activity of a low-molecular peptide is shown to arise from a restricted number of preferable spatial structures which may occur under physiological conditions. Each particular function of an oligopeptide is connected to a definite spatial structure, belonging to the set of low-energy conformations from one biological activity of a peptide shift of the conformational equilibrium caused by a change of environmental conditions. This shift is provided for by specific intramolecular interactions, alternative in their nature, which stabilize a particular structure. An approach is suggested which enables to construct a synthetic analog with the predetermined physiologically active conformation, prior to all chemical and biological tests.  相似文献   

12.
This paper presents an approach to the well-known Travelling Salesman Problem (TSP) using Self-Organizing Maps (SOM). The SOM algorithm has interesting topological information about its neurons configuration on cartesian space, which can be used to solve optimization problems. Aspects of initialization, parameters adaptation, and complexity analysis of the proposed SOM based algorithm are discussed. The results show an average deviation of 3.7% from the optimal tour length for a set of 12 TSP instances.  相似文献   

13.
An evaluation of the molecular clock hypothesis using mammalian DNA sequences   总被引:38,自引:0,他引:38  
A statistical analysis of extensive DNA sequence data from primates, rodents, and artiodactyls clearly indicates that no global molecular clock exists in mammals. Rates of nucleotide substitution in rodents are estimated to be four to eight times higher than those in higher primates and two to four times higher than those in artiodactyls. There is strong evidence for lower substitution rates in apes and humans than in monkeys, supporting the hominoid slowdown hypothesis. There is also evidence for lower rates in humans than in apes, suggesting a further rate slowdown in the human lineage after the separation of humans from apes. By contrast, substitution rates are nearly equal in mouse and rat. These results suggest that differences in generation time or, more precisely, in the number of germline DNA replications per year are the primary cause of rate differences in mammals. Further, these differences are more in line with the neutral mutation hypothesis than if the rates are the same for short- and long-living mammals.  相似文献   

14.
C Y Meng  A P Dempster 《Biometrics》1987,43(2):301-311
Statistical analyses of simple tumor rates from an animal experiment with one control and one treated group typically consist of hypothesis testing of many 2 X 2 tables, one for each tumor type or site. The multiplicity of significance tests may cause excessive overall false-positive rates. This paper presents a Bayesian approach to the problem of multiple significance testing. We develop a normal logistic model that accommodates the incidences of all tumor types or sites observed in the current experiment simultaneously as well as their historical control incidences. Exchangeable normal priors are assumed for certain linear terms in the model. Posterior means, standard deviations, and Bayesian P-values are computed for an average treatment effect as well as for the effects on individual tumor types or sites. Model assumptions are checked using probability plots and the sensitivity of the parameter estimates to alternative priors is studied. The method is illustrated using tumor data from a chronic animal experiment.  相似文献   

15.
16.
17.
A system of quality control for gynecologic cytologic screening designed to minimize the problem of false-negative cytologic reports is proposed. The system involves the harvesting of "quality control" smears, at the time of colposcopic examination, from patients referred to colposcopy because of previous abnormal smears that had been interpreted as representing dysplasia or neoplasia. These quality control smears are then submitted to the cytology laboratory, indistinguishable from bona fide cases, among the routine smears. This appears to be a more effective means of quality control than the current standard system used in the United States, which involves cytology laboratories routinely rescreening 10% of the slides reported out as "negative".  相似文献   

18.
MOTIVATION: Epigenetic modifications are one of the critical factors to regulate gene expression and genome function. Among different epigenetic modifications, the differential histone modification sites (DHMSs) are of great interest to study the dynamic nature of epigenetic and gene expression regulations among various cell types, stages or environmental responses. To capture the histone modifications at whole genome scale, ChIP-seq technology is becoming a robust and comprehensive approach. Thus the DHMSs are potentially identifiable by comparing two ChIP-seq libraries. However, little has been addressed on this issue in literature. RESULTS: Aiming at identifying DHMSs, we propose an approach called ChIPDiff for the genome-wide comparison of histone modification sites identified by ChIP-seq. Based on the observations of ChIP fragment counts, the proposed approach employs a hidden Markov model (HMM) to infer the states of histone modification changes at each genomic location. We evaluated the performance of ChIPDiff by comparing the H3K27me3 modification sites between mouse embryonic stem cell (ESC) and neural progenitor cell (NPC). We demonstrated that the H3K27me3 DHMSs identified by our approach are of high sensitivity, specificity and technical reproducibility. ChIPDiff was further applied to uncover the differential H3K4me3 and H3K36me3 sites between different cell states. Interesting biological discoveries were achieved from such comparison in our study.  相似文献   

19.
An algorithm for locating the region in conformational space containing the global energy minimum of a polypeptide is described. Distances are used as the primary variables in the minimization of an objective function that incorporates both energetic and distance-geometric terms. The latter are obtained from geometry and energy functions, rather than nuclear magnetic resonance experiments, although the algorithm can incorporate distances from nuclear magnetic resonance data if desired. The polypeptide is generated originally in a space of high dimensionality. This has two important consequences. First, all interatomic distances are initially at their energetically most favorable values; i.e. the polypeptide is initially at a global minimum-energy conformation, albeit a high-dimensional one. Second, the relaxation of dimensionality constraints in the early stages of the minimization removes many potential energy barriers that exist in three dimensions, thereby allowing a means of escaping from three-dimensional local minima. These features are used in an algorithm that produces short trajectories of three-dimensional minimum-energy conformations. A conformation in the trajectory is generated by allowing the previous conformation in the trajectory to evolve in a high-dimensional space before returning to three dimensions. The resulting three-dimensional structure is taken to be the next conformation in the trajectory, and the process is iterated. This sequence of conformations results in a limited but efficient sampling of conformational space. Results for test calculations on Met-enkephalin, a pentapeptide with the amino acid sequence H-Tyr-Gly-Gly-Phe-Met-OH, are presented. A tight cluster of conformations (in three-dimensional space) is found with ECEPP energies (Empirical Conformational Energy Program for Peptides) lower than any previously reported. This cluster of conformations defines a region in conformational space in which the global-minimum-energy conformation of enkephalin appears to lie.  相似文献   

20.
The analysis of experimental data from the photocycle of bacteriorhodopsin (bR) as sums of exponentials has accumulated a large amount of information on its kinetics which is still controversial. One reason for ambiguous results can be found in the inherent instabilities connected with the fitting of noisy data by sums of exponentials. Nevertheless, there are strategies to optimize the experiments and the data analysis by a proper combination of well known techniques. This paper describes an applicable approach based on the correct weighting of the data, a separation of the linear and the non-linear parameters in the process of the least squares approximation, and a statistical analysis applying the correlation matrix, the determinant of Fisher's information matrix, and the variance of the parameters as a measure of the reliability of the results. In addition, the confidence regions for the linear approximation of the non-linear model are compared with confidence regions for the true non-linear model. Evaluation techniques and rules for an optimum experimental design are mainly exemplified by the analysis of numerically generated model data with increasing complexity. The estimation of the number of exponentials significant for the interpretation of a given set of data is demonstrated by using records from eight absorption and photocurrent experiments on the photocycle of bacteriorhodopsin.Offprint requests to: K.-H. Müller  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号