首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Brian Charlesworth 《Genetics》2015,200(3):667-669
The Genetic Society of America’s Thomas Hunt Morgan Medal is awarded to an individual GSA member for lifetime achievement in the field of genetics. For over 40 years, 2015 recipient Brian Charlesworth has been a leader in both theoretical and empirical evolutionary genetics, making substantial contributions to our understanding of how evolution acts on genetic variation. Some of the areas in which Charlesworth’s research has been most influential are the evolution of sex chromosomes, transposable elements, deleterious mutations, sexual reproduction, and life history. He also developed the influential theory of background selection, whereby the recurrent elimination of deleterious mutations reduces variation at linked sites, providing a general explanation for the correlation between recombination rate and genetic variation.Open in a separate windowI am grateful to the Genetics Society of America for honoring me with the Thomas Hunt Morgan Medal and for inviting me to contribute this essay. I have spent nearly 50 years doing research in population genetics. This branch of genetics uses knowledge of the rules of inheritance to predict how the genetic composition of a population will change under the forces of evolution and compares the predictions to relevant data. As our knowledge of how genomes are organized and function has increased, so has the range of problems confronted by population geneticists. We are, however, a relatively small part of the genetics community, and sometimes it seems that our field is regarded as less important than those branches of genetics concerned with the properties of cells and individual organisms.I will take this opportunity to explain why I believe that population genetics is useful to a broad range of biologists. The fundamental importance of population genetics is the basic insights it provides into the mechanisms of evolution, some of which are far from intuitively obvious. Many of these insights came from the work of the first generation of population geneticists, notably Fisher, Haldane, and Wright. Their mathematical models showed that, contrary to what was believed by the majority of biologists in the 1920s, natural selection operating on Mendelian variation can cause evolutionary change at rates sufficient to explain historical patterns of evolution. This led to the modern synthesis of evolution (Provine 1971). No one can claim to understand how evolution works without some basic understanding of classical population genetics; those who do run the risk of making mistakes such as asserting that rapid evolutionary change is most likely to occur in small founder populations (Mayr 1954).
As our knowledge of how genomes are organized and function has increased, so has the range of problems confronted by population geneticists. We are, however, a relatively small part of the genetics community, and sometimes it seems that our field is regarded as less important than those branches of genetics concerned with the properties of cells and individual organisms.—B.C.
The modern synthesis is getting on for 80 years old, so this argument will probably not convince skeptical molecular geneticists that population genetics has a lot to offer the modern biologist. I provide two examples of the useful role that population genetic studies can play. First, one of the most notable discoveries of the past 40 years was the finding that the genomes of most species contain families of transposable elements (TEs) with the capacity to make new copies that insert elsewhere in the genome (Shapiro 1983). This led to two schools of thought about why they are present in the genome. One claimed that TEs are maintained because they confer benefits on the host by producing adaptively useful mutations (Syvanen 1984); the other believed that they are parasites, maintained by their ability to replicate within the genome despite potentially deleterious fitness effects of TE insertions (Doolittle and Sapienza 1980; Orgel and Crick 1980).The second hypothesis can be tested by comparing population genetic predictions with the results of TE surveys within populations. In the early 1980s, Chuck Langley, myself and several collaborators tried to do just this, using populations of Drosophila melanogaster (Charlesworth and Langley 1989). The models predicted that most Drosophila TEs should be found at low population frequencies at their insertion sites. This is so because D. melanogaster populations have large effective sizes (Ne). Ne is essentially the number of individuals that genetically contribute to the next generation. Large Ne means that a very small selection pressure can keep deleterious elements at low frequencies. This is a consequence of one of the most important findings of classical population genetics—the fate of a variant in a population is the product of Ne and the strength of selection (Fisher 1930; Kimura 1962). If, for example, Ne is 1000, a mutation that reduces fitness relative to wild type by 0.001 will be eliminated from the population with near certainty.Using the crude tools then available (restriction mapping of cloned genomic regions and in situ hybridization of labeled TE probes to polytene chromosomes), we found that nearly all TEs are indeed present at low frequencies in the population (Charlesworth and Langley 1989). Most of the exceptions to this rule were found in genomic regions in which little crossing over occurs (Maside et al. 2005). This is consistent with Chuck’s proposal that a major contributor to the removal of TEs from the population is selection against aneuploid progeny created by crossing over among homologous TEs at different locations in the genome (Langley et al. 1988). It is now a familiar finding that nonrecombining genomes or genomic regions tend to be full of TEs and other kinds of repetitive sequences; the population genetic reasons for this, discussed by Charlesworth et al. (1994), are perhaps not so familiar.Modern genomic methods provide much more powerful means for identifying TE insertions. Recent population surveys using these methods have confirmed the older findings: most TEs in Drosophila are present at low frequencies, and there is statistical evidence for selection against insertions (Barron et al. 2014). This is consistent with the existence of elaborate molecular mechanisms for repressing TE activity, such as the Piwi-interacting RNA (piRNA) pathway of animals (Senti and Brennecke 2010); there would be no reason to evolve such mechanisms if TEs were harmless. In a few cases, TEs have swept to high frequencies or fixation, and there is convincing evidence that at least some of these events are associated with increased fitness caused by the TE insertions themselves (Barron et al. 2014). These cases do not contradict the intragenomic parasite hypothesis for the maintenance of TEs; favorable mutations induced by TEs are too rare to outweigh the elimination of deleterious insertions unless new insertions continually replace those that are lost.
From the theory of aging, to the degeneration of Y chromosomes, to the dynamics of transposable elements, our understanding of the genetic basis of evolution is deeper and richer as a result of Charlesworth’s many contributions to the field. —Charles Langley, University of California, Davis
My other example is a population genetics discovery about a fundamental biological process: the PRDM9 protein involved in establishing recombination hot spots in humans. This was enabled by the revolution in population genetics brought about by coalescence theory (Hudson 1990), which is a powerful tool for looking at the statistical properties of a sample from a population under the hypothesis of selective neutrality. The basic idea is simple: if we sample two homologous, nonrecombining haploid genomes (e.g., mitochondrial DNA) from a large population, there is a probability of 1/(2Ne) that they are derived from the same parental genome in the preceding generation; i.e., they coalesce (Ne is the effective population size for the genome region in question). If they fail to coalesce in that generation, there is a probability of 1/(2Ne) that they coalesce one generation further back, and so on. If n genomes are sampled, there is a bifurcating tree connecting them back to their common ancestor. The size and shape of this tree are highly random, so genetically independent components of the genome experience different trees, even if they share the same Ne. The properties of sequence variability in the sample can be modeled by throwing mutations at random onto the tree (Hudson 1990).Recombination causes different sites in the genome to experience different trees, but closely linked sites have much more similar trees than independent sites. At the level of sequence variability, close linkage results in nonrandom associations between neutral variants—linkage disequilibrium (LD). The extent of LD among neutral variants at different sites is determined by the product of Ne and the frequency of recombination between them c (Ohta and Kimura 1971; McVean 2002). Richard Hudson proposed a statistical method for estimating Nec from data on variants at multiple sites across the genome (Hudson 2001) that was implemented in a widely used computer program LDhat by Gil McVean and colleagues (McVean et al. 2002). Applications to large data sets on human sequence variability showed that the genome is full of recombination hot spots and cold spots, consistent with previous molecular genetic studies of specific loci (Myers et al. 2005). Most recombination occurs in hot spots and very little in between them, accounting for the fact that there is almost complete LD over tens or even hundreds of kilobases in humans. The identification of a large number of hot spots led to the discovery of a sequence motif bound by a zinc finger protein, PRDM9, at about the same time that mouse geneticists also discovered that PRDM9 promotes recombination (McVean and Myers 2010; Baudat et al. 2014). These discoveries have led to many interesting observations, such as associations between PRDM9 variants in humans and individual variation in recombination rates, generating an ongoing research program of great scientific interest (Baudat et al. 2014).With the ever-increasing use of genomic data, I am confident that many more such fruitful interactions between molecular and population genetics will take place. A take-home message is that more needs to be done to integrate training in population, molecular, and computational approaches to provide the next generation of researchers with the broad range of knowledge they will need.  相似文献   

2.
3.
4.
Faced with a shortage of trained nursing staff and a high wastage rate among learners the management of a district general hospital decided to close some of its acute beds. A beds committee attempted to minimize the effects of these closures by introducing a bed bureau, “pooling,” and a simple system for forecasting waiting list admissions.The figures for recruitment and wastage of nurses improved, and a very high turnover per available bed was achieved. This increased efficiency in numerical terms was not mirrored by an improvement in the morale of the doctors and nurses working on the wards, who were subjected to new pressures and considered that at times the standard of patient care deteriorated.  相似文献   

5.
Narra HP  Ochman H 《Current biology : CB》2006,16(17):R705-R710
Though bacteria are predominantly asexual, the genetic information in their genomes can be expanded and modified through mechanisms that introduce DNA from outside sources. Bacterial sex differs from that of eukaryotes in that it is unidirectional and does not involve gamete fusion or reproduction. The input of DNA during bacterial sex generates diversity in two ways--through the alteration of existing genes by recombination and through the introduction of novel sequences--and each of these processes has been shown to aid in the survival and diversification of lineages.  相似文献   

6.
Greenland S 《Biometrics》2000,56(3):915-921
Regression models with random coefficients arise naturally in both frequentist and Bayesian approaches to estimation problems. They are becoming widely available in standard computer packages under the headings of generalized linear mixed models, hierarchical models, and multilevel models. I here argue that such models offer a more scientifically defensible framework for epidemiologic analysis than the fixed-effects models now prevalent in epidemiology. The argument invokes an antiparsimony principle attributed to L. J. Savage, which is that models should be rich enough to reflect the complexity of the relations under study. It also invokes the countervailing principle that you cannot estimate anything if you try to estimate everything (often used to justify parsimony). Regression with random coefficients offers a rational compromise between these principles as well as an alternative to analyses based on standard variable-selection algorithms and their attendant distortion of uncertainty assessments. These points are illustrated with an analysis of data on diet, nutrition, and breast cancer.  相似文献   

7.
Callesen H  Bak A  Greve T 《Theriogenology》1992,38(5):959-968
Two Pregnant Mare Serum Gonadotrophin (PMSG) antisera were tested in 174 dairy cows that were superovulated with PMSG and were then given prostaglandin at 60 hours after PMSG. At 48 hours after injection of prostaglandin, the cows were given either PMSG antiserum (monoclonal (n=56) or polyclonal (n=57)), or saline as control (n=61). Ova (n=1,206) were recovered either nonsurgically or after slaughter. Of these, 757 were evaluated morphologically to be transferable embryos. A proportion of these embryos (n=295 from 52 flushed donors) were transferred to synchronized recipients and the pregnancy results were recorded. The reproductive function of 37 flushed donors was followed for 6 months after superovulation. No significant effect of the PMSG antisera could be demonstrated in any of the parameters studied (i.e., ovulation rate, number of follicles at collection, total yield of ova, fertilization rate, number of transferable embryos, pregnancy results after transfer of embryos, or period required by the donor cows for restitution of reproductive function after superovulation and recovery). It is concluded that use of PMSG antiserum did not improve the embryo yield in terms of the number and quality of transferable embryos or enhance normalization of reproductive function of the donor in the 6-month period after superovulation. Therefore, in an embryo transfer operation, the routine use of PMSG antiserum in a PMSG superovulation regimen in cattle is not recommended.  相似文献   

8.
9.
10.
C Schrauf  J Call  K Fuwa  S Hirata 《PloS one》2012,7(7):e41044
The extent to which tool-using animals take into account relevant task parameters is poorly understood. Nut cracking is one of the most complex forms of tool use, the choice of an adequate hammer being a critical aspect in success. Several properties make a hammer suitable for nut cracking, with weight being a key factor in determining the impact of a strike; in general, the greater the weight the fewer strikes required. This study experimentally investigated whether chimpanzees are able to encode the relevance of weight as a property of hammers to crack open nuts. By presenting chimpanzees with three hammers that differed solely in weight, we assessed their ability to relate the weight of the different tools with their effectiveness and thus select the most effective one(s). Our results show that chimpanzees use weight alone in selecting tools to crack open nuts and that experience clearly affects the subjects' attentiveness to the tool properties that are relevant for the task at hand. Chimpanzees can encode the requirements that a nut-cracking tool should meet (in terms of weight) to be effective.  相似文献   

11.
Feng Y  Luo L 《Amino acids》2008,35(3):607-614
This paper develops a novel sequence-based method, tetra-peptide-based increment of diversity with quadratic discriminant analysis (TPIDQD for short), for protein secondary-structure prediction. The proposed TPIDQD method is based on tetra-peptide signals and is used to predict the structure of the central residue of a sequence fragment. The three-state overall per-residue accuracy (Q 3) is about 80% in the threefold cross-validated test for 21-residue fragments in the CB513 dataset. The accuracy can be further improved by  taking long-range sequence information (fragments of more than 21 residues) into account in prediction. The results show the tetra-peptide signals can indeed reflect some relationship between an amino acid’s sequence and its secondary structure, indicating the importance of  tetra-peptide signals as the protein folding code in the protein structure prediction.  相似文献   

12.
Deoxyribonucleoside kinases(d NKs) phosphorylate deoxyribonucleosides to their corresponding monophosphate compounds. d Nks also phosphorylate deoxyribonucleoside analogues that are used in the treatment of cancer or viral infections. The study of the mammalian d NKs has therefore always been of great medical interest. However, during the last 20 years, research on d NKs has gone into nonmammalian organisms. In this review, we focus on non-viral d NKs, in particular their diversity and their practical applications. The diversity of this enzyme family in different organisms has proven to be valuable in studying the evolution of enzymes. Some of these newly discovered enzymes have been useful in numerous practical applications in medicine and biotechnology, and have contributed to our understanding of the structural basis of nucleoside and nucleoside analogue activation.  相似文献   

13.
A new germfree chicken cage for rearing chicks up to 3 or 4 weeks of age has been designed and is in use at the University of Missouri. The cage and the accessory parts, the small magnet and hinged door, the “Bactytector,” the built-in air filter assembly, and the glass top with connecting air outlet filter have been described in detail. The complete operating procedure for sterilizing the cage and diet and the method of adding sterile embryonated eggs have been outlined. Data on the effectiveness of the cage as a physical barrier to microbes have been presented.  相似文献   

14.
This article argues that policies aimed at sustainability need to address the spatial dimensions of environmental problems and their solutions. In particular, spatial configurations of economic activities deserve attention, which means addressing land use, infrastructure, trade, and transport. Unfortunately, good theory and indicators to support the analysis and design of spatial‐environmental policies are not fully developed. One approach that has become very popular in the last decade is the ecological footprint (EF). It is both an environmental accounting tool and aggregate indicator, which is used by scientists, environmental organizations, and popular media. Despite criticisms of the EF method in the past, its popularity has only increased. In fact, an increasing number of publications with an application of the EF appear in scientific journals. We review the EF approach from indicator‐methodology and welfare angles and assess its policy relevance. Our conclusion is that it does not offer any meaningful information for public policy.  相似文献   

15.
Metabolite profiling is commonly performed by GC–MS of methoximated trimethylsilyl derivatives. The popularity of this technique owes much to the robust, library searchable spectra produced by electron ionization (EI). However, due to extensive fragmentation, EI spectra of trimethylsilyl derivatives are commonly dominated by trimethylsilyl fragments (e.g. m/z 73 and 147) and higher m/z fragment ions with structural information are at low abundance. Consequently different metabolites can have similar EI spectra, and this presents problems for identification of “unknowns” and the detection and deconvolution of overlapping peaks. The aim of this work is to explore use of positive chemical ionization (CI) as an adjunct to EI for GC–MS metabolite profiling. Two reagent gases differing in proton affinity (CH4 and NH3) were used to analyse 111 metabolite standards and extracts from plant samples. NH3-CI mass spectra were simple and generally dominated by [MH]+ and/or the adduct [M+NH4]+. For the 111 metabolite standards, m/z 73 and 147 were less than 3% of basepeak in NH3-CI and less than 30% of basepeak in CH4-CI. With CH4-CI, [MH]+ was generally present but at lower relative abundance than for NH3-CI. CH4-CI spectra were commonly dominated by losses of CH4 [M+1-16]+, 1–3 TMSOH [M+1-nx90]+, and combinations of CH4 and TMSOH losses [M+1-nx90-16]+. CH4-CI and NH3-CI mass spectra are presented for 111 common metabolites, and CI is used with real samples to help identify overlapping peaks and aid identification via determination of the pseudomolecular ion with NH3-CI and structural information with CH4-CI.  相似文献   

16.
Niacin or niacin combinations were administered in dosage of 1.0 to 6.0 gm. to 31 hypercholesterolemic patients for periods up to three years. Eighty per cent were able to continue medication for long periods without significant side effects. Jaundice, apparently due to nicotinic acid, occurred in one patient. Liver toxicity will probably be a hazard in the use of this therapy.Significant and maintained serum cholesterol depression was achieved in 80 per cent of the patients who were able to take adequate dosage. Reduction in xanthomata was observed with cholesterol reduction. In some cases, when larger doses were not tolerated or were unsuccessful, combining 1.5 gm of niacin with small doses of estrogen or triparanol achieved the desired effect. Aluminum nicotinate had no important advantage over plain niacin.  相似文献   

17.
A study was carried out to determine whether the double diffusion gel test when applied to the serum of patients with clear-cut penicillin reactions of various types, might be useful for demonstrating the presence of precipitating antibody. Results did not demonstrate the antibody.The difference in results with this test obtained by various workers was not explained by the observations in this study.Other approaches to determination of the mechanism of the penicillin reaction are discussed, and it is noted that the hemagglutination test, newly applied to the penicillin reaction problem, may be useful after further investigation.  相似文献   

18.
19.
20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号