首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
2.
Does the Golgi self-organize or does it form around an instructive template? Evidence on both sides is piling up, but a definitive conclusion is proving elusive.In the battle to define the Golgi, discussions easily spiral into what can appear like nitpicking. In a contentious poster session, an entire worldview rests on whether you think a particular mutant is arrested with vesicles that are close to but distinct from the ER or almost budded from but still attached to the ER.Sometimes obscured by these details are the larger issues. This debate “gets to the fundamental issue of how you think of the Golgi,” says Ben Glick of the University of Chicago (Chicago, IL). “The dogma has been that you need a template to build an organelle. But in the secretory system it''s possible in principle that you could get de novo organization of structure. That''s the issue that stirs people emotionally and intellectually.”Then there are the collateral issues. There is an ongoing controversy about the nature of forward transport through the Golgi—it may occur via forward movement of small vesicles, or by gradual maturation of one cisterna to form the next. The cisternal maturation model “argues for a Golgi that can be made and consumed,” says Graham Warren (Yale University, New Haven, CT)—a situation that is more difficult to reconcile with Warren''s template-determined universe.Even more confusing is the situation in mitosis. Accounts vary wildly on how much of the Golgi disappears into the ER during mitosis. The answer would determine to what extent the cell has to rebuild the Golgi after mitosis, and what method it might use to do so.Several laboratories have made major contributions to address these issues. But none define them so clearly as those of Warren and Jennifer Lippincott-Schwartz (National Institutes of Health, Bethesda, MD). At almost every turn, on almost every issue, it seems that Warren and Lippincott-Schwartz reach opposite conclusions, sometimes based on similar or identical data.And yet, at least in public, there is a remarkable lack of rancor. “These are not easy experiments for us to do,” says Warren. “It''s all cutting-edge research and we are pushing the technology to the limit. Part of that is that you push your own interpretation.” For her part, Lippincott-Schwartz approaches a lengthy poster-session debate with Warren with something approaching glee. This is not triumphal glee, however. Rather, Lippincott-Schwartz seems to relish the opportunity to exchange ideas, and on this point Warren agrees. “Complacency is the worst thing to have in a field,” he says. The debate “has made all of us think a lot harder.”  相似文献   

3.
Hunter P 《EMBO reports》2011,12(6):504-507
New applications and technologies, and more rigorous safety measures could herald a new era for genetically modified crops with improved traits, for use in agriculture and the pharmaceutical industry.The imminent prospect of the first approval of a plant-made pharmaceutical (PMP) for human use could herald a new era for applied plant science, after a decade of public backlash against genetically modified crops, particularly in Europe. Yet, the general resistance to genetically modified organisms might have done plant biotechnology a favour in the long run, by forcing it to adopt more-rigorous procedures for efficacy and safety in line with the pharmaceutical industry. This could, in turn, lead to renewed vigour for plant science, with the promise of developing not only food crops that deliver benefits to consumers and producers, but also a wide range of new pharmaceuticals.This is certainly the view of David Aviezer, CEO of Protalix, an Israeli company that has developed what could become the first recombinant therapeutic protein from plants to treat Gaucher disease. The protein is called taliglucerase alpha; it is a recombinant human form of the enzyme glucocerebrosidase that is produced in genetically engineered carrot cells. This enzyme has a crucial role in the breakdown of glycolipids in the cell membrane and is either used to provide energy or for cellular recognition. Deficiency of this enzyme causes accumulation of lipids with a variety of effects including premature death.“My feeling is that there is a dramatic change in this area with a shift away from the direction where a decade ago biotech companies like Monsanto and Dow went with growing transgenic plants in an open field, and instead moving this process into a more regulatory well-defined process inside a clean room,” Aviezer said. “Now the process is taking place in confined conditions and is very highly regulated as in the pharmaceutical industry.”…resistance to genetically modified organisms might have done plant biotechnology a favour […] forcing it to adopt more-rigorous procedures for efficacy and safety…He argues that this is ushering in a new era for plant biotechnology that could lead to greater public acceptance, although he denies that the move to clean-room development has been driven purely by the environmental backlash against genetically modified organisms in the late 1990s and early 2000s. “That was one aspect, but I think the move has been coming more from an appreciation that biopharmaceuticals require a more regulatory defined setting than is achieved at the moment with transgenic plants.”Interest in deriving pharmaceuticals from plants, known colloquially as ‘pharming'', first took off in the 1990s after researchers showed that monoclonal antibodies could be made in tobacco plants (Hiatt et al, 1989). This led to genetic engineering of plants to produce vaccines, antibodies and proteins for therapeutics, but none gained regulatory approval, mostly because of safety concerns. Moreover, the plants were grown in open fields, therefore attracting the same criticisms as transgenic food crops. In fact, a recent study showed that the views of the public on pharming depended on the product and the means to produce it; the researchers found increasing acceptance if the plants were used to produce therapeutics against severe diseases and grown in containment (Pardo et al, 2009).However, it was the technical challenges involved in purification and the associated regulatory issues that really delayed the PMP field, according to George Lomonossoff, project leader in biological chemistry at the John Innes Centre for plant research in Norwich in the UK, part of the Biotechnology and Biological Sciences Research Council (BBSRC). “Extraction from plants required the development of systems which are not clogged by the large amounts of fibrous material, mainly cellulose, and the development of GMP [good manufacturing practice; quality and testing guidelines for pharmaceutical manufacture] compliant methods of purification which are distinct from those required from, say, mammalian cells,” said Lomonossoff. “All this is very time consuming.”“Secondly there was no regulatory framework in place to assess the risks associated with proteins produced in plants, and determining how equivalent they are to mammalian-cell-produced material and what kind of contaminants you might have to guard against,” Lomonossoff added. “Again, attempting to address all possible concerns is a lengthy and expensive process.” Yet recent work by Protalix and a few other companies, such as Dow Agrosciences, has given grounds for optimism that purification and GMP-compliant methods of production have finally been established, Lomonossoff added.…a recent study showed that the views of the public on pharming depended on the product and the means to produce it…The first important breakthrough for PMPs came in 2006, when Dow Agrosciences gained regulatory approval from the US Department of Agriculture for a vaccine against Newcastle disease, a contagious bird infection caused by paramyxovirus PMV-1. “Though the vaccine, produced in tobacco-suspension culture cells, was never deployed commercially, it showed that regulatory approval for a plant-made pharmaceutical can be obtained, albeit for veterinary use in this case,” Lomonossoff said.As approval is imminent for taliglucerase alpha for human use, it is natural to ask why plants, as opposed to micro-organisms and animals, are worth the effort as sources of vaccines, antibiotics or hormones. There are three reasons: first, plants can manufacture some existing drugs more cheaply; second, they can do it more quickly; and third, and perhaps most significantly, they will be able to manufacture more complex proteins that cannot be produced with sufficient yield in any other way.An important example in the first category is insulin, which is being manufactured in increasing quantities to treat type 1 diabetes and some cases of type 2 diabetes. Until the arrival of recombinant DNA technology, replacement insulin was derived from the pancreases of animals in abattoirs, mostly cattle and pigs, but it is now more often produced from transgenic Escherichia coli, or sometimes yeast. Recently, there has been growing interest in using plants rather than bacteria as sources of insulin (Davidson, 2004; Molony et al, 2005). SemBioSys, a plant biotechnology company based in Calgary, Canada, is now developing systems to produce insulin and other therapeutic proteins in the seeds of safflower, an oilseed crop (Boothe et al, 2009).…plants can in principle be engineered to produce any protein, including animal ones…“We have developed technology that combines the high-capacity, low-cost production of therapeutic proteins in seeds with a novel technology that simplifies downstream purification,” said Joseph Boothe, vice president of research and development at SemBioSys. “The target proteins are engineered to associate with small, oil-containing structures within the seed known as oilbodies,” Boothe explained. “When extracted from the seed these oilbodies and associated proteins can be separated from other components by simple centrifugation. As a result, much of the heavy lifting around the initial purification is accomplished without chromatography, providing for substantial cost savings.”The second potential advantage of PMPs is their speed to market, which could prove most significant for the production of vaccines, either against emerging diseases or seasonal influenza, for which immunological changes in the virus mean that newly formulated vaccines are required each year. “In terms of a vaccine, I think influenza is very promising particularly as speed is of the essence in combating new strains,” Lomonossoff said. “Using transient expression methods, you can go from sequence to expressed protein in two weeks.” Transient gene expression involves injection of genes into a cell to produce a target protein, rather than permanently incorporating the gene into a host genome. This is emerging as a less technically difficult and faster alternative to developing stable cell lines for expressing bioengineered proteins. The process of injecting the desired gene into the target genome, known as transfection, can be effected not only by viruses, but also by non-viral agents including various lipids, polyethylenine and calcium phosphate.The last of the three advantages of plants for pharmaceutical production—the ability to manufacture proteins not available by other means—is creating perhaps the greatest excitement. The Protalix taliglucerase alpha protein falls into this category, and is likely to be followed by other candidates for treating disorders that require enzymes or complex molecules beyond the scope of bacteria, according to Aviezer. “I would say that for simpler proteins, bacteria will still be the method of choice for a while,” Aviezer said. “But for more complex proteins currently made via mammalian cells, I think we can offer a very attractive alternative using plant cells.”Indeed, plants can in principle be engineered to produce any protein, including animal ones, as Boothe pointed out. “In some cases this may require additional genetic engineering to enable the plant to perform certain types of protein modification that differ between plants and animals,” he said. “The classic example of this is glycosylation. With recent advances in the field it is now possible to engineer plants to glycosylate proteins in a manner similar to that of mammalian cells.” Glycosylation is a site-directed process that adds mono- or polysaccharides to organic molecules, and plays a vital role in folding and conferring stability on the finished molecule or macromolecule. Although plants can be engineered to perform it, bacteria generally cannot, which is a major advantage of plant systems over micro-organisms for pharmaceutical manufacture, according to Aviezer. “This enables plant systems to do complex folding and so make proteins for enzyme replacement or antibodies,” Aviezer said.Genomic-assisted breeding is being used either as a substitute for, or a complement to, genetic-modification techniques…In addition to plants themselves, their viruses also have therapeutic potential, either to display epitopes—the protein, sugar or lipid components of antigens on the surface of an infectious agent—so as to trigger an immune response or, alternatively, to deliver a drug directly into a cell. However, as Lomonossoff pointed out, regulatory authorities remain reluctant to approve any compound containing foreign nucleic acids for human use because of the risk of infection as a side effect. “I hope the empty particle technology [viruses without DNA] we have recently developed will revive this aspect,” Lomonossoff said. “The empty particles can also be used as nano-containers for targeted drug delivery and we are actively exploring this.”As pharmaceutical production is emerging as a new field for plant biology, there is a small revolution going on in plant breeding, with the emergence of genomic techniques that allow simultaneous selection across several traits. Although genetic modification can, by importing a foreign gene, provide instant expression of a desired trait, such as drought tolerance, protein content or pesticide resistance, the new field of genomics-assisted breeding has just as great potential through selection of unique variants within the existing gene pool of a plant, according to Douwe de Boer, managing director of the Netherlands biotech group Genetwister. “With this technology it will be possible to breed faster and more efficiently, especially for complex traits that involve multiple genes,” he said. “By using markers it is possible to combine many different traits in one cultivar, variety, or line in a pre-planned manner and as such breed superior crops.”“The application of genomics technologies and next generation sequencing will surely revolutionize plant breeding and will eventually allow this to be achieved with clinical precision”Genomic-assisted breeding is being used either as a substitute for, or a complement to, genetic-modification techniques, both for food crops to bolt on traits such as nutrient value or drought resistance, and for pharmaceutical products, for example to increase the yield of a desired compound or reduce unwanted side effects. Yet, there is more research required to make genomic-assisted breeding as widely used as established genetic-modification techniques. “The challenge in our research is to find markers for each trait and as such we extensively make use of bio-informatics for data storage, analysis and visualization,” de Boer said.The rewards are potentially enormous, according to Alisdair Fernie, a group leader from the Max-Planck-Institute for Molecular Plant Physiology in Potsdam, Germany. “Smart breeding will certainly have a massive impact in the future,” Fernie said. “The application of genomics technologies and next generation sequencing will surely revolutionize plant breeding and will eventually allow this to be achieved with clinical precision.” The promise of such genomic technologies in plants extends beyond food and pharmaceuticals to energy and new materials or products such as lubricants; the potential of plants is that they are not just able to produce the desired compound, but can often do so more quickly, efficiently and cheaply than competing biotechnological methods.  相似文献   

4.
Crop shortages     
A lack of breeders to apply the knowledge from plant science is jeopardizing public breeding programmes and the training of future plant scientistsIn the midst of an economic downturn, many college and university students in the USA face an uncertain future. There is one crop of graduates, though, who need not worry about unemployment: plant breeders. “Our students start with six-digit salaries once they leave and they have three or four offers. We have people coming to molecular biology and they can''t find jobs. People coming to plant breeding, they have as many jobs as they want,” said Edward Buckler, a geneticist with the US Department of Agriculture''s Agricultural Research Service Institute for Genomic Diversity at Cornell University (Ithaca, NY, USA).The lure of Big Ag depletes universities and research institutes of plant breeders […] and jeopardizes the training of future generations of plant scientists and breedersThe secret behind the success of qualified breeders on the job market is that they can join ‘Big Ag''—big agriculture—that is, major seed companies. Roger Boerma, coordinator of academic research for the Center for Applied Genetic Technologies at the University of Georgia (Athens, GA, USA), said that most of his graduate and postdoctoral students find jobs at companies such as Pioneer, Monsanto and Syngenta, rather than working in the orchards and fields of academic research. According to Todd Wehner, a professor and cucurbit breeder at the Department of Horticultural Science, North Carolina State University (Raleigh, NC, USA), the best-paying jobs—US$100,000 plus good benefits and research conditions—are at seed companies that deal with the main crops (Guner & Wehner, 2003). By contrast, university positions typically start at US$75,000 and tenure track.As a result, Wehner said, public crop breeding in the USA has begun to disappear. “To be clear, there is no shortage of plant breeders,” he said. “There is a shortage of plant breeders in the public sector.” The lure of Big Ag depletes universities and research institutes of plant breeders—who, after all, are the ones who create new plant varieties for agriculture—and jeopardizes the training of future generations of plant scientists and breeders. Moreover, there is an increasing demand for breeders to address the challenge of creating environmentally sustainable ways to grow more food for an increasing human population on Earth.At the same time, basic plant research is making rapid progress. The genomes of most of the main crop plants and many vegetables have been sequenced, which has enabled researchers to better understand the molecular details of how plants fend off pests and pathogens, or withstand drought and flooding. This research has also generated molecular markers—short regions of DNA that are linked to, for example, better resistance to fungi or other pathogens. So-called marker-assisted breeding based on this information is now able to create new plant varieties more effectively than would be possible with the classical strategy of crossing, selection and backcrossing.However, applying the genomic knowledge requires both breeders and plant scientists with a better understanding of each other''s expertise. As David Baulcombe, professor of botany at the University of Cambridge, UK, commented, “I think the important gap is actually in making sure that the fundamental scientists working on genomics understand breeding, and equally that those people doing breeding understand the potential of genomics. This is part of the translational gap. There''s incomplete understanding on both sides.”…applying the genomic knowledge requires both breeders and plant scientists with a better understanding of each other''s expertiseIn the genomic age, plant breeding has an image problem: like other hands-on agricultural work, it is dirty and unglamorous. “A research project in agriculture in the twenty-first century resembles agriculture for farmers in the eighteenth century,” Wehner said. “Harvesting in the fields in the summer might be considered one of the worst jobs, but not to me. I''m harvesting cucumbers just like everybody else. I don''t mind working at 105 degrees, with 95% humidity and insects biting my ankles. I actually like that. I like that better than office work.”For most students, however, genomics is the more appealing option as a cutting-edge and glamorous research field. “The exciting photographs that you always see are people holding up glass test tubes and working in front of big computer screens,” Wehner explained.In addition, Wehner said that federal and state governments have given greater priority and funding to molecular genetics than to plant breeding. “The reason we''ve gone away from plant breeding of course is that faculty can get competitive grants for large amounts of money to do things that are more in the area of molecular genetics,” he explained. “Plant breeders have switched over to molecular genetics because they can get money there and they can''t get money in plant breeding.”“The frontiers of science shifted from agriculture to genetics, especially the genetics of corn, wheat and rice,” agreed Richard Flavell, former Director of the John Innes Centre (Norwich, UK) and now Chief Scientific Officer of Ceres (Thousand Oaks, CA, USA). “As university departments have chased their money, chased the bright students, they have [focused on] programmes that pull in research dollars on the frontiers, and plant breeding has been left behind as something of a Cinderella subject.”In the genomic age, plant breeding has an image problem: like other hands-on agricultural work, it is dirty and unglamorousIn a sense, public plant breeding has become a victim of its own success. Wehner explained that over the past century, the protection of intellectual property has created a profitable market for private corporations to the detriment of public programmes. “It started out where they could protect seed-propagated crops,” he said. “The companies began to hire plant breeders and develop their own varieties. And that started the whole agricultural business, which is now huge.”As a result, Wehner said, the private sector can now outmanoeuvre public breeders at will. “[Seed companies] have huge teams that can go much faster than I can go. They have winter nurseries and big greenhouses and lots of pathologists and molecular geneticists and they have large databases and seed technologists and sales reps and catalogue artists and all those things. They can do much faster cucumber breeding than I can. They can beat me in any area that they choose to focus on.”He said that seed corporations turn only to public breeders when they are looking for rare seeds obtained on expeditions around the world or special knowledge. These crops and the breeders and other scientists who work on them receive far less financial support from government than do the more profitable crops, such as corn and soybean. In effect, these crops are in an analogous position to orphan drugs that receive little attention because the patients who need them represent a small economic market.The dwindling support for public breeding programmes is also a result of larger political developments. Since the 1980s, when British Prime Minister Margaret Thatcher and US President Ronald Regan championed the private sector in all things, government has consistently withdrawn support for public research programmes wherever the private sector can profit. “Plant breeding programmes are expensive. My programme costs about US$500,000 a year to run for my crops, watermelon and cucumber. Universities don''t want to spend that money if they don''t have to, especially if it''s already being done by the private sector,” Wehner said.“Over the last 30 years or so, food supplies and food security have fallen off the agenda of policymakers”…“Over the last 30 years or so, food supplies and food security have fallen off the agenda of policymakers,” Baulcombe explained. “Applied research in academic institutions is disappearing, and so the opportunities for linking the achievements of basic research with applications, at least in the public sector, are disappearing. You''ve got these two areas of the work going in opposite directions.”There''s another problem for plant breeding in the publish-or-perish world of academia. According to Ian Graham, Director of the Centre for Novel Agricultural Products at York University in the UK, potential academics in the plant sciences are turned off by plant breeding as a discipline because it is difficult to publish the research in high-impact journals.Graham, who is funded by the Bill & Melinda Gates Foundation to breed new varieties of Artemisia—the plant that produces the anti-malarial compound artemisinin—said this could change. “Now with the new [genomic] technologies, the whole subject of plant breeding has come back into the limelight. We can start thinking seriously about not just the conventional crops […] but all the marginal crops as well that we can really start employing these technologies on and doing exciting science and linking phenotypes to genes and phenotypes to the underlying biology,” he said. “It takes us back again closer to the science. That will bring more people into plant breeding.”…potential academics in the plant sciences are turned off by plant breeding as a discipline because it is difficult to publish the research in high-impact journalsBuckler, who specializes in functional genomic approaches to dissect complex traits in maize, wheat and Arabidopsis, said that public breeding still moves at a slower pace. “The seed companies are trying to figure out how to move genomics from gene discovery all the way to the breeding side. And it''s moving forward,” he said. “There have been some real intellectual questions that people are trying to overcome as to how fast to integrate genomics. I think it''s starting to occur also with a lot of the public breeders. A lot of it has been that the cost of genotyping, especially for specialty crops, was too high to develop marker systems that would really accelerate breeding.”Things might be about to change on the cost side as well. Buckler said that decreasing costs for sequencing and genotyping will give public breeding a boost. Using today''s genomic tools, researchers and plant breeders could match the achievements of the last century in maize breeding within three years. He said that comparable gains could be made in specialty crops, the forte of public breeding. “Right now, most of the simulations suggest that we can accelerate it about threefold,” Buckler said. “Maybe as our knowledge increases, maybe we can approach a 15-fold rate increase.”Indeed, the increasing knowledge from basic research could well contribute to significant advances in the coming years. “We''ve messed around with genes in a rather blind, sort of non-predictive process,” said Scott Jackson, a plant genomics expert at Purdue University (West Lafayette, IN, USA), who headed the team that decoded the soybean genome (Schmutz et al, 2010). “Having a full genome sequence, having all the genes underlying all the traits in whatever plant organism you''re looking at, makes it less blind. You can determine which genes affect the trait and it has the potential to make it a more predictive process where you can take specific genes in combinations and you can predict what the outcome might be. I think that''s where the real revolution in plant breeding is going to come.”Nevertheless, the main problem that could hold back this revolution is a lack of trained people in academia and the private sector. Ted Crosbie, Head of Plant Breeding at Monsanto (St Louis, MO, USA), commented at the national Plant Breeding Coordinating Committee meeting in 2008 that “[w]e, in the plant breeding industry, face a number of challenges. More plant breeders are reaching retirement age at a time when the need for plant breeders has never been greater […] We need to renew our nation''s capacity for plant breeding.”“…with the new [genomic] technologies, the whole subject of plant breeding has come back into the limelight”Dry bean breeder James Kelly, a professor of crop and soil sciences at Michigan State University (East Lansing, MI, USA), said while there has been a disconnect between public breeders and genomics researchers, new federal grants are designed to increase collaboration.In the meantime, developing countries such as India and China have been filling the gap. “China is putting a huge amount of effort into agriculture. They actually know the importance of food. They have plant breeders all over the place,” Wehner said. “The US is starting to fall behind. And now, agricultural companies are looking around wondering—where are we going to get our plant breeders?”To address the problem, major agriculture companies have begun to fund fellowships to train new plant breeders. Thus far, Buckler said, these efforts have had only a small impact. He noted that 500 new PhDs a year are needed just in maize breeding. “It''s not uncommon for the big companies like Monsanto, Pioneer and Syngenta to spend money on training, on endowing chairs at universities,” Flavell said. “It''s good PR, but they''re serious about the need for breeders.”The US government has also taken some measures to alleviate the problem. Congress decided to establish the US National Institute of Food and Agriculture (Washington, DC, USA) under the auspices of the US Department of Agriculture to make more efficient use of research money, advance the application of plant science and attract new students to plant breeding (see the interview with Roger Beachy in this issue, pp 504–507). Another approach is to use distance education to train breeders, such as technicians who want to advance their careers, in certificate programmes rather than master''s or doctorate programmes.“If [breeding] is not done in universities in the public sector, where is it done?”…“If [breeding] is not done in universities in the public sector, where is it done?” Flavell asked about the future of public breeding. “I can wax lyrical and perhaps be perceived as being over the top, but if we''re going to manage this planet on getting more food out of less land, this has to be almost one of the highest things that has got to be taken care of by government.” Wehner added, “The public in the developed world thinks food magically appears in grocery stores. There is no civilization without agriculture. Without plant breeders to work on improving our crops, civilization is at risk.”  相似文献   

5.
Geneticists and historians collaborated recently to identify the remains of King Richard III of England, found buried under a car park. Genetics has many more contributions to make to history, but scientists and historians must learn to speak each other''s languages.The remains of King Richard III (1452–1485), who was killed with sword in hand at the Battle of Bosworth Field at the end of the War of the Roses, had lain undiscovered for centuries. Earlier this year, molecular biologists, historians, archaeologists and other experts from the University of Leicester, UK, reported that they had finally found his last resting place. They compared ancient DNA extracted from a scoliotic skeleton discovered under a car park in Leicester—once the site of Greyfriars church, where Richard was rumoured to be buried, but the location of which had been lost to time—with that of a seventeenth generation nephew of King Richard: it was a match. Richard has captured the public imagination for centuries: Tudor-friendly playwright William Shakespeare (1564–1616) portrayed Richard as an evil hunchback who killed his nephews in order to ascend to the throne, whilst in succeeding years others have leapt to his defence and backed an effort to find his remains.The application of genetics to history is revealing much about the ancestry and movements of groups of humans, from the fall of the Roman Empire to ancient ChinaMolecular biologist Turi King, who led the Leicester team that extracted the DNA and tracked down a descendant of Richard''s older sister, said that Richard''s case shows how multi-disciplinary teams can join forces to answer history''s questions. “There is a lot of talk about what meaning does it have,” she said. “It tells us where Richard III was buried; that the story that he was buried in Greyfriars is true. I think there are some people who [will] try and say: “well, it''s going to change our view of him” […] It won''t, for example, tell us about his personality or if he was responsible for the killing of the Princes in the Tower.”The discovery and identification of Richard''s skeleton made headlines around the world, but he is not the main prize when it comes to collaborations between historians and molecular biologists. Although some of the work has focused on high-profile historic figures—such as Louis XVI (1754–1793), the only French king to be executed, and Vlad the Impaler, the Transylvanian royal whose patronymic name inspired Bram Stoker''s Dracula (Fig 1)—many other projects involve population studies. Application of genetics to history is revealing much about the ancestry and movements of groups of humans, from the fall of the Roman Empire to ancient China.Open in a separate windowFigure 1The use of molecular genetics to untangle history. Even when the historical record is robust, molecular biology can contribute to our understanding of important figures and their legacies and provide revealing answers to questions about ancient princes and kings.Medieval historian Michael McCormick of Harvard University, USA, commented that historians have traditionally relied on studying records written on paper, sheepskin and papyrus. However, he and other historians are now teaming up with geneticists to read the historical record written down in the human genome and expand their portfolio of evidence. “What we''re seeing happening now—because of the tremendous impact from the natural sciences and particularly the application of genomics; what some of us are calling genomic archaeology—is that we''re working back from modern genomes to past events reported in our genomes,” McCormick explained. “The boundaries between history and pre-history are beginning to dissolve. It''s a really very, very exciting time.”…in the absence of written records, DNA and archaeological records could help fill in gapsMcCormick partnered with Mark Thomas, an evolutionary geneticist at University College London, UK, to try to unravel the mystery of one million Romano-Celtic men who went missing in Britain after the fall of the Roman Empire. Between the fourth and seventh centuries, Germanic tribes of Angles, Saxons and Jutes began to settle in Britain, replacing the Romano-British culture and forcing some of the original inhabitants to migrate to other areas. “You can''t explain the predominance of the Germanic Y chromosome in England based on the population unless you imagine (a) that they killed all the male Romano-Celts or (b) there was what Mark called ‘sexual apartheid'' and the conquerors mated preferentially with the local women. [The latter] seems to be the best explanation that I can see,” McCormick said of the puzzle.Ian Barnes, a molecular palaeobiologist at Royal Holloway University of London, commented that McCormick studies an unusual period, for which both archaeological and written records exist. “I think archaeologists and historians are used to having conflicting evidence between the documentary record and the archaeological record. If we bring in DNA, the goal is to work out how to pair all the information together into the most coherent story.”Patrick Geary, Professor of Western Medieval History at the Institute for Advanced Study in Princeton, New Jersey, USA, studies the migration period of Europe: a time in the first millennium when Germanic tribes, including the Goths, Vandals, Huns and Longobards, moved across Europe as the Roman Empire was declining. “We do not have detailed written information about these migrations or invasions or whatever one wants to call them. Primarily what we have are accounts written later on, some generations later, from the contemporary record. What we tend to have are things like sermons bemoaning the faith of people because God''s wrath has brought the barbarians on them. Hardly the kind of thing that gives us an idea of exactly what is going on—are these really invasions, are they migrations, are they small military groups entering the Empire? And what are these ‘peoples'': biologically related ethnic groups, or ad hoc confederations?” he said.Geary thinks that in the absence of written records, DNA and archaeological records could help fill in the gaps. He gives the example of jewellery, belt buckles and weapons found in ancient graves in Hungary and Northern and Southern Italy, which suggest migrations rather than invasions: “If you find this kind of jewellery in one area and then you find it in a cemetery in another, does it mean that somebody was selling jewellery in these two areas? Does this mean that people in Italy—possibly because of political change—want to identify themselves, dress themselves in a new style? This is hotly debated,” Geary explained. Material goods can suggest a relationship between people but the confirmation will be found in their DNA. “These are the kinds of questions that nobody has been able to ask because until very recently, DNA analysis simply could not be done and there were so many problems with it that this was just hopeless,” he explained. Geary has already collected some ancient DNA samples and plans to collect more from burial sites north and south of the Alps dating from the sixth century, hoping to sort out kinship relations and genetic profiles of populations.King said that working with ancient DNA is a tricky business. “There are two reasons that mitochondrial DNA (mtDNA) is the DNA we wished to be able to analyse in [King] Richard. In the first instance, we had a female line relative of Richard III and mtDNA is passed through the female line. Fortunately, it''s also the most likely bit of DNA that we''d be able to retrieve from the skeletal remains, as there are so many copies of it in the cell. After death, our DNA degrades, so mtDNA is easier to retrieve simply due to the sheer number of copies in each cell.”Geary contrasted the analysis of modern and ancient DNA. He called modern DNA analysis “[…] almost an industrial thing. You send it off to a lab, you get it back, it''s very mechanical.” Meanwhile, he described ancient DNA work as artisanal, because of degeneration and contamination. “Everything that touched it, every living thing, every microbe, every worm, every archaeologist leaves DNA traces, so it''s a real mess.” He said the success rate for extracting ancient mtDNA from teeth and dense bones is only 35%. The rate for nuclear DNA is only 10%. “Five years ago, the chances would have been zero of getting any, so 10% is a great step forward. And it''s possible we would do even better because this is a field that is rapidly transforming.”But the bottleneck is not only the technical challenge to extract and analyse ancient DNA. Historians and geneticists also need to understand each other better. “That''s why historians have to learn what it is that geneticists do, what this data is, and the geneticists have to understand the kind of questions that [historians are] trying to ask, which are not the old nineteenth century questions about identity, but questions about population, about gender roles, about relationship,” Geary said.DNA analysis can help to resolve historical questions and mysteries about our ancestors, but both historians and geneticists are becoming concerned about potential abuses and frivolous applications of DNA analysis in their fields. Thomas is particularly disturbed by studies based on single historical figures. “Unless it''s a pretty damn advanced analysis, then studying individuals isn''t particularly useful for history unless you want to say something like this person had blue eyes or whatever. Population level studies are best,” he said. He conceded that the genetic analysis of Richard III''s remnants was a sound application but added that this often is not the case with other uses, which he referred to as “genetic astrology.” He was critical of researchers who come to unsubstantiated conclusions based on ancient DNA, and scientific journals that readily publish such papers.…both historians and geneticists are becoming concerned about potential abuses or frivolous applications of DNA analysis in their fieldsThomas said that it is reasonable to analyse a Y chromosome or mtDNA to estimate a certain genetic trait. “But then to look at the distribution of those, note in the tree where those types are found, and informally, interpretively make inferences—“Well this must have come from here and therefore when I find it somewhere else then that means that person must have ancestors from this original place”—[…] that''s deeply flawed. It''s the most widely used method for telling historical stories from genetic data. And yet is easily the one with the least credibility.” Thomas criticized such facile use of genetic data, which misleads the public and the media. “I suppose I can''t blame these [broadcast] guys because it''s their job to make the programme look interesting. If somebody comes along and says ‘well, I can tell you you''re descended from some Viking warlord or some Celtic princess'', then who are they to question.”Similarly, the historians have reservations about making questionable historical claims on the basis of DNA analysis. Geary said the use of mtDNA to identify Richard III was valuable because it answered a specific, factual question. However, he is turned off by other research using DNA to look at individual figures, such as a case involving a princess who was a direct descendant of the woman who posed for Leonardo Da Vinci''s Mona Lisa. “There''s some people running around trying to dig up famous people and prove the obvious. I think that''s kind of silly. There are others that I think are quite appropriate, and while is not my kind of history, I think it is fine,” he said. “The Richard III case was in the tradition of forensics.”…the cases in which historians and archaeologists work with molecular biologists are rare and remain disconnected in general from the mainstream of historical or archaeological researchNicola Di Cosmo, a historian at the Institute for Advanced Study, who is researching the impact of climate change on the thirteenth century Mongol empire, follows closely the advances in DNA and history research, but has not yet applied it to his own work. Nevertheless, he said that genetics could help to understand the period he studies because there are no historical documents, although monumental burials exist. “It is important to get a sense of where these people came from, and that''s where genetics can help,” he said. He is also concerned about geneticists who publish results without involving historians and without examining other records. He cited a genetic study of a so-called ‘Eurasian male'' in a prestige burial of the Asian Hun Xiongnu, a nomadic people who at the end of the third century B.C. formed a tribal league that dominated most of Central Asia for more than 500 years. “The conclusion the geneticists came to was that there was some sort of racial tolerance in this nomadic empire, but we have no way to even assume that they had any concept of race or tolerance.”Di Cosmo commented that the cases in which historians and archaeologists work with molecular biologists are rare and remain disconnected in general from the mainstream of historical or archaeological research. “I believe that historians, especially those working in areas for which written records are non-existent, ought to be taking seriously the evidence churned out by genetic laboratories. On the other hand, geneticists must realize that the effectiveness of their research is limited unless they access reliable historical information and understand how a historical argument may or may not explain the genetic data” [1].Notwithstanding the difficulties in collaboration between two fields, McCormick is excited about historians working with DNA. He said the intersection of history and genomics could create a new scientific discipline in the years ahead. “I don''t know what we''d call it. It would be a sort of fusion science. It certainly has the potential to produce enormous amounts of enormously interesting new evidence about our human past.”  相似文献   

6.
Hunter P 《EMBO reports》2010,11(12):924-926
The global response to the credit crunch has varied from belt tightening to spending sprees. Philip Hunter investigates how various countries react to the financial crisis in terms of supporting scientific research.The overall state of biomedical research in the wake of the global financial crisis remains unclear amid growing concern that competition for science funding is compromising the pursuit of research. Such concerns pre-date the credit crunch, but there is a feeling that an increasing amount of time and energy is being wasted in the ongoing scramble for grants, in the face of mounting pressure from funding agencies demanding value for money. Another problem is balancing funding between different fields; while the biomedical sciences have generally fared well, they are increasingly dependent on basic research in physics and chemistry that are in greater jeopardy. This has led to calls for rebalancing funding, in order to ensure the long-term viability of all fields in an increasingly multidisciplinary and collaborative research world.For countries that are cutting funding—such as Spain, Italy and the UK—the immediate priority is to preserve the fundamental research base and avoid a significant drain of expertise, either to rival countries or away from science altogether. This has highlighted the plight of postdoctoral researchers who have traditionally been the first to suffer from funding cuts, partly because they have little immediate impact on on a country''s scientific competitiveness. Postdocs have been the first to go whenever budgets have been cut, according to Richard Frankel, a physicist at California Polytechnic State University in Saint Luis Obispo, who investigates magnetotaxis in bacteria. “In the short term there will be little effect but the long-term effects can be devastating,” he said.…there is a feeling that an increasing amount of time and energy is being wasted in the ongoing scramble for grants, in the face of mounting pressure from funding agencies…According to Peter Stadler, head of a bioinformatics group at the University of Leipzig in Germany, such cuts tend to cause the long-term erosion of a country''s science skills base. “Short-term cuts in science funding translate totally into a brain drain, since they predominantly affect young researchers who are paid from the soft money that is drying up first,” said Stadler. “They either leave science, an irreversible step, or move abroad but do not come back later, because the medium-term effect of cuts is a reduction in career opportunities and fiercer competition giving those already in the system a big advantage.”Even when young researchers are not directly affected, the prevailing culture of short-term funding—which requires ongoing grant applications—can be disruptive, according to Xavier Salvatella, principal investigator in the Laboratory of Molecular Biophysics at the Institute for Research in Biomedicine in Barcelona, Spain. “I do not think the situation is dramatic but too much time is indeed spent writing proposals,” he commented. “Because success rates are decreasing, the time devoted to raise funds to run the lab necessarily needs to increase.”At the University of Adelaide in Australia, Andrew Somogyi, professor of pharmacology, thinks that the situation is serious: “[M]y postdocs would spend about half their time applying for grants.” Somogyi pointed out that the success rate has been declining in Australia, as it has in some other countries. “For ARC [Australian Research Council] the success rate is now close to 20%, which means many excellent projects don''t get funding because the assessment is now so fine cut,” he said.Similar developments have taken place in the USA at both the National Institutes of Health (NIH)—which provides US$16 billion funding per year and the American Cancer Society (ACS), the country''s largest private non-profit funder of cancer research, with a much smaller pot of US$120 million per year. The NIH funded 21% of research proposals submitted to it in 2009, compared with 32% a decade earlier, while the ACS approves only 15% of grant applications, down several percentage points over the past few years.While the NIH is prevented by federal law from allowing observers in to its grant review meetings, the ACS did allow a reporter from Nature to attend one of its sessions on the condition that the names of referees and the applications themselves were not revealed (Powell, 2010). The general finding was that while the review process works well when around 30% of proposals are successful, it tends to break down as the success rate drops, as more arbitrary decisions are made and the risk of strong pitches being rejected increases. This can also discourage the best people from being reviewers because the process becomes more tiring and time-consuming.Even when young researchers are not directly affected, the prevailing culture of short-term funding—which requires ongoing grant applications—can be disruptive…In some countries, funding shortfalls are also leading to the loss of permanent jobs, for example in the UK where finance minister George Osborne announced on October 20 that the science budget would be frozen at £4.6 billion, rather than cut as had been expected. Even so, combined with the cut in funding for universities that was announced on the same day, this raises the prospect of reductions in academic staff numbers, which could affect research projects. This follows several years of increasing funding for UK science. Such uncertainty is damaging, according to Cornelius Gross, deputy head of the mouse biology unit, European Molecular Biology Laboratory in Monterotondo, Italy. “Large fluctuations in funding have been shown to cause damage beyond their direct magnitude as can be seen in the US where the Clinton boom was inevitably followed by a slowdown that led to rapid and extreme tightening of budgets,” he said.Some countries are aware of these dangers and have acted to protect budgets and, in some cases, even increase spending. A report by the OECD argued that countries and companies that boosted research and development spending during the ‘creative destruction'' of an economic downturn tended to gain ground on their competitors and emerge from the crisis in a relatively stronger position (OECD, 2009). This was part of the rationale of the US stimulus package, which was intended to provide an immediate lift to the economy and has been followed by a slight increase in funding. The NIH''s budget is set to increase by $1 billion, or 3% from 2010 to 2011, reaching just over $32 billion. This looks like a real-term increase, since inflation in the USA is now between 1 and 2%. However, there are fears that budgets will soon be cut; even now the small increase at the Federal level is being offset by cuts in state support, according to Mike Seibert, research fellow at the US Department of Energy''s National Renewable Energy Laboratory. “The stimulus funds are disappearing in the US, and the overall budget for science may be facing a correction at the national level as economic, budget, and national debt issues are addressed,” he said. “The states in most cases are suffering their own budget crises and will be cutting back on anything that is not nailed down.”…countries and companies that boosted research and development spending during the ‘creative destruction'' of an economic downturn tended to gain ground on their competitors…In Germany, the overall funding situation is also confused by a split between the Federal and 16 state governments, each of which has its own budget for science. In contrast to many other countries though, both federal and state governments have responded boldly to the credit crisis by increasing the total budget for the DFG (Deutsche Forschungsgemeinschaft)—Germany''s largest research funding agency—to €2.3 billion in 2011. Moreover, total funding for research and education from the BMBF (Federal Ministry for Education and Research) is expected to increase by another 7% from €10.9 billion in 2010 to €11.64 billion, although the overall federal budget is set to shrink by 3.8% under Germany''s austerity measures (Anon, 2010). There have also been increases in funding from non-government sources, such as the Fraunhofer Society, Europe''s largest application-oriented research organization, which has an annual budget of €1.6 billion.The German line has been strongly applauded by the European Union, which since 2007 has channelled its funding for cutting-edge research through the European Research Council (ERC). The ERC''s current budget of €7.5 billion, which runs until 2013, was set in 2007 and negotiations for the next period have not yet begun, but the ERC''s executive agency director Jack Metthey has indicated that it will be increased: “The Commission will firmly sustain in the negotiations the view that research and innovation, central to the Europe 2020 Strategy agreed by the Member States, should be a top budgetary priority.” Metthey also implied that governments cutting funding, as the UK had been planning to do, were making a false economy that would gain only in the short term. “Situations vary at the national level but the European Commission believes that governments should maintain and even increase research and innovation investments during difficult times, because these are pro-growth, anti-crisis investments,” he said.Many other countries have to cope with flat or declining science budgets; some are therefore exploring ways in which to do more with less. In Japan, for instance, money has been concentrated on larger projects and fewer scientists, with the effect of intensifying the grant application process. Since 2002, the total Japanese government budget for science and technology has remained flat at around ¥3,500 billion—or €27 billion at current exchange rates—with a 1% annual decline in university support but increased funding for projects considered to be of high value to the economy. This culminated in March 2010 with the launch of the ¥100 billion (€880 million) programme for World Leading Innovative Research and Development on Science and Technology.But such attempts to make funding more competitive or focus it on specific areas could have unintended side effects on innovation and risk taking. One side effect can be favouring scientists who may be less creative but good at attracting grants, according to Roger Butlin, evolutionary biologist at the University of Sheffield in the UK. “Some productive staff are being targeted because they do not bring in grants, so money is taking precedence over output,” said Butlin. “This is very dangerous if it results in loss of good theoreticians or data specialists, especially as the latter will be a critical group in the coming years.”“Scientists are usually very energetic when they can pursue their own ideas and less so when the research target is too narrowly prescribed”There have been attempts to provide funding for young scientists based entirely on merit, such as the ERC ‘Starting Grant'' for top young researchers, whose budget was increased by 25% to €661 million for 2011. Although they are welcome, such schemes could also backfire unless they are supported by measures to continue supporting the scientists after these early career grants expire, according to Gross. “There are moves to introduce significant funding for young investigators to encourage independence, so called anti-brain-drain grants,” he said. “These are dangerous if provided without later independent positions for these people and a national merit-based funding agency to support their future work.”Such schemes might work better if they are incorporated into longer-term funding programmes that provide some security as well as freedom to expand a project and explore promising side avenues. Butlin cited the Canadian ‘Discovery Grant'' scheme as an example worth adopting elsewhere; it supports ongoing programmes with long-term goals, giving researchers freedom to pursue new lines of investigation, provided that they fit within the overall objective of the project.To some extent the system of ‘open calls''—supported by some European funding agencies—has the same objective, although it might not provide long-term funding. The idea is to allow scientists to manoeuvre within a broad objective, rather than confining them to specific lines of research or ‘thematic calls'', which tend to be highly focused. “The majority of funding should be distributed through open calls, rather than thematic calls,” said Thomas Höfer from the Modeling Research Group at the German Cancer Research Center & BioQuant Center in Heidelberg. “Scientists are usually very energetic when they can pursue their own ideas and less so when the research target is too narrowly prescribed. In my experience as a reviewer at both the national and EU level, open calls are also better at funding high-quality research whereas too narrow thematic calls often result in less coherent proposals.”“Cutting science, and education, is the national equivalent of a farmer eating his ‘seed corn'', and will lead to developing nation status within a generation”Common threads seems to be emerging from the different themes and opinions about funding: budgets should be consistent over time and spread fairly among all disciplines, rather than focused on targeted objectives. They should also be spread across the working lifetime of a scientist rather than being shot in a scatter-gun approach at young researchers. Finally, policies should put a greater emphasis on long-term support for the best scientists and projects, chosen for their merit. Above all, funding policy should reflect the fundamental importance of science to economies, as Seibert concluded: “Cutting science, and education, is the national equivalent of a farmer eating his ‘seed corn'', and will lead to developing nation status within a generation.”  相似文献   

7.
Last year''s Nobel Prizes for Carol Greider and Elizabeth Blackburn should be encouraging for all female scientists with childrenCarol Greider, a molecular biologist at Johns Hopkins University (Baltimore, MD, USA), recalled that when she received a phone call from the Nobel Foundation early in October last year, she was staring down a large pile of laundry. The caller informed her that she had won the 2009 Nobel Prize in Physiology or Medicine along with Elizabeth Blackburn, her mentor and co-discoverer of the enzyme telomerase, and Jack Szostak. The Prize was not only the ultimate reward for her own achievements, but it also highlighted a research field in biology that, unlike most others, is renowned for attracting a significant number of women.Indeed, the 2009 awards stood out in particular, as five women received Nobel prizes. In addition to the Prize for Greider and Blackburn, Ada E. Yonath received one in chemistry, Elinor Ostrom became the first female Prize-winner in economics, and Herta Müller won for literature (Fig 1).Open in a separate windowFigure 1The 2009 Nobel Laureates assembled for a photo during their visit to the Nobel Foundation on 12 December 2009. Back row, left to right: Nobel Laureates in Chemistry Ada E. Yonath and Venkatraman Ramakrishnan, Nobel Laureates in Physiology or Medicine Jack W. Szostak and Carol W. Greider, Nobel Laureate in Chemistry Thomas A. Steitz, Nobel Laureate in Physiology or Medicine Elizabeth H. Blackburn, and Nobel Laureate in Physics George E. Smith. Front row, left to right: Nobel Laureate in Physics Willard S. Boyle, Nobel Laureate in Economic Sciences Elinor Ostrom, Nobel Laureate in Literature Herta Müller, and Nobel Laureate in Economic Sciences Oliver E. Williamson. © The Nobel Foundation 2009. Photo: Orasis.Greider, the daughter of scientists, has overcome many obstacles during her career. She had dyslexia that placed her in remedial classes; “I thought I was stupid,” she told The New York Times (Dreifus, 2009). Yet, by far the biggest challenge she has tackled is being a woman with children in a man''s world. When she attended a press conference at Johns Hopkins to announce the Prize, she brought her children Gwendolyn and Charles with her (Fig 2). “How many men have won the Nobel in the last few years, and they have kids the same age as mine, and their kids aren''t in the picture? That''s a big difference, right? And that makes a statement,” she said.The Prize […] highlighted a research field in biology that, unlike most others, is renowned for attracting a significant number of womenOpen in a separate windowFigure 2Mother, scientist and Nobel Prize-winner: Carol Greider is greeted by her lab and her children. © Johns Hopkins Medicine 2009. Photo: Keith Weller.Marie Curie (1867–1934), the Polish–French physicist and chemist, was the first woman to win the Prize in 1903 for physics, together with her husband Pierre, and again for chemistry in 1911—the only woman to twice achieve such recognition. Curie''s daughter Irène Joliot-Curie (1897–1956), a French chemist, also won the Prize with her husband Frédéric in 1935. Since Curie''s 1911 prize, 347 Nobel Prizes in Physiology or Medicine and Chemistry (the fields in which biologists are recognized) have been awarded, but only 14—just 4%—have gone to women, with 9 of these awarded since 1979. That is a far cry from women holding up half the sky.Yet, despite the dominance of men in biology and the other natural sciences, telomere research has a reputation as a field dominated by women. Daniela Rhodes, a structural biologist and senior scientist at the MRC Laboratory of Molecular Biology (Cambridge, UK) recalls joining the field in 1993. “When I went to my first meeting, my world changed because I was used to being one of the few female speakers,” she said. “Most of the speakers there were female.” She estimated that 80% of the speakers at meetings at Cold Spring Harbour Laboratory in those early days were women, while the ratio in the audience was more balanced.Since Curie''s 1911 prize, 347 Nobel Prizes in Physiology or Medicine and Chemistry […] have been awarded, but only 14—just 4%—have gone to women…“There''s nothing particularly interesting about telomeres to women,” Rhodes explained. “[The] field covers some people like me who do structural biology, to cell biology, to people interested in cancers […] It could be any other field in biology. I think it''s [a result of] having women start it and [including] other women.” Greider comes to a similar conclusion: “I really see it as a founder effect. It started with Joe Gall [who originally recruited Blackburn to work in his lab].”Gall, a cell biologist, […] welcomed women to his lab at a time when the overall situation for women in science was “reasonably glum”…Gall, a cell biologist, earned a reputation for being gender neutral while working at Yale University in the 1950s and 1960s; he welcomed women to his lab at a time when the overall situation for women in science was “reasonably glum,” as he put it. “It wasn''t that women were not accepted into PhD programs. It''s just that the opportunities for them afterwards were pretty slim,” he explained.“Very early on he was very supportive to a number of women who went on and then had their own labs and […] many of those women [went] out in the world [to] train other women,” Greider commented. “A whole tree that then grows up that in the end there are many more women in that particular field simply because of that historical event.Thomas Cech, who won the Nobel Prize for Chemistry in 1989 and who worked in Gall''s lab with Blackburn, agreed: “In biochemistry and metabolism, we talk about positive feedback loops. This was a positive feedback loop. Joe Gall''s lab at Yale was an environment that was free of bias against women, and it was scientifically supportive.”Gall, now 81 and working at the Carnegie Institution of Washington (Baltimore, MD, USA), is somewhat dismissive about his positive role. “It never occurred to me that I was doing anything unusual. It literally, really did not. And it''s only been in the last 10 or 20 years that anyone made much of it,” he said. “If you look back, […] my laboratory [was] very close to [half] men and [half] women.”During the 1970s and 1980s; “[w]hen I entered graduate school,” Greider recalled, “it was a time when the number of graduate students [who] were women was about 50%. And it wasn''t unusual at all.” What has changed, though, is the number of women choosing to pursue a scientific career further. According to the US National Science Foundation (Arlington, VA, USA), women received 51.8% of doctorates in the life sciences in 2006, compared with 43.8% in 1996, 34.6% in 1986, 20.7% in 1976 and 11.9% in 1966 (www.nsf.gov/statistics).In fact, Gall suspects that biology tends to attract more women than the other sciences. “I think if you look in biology departments that you would find a higher percentage [of women] than you would in physics and chemistry,” he said. “I think […] it''s hard to dissociate societal effects from specific effects, but probably fewer women are inclined to go into chemistry [or] physics. Certainly, there is no lack of women going into biology.” However, the representation of women falls off at each level, from postdoc to assistant professor and tenured professor. Cech estimated that only about 20% of the biology faculty in the USA are women.“[It] is a leaky pipeline,” Greider explained. “People exit the system. Women exit at a much higher proportion than do men. I don''t see it as a [supply] pipeline issue at all, getting the trainees in, because for 25 years there have been a great number of women trainees.“We all thought that with civil rights and affirmative action you''d open the doors and women would come in and everything would just follow. And it turned out that was not true.”Nancy Hopkins, a molecular biologist and long-time advocate on issues affecting women faculty members at the Massachusetts Institute of Technology (Cambridge, MA, USA), said that the situation in the USA has improved because of civil rights laws and affirmative action. “I was hired—almost every woman of my generation was hired—as a result of affirmative action. Without it, there wouldn''t have been any women on the faculty,” she said, but added that: “We all thought that with civil rights and affirmative action you''d open the doors and women would come in and everything would just follow. And it turned out that was not true.”Indeed, in a speech at an academic conference in 2005, Harvard President Lawrence Summers said that innate differences between males and females might be one reason why fewer women than men succeeded in science and mathematics. The economist, who served as Secretary of Treasury under President William Clinton, told The Boston Globe that “[r]esearch in behavioural genetics is showing that things people previously attributed to socialization weren''t [due to socialization after all]” (Bombardieri, 2005).Some attendees of the meeting were angered by Summers''s remarks that women do not have the same ‘innate ability'' as men in some fields. Hopkins said she left the meeting as a protest and in “a state of shock and rage”. “It isn''t a question of political correctness, it''s about making unscientific, unfounded and damaging comments. It''s what discrimination is,” she said, adding that Summers''s views reflect the problems women face in moving up the ladder in academia. “To have the president of Harvard say that the second most important reason for their not being equal was really their intrinsic genetic inferiority is so shocking that no matter how many times I think back to his comments, I''m still shocked. These women were not asking to be considered better or special. They were just asking to have their gender be invisible.”Nonetheless, women are making inroads into academia, despite lingering prejudice and discrimination. One field of biology that counts a relatively high number of successful women among its upper ranks is developmental biology. Christiane Nüsslein-Volhard, for example, is Director of the Max Planck Institute for Developmental Biology in Tübingen, Germany, and won the Nobel Prize for Physiology or Medicine in 1995 for her work on the development of Drosophila embryos. She estimated that about 30% of speakers at conferences in her field are women.…for many women, the recent Nobel Prize for Greider […] and Blackburn […] therefore comes as much needed reassurance that it is possible to combine family life and a career in scienceHowever, she also noted that women have never been the majority in her own lab owing to the social constraints of German society. She explained that in Germany, Switzerland and Austria, family issues pose barriers for many women who want to have children and advance professionally because the pressure for women to not use day care is extremely strong. As such, “[w]omen want to stay home because they want to be an ideal mother, and then at the same time they want to go to work and do an ideal job and somehow this is really very difficult,” she said. “I don''t know a single case where the husband stays at home and takes care of the kids and the household. This doesn''t happen. So women are now in an unequal situation because if they want to do the job, they cannot; they don''t have a chance to find someone to do the work for them. […] The wives need wives.” In response to this situation, Nüsslein-Volhard has established the CNV Foundation to financially support young women scientists with children in Germany, to help pay for assistance with household chores and child care.Rhodes, an Italian native who grew up in Sweden, agreed with Nüsslein-Volhard''s assessment of the situation for many European female scientists with children. “Some European countries are very old-fashioned. If you look at the Protestant countries like Holland, women still do not really go out and have a career. It tends to be the man,” she said. “What I find depressing is [that in] a country like Sweden where I grew up, which is a very liberated country, there has been equality between men and women for a couple of generations, and if you look at the percentage of female professors at the universities, it''s still only 10%.” In fact, studies both from Europe and the USA show that academic science is not a welcoming environment for women with children; less so than for childless women and fathers, who are more likely to succeed in academic research (Ledin et al, 2007; Martinez et al, 2007).For Hopkins, her divorce at the age of 30 made a choice between children or a career unavoidable. Offered a position at MIT, she recalled that she very deliberately chose science. She said that she thought to herself: “Okay, I''m going to take the job, not have children and not even get married again because I couldn''t imagine combining that career with any kind of decent family life.” As such, for many women, the recent Nobel Prize for Greider, who raised two children, and Blackburn (Fig 3), who raised one, therefore comes as much needed reassurance that it is possible to combine family life and a career in science. Hopkins said the appearance of Greider and her children at the press conference sent “the message to young women that they can do it, even though very few women in my generation could do it. The ways in which some women are managing to do it are going to become the role models for the women who follow them.”Open in a separate windowFigure 3Elizabeth Blackburn greets colleagues and the media at a reception held in Genentech Hall at UCSF Mission Bay to celebrate her award of the Nobel Prize in Physiology or Medicine. © University of California, San Francisco 2009. Photo: Susan Merrell.  相似文献   

8.
Samuel Caddick 《EMBO reports》2008,9(12):1174-1176
  相似文献   

9.
Suran M 《EMBO reports》2011,12(1):27-30
Few environmental disasters are as indicting of humanity as major oil spills. Yet Nature has sometimes shown a remarkable ability to clean up the oil on its own.In late April 2010, the BP-owned semi-submersible oilrig known as Deepwater Horizon exploded just off the coast of Louisiana. Over the following 84 days, the well from which it had been pumping spewed 4.4 million barrels of crude oil into the Gulf of Mexico, according to the latest independent report (Crone & Tolstoy, 2010). In August, the US Government released an even grimmer estimate: according to the federal Flow Rate Technical Group, up to 4.9 million barrels were excreted during the course of the disaster. Whatever the actual figure, images from NASA show that around 184.8 million gallons of oil have darkened the waters just 80 km from the Louisiana coast, where the Mississippi Delta harbours marshlands and an abundance of biodiversity (NASA Jet Propulsion Laboratory, 2010; Fig 1).…the Deepwater incident is not the first time that a massive oil spill has devastated marine and terrestrial ecosystems, nor is it likely to be the lastOpen in a separate windowFigure 1Images of the Deepwater Horizon oil slick in the Gulf of Mexico. These images were recorded by NASA''s Terra spacecraft in May 2010. The image dimensions are 346 × 258 kilometres and North is toward the top. In the upper panel, the oil appears bright turquoise owing to the combination of images that were used from the Multi-angle Imaging SpectroRadiometer (MISR) aboard the craft. The Mississippi Delta, which harbors marshlands and an abundance of biodiversity, is visible in the top left of the image. The white arrow points to a plume of smoke and the red cross-hairs indicate the former location of the drilling rig. The lower two panels are enlargements of the smoke plume, which is probably the result of controlled burning of collected oil on the surface.© NASA/GSFC/LaRC/JPL, MISR TeamThe resulting environmental and economic situation in the Gulf is undoubtedly dreadful—the shrimp-fishing industry has been badly hit, for example. Yet the Deepwater incident is not the first time that a massive oil spill has devastated marine and terrestrial ecosystems, nor is it likely to be the last. In fact, the US National Oceanic and Atmospheric Association (NOAA) deals with approximately 300 oil spills per year and the Deepwater catastrophe—despite its extent and the enormous amount of oil released—might not be as terrible for the environment as was originally feared. Jacqueline Michel, a geochemist who has worked on almost every major oil spill since the 1970s and who is a member of NOAA''s scientific support team for the Gulf spill, commented that “the marshes and grass are showing some of the highest progresses of [oil] degradation because of the wetness.” This rapid degradation is partly due to an increased number of oil-consuming microbes in the water, whose population growth in response to the spill is cleaning things up at a relatively fast pace (Hazen et al, 2010).It therefore seems that, however bad the damage, Nature''s capacity to repair itself might prevent the unmitigated disaster that many feared on first sight of the Deepwater spill. As the late social satirist George Carlin (1937–2008) once put it: “The planet will shake us off like a bad case of fleas, a surface nuisance[.] The planet will be here for a long, long—LONG—time after we''re gone, and it will heal itself, it will cleanse itself, because that''s what it does, it''s a self-correcting system.”Michel believes that there are times when it is best to leave nature alone. In such cases the oil will degrade naturally by processes as simple as exposure to sunlight—which can break it down—or exposure to the air—which evaporates many of its components. “There have been spills where there was no response because we knew we were going to cause more harm,” Michel said. “Although we''re going to remove heavier layers of surface oil [in this case], the decision has been made to leave oil on the beach because we believe it will degrade in a timescale of months […] through natural processing.”To predict the rate of general environmental recovery, Michel said one should examine the area''s fauna, the progress of which can be very variable. Species have different recovery rates and although it takes only weeks or months for tiny organisms such as plankton to bounce back to their normal population density, it can take years for larger species such as the endangered sea turtle to recover.…however bad the damage, Nature''s capacity to repair itself might prevent the unmitigated disaster that many feared on first sight…Kimberly Gray, professor of environmental chemistry and toxicology at Northwestern University (Evanston, IL, USA), is most concerned about the oil damaging the bottom of the food chain. “Small hits at the bottom are amplified as you move up,” she explained. “The most chronic effects will be at the base of the food chain […] we may see lingering effects with the shrimp population, which in time may crash. With Deepwater, it''s sort of like the straw that broke the shrimp''s back.”Wetlands in particular are a crucial component of the natural recovery of ecosystems, as they provide flora that are crucial to the diets of many organisms. They also provide nesting grounds and protective areas where fish and other animals find refuge from predation. “Wetlands and marsh systems are Nature''s kidneys and they''ve been damaged,” Gray said. The problem is exacerbated because the Louisiana wetlands are already stressed in the aftermath of Hurricane Katrina, which devastated the Gulf coast in August 2005, and because of constant human activity and environmental damage. As Gray commented, “Nature has a very powerful capacity to repair itself, but what''s happening in the modern day is assault after assault.”Ron Thom, a marine ecologist at Pacific Northwest National Laboratory—a US government-funded research facility (Richland, WA, USA)—has done important research on coastal ecosystems. He believes that such habitats are able to decontaminate themselves to a limited degree because of evolution. “[Coastal-related ecosystems are] pretty resilient because they''ve been around a long time and know how to survive,” he said.As a result, wetlands can decontaminate themselves of pollutants such as oil, nitrate and phosphate. However, encountering large amounts of pollutants in a short period of time can overwhelm the healing process, or even stop it altogether. “We did some experiments here in the early 90s looking at the ability for salt marshes to break down oil,” Thom said. “When we put too much oil on the surface of the marsh it killed everything.” He explained that the oil also destroyed the sediment–soil column, where plant roots are located. Eventually, the roots disintegrated and the entire soil core fell apart. According to Thom, the Louisiana marshes were weakened by sediment and nutrient starvation, which suggests that the Deepwater spill destroyed below-ground material in some locations. “You can alter a place through a disturbance so drastic that it never recovers to what it used to be because things have changed so much,” he said.“Nature has a very powerful capacity to repair itself, but what''s happening in the modern day is assault after assault”Michael Blum, a coastal marsh ecologist at Tulane University in New Orleans, said that it is hard to determine the long-term effects of the oil because little is known about the relevant ecotoxicology—the effect of toxic agents on ecosystems. He has conducted extensive research on how coastal marsh plants respond to stress: some marshes might be highly susceptible to oil whereas others could have evolved to deal with natural oil seepage to metabolize hydrocarbons. In the former, marshes might perish after drastic exposure to oil leading to major shifts in plant communities. In the latter case, the process of coping with oil could involve the uptake of pollutants in the oil—known as polycyclic aromatic hydrocarbons (PAHs)—and their reintroduction into the environment. “If plants are growing in the polluted sediments and tapping into those contaminated sources, they can pull that material out of the soil and put it back into the water column or back into the leaf tissue that is a food source for other organisms,” Blum explained.In addition to understanding the responses of various flora, scientists also need to know how the presence of oil in an ecosystem affects the fauna. One model that is used to predict the effects of oil on vertebrates is the killifish; a group of minnows that thrive in the waters of Virginia''s Elizabeth River, where they are continuously exposed to PAHs deposited in the water by a creosote factory (Meyer & Di Giulio, 2003). “The killifish have evolved tolerance to the exposure of PAHs over chronic, long-term conditions,” Blum said. “This suggests that something similar may occur elsewhere, including in Gulf Coast marshes exposed to oil.”Although Michel is optimistic about the potential for environmental recovery, she pointed out that no two spills are the same. “There are lot of things we don''t know, we never had a spill that had surface release for so long at this water depth,” she said. Nevertheless, to better predict the long-term effects, scientists have turned to data from similar incidents.In 1989, the petroleum tanker Exxon Valdez struck Bligh Reef off the coast of Prince William Sound in Alaska and poured a minimum of 11 million gallons of oil into the water—enough to fill 125 Olympic-sized swimming pools. Senior scientist at NOAA, Stanley Rice of Juno, Alaska, studies the long-term effects of the spill and the resulting oil-related issues in Prince William Sound. Rice has worked with the spill since day 3 and, 20 years later, he is seeing major progress. “I never want to give the impression that we had this devastating oil spill in 1989 and it''s still devastating,” he said. “We have pockets of a few species where lingering oil hurts their survival, but in terms of looking at the Sound in its entirety […] it''s done a lot of recovery in 20 years.”…little is known about the relevant ecotoxicology—the effect of toxic agents on ecosystemsDespite the progress, Rice is still concerned about one group of otters. The cold temperature of the water in the Sound—rarely above 5 °C—slows the disintegration of the oil and, every so often, the otters come in contact with a lingering pocket. When they are searching for food, for example, the otters often dig into pits containing oil and become contaminated, which damages their ability to maintain body temperature. As a result, they cannot catch as much food and starve because they need to consume the equivalent of 25% of their body weight every day (Rice, 2009).“Common colds or worse, pneumonia, are extremely debilitating to an animal that has to work literally 365 days a year, almost 8 to 12 hours a day,” Rice explained. “If they don''t eat enough to sustain themselves, they die of hyperthermia.” Nevertheless, in just the last two years, Rice has finally seen the otter population rebound.Unlike the otters, one pod of orca whales has not been so lucky. Since it no longer has any reproductive females, the pod will eventually become extinct. However, as it dies out, orca prey such as seals and otters will have a better chance of reproducing. “There are always some winners and losers in these types of events,” Rice said. “Nature is never static.”The only ‘loser'' that Rice is concerned about at the moment is the herring, as many of their populations have remained damaged for the past 20 years. “Herring are critical to the ecosystem,” he said. “[They are] a base diet for many species […] Prince William Sound isn''t fully recovered until the herring recover.”North America is not alone in dealing with oil-spill disasters—Europe has had plenty of experience too. One of the worst spills occurred when the oil tanker Prestige leaked around 20 million gallons of oil into the waters of the Galacian coast in Northern Spain in 2002. This also affected the coastline of France and is considered Spain''s worst ecological disaster.“The impacts of the Prestige were indeed severe in comparison with other spills around the world,” said attorney Xabier Ezeizabarrena, who represented the Fishermen Guilds of Gipuzkoa in a lawsuit relating to the spill. “Some incidents aren''t even reported, but in the European Union the ratio is at least one oil spill every six months.”For disasters involving oil, oceanographic data to monitor and predict the movement of the spill is essentialIn Ezeizabarrena''s estimation, Spanish officials did not respond appropriately to the leak. The government was denounced for towing the shipwreck further out into the Atlantic Ocean—where it eventually sank—rather than to a port. “There was a huge lack of measures and tools from the Spanish government in particular,” Ezeizabarrena said. “[However], there was a huge response from civil society […] to work together [on restoration efforts].”Ionan Marigómez, professor of cellular biology at the University of the Basque Country, Spain, was the principal investigator on a federal coastal-surveillance programme named Orbankosta. He recorded the effects of the oil on the Basque coast and was a member of the Basque government''s technical advisory commission for the response to the Prestige spill. He was also chair of the government''s scientific committee. “Unfortunately, most of us scientists were not prepared to answer questions related to the biological impact of restoration strategies,” Marigómez said. “We lacked data to support our advice since continued monitoring is not conducted in the area […] and most of us had developed our scientific activity with too much focus on each one''s particular area when the problem needed a holistic view.”…the world consumes approximately 31 billion barrels of oil per year; more than 700 times the amount that leaked during the Deepwater spillFor disasters involving oil, oceanographic data to monitor and predict the movement of the spill is essential. Clean-up efforts were initially encouraged in Spain, but data provided by coastal-inspection programmes such as Orbankosta informed the decision to not clean up the Basque shoreline, allowing the remaining oil debris to disintegrate naturally. In fact, the cleaning activity that took place in Galicia only extended the oil pollution to the supralittoral zone—the area of the beach splashed by the high tide, rather than submerged by it—as well as to local soil deposits. On the Basque coast, restoration efforts were limited to regions where people were at risk, such as rocky areas near beaches and marinas.Eight years later, Galicia still suffers from the after-effects of the Prestige disaster. Thick subsurface layers of grey sand are found on beaches, sometimes under sand that seems to be uncontaminated. In Corme-Laxe Bay and Cies Island in Galicia, PAH levels have decreased. Studies have confirmed, however, that organisms exposed to the area''s sediments had accumulated PAHs in their bodies. Marigómez, for example, studied the long-term effects of the spill on mussels. Depending on their location, PAH levels decreased in the sampled mussel tissue between one and two years after the spill. However, later research showed that certain sites suffered later increases in the level of PAHs, due to the remobilization of oil residues (Cajaraville et al, 2006). Indeed, many populations of macroinvertebrate species—which are the keystones of coastal ecosystems—became extinct at the most-affected locations, although neighbouring populations recolonized these areas. The evidence suggests that only time will tell what will happen to the Galicia ecosystem. The same goes for oil-polluted environments around the world.The concern whether nature can recover from oil spills might seem extreme, considering that oil is a natural product derived from the earth. But too much of anything can be harmful and oil would remain locked underground without human efforts to extract it. “As from Paracelsus'' aphorism, the dose makes the poison,” Marigómez said.According to the US Energy Information Administration, the world consumes approximately 31 billion barrels of oil per year; more than 700 times the amount that leaked during the Deepwater spill. Humanity continues, in the words of some US politicians, to “drill, baby, drill!” On 12 October 2010, less than a year after the Gulf Coast disaster, US President Barack Obama declared that he was lifting the ban on deepwater drilling. It appears that George Carlin got it right again when he satirized a famous American anthem: “America, America, man sheds his waste on thee, and hides the pines with billboard signs from sea to oily sea!”  相似文献   

10.
Suran M 《EMBO reports》2011,12(5):404-407
The increasing influence of the Tea Party in Congress and politics has potential repercussions for public funding of scientific research in the USAIn 2009, Barack Obama became the 44th President of the USA, amid hopes that he would fix the problems created or left unresolved by his predecessor. However, despite his positive mantra, “Yes we can,” the situation was going to get worse: the country was spiralling towards an economic recession, a collapsing residential real-estate market and the loss of millions of jobs. Now, the deficit lingers around US$14 trillion (US Department of the Treasury, 2011). In response to these hardships and the presence of a perceived ‘socialist'' president in office, a new political movement started brewing that would challenge both the Democrats and the Republicans—the two parties that have dominated US politics for generations. Known as the Tea Party, this movement has been gaining national momentum in its denouncement of the status quo of the government, especially in relation to federal spending, including the support of scientific research.The name is a play on the Boston Tea Party, at which more than 100 American colonists dumped 45 tonnes of tea into Boston Harbour (Massachusetts, USA) in 1773 to protest against the British taxation of imported tea. Whereas the 18th century Boston Tea Party formed to protest against a specific tax, the Tea Party of the 21st century protests against taxes and ‘big'' government in general.Many view Tea Party followers as modern muckrakers, but supporters claim their movement is fundamentally about upholding the US Constitution. Tea Party Patriots, a non-partisan organization, considers itself to be the official home of the Tea Party movement. Fuelled by the values of fiscal responsibility, limited government and free markets, Tea Party Patriots believe, these three principles are granted by the Constitution, although not necessarily upheld by the administration.“If you read the Constitution, the limits of government involvement in society [are] pretty well-defined and our government has gone farther and farther beyond the specific limits of the Constitution,” said Mark Meckler, one of the co-founders of Tea Party Patriots. “Our Constitution is not designed as an empowering document, but as a limiting document… [and] was intended to be used as a weapon by the people against the government to keep it in the box.” Tea Partiers tend to be especially critical when it comes to spending tax dollars on bank bailouts and health care, but anything goes when it comes to cutting back on public spending—even science. “We believe everything needs to be on the table since the government is virtually bankrupt,” Meckler said. “We need to cut the waste, cut the abuse [and] get rid of the departments that shouldn''t exist.”Tea Partiers tend to be especially critical when it comes to spending tax dollars on bank bailouts and health care, but anything goes when […]cutting […] public spending—even scienceOn 19 February 2011, the US House of Representatives, which is currently controlled by the Republicans, passed a federal-spending bill for the remainder of the 2011 fiscal year budget. Among other cuts, the bill called for billions of dollars to be slashed from the budgets of federal science agencies. If the bill is signed into law, the National Institutes of Health (NIH) will have $1.6 billion cut from its budget—a 5.2% decrease—and the Department of Energy (DOE) will experience an 18% cut in funding for its Office of Science. Other agencies targeted include the Environmental Protection Agency (EPA), the National Aeronautics and Space Administration (NASA), the National Institute of Standards and Technology (NIST) and the National Science Foundation (NSF; Anon, 2011; Cho, 2011). Although the US Senate, which has a Democratic majority, must consider the bill before any definite amendments to the budget are made, it is likely that there will be some cuts to science funding.Although the House is in favour of science-related cuts, President Obama supports spending more on science education, basic research and clean-energy research. He has also proposed an 11.8% increase in the budget of the DOE, as well as an 8% increase in the NSF budget (Office of Management and Budget, 2011).The House is in favour of science-related cuts, but President Obama is in favour of spending more on science education, basic science and clean-energy researchJoann Roskoski, acting assistant director of the Biology Directorate at the NSF, said her institute is strongly in favour of President Obama''s budget proposal. “President Obama is a very strong supporter of fundamental research and STEM [science, technology, engineering and mathematics] education because he perceives it as investing in the future of the country,” she said. “These are just difficult budgetary times and we''ll just have to wait and see what happens. As they say, the president proposes and Congress disposes.”Karl Scheidt, a professor of chemistry at Northwestern University (Evanston, Illinois, USA), has four grants from federal agencies. “A couple of my grants expire this year, which is happening at the worst, worst possible time,” explained Scheidt, whose grants are funded by the NIH and the NSF. He added that although many politicians either do not understand or believe in the fundamentals of science, they still preach to the masses about what they ‘think'' they know. “I think it''s an absolute travesty that many people don''t understand science and that many of the Republicans who don''t fully understand science perpetuate incorrect assumptions and scientific falsehoods when speaking in public,” he said. “It makes the US less competitive and puts us collectively at a disadvantage relative to other nations if we don''t succeed in scientific education and innovative research in the future.”Although the Tea Party is not technically associated with the Republican Party, all Tea-Party representatives and senators ran as Republican candidates in the last election. While only one-third of seats in the Senate are on the ballot every two years for a six-year term, all House seats are for a two-year term. In the most recent Senatorial election, 50% of Tea Party-backed candidates won; 10 in total. 140 candidates for seats in the House of Representatives were backed by the Tea Party—all of whom were Republicans—but only 40 won. Nevertheless, with around 100 new Republicans in office, a House controlled by a Republican majority and most Congress-based Republicans in agreement with Tea Party ideals, the Tea Party actually has a lot of sway on the voting floor.Of course, as a fundamentally grass-roots movement, their influence is not limited to the halls of power. Since just before the November election last year, Tea Party-backed politicians have received more scrutiny and media exposure, meaning more people have listened to their arguments against spending on science. In fact, Republican politicians associated with the Tea Party have made critical and sometimes erroneous comments about science. Representative Michelle Bachman, for example, claimed that because carbon dioxide is a natural gas, it is not harmful to our atmosphere (Johnson, 2009). Representative Jack Kingston denounced the theory of evolution and stated that he did not come from a monkey (The Huffington Post, 2011). When asked how old he believes the Earth to be, Senator Rand Paul refused to answer (Binckes, 2010). He also introduced a bill to cut the NSF budget by 62%, and targeted the budget of the Center for Disease Control and Prevention.Scheidt believes part of the challenge is that many scientists do not properly articulate the importance of their work to the public, and there is limited representation on behalf of science in Washington. “It''s difficult sometimes to advocate for and explain the critical importance of basic research and for the most part, Congress may not always appreciate the basic fundamental mission of organizations like the NIH,” Scheidt said. “Arlen Specter was one of the few people who could form coalitions with his colleagues on both sides of the aisle and communicate why scientific research is critical. Why discovering new ways to perform transplants and creating new medicines are so important to everyone.”…part of the challenge is that many scientists do not properly articulate the importance of their work to the public, and there is limited representation on behalf of science in WashingtonSpecter, a former senator, was Republican until 2009 when he decided to switch political parties. During the last Democratic primary, he lost the Pennsylvania Senate nomination after serving in Congress for more than four decades. The Democratic nominee, Joe Sestak, eventually lost the coveted seat to Pat Toomey, a Tea Party Republican who sponsored an amendment denying NIH funding for some grants while he was a House member. Toomey is also against funding climate science and clean-energy research with federal dollars.Specter was considered a strong supporter of biomedical research, especially cancer research. He was the catalyst that pushed through a great deal of pro-science legislation, such as adding approximately $10 billion to NIH funding as part of the stimulus package in 2009, and doubling NIH funding in the 1990s. As scientific research was so important to him, he served on the US Senate Committee on Appropriations Subcommittee on Labor, Health and Human Services, Education, and Related Agencies and on the Senate Committee on Environment and Public Works. Specter was a popular political champion of science not only because of all he had accomplished, but also because so few scientists are elected to office.Among those Democrats who lost their seats to Tea Party Republicans was Congressman Bill Foster. Foster, who once worked for the Fermi National Accelerator Laboratory (Fermilab)—which is funded by the DOE—represented Batavia, Ilinois, which is also where Fermilab has its headquarters. “The new representative in the district where Fermilab resides is Randy Hultgren, a Republican, who has been very supportive of the laboratory since he''s been elected,” said Cindy Conger, Chief Financial Officer at Fermilab. “He''s very interested in us and very interested […] in us having adequate funding.”However, Fermilab is suffering financially. “We will […] have some level of layoffs,” Conger said. “Inadequate federal funding could result in more layoffs or not being able to run our machines for part of the year. These are the things we are contemplating doing in the event of a significant budget cut. Nothing is off the table [but] we will do everything we can to run the [Tevatron] accelerator.”But Fermilab''s desperate appeal for $35 million per year for the next three fiscal years was denied by the Obama administration and not included in the 2012 White House budget request. As a result, the most powerful proton–antiproton accelerator in the USA, the Tevatron, is shutting down indefinitely near the end of this year.Another pro-science Republican is former Congressman John Porter, who studied at the Massachusetts Institute of Technology. He encouraged the federal funding of science while serving as chair of the House Subcommittee on Labor, Health and Human Services, and Education, as well as on the House Committee on Appropriations and Related Agencies. Like Scheidt, Porter said a problem is that not many members of Congress really understand science or what goes into scientific research.“Many members of Congress don''t realize that the money appropriated for the funding of scientific research through NIH, NSF […] is sent out to research institutes in their districts and states where the research is conducted,” said Porter, who retired from Congress in 2001 after serving for more than 20 years. “They simply haven''t been exposed to it and that''s the fault of the science community, which has a great responsibility to educate about the mechanisms on how we fund scientific research.”Today, Porter is vice-chair of the Foundation for the NIH and also chairs Research!America, a non-profit organization which aims to further medical, health and scientific research as higher national priorities. He also noted that industry would not fund scientific research in the way the government does because there would essentially be no profits. Therefore, federal funding remains essential.“Let''s take away the phones, iPads and everything else [those against the federal funding of science] depend on and see what''s left,” Porter said. “The US is the world leader in science, technology and research and the way we got there and the way we have created the technology that makes life easier […] is a result of making investments in that area.”For now, Scheidt said the best approach is to educate as many people as possible to understand that scientific research is a necessity, not a luxury. “We unfortunately have a very uneducated population in regard to science and it''s not 100% their fault,” he said. “However, if people took a real interest in science and paid as much attention to stem-cell or drug-discovery research as they did to the Grammy Awards or People magazine I think they would appreciate what''s going on in the science world.”…the best approach is to educate as many people as possible to understand that scientific research is a necessity, not a luxuryInstead, the USA is lagging behind its competitors when it comes to STEM education. According to the 2009 Program for International Student Assessment (PISA), the USA is ranked 17th on science and 25th on maths out of 34 countries (US Department of Education, 2010). “We''re in a cluster now, we''re no longer the leading country,” said D. Martin Watterson, a molecular biologist who sits on NIH peer-review committees to evaluate grant proposals. The reason, according to Watterson, is that the first things to be cut after a budget decrease are training grants for continuing education efforts. Moreover, the USA already lacks highly trained workers in the field of science. “In some disciplines, employers now look to other places in Europe and Asia to find those trained personnel,” Watterson said.Ultimately, most people at least want a final budget to be passed so that there is sufficient time to plan ahead. However, Georgetown University political science professor Clyde Wilcox thinks that a compromise is not so simple. “The problem is that it''s a three-way poker game. People are going to sit down and they''re going to be bargaining, negotiating and bluffing each other,” he said. “The House Republicans just want to cut the programs that they don''t like, so they''re not cutting any Republican programs for the most part.”As a result, institutions such as the EPA find themselves being targeted by the Republicans. Although there is not a filibuster-proof majority of Democrats in the Senate, they still are a majority and will try to preserve science funding. Wilcox said that it is not necessarily a good thing to continue negotiating if nothing gets done and the country is dependent on continuing resolutions.Although there is not a filibuster-proof majority of Democrats in the Senate, they still are a majority and will try to preserve science funding“What the real problem is, when push comes to shove, someone has to blink,” he said. “I don''t think there will be deep cuts in science for a number of reasons, one is science is consistent with the Democratic ideology of education and the Republican ideology of investment. And then, we don''t really spend that much on science anyway so you couldn''t come remotely close to balancing the budget even if you eliminated everything.”Although during his time in Congress representatives of both parties were not as polarized as they are today, Porter believes the reason they are now is because of the political climate. “The president has made [science] a very important issue on his agenda and unfortunately, there are many Republicans today that say if he''s for it, I''m against it,” Porter said. In fact, several government officials ignored repeated requests or declined to comment for this article.“It''s time for everybody from both parties to stand up for the country, put the party aside and find solutions to our problems,” Porter commented. “The American people didn''t just elect us to yell at each other, they elected us to do a job. You have to choose priorities and to me the most important priority is to have our children lead better lives, to have all human beings live longer, healthier, happier lives and to have our economy grow and prosper and our standard of living maintained—the only way to do that is to invest where we lead the world and that''s in science.”  相似文献   

11.
Hunter P 《EMBO reports》2011,12(3):205-207
A more complete catalogue of the Earth''s fauna and flora and a more holistic view of man-made environmental problems could help to slow the rate of biodiversity loss.In the wake of the admission from the United Nations (UN) that, to date, efforts have failed to even slow down the rate of extinction across almost all plant and animal taxa (CBD, 2010), the fight to reverse the human-induced loss of biodiversity is entering a new chapter. The failure to achieve the targets set in 2002 for reducing decline has led to a revised strategy from the Campaign for Biodiversity (CBD). This new approach recognizes that species conservation cannot be treated in isolation from other issues facing humans, including climate change, water scarcity, poverty, agricultural development and global conflict. It also acknowledges that declining biodiversity cannot be tackled properly without a more accurate inventory of the species in existence today. Thus, a large part of the strategy to combat species decline focuses on building an exhaustive catalogue of life.The Global Strategy for Plant Conservation includes such a plan. The intention is to compile an online flora of known plants by 2020, which should enable comprehensive conservation efforts to gather steam. Peter Wyse Jackson, president of the Missouri Botanical Garden in the USA, said that around 25% of the estimated 400,000 plant species in the world, are thought to be threatened. He said that around 850 botanical gardens have, between them, collected around 100,000 species, but only a quarter of these are from the threatened group. “World Flora online will then be an essential baseline to determine the status of individual plant species and threats to them,” Jackson explained. “By 2020 it is proposed that at least 75% of known threatened plants should be conserved both in the wild and in existing collections.”…an online flora of known plants […] should enable comprehensive conservation efforts to gather steamMissouri Botanical Gardens will have an important role in the project and Jackson commented that the first step of the plan has already been achieved: the establishment of an online checklist of flora that is needed to build a comprehensive database of the plant species in the world.Yet, some other plans to halt species decline have drawn criticism. “In my opinion, whilst such international targets are useful to motivate individuals, states and wider society to do conservation, they are not necessarily realistic because they are often ‘pulled out of the hat'' with very little science behind them,” commented Shonil Bhagwat, senior research fellow at the School of Geography and the Environment at Oxford University.The revised CBD plan specifies measures for reversing the decline in biodiversity. One target is to enlarge protected areas for wildlife, within which activities such as logging are prohibited. Ecological corridors could then connect these areas to allow migration and create a network of ‘safe'' places for wildlife.Such a corridor is being created between two parts of the Brazilian Atlantic rainforest—the Pau Brasil National Park and the Monte Pascoal National Park—both of which are already protected. “Well-managed protected areas keep away biodiversity threats, such as deforestation, invasive species, hunting and poaching,” explained Arnd Alexander Rose, marketing manager for Brazil at The Nature Conservancy, a conservation organization that operates on all continents. “We think that the connectivity between the national parks is essential for the long-term permanence of local species, especially fauna,” Rose said.Worldwide, only around 6% of coastlines are within protected areas, but around 12% of the total land area is protected—a figure that is perhaps higher than many would expect, reflecting the large size of many national parks and other designated wildlife zones. Nevertheless, the coverage of different habitats varies greatly: “Only 5% of the world''s temperate needle-leaf forests and woodlands, 4.4% of temperate grasslands and 2.2% of lake systems are protected” (CBD, 2010). The aim of the CBD is to increase the total area of protected land to 17% by 2020, and also to expand the protected coastal zones, as well as extending the area of protected oceans to 10%.Things at sea, however, are different; both in terms of biodiversity and protection. The biggest threat to many marine species is not direct human activity—poaching or habitat encroachment, for example—but the impact of increased ocean acidity due to rising atmospheric carbon dioxide levels. Halting or reversing this increase will therefore contribute to the marine conservation effort and biodiversity in the long term.However, the first task is to establish the extent of marine biodiversity, particularly in terms of invertebrate animals, which are not well catalogued. Ian Poiner is CEO of the Australian Institute of Marine Science and chair of the steering committee for the first Census of Marine Life (Census of Marine Life, 2010), which has revealed the enormity of our remaining uncertainty. “So far 250,000 species [of invertebrates] have been formally described, but at least another 750,000 remain to be discovered, and I think it could be as many as 10 million,” Poiner said. As evidence for this uncertainty he points to the continuing high rate of discovery of new species around coral reefs, where each organism also tends to come with a new parasite. The situation is compounded by the problem of how to define diversity among prokaryotes.“…250,000 species [of invertebrates] have been formally described, but at least another 750,000 remain to be discovered…”Even if the number of non-vertebrate marine species remaining to be discovered turns out to be at the low end of estimates, Poiner points out that the abundance and diversity of life in the oceans will still be far greater than was expected before the census. For fish—a group that has been more extensively analysed than invertebrates—Poiner notes that there are several thousand species yet to be discovered, in addition to the 25,000 or more known species.The levels of diversity are perhaps most surprising for microorganisms. It was expected that these organisms would be present in astronomically large numbers—they are thought to account for 50–90% of the biomass in the oceans, as measured by total amount of carbon—but the high degree of genetic divergence found within even relatively small areas was unexpected. “We found there are about 38,000 kinds of bacteria in a litre of sea water,” Poiner said. “We also found that rarity is common, especially for microbes. If you take two separate litre samples of sea water just 10 or 20 kilometres apart, only a small percentage of the 38,000 bacteria types in each one are of the same kind. The challenge now is to find out why most are so rare.”This mystery is confounded by another result of the census: there is a much greater degree of connectedness than had been expected. Many fish, and even smaller invertebrate species, travel huge distances and navigate with great accuracy, rather like migratory birds. “Pacific white sharks will travel long distances and come back to within 50 metres from where they started,” Poiner said, by way of example.The behaviour of the sharks was discovered by using new tags, measuring just a few centimetres across, that can be attached to the heads of any large creatures to track their location and measure temperature, conductivity—and thereby salinity—and depth. For smaller creatures, such as baby salmon, a different technology is used that involves the attachment of passive acoustic sensors to their bodies. These trigger a signal when the fish swim through arrays of acoustic receivers that are installed in shallower waters at locations throughout the oceans.Although tagging and acoustic monitoring are providing new information about the movements and interactions of many species throughout the oceans, the huge task remains of identifying and cataloguing those species. For this, the quickly maturing technique of DNA barcoding has been useful and provides a relatively inexpensive and convenient way of assessing whether a specimen belongs to a new species or not. The method uses a short DNA sequence in the mitochondrial gene for cytochrome c oxidase subunit 1 (CO1)—around 600 base pairs in most species—which differs little within species but significantly between them (Kress & Erickson, 2008).The Marine Census programme involves several barcoding centres that have determined barcodes for more than 2,000 of the 7,000 known species of holozooplankton, for example (Census of Marine Zooplankton: http://www.cmarz.org). Holozooplankton are small, completely planktonic invertebrates—which spend their lives floating or swimming in open water—and are a particularly sensitive marker of environmental changes such as ocean warming or acidification.DNA barcoding can also be applied to prokaryotes, although it requires alternative sequences owing to the lack of mitochondria. In addition, horizontal gene transfer and uncertainty about how to define prokaryotic species complicate the task of cataloguing them. Nevertheless, by targeting a suitable core subset of a few genes, bacteria and archaea can be identified quite accurately, and barcoding can increase our knowledge and understanding of their behaviour and evolution.Such techniques could be applied to the identification of marine prokaryotic species, but Poiner argues that they need further refinement and will probably need to be combined with analytical methods that help estimate the total diversity, given that it is impossible to identify every single species at present. Indeed, the task of assessing the diversity of even land-based microorganisms is difficult, but such cataloguing is a prerequisite for accurate assessment of their response to environmental change.“There is a general rule that the smaller things are the less we know about them,” commented Stephen Blackmore, Regius Keeper of the Royal Botanical Gardens in Edinburgh, UK, a leading centre for conservation research. “I think it is very difficult or too early to say how biodiversity at the microscopic level is being impacted. Some of the newer approaches using DNA diversity to see, for example, what microorganisms are present in soil, will be important.”In the immediate future, advanced DNA analysis techniques have a more urgent application: the identification of genetic diversity within eukaryotic species. This is important because it determines the ability of populations to cope with rapid change: a species with greater genetic diversity is more likely to have individuals with phenotypes capable of surviving changes in habitat, temperature or nutrient availability. Genetic evidence will help to determine the secret of success for many invasive species of plants and animals, as they have already adapted to human influence.“A major emerging theme is to look at the genetic diversity present in wild plant populations and to try to correlate this with identifying the populations that are best suited for coping with climate change,” Blackmore said. “But it''s a very new field and so far not much is being funded. Meanwhile, the immediate prospect is that plants will continue slipping away more or less un-noticed. Even where the landscape appears green there is generally a steady erosion of plant biodiversity going, on driven by the shrinking of natural habitats, the encroachment of invasive species, climate change and land management practices.”Yet Blackmore is optimistic that knowledge of how to preserve biodiversity is increasing, even for less adaptable species. “We know how to, for example, grow food crops in ways that are more beneficial to biodiversity, but the desire for the cheapest food means that uptake is too limited. We know how to do most of the things needed to protect biodiversity. Unfortunately they are not being done.”There is hope, though, that increased understanding of biodiversity as a single, interconnected problem—rather than a series of unrelated hot spots and particular species—will lead to more coherent strategies for arresting global decline. The fate of flowering plants, for example, is intimately tied to their pollinators and seed dispersers. Most land animals in turn depend directly or indirectly on plants. “Since plants are the base of the food chain in all terrestrial environments, the threats to animals are increasing even more rapidly than those to the plants they depend upon,” Blackmore noted. “It is still the case, however, that most conservation action is framed in terms of charismatic animals—such as tigers, whales, polar bears and pandas—rather than on the continuation of the kinds of place they require to live in.”Due to human nature, this ‘cute'' framing of the problem is perhaps inevitable. However, if it creates a groundswell of public concern leading to voluntary involvement and donation towards biodiversity conservation, then all species might benefit in the end. After all, animals and plants do not respect arbitrary human boundaries, so an ecological corridor and protected habitat created for tigers will also benefit other, less ‘cuddly'' species.  相似文献   

12.
Despite the scientific community''s overwhelming support for the European Research Council, many grant recipients are irked about red tapeThere is one thing that most European researchers agree on: B stands for Brussels and bureaucracy. Research funding from the European Commission (EC), which distributes EU money, is accompanied by strict accountability and auditing rules in order to ensure that European taxpayers'' money is not wasted. All disbursements are treated the same, whether subsidies to farmers or grants to university researchers. However, the creation of the European Research Council (ERC) in 2007 as a new EU funding agency for basic research created high hopes among scientists for a reduced bureaucratic burden.… many researchers who have received ERC funding have been angered with accounting rules inherited from the EC''s Framework Programmes…ERC has, indeed, been a breath of fresh air to European-level research funding as it distributes substantial grants based only on the excellence of the proposal and has been overwhelmingly supported by the scientific community. Nevertheless, many researchers who have received ERC funding have been angered with accounting rules inherited from the EC''s Framework Programmes, and which seem impossible to change. In particular, a requirement to fill out time sheets to demonstrate that scientists spend an appropriate amount of time working on the project for which they received their ERC grant has triggered protests over the paperwork (Jacobs, 2009).Luis Serrano, Coordinator of the Systems Biology Programme at the Centre for Genomic Regulation in Barcelona, Spain, and recipient of a €2 million ERC Advanced Investigator Grant for five years, said the requirement of keeping time sheets is at best a waste of time and worst an insult to the high-level researchers. “Time sheets do not make much sense, to be honest. If you want to cheat, you can always cheat,” he said. He said other grants he receives from the Spanish government and the Human Frontier Science Programme do not require time sheets.Complaints by academic researchers about the creeping bureaucratization of research are not confined to the old continent (see Opinion by Paul van Helden, page 648). As most research, as well as universities and research institutes, is now funded by public agencies using taxpayers'' money, governments and regulators feel to be under pressure to make sure that the funds are not wasted or misappropriated. Yet, the USA and the EU have taken different approaches to making sure that scientists use public money correctly. In the USA, misappropriation of public money is considered a criminal offence that can be penalized by a ban on receiving public funds, fines and even jail time; in fact, a few scientists in the USA have gone to prison.By contrast, the EU puts the onus on controlling how public money is spent upfront. Research funding under the EU''s Framework Programmes requires clearly spelt out deliverables and milestones, and requires researchers to adhere to strict accountability and auditing rules. Not surprisingly, this comes with an administrative burden that has raised the ire of many scientists who feel that their time is better spent doing research. Serrano said in a major research centre such as the CRG, the administration could minimize the paper burden. “My administration prepares them for me and I go one, two, three, four, five and I do all of them. You can even have a machine sign for you,” he commented. “But I can imagine researchers who don''t have the administrative help, this can take up a significant amount of time.” For ERC grants, which by definition are for ‘blue-skies'' research and thus do not have milestones or deliverables, such paperwork is clearly not needed.Complaints by academic researchers about the creeping bureaucratization of research are not confined to the old continentNot everyone is as critical as Serrano though. Vincent Savolainen at the Division of Biology at Imperial College London, UK, and recipient of a €2.5 million, five-year ERC Advanced Investigator Grant, said, “Everything from the European Commission always comes with time sheets, and ERC is part of the European Commission.” Still, he felt it was very confusing to track time spent on individual grants for Principal Investigators such as him. “It is a little bit ridiculous but I guess there are places where people may abuse the system. So I can also see the side of the European Commission,” he said. “It''s not too bad. I can live with doing time sheets every month,” he added. “Still, it would be better if they got rid of it.”Juleen Zierath, an integrative physiologist in the Department of Molecular Medicine at Karolinska Institutet (Stockholm, Sweden), who received a €2.5 million, five-year ERC grant, takes the time sheets in her stride. “If I worked in a company, I would have to fill out a time sheet,” she said. “I''m delighted to have the funding. It''s a real merit. It''s a real honour. It really helps my work. If I have to fill out a time sheet for the privilege of having that amount of funding for five years, it''s not a big issue.”Zierath, a native of Milwaukee (WI, USA) who came to Karolinska for graduate work in 1989, said the ERC''s requirements are certainly “bureaucracy light” compared with the accounting and reporting requirements for more traditional EU funding instruments, such as the ‘Integrated Projects''. “ERC allows you to focus more on the science,” she said. “I don''t take time sheets as a signal that the European Union doesn''t count on us to be doing our work on the project. They have to be able to account for where they''re spending the money somehow and I think it''s okay. I can understand where some people would be really upset about that.”…governments and regulators feel to be under pressure to make sure that the funds are not wasted or misappropriated…The complaints about time sheets and other bureaucratic red tape have caught the attention of high-level scientists and research managers throughout Europe. In March 2009, the EC appointed an outside panel, headed by Vaira Vike-Freiberga, former President of Latvia, to review the ERC''s structures and mechanisms. The panel reported in July last year that the objective of building a world-class institution is not properly served by “undue cumbersome regulations, checks and controls.” Although fraud and mismanagement should be prevented, excessively bureaucratic procedures detract from the mission, and might be counter-productive.Helga Nowotny, President of the ERC, said the agency has to operate within the rules of the EC''s Framework Programme 7, which includes the ERC. She explained that if researchers hold several grants, the EC wants recipients to account for their time. “The Commission and the Rules of Participation of course argue that many of these researchers have more than one grant or they may have other contracts. In order to be accountable, the researchers must tell us how much time they spend on the project. But instead of simply asking if they spent a percentage of time on it, the Commission auditors insist on time sheets. I realize that filling them out has a high symbolic value for a researcher. So, why not leave it to the administration of the host institution?”Particle physicist Ian Halliday, President of the European Science Foundation and a major supporter of the ERC, said that financial irregularities that affected the EU over many years prompted the Commission to tighten its monitoring of cash outlays. “There have been endless scandals over the agricultural subsidies. Wine leaks. Nonexistent olive trees. You name it,” he said. “The Commission''s financial system is designed to cope with that kind of pressure as opposed to trusting the University of Cambridge, for example, which has been there for 800 years or so and has a well-earned reputation by now. That kind of system is applied in every corner of the European Commission. And that is basically what is causing the trouble. But these rules are not appropriate for research.”…financial irregularities that affected the EU over many years prompted the Commission to tighten its monitoring of cash outlaysNowotny is sympathetic and sensitive to the researchers'' complaints, saying that requiring time sheets for researchers sends a message of distrust. “It feels like you''re not trusted. It has this sort of pedantic touch to it,” she said. “If you''ve been recognized for doing this kind of top research, researchers feel, ‘Why bother [with time sheets]?''” But the bureaucratic alternative would not work for the ERC either. This would mean spelling out ‘deliverables'' in advance, which is clearly not possible with frontier research.Moreover, as Halliday pointed out, there is inevitably an element of fiction with time sheets in a research environment. In his area of research, for example, he considers it reasonable to track the hours of a technician fabricating parts of a telescope. But he noted that there is a different dynamic for researchers: “Scientists end up doing their science sitting in their bath at midnight. And you mull over problems and so forth. How do you put that on a time sheet?” Halliday added that one of the original arguments in establishing the ERC was to put it at an arm''s length from the Commission and in particular from financial regulations. But to require scientists to specify what proportion of their neurons are dedicated to a particular project at any hour of the day or night is nonsensical. Nowotny agreed. “The time sheet says I''ve been working on this from 11 in the morning until 6 in the evening or until midnight or whatever. This is not the way frontier research works,” she said.Halliday, who served for seven years as chief executive of the Particle Physics and Astronomy Research Council (Swindon, UK), commented that all governments require accountability. In Great Britain, for instance, much more general accountability rules are applied to grantees, thereby offering a measure of trust. “We were given a lot of latitude. Don''t get me wrong that we allowed fraud, but the system was fit for the purpose of science. If a professor says he''s spending half his time on a certain bit of medical research, let''s say, the government will expect half his salary to show up in the grants he gets from the funding agencies. We believe that if the University of Cambridge says that this guy is spending half his time on this research, then that''s probably right and nobody would get excited if it was 55% or 45%. People would get excited if it was 5%. There are checks and balances at that kind of level, but it''s not at a level of time sheets. It will be checked whether the project has done roughly what it said.”Other funding agencies also take a less bureaucratic approach. Candace Hassall, head of Basic Careers at the Wellcome Trust (London, UK), which funds research to improve human and animal health, said Wellcome''s translation awards have milestones that researchers are expected to meet. But “time sheets are something that the Wellcome Trust hasn''t considered at all. I would be astonished if we would ever consider them. We like to work closely with our researchers, but we don''t require that level of reporting detail,” she said. “We think that such detailed, day-by-day monitoring is actually potentially counterproductive overall. It drives people to be afraid to take risks when risks should be taken.”…to require scientists to specify what proportion of their neurons are dedicated to a particular project at any hour of the day or night is nonsensicalOn the other side of the Atlantic, Jack Dixon, vice president and chief scientific officer at the Howard Hughes Medical Institution (Chevy Chase, MD, USA), who directs Hughes'' investigator programme, said he''d never heard of researchers being asked to keep time sheets: “Researchers filling out time sheets is just something that''s never crossed our minds at the Hughes. I find it sort of goofy if you want to know the truth.”In fact, a system based on trust still works better in the academic worldInstead, Hughes trusts researchers to spend the money according to their needs. “We trust them,” Dixon said. “What we ask each of our scientists to do is devote 75% of their time to research and then we give them 25% of their time which they can use to teach, serve on committees. They can do consulting. They can do a variety of things. Researchers are free to explore.”There is already growing support for eliminating the time sheets and other bureaucratic requirements that come with an ERC grant, and which are obviously just a hangover from the old system. Indeed, there have been complaints, such as reviewers of grant applications having to fax in copies of their passports or identity cards, before being allowed sight of the proposals, said Nowotny. The review panel called on the EC to adapt its rules “based on trust and not suspicion and mistrust” so that the ERC can attain the “full realization of the dream shared by so many Europeans in the academic and policy world as well as in political milieus.”In fact, a system based on trust still works better in the academic world. Hassall commented that lump-sum payments encourage the necessary trust and give researchers a sense of freedom, which is already the principle behind ERC funding. “We think that you have to trust the researcher. Their careers are on the line,” she said. Nowotny hopes ERC will be allowed to take a similar approach to that of the Wellcome Trust, with its grants treated more like “a kind of prize money” than as a contract for services.She sees an opportunity to relax the bureaucratic burden with a scheduled revision of the Rules of Participation but issues a word of caution given that, when it comes to EU money, other players are involved. “We don''t know whether we will succeed in this because it''s up to the finance ministers, not even the research ministers,” she explained. “It''s the finance ministers who decide the rules of participation. If finance ministers agree then the time sheets would be gone.”  相似文献   

13.
Wolinsky H 《EMBO reports》2011,12(2):107-109
Considering a patient''s ethnic background can make some diagnoses easier. Yet, ‘racial profiling'' is a highly controversial concept and might soon be replaced by the advent of individualized medicine.In 2005, the US Food and Drug Administration (FDA; Bethesda, MD, USA) approved BiDil—a combination of vasodilators to treat heart failure—and hailed it as the first drug to specifically treat an ethnic group. “Approval of a drug to treat severe heart failure in self-identified black population is a striking example of how a treatment can benefit some patients even if it does not help all patients,” announced Robert Temple, the FDA''s Director of Medical Policy. “The information presented to the FDA clearly showed that blacks suffering from heart failure will now have an additional safe and effective option for treating their condition” (Temple & Stockbridge, 2007). Even the National Medical Association—the African-American version of the American Medical Association—advocated the drug, which was developed by NitroMed, Inc. (Lexington, MA, USA). A new era in medicine based on racial profiling seemed to be in the offing.By January 2008, however, the ‘breakthrough'' had gone bust. NitroMed shut down its promotional campaign for BiDil—a combination of the vasodilators isosorbide dinitrate, which affects arteries and veins, and hydralazine hydrochloride, which predominantly affects arteries. In 2009, it sold its BiDil interests and was itself acquired by another pharmaceutical company.In the meantime, critics had largely discredited the efforts of NitroMed, thereby striking a blow against the drug if not the concept of racial profiling or race-based medicine. Jonathan Kahn, a historian and law professor at Hamline University (St Paul, MN, USA), described the BiDil strategy as “a leap to genetics.” He demonstrated that NitroMed, motivated to extend its US patent scheduled to expire in 2007, purported to discover an advantage for a subpopulation of self-identified black people (Kahn, 2009). He noted that NitroMed conducted a race-specific trial to gain FDA approval, but, as there were no comparisons with other populations, it never had conclusive data to show that BiDil worked in black people differently from anyone else.“If you want to understand heart failure, you look at heart failure, and if you want to understand racial disparities in conditions such as heart failure or hypertension, there is much to look at that has nothing to do with genetics,” Kahn said, adding “that jumping to race as a genetic construct is premature at best and reckless generally in practice.” The USA, he explained, has a century-old tradition of marketing to racial and ethnic groups. “BiDil brought to the fore the notion that you can have ethnic markets not only in things like cigarettes and food, but also in pharmaceuticals,” Kahn commented.“BiDil brought to the fore the notion that you can have ethnic markets not only in things like cigarettes and food, but also in pharmaceuticals”However, despite BiDil''s failure, the search for race-based therapies and diagnostics is not over. “What I have found is an increasing, almost exponential, rise in the use of racial and ethnic categories in biotechnology-related patents,” Kahn said. “A lot of these products are still in the pipeline. They''re still patent applications, they''re not out on the market yet so it''s hard to know how they''ll play out.”The growing knowledge of the human genome is also providing new opportunities to market medical products aimed at specific ethnic groups. The first bumpy steps were taken with screening for genetic risk factors for breast cancers. Myriad Genetics (Salt Lake City, UT, USA) holds broad patents in the USA for breast-cancer screening tests that are based on mutations of the BRCA1 and BRCA2 genes, but it faced challenges in Europe, where critics raised concerns about the high costs of screening.The growing knowledge of the human genome is also providing new opportunities to market medical products aimed at specific ethnic groupsThe European Patent Office initially granted Myriad patents for the BRCA1 and BRCA2-based tests in 2001, after years of debate. But it revoked the patent on BRCA1 in 2005, which was again reversed in 2009. In 2005 Myriad decided to narrow the scope of BRCA2 testing on the basis of ethnicity. The company won a patent to predict breast-cancer risk in Ashkenazi Jewish women on the basis of BRCA2 mutations, which occur in one in 100 of these women. Physicians offering the test are supposed to ask their patients whether they are in this ethnic group, and then pay a fee to Myriad.Kahn said Myriad took this approach to package the test differently in order to protect its financial interests. However, he commented, the idea of ethnic profiling by asking women whether they identify themselves as Ashkenazi Jewish and then paying extra for an ‘ethnic'' medical test did not work in Europe. “It''s ridiculous,” Kahn commented.After the preliminary sequence of the human genome was published a decade ago, experts noted that humans were almost the same genetically, implying that race was irrelevant. In fact, the validity of race as a concept in science—let alone the use of the word—has been hotly debated. “Race, inasmuch as the concept ought to be used at all, is a social concept, not a biological one. And using it as though it were a biological one is as a much an ethical problem as a scientific problem,” commented Samia Hurst, a physician and bioethicist at Geneva University Medical School in Switzerland.Switzerland.Open in a separate window© Monalyn Gracia/CorbisCiting a popular slogan: “There is no gene for race,” she noted, “there doesn''t seem to be a single cluster of genes that fits with identification within an ethnic group, let alone with disease risks as well. We''re also in an increasingly mixed world where many people—and I count myself among them—just don''t know what to check on the box. If you start counting up your grandparents and end up with four different ethnic groups, what are you going to do? So there are an increasing number of people who just don''t fit into those categories at all.”Still, some dismiss criticism of racial profiling as political correctness that could potentially prevent patients from receiving proper care. Sally Satel, a psychiatrist in Washington, DC, USA, does not shy away from describing herself as a racially profiling physician and argues that it is good medicine. A commentator and resident scholar at the nonpartisan conservative think tank, the American Enterprise Institute (Washington, DC, USA), Satel wrote the book PC, M.D.: How Political Correctness is Corrupting Medicine. “In practicing medicine, I am not color blind. I take note of my patient''s race. So do many of my colleagues,” she wrote in a New York Times article entitled “I am a racially profiling doctor” (Satel, 2002).…some dismiss criticism of racial profiling as political correctness that could potentially prevent patients from receiving proper careSatel noted in an interview that it is an undeniable fact that black people tend to have more renal disease, Native Americans have more diabetes and white people have more cystic fibrosis. She said these differences can help doctors to decide which drugs to prescribe at which dose and could potentially lead researchers to discover new therapies on the basis of race.Satel added that the mention of race and medicine makes many people nervous. “You can dispel that worry by taking pains to specify biological lineage. Simply put, members of a group have more genes in common than members of the population at large. Some day geneticists hope to be able to conduct genomic profiles of each individual, making group identity irrelevant, but until then, race-based therapeutics has its virtues,” she said. “Denying the relationship between race and medicine flies in the face of clinical reality, and pretending that we are all at equal risk for health problems carries its own dangers.”However, Hurst contended that this approach may be good epidemiology, rather than racial profiling. Physicians therefore need to be cautious about using skin colour, genomic data and epidemiological data in decision making. “If African Americans are at a higher risk for hypertension, are you not going to check for hypertension in white people? You need to check in everyone in any case,” she commented.Hurst said European physicians, similarly to their American colleagues, deal with race and racial profiling, albeit in a different way. “The way in which we struggle with it is strongly determined by the history behind what could be called the biases that we have. If you have been a colonial power, if the past is slavery or if the past or present is immigration, it does change some things,” she said. “On the other hand, you always have the difficulty of doing fair and good medicine in a social situation that has a kind of ‘them and us'' structure. Because you''re not supposed to do medicine in a ‘them and us'' structure, you''re supposed to treat everyone according to their medical needs and not according to whether they''re part of ‘your tribe'' or ‘another tribe''.”Indeed, social factors largely determine one''s health, rather than ethnic or genetic factors. August A. White III, an African-American orthopaedic surgeon at Harvard Medical School (Boston, MA, USA) and author of the book Seeing Patients: Unconscious Bias In Health Care, noted that race is linked to disparities in health care in the USA. A similar point can be made in Europe where, for example, Romani people face discrimination in several countries.White said that although genetic research shows that race is not a scientific concept, the way people are labelled in society and how they are treated needs to be taken into account. “It''d be wonderful at some point if we can pop one''s key genetic information into a computer and get a printout of which medications are best of them and which doses are best for them,” he commented. “In the meantime though, I advocate careful operational attempts to treat everyone as human beings and to value everyone''s life, not devalue old people, or devalue women, or devalue different religious faiths, etc.”Notwithstanding the scientific denunciation, a major obstacle for the concept of racial profiling has been the fact that the word ‘race'' itself is politically loaded, as a result of, among other things, the baggage of eugenics and Nazi racism and the legacies of slavery and colonialism. Richard Tutton, a sociologist at Lancaster University in the UK, said that British scientists he interviewed for a Wellcome Trust project a few years ago prefer the term ethnicity to race. “Race is used in a legal sense in relation to inequality, but certainly otherwise, ethnicity is the preferred term, which obviously is different to the US” he said. “I remember having conversations with German academics and obviously in Germany you couldn''t use the R-word.”Jan Helge Solbakk, a physician, theologian and medical ethicist at the University of Oslo in Norway, said the use of the term race in Europe is a non-starter because it makes it impossible for the public and policy-makers to communicate. “I think in Europe it would be politically impossible to launch a project targeting racial differences on the genetic level. The challenge is to find not just a more politically correct concept, but a genetically more accurate concept and to pursue such research questions,” he said. According to Kahn, researchers therefore tend to refer to ethnicity rather than race: “They''re talking about European, Asian and African, but they''re referring to it as ethnicity instead of race because they think somehow that''s more palatable.”Regardless, race-based medicine might just be a stepping stone towards more refined and accurate methods, with the advent of personalized medicine based on genomics, according to Leroy Hood, whose work has helped to develop tools to analyse the human genome. The focus of his company—the Institute for Systems Biology (Seattle, WA, USA)—is to identify genetic variants that can inform and help patients to pioneer individualized health care.“Race as a concept is disappearing with interbreeding,” Hood said. “Race distinction is going to slowly fade away. We can use it now because we have signposts for race, which are colour, fairness, kinkiness of hair, but compared to a conglomeration of things that define a race, those are very few features. The race-defining features are going to be segregating away from one another more and more as the population becomes racially heterogeneous, so I think it''s going to become a moot point.”Hood instead advocates “4P” health care—“Predictive, Personalized, Preventive and Participatory.” “My overall feeling about the race-based correlations is that it is far more important to think about the individual and their individual unique spectra of health and wellness,” he explained. “I think we are not going to deal in the future with racial or ethnic populations, rather medicine of the future is going to be focused entirely on the individual.”Yet, Arthur Caplan, Director of the Center for Bioethics at the University of Pennsylvania (Philadelphia, PA, USA), is skeptical about the prospects for both race-based and personalized medicine. “Race-based medicine will play a minor role over the next few years in health care because race is a minor factor in health,” he said. “It''s not like we have a group of people who keel over dead at 40 who are in the same ethnic group.”Caplan also argued that establishing personalized genomic medicine in a decade is a pipe dream. “The reason I say that is it''s not just the science,” he explained. “You have to redo the whole health-care system to make that possible. You have to find manufacturers who can figure out how to profit from personalized medicine who are both in Europe and the United States. You have to have doctors that know how to prescribe them. It''s a big, big revamping. That''s not going to happen in 10 years.”Hood, however, is more optimistic and plans to advance the concept with pilot projects; he believes that Europe might be the better testing ground. “I think the European systems are much more efficient for pioneering personalized medicine than the United States because the US health-care system is utterly chaotic. We have every combination of every kind of health care and health delivery. We have no common shared vision,” he said. “In the end we may well go to Europe to persuade a country to really undertake this. The possibility of facilitating a revolution in health care is greater in Europe than in the United States.”  相似文献   

14.
Wolinsky H 《EMBO reports》2010,11(11):830-833
Sympatric speciation—the rise of new species in the absence of geographical barriers—remains a puzzle for evolutionary biologists. Though the evidence for sympatric speciation itself is mounting, an underlying genetic explanation remains elusive.For centuries, the greatest puzzle in biology was how to account for the sheer variety of life. In his 1859 landmark book, On the Origin of Species, Charles Darwin (1809–1882) finally supplied an answer: his grand theory of evolution explained how the process of natural selection, acting on the substrate of genetic mutations, could gradually produce new organisms that are better adapted to their environment. It is easy to see how adaptation to a given environment can differentiate organisms that are geographically separated; different environmental conditions exert different selective pressures on organisms and, over time, the selection of mutations creates different species—a process that is known as allopatric speciation.It is more difficult to explain how new and different species can arise within the same environment. Although Darwin never used the term sympatric speciation for this process, he did describe the formation of new species in the absence of geographical separation. “I can bring a considerable catalogue of facts,” he argued, “showing that within the same area, varieties of the same animal can long remain distinct, from haunting different stations, from breeding at slightly different seasons, or from varieties of the same kind preferring to pair together” (Darwin, 1859).It is more difficult to explain how new and different species can arise within the same environmentIn the 1920s and 1930s, however, allopatric speciation and the role of geographical isolation became the focus of speciation research. Among those leading the charge was Ernst Mayr (1904–2005), a young evolutionary biologist, who would go on to influence generations of biologists with his later work in the field. William Baker, head of palm research at the Royal Botanic Gardens, Kew in Richmond, UK, described Mayr as “one of the key figures to crush sympatric speciation.” Frank Sulloway, a Darwin Scholar at the Institute of Personality and Social Research at the University of California, Berkeley, USA, similarly asserted that Mayr''s scepticism about sympatry was central to his career.The debate about sympatric and allopatric speciation has livened up since Mayr''s death…Since Mayr''s death in 2005, however, several publications have challenged the notion that sympatric speciation is a rare exception to the rule of allopatry. These papers describe examples of both plants and animals that have undergone speciation in the same location, with no apparent geographical barriers to explain their separation. In these instances, a single ancestral population has diverged to the extent that the two new species cannot produce viable offspring, despite the fact that their ranges overlap. The debate about sympatric and allopatric speciation has livened up since Mayr''s death, as Mayr''s influence over the field has waned and as new tools and technologies in molecular biology have become available.Sulloway, who studied with Mayr at Harvard University, in the late 1960s and early 1970s, notes that Mayr''s background in natural history and years of fieldwork in New Guinea and the Solomon Islands contributed to his perception that the bulk of the data supported allopatry. “Ernst''s early career was in many ways built around that argument. It wasn''t the only important idea he had, but he was one of the strong proponents of it. When an intellectual stance exists where most people seem to have gotten it wrong, there is a tendency to sort of lay down the law,” Sulloway said.Sulloway also explained that Mayr “felt that botanists had basically led Darwin astray because there is so much evidence of polyploidy in plants and Darwin turned in large part to the study of botany and geographical distribution in drawing evidence in The Origin.” Indeed, polyploidization is common in plants and can lead to ‘instantaneous'' speciation without geographical barriers.In February 2006, the journal Nature simultaneously published two papers that described sympatric speciation in animals and plants, reopening the debate. Axel Meyer, a zoologist and evolutionary biologist at the University of Konstanz, Germany, demonstrated with his colleagues that sympatric speciation has occurred in cichlid fish in Lake Apoyo, Nicaragua (Barluenga et al, 2006). The researchers claimed that the ancestral fish only seeded the crater lake once; from this, new species have evolved that are distinct and reproductively isolated. Meyer''s paper was broadly supported, even by critics of sympatric speciation, perhaps because Mayr himself endorsed sympatric speciation among the cichlids in his 2001 book What Evolution Is. “[Mayr] told me that in the case of our crater lake cichlids, the onus of showing that it''s not sympatric speciation lies with the people who strongly believe in only allopatric speciation,” Meyer said.…several scientists involved in the debate think that molecular biology could help to eventually resolve the issueThe other paper in Nature—by Vincent Savolainen, a molecular systematist at Imperial College, London, UK, and colleagues—described the sympatric speciation of Howea palms on Lord Howe Island (Fig 1), a minute Pacific island paradise (Savolainen et al, 2006a). Savolainen''s research had originally focused on plant diversity in the gesneriad family—the best known example of which is the African violet—while he was in Brazil for the Geneva Botanical Garden, Switzerland. However, he realized that he would never be able prove the occurrence of sympatry within a continent. “It might happen on a continent,” he explained, “but people will always argue that maybe they were separated and got together after. […] I had to go to an isolated piece of the world and that''s why I started to look at islands.”Open in a separate windowFigure 1Lord Howe Island. Photo: Ian Hutton.He eventually heard about Lord Howe Island, which is situated just off the east coast of Australia, has an area of 56 km2 and is known for its abundance of endemic palms (Sidebar A). The palms, Savolainen said, were an ideal focus for sympatric research: “Palms are not the most diverse group of plants in the world, so we could make a phylogeny of all the related species of palms in the Indian Ocean, southeast Asia and so on.”…the next challenges will be to determine which genes are responsible for speciation, and whether sympatric speciation is common

Sidebar A | Research in paradise

Alexander Papadopulos is no Tarzan of the Apes, but he has spent a couple months over the past two years aloft in palm trees hugging rugged mountainsides on Lord Howe Island, a Pacific island paradise and UNESCO World Heritage site.Papadopulos—who is finishing his doctorate at Imperial College London, UK—said the views are breathtaking, but the work is hard and a bit treacherous as he moves from branch to branch. “At times, it can be quite hairy. Often you''re looking over a 600-, 700-metre drop without a huge amount to hold onto,” he said. “There''s such dense vegetation on most of the steep parts of the island. You''re actually climbing between trees. There are times when you''re completely unsupported.”Papadopulos typically spends around 10 hours a day in the field, carrying a backpack and utility belt with a digital camera, a trowel to collect soil samples, a first-aid kit, a field notebook, food and water, specimen bags, tags to label specimens, a GPS device and more. After several days in the field, he spends a day working in a well-equipped field lab and sleeping in the quarters that were built by the Lord Howe governing board to accommodate the scientists who visit the island on various projects. Papadopulos is studying Lord Howe''s flora, which includes more than 200 plant species, about half of which are indigenous.Vincent Savolainen said it takes a lot of planning to get materials to Lord Howe: the two-hour flight from Sydney is on a small plane, with only about a dozen passengers on board and limited space for equipment. Extra gear—from gardening equipment to silica gel and wood for boxes in which to dry wet specimens—arrives via other flights or by boat, to serve the needs of the various scientists on the team, including botanists, evolutionary biologists and ecologists.Savolainen praised the well-stocked researcher station for visiting scientists. It is run by the island board and situated near the palm nursery. It includes one room for the lab and another with bunks. “There is electricity and even email,” he said. Papadoupulos said only in the past year has the internet service been adequate to accommodate video calls back home.Ian Hutton, a Lord Howe-based naturalist and author, who has lived on the island since 1980, said the island authorities set limits on not only the number of residents—350—but also the number of visitors at one time—400—as well as banning cats, to protect birds such as the flightless wood hen. He praised the Imperial/Kew group: “They''re world leaders in their field. And they''re what I call ‘Gentlemen Botanists''. They''re very nice people, they engage the locals here. Sometimes researchers might come here, and they''re just interested in what they''re doing and they don''t want to share what they''re doing. Not so with these people. Savolainen said his research helps the locals: “The genetics that we do on the island are not only useful to understand big questions about evolution, but we also always provide feedback to help in its conservation efforts.”Yet, in Savolainen''s opinion, Mayr''s influential views made it difficult to obtain research funding. “Mayr was a powerful figure and he dismissed sympatric speciation in textbooks. People were not too keen to put money on this,” Savolainen explained. Eventually, the Leverhulme Trust (London, UK) gave Savolainen and Baker £70,000 between 2003–2005 to get the research moving. “It was enough to do the basic genetics and to send a research assistant for six months to the island to do a lot of natural history work,” Savolainen said. Once the initial results had been processed, the project received a further £337,000 from the British Natural Environment Research Council in 2008, and €2.5 million from the European Research Council in 2009.From the data collected on Lord Howe Island, Savolainen and his team constructed a dated phylogenetic tree showing that the two endemic species of the palm Howea (Arecaceae; Fig 2) are sister taxa. From their tree, the researchers were able to establish that the two species—one with a thatch of leaves and one with curly leaves—diverged long after the island was formed 6.9 million years ago. Even where they are found in close proximity, the two species cannot interbreed as they flower at different times.Open in a separate windowFigure 2The two species of Howea palm. (A) Howea fosteriana (Kentia palm). (B) Howea belmoreana. Photos: William Baker, Royal Botanical Gardens, Kew, Richmond, UK.According to the researchers, the palm speciation probably occurred owing to the different soil types in which the plants grow. Baker explained that there are two soil types on Lord Howe—the older volcanic soil and the younger calcareous soils. The Kentia palm grows in both, whereas the curly variety is restricted to the volcanic soil. These soil types are closely intercalated—fingers and lenses of calcareous soils intrude into the volcanic soils in lowland Lord Howe Island. “You can step over a geological boundary and the palms in the forest can change completely, but they remain extremely close to each other,” Baker said. “What''s more, the palms are wind-pollinated, producing vast amounts of pollen that blows all over the place during the flowering season—people even get pollen allergies there because there is so much of the stuff.” According to Savolainen, that the two species have different flowering times is a “way of having isolation so that they don''t reproduce with each other […] this is a mechanism that evolved to allow other species to diverge in situ on a few square kilometres.”According to Baker, the absence of a causative link has not been demonstrated between the different soils and the altered flowering times, “but we have suggested that at the time of speciation, perhaps when calcareous soils first appeared, an environmental effect may have altered the flowering time of palms colonising the new soil, potentially causing non-random mating and kicking off speciation. This is just a hypothesis—we need to do a lot more fieldwork to get to the bottom of this,” he said. What is clear is that this is not allopatric speciation, as “the micro-scale differentiation in geology and soil type cannot create geographical isolation”, said Baker.…although molecular data will add to the debate, it will not settle it aloneThe results of the palm research caused something of a splash in evolutionary biology, although the study was not without its critics. Tod Stuessy, Chair of the Department of Systematic and Evolutionary Botany at the University of Vienna, Austria, has dealt with similar issues of divergence on Chile''s Juan Fernández Islands—also known as the Robinson Crusoe Islands—in the South Pacific. From his research, he points out that on old islands, large ecological areas that once separated species—and caused allopatric speciation—could have since disappeared, diluting the argument for sympatry. “There are a lot of cases [in the Juan Fernández Islands] where you have closely related species occurring in the same place on an island, even in the same valley. We never considered that they had sympatric origins because we were always impressed by how much the island had been modified through time,” Stuessy said. “What [the Lord Howe researchers] really didn''t consider was that Lord Howe Island could have changed a lot over time since the origins of the species in question.” It has also been argued that one of the palm species on Lord Howe Island might have evolved allopatrically on a now-sunken island in the same oceanic region.In their response to a letter from Stuessy, Savolainen and colleagues argued that erosion on the island has been mainly coastal and equal from all sides. “Consequently, Quaternary calcarenite deposits, which created divergent ecological selection pressures conducive to Howea species divergence, have formed evenly around the island; these are so closely intercalated with volcanic rocks that allopatric speciation due to ecogeographic isolation, as Stuessy proposes, is unrealistic” (Savolainen et al, 2006b). Their rebuttal has found support in the field. Evolutionary biologist Loren Rieseberg at the University of British Columbia in Vancouver, Canada, said: “Basically, you have two sister species found on a very small island in the middle of the ocean. It''s hard to see how one could argue anything other than they evolved there. To me, it would be hard to come up with a better case.”Whatever the reality, several scientists involved in the debate think that molecular biology could help to eventually resolve the issue. Savolainen said that the next challenges will be to determine which genes are responsible for speciation, and whether sympatric speciation is common. New sequencing techniques should enable the team to obtain a complete genomic sequence for the palms. Savolainen said that next-generation sequencing is “a total revolution.” By using sequencing, he explained that the team, “want to basically dissect exactly what genes are involved and what has happened […] Is it very special on Lord Howe and for this palm, or is [sympatric speciation] a more general phenomenon? This is a big question now. I think now we''ve found places like Lord Howe and [have] tools like the next-gen sequencing, we can actually get the answer.”Determining whether sympatric speciation occurs in animal species will prove equally challenging, according to Meyer. His own lab, among others, is already looking for ‘speciation genes'', but this remains a tricky challenge. “Genetic models […] argue that two traits (one for ecological specialisation and another for mate choice, based on those ecological differences) need to become tightly linked on one chromosome (so that they don''t get separated, often by segregation or crossing over). The problem is that the genetic basis for most ecologically relevant traits are not known, so it would be very hard to look for them,” Meyer explained. “But, that is about to change […] because of next-generation sequencing and genomics more generally.”Many researchers who knew Mayr personally think he would have enjoyed the challenge to his viewsOthers are more cautious. “In some situations, such as on isolated oceanic islands, or in crater lakes, molecular phylogenetic information can provide strong evidence of sympatric speciation. It also is possible, in theory, to use molecular data to estimate the timing of gene flow, which could help settle the debate,” Rieseberg said. However, he cautioned that although molecular data will add to the debate, it will not settle it alone. “We will still need information from historical biogeography, natural history, phylogeny, and theory, etc. to move things forward.”Many researchers who knew Mayr personally think he would have enjoyed the challenge to his views. “I can only imagine that it would''ve been great fun to engage directly with him [on sympatry on Lord Howe],” Baker said. “It''s a shame that he wasn''t alive to comment on [our paper].” In fact, Mayr was not really as opposed to sympatric speciation as some think. “If one is of the opinion that Mayr opposed all forms of sympatric speciation, well then this looks like a big swing back the other way,” Sulloway commented. “But if one reads Mayr carefully, one sees that he was actually interested in potential exceptions and, as best he could, chronicled which ones he thought were the best candidates.”Mayr''s opinions aside, many biologists today have stronger feelings against sympatric speciation than he did himself in his later years, Meyer added. “I think that Ernst was more open to the idea of sympatric speciation later in his life. He got ‘softer'' on this during the last two of his ten decades of life that I knew him. I was close to him personally and I think that he was much less dogmatic than he is often made out to be […] So, I don''t think that he is spinning in his grave.” Mayr once told Sulloway that he liked to take strong stances, precisely so that other researchers would be motivated to try to prove him wrong. “If they eventually succeeded in doing so, Mayr felt that science was all the better for it.”? Open in a separate windowAlex Papadopulos and Ian Hutton doing fieldwork on a very precarious ridge on top of Mt. Gower. Photo: William Baker, Royal Botanical Gardens, Kew, Richmond, UK.  相似文献   

15.
16.
Wolinsky H 《EMBO reports》2012,13(4):308-312
Genomics has become a powerful tool for conservationists to track individual animals, analyse populations and inform conservation management. But as helpful as these techniques are, they are not a substitute for stricter measures to protect threatened species.You might call him Queequeg. Like Herman Melville''s character in the 1851 novel Moby Dick, Howard Rosenbaum plies the seas in search of whales following old whaling charts. Standing on the deck of a 12 m boat, he brandishes a crossbow with hollow-tipped darts to harpoon the flanks of the whales as they surface to breathe (Fig 1). “We liken it to a mosquito bite. Sometimes there''s a reaction. Sometimes the whales are competing to mate with a female, so they don''t even react to the dart,” explained Rosenbaum, a conservation biologist and geneticist, and Director of the New York City-based Wildlife Conservation Society''s Ocean Giants programme. Rosenbaum and his colleagues use the darts to collect half-gram biopsy samples of whale epidermis and fat—about the size of a human fingernail—to extract DNA as part of international efforts to save the whales.Open in a separate windowFigure 1Howard Rosenbaum with a crossbow to obtain skin samples from whales. © Wildlife Conservation Society.Like Rosenbaum, many conservation biologists and wildlife managers increasingly rely on DNA analysis tools to identify species, determine sex or analyse pedigrees. George Amato, Director of the Sackler Institute for Comparative Genomics at the American Museum of Natural History in New York, NY, USA, said that during his 25-year career, genetic tools have become increasingly important for conservation biology and related fields. Genetic information taken from individual animals to the extent of covering whole populations now plays a valuable part in making decisions about levels of protection for certain species or populations and managing conflicts between humans and conservation goals.[…] many conservation biologists and wildlife managers increasingly rely on DNA analysis tools to identify species, determine sex or analyse pedigreesMoreover, Amato expects the use and importance of genetics to grow even more, given that conservation of biodiversity has become a global issue. “My office overlooks Central Park. And there are conservation issues in Central Park: how do you maintain the diversity of plants and animals? I live in suburban Connecticut, where we want the highest levels of diversity within a suburban environment,” he said. “Then, you take this all the way to Central Africa. There are conservation issues across the entire spectrum of landscapes. With global climate change, techniques in genetics and molecular biology are being used to look at issues and questions across that entire landscape.”Rosenbaum commented, “The genomic revolution has certainly changed the way we think about conservation and the questions we can ask and the things we can do. It can be a forensic analysis.” The data translates “into a conservation value where governments, conservationists, and people who actively protect these species can use this information to better protect these animals in the wild.”“The genomic revolution has certainly changed the way we think about conservation […]”Rosenbaum and colleagues from the Wildlife Conservation Society, the American Museum of Natural History and other organizations used genomics for the largest study so far—based on more than 1,500 DNA samples—about the population dynamics of humpback whales in the Southern hemisphere [1]. The researchers analysed population structure and migration rates; they found the highest gene flow between whales that breed on either side of the African continent and a lower gene flow between whales on opposite sides of the Atlantic, from the Brazilian coast to southern Africa. The group also identified an isolated population of fewer than 200 humpbacks in the northern Indian Ocean off the Arabian Peninsula, which are only distantly related to the humpbacks breeding off the coast of Madagascar and the eastern coast of southern Africa. “This group is a conservation priority,” Rosenbaum noted.He said the US National Oceanographic and Atmospheric Administration is using this information to determine whether whale populations are recovering or endangered and what steps should be taken to protect them. Through wildlife management and protection, humpbacks have rebounded to 60,000 or more individuals from fewer than 5,000 in the 1960s. Rosenbaum''s data will, among other things, help to verify whether the whales should be managed as one large group or divided into subgroups.He has also been looking at DNA collected from dolphins caught in fishing nets off the coast of Argentina. Argentine officials will be using the data to make recommendations about managing these populations. “We''ve been able to demonstrate that it''s not one continuous population in Argentina. There might be multiple populations that merit conservation protection,” Rosenbaum explained.The sea turtle is another popular creature that is high on conservationists'' lists. To get DNA samples from sea turtles, population geneticist and wildlife biologist Nancy FitzSimmons from the University of Canberra in Australia reverts to a simpler method than Rosenbaum''s harpoon. “Ever hear of a turtle rodeo?” she asked. FitzSimmons goes out on a speed boat in the Great Barrier Reef with her colleagues, dives into the water and wrangles a turtle on board so it can be measured, tagged, have its reproductive system examined with a laparoscope and a skin tag removed with a small scissor or scalpel for DNA analysis (Fig 2).Open in a separate windowFigure 2Geneticist Stewart Pittard measuring a sea turtle. © Michael P. Jensen, NOAA.Like Rosenbaum, she uses DNA as a forensic tool to characterize individuals and populations [2]. “That''s been a really important part, to be able to tell people who are doing the management, ‘This population is different from that one, and you need to manage them appropriately,''” FitzSimmons explained. The researchers have characterized the turtle''s feeding grounds around Australia to determine which populations are doing well and which are not. If they see that certain groups are being harmed through predation or being trapped in ‘ghost nets'' abandoned by fishermen, conservation measures can be implemented.FitzSimmons, who started her career studying the genetics of bighorn sheep, has recently been using DNA technology in other areas, including finding purebred crocodiles to reintroduce them into a wetland ecosystem at Cat Tien National Park in Vietnam. “DNA is invaluable. You can''t reintroduce animals that aren''t purebred,” she said, explaining the rationale for looking at purebreds. “It''s been quite important to do genetic studies to make sure you''re getting the right animals to the right places.”Geneticist Hans Geir Eiken, senior researcher at the Norwegian Institute for Agricultural and Environmental Research in Svanvik, Norway, does not wrestle with the animals he is interested in. He uses a non-intrusive method to collect DNA from brown bears (Fig 3). “We collect the hair that is on the vegetation, on the ground. We can manage with only a single hair to get a DNA profile,” he said. “We can even identify mother and cub in the den based on the hairs. We can collect hairs from at least two different individuals and separate them afterwards and identify them as separate entities. Of course we also study how they are related and try to separate the bears into pedigrees, but that''s more research and it''s only occasionally that we do that for [bear] management.”Open in a separate windowFigure 3Bear management in Scandinavia. (A) A brown bear in a forest in Northern Finland © Alexander Kopatz, Norwegian Institute for Agricultural and Environmental Research. (B) Faecal sampling. Monitoring of bears in Norway is performed in a non-invasive way by sampling hair and faecal samples in the field followed by DNA profiling. © Hans Geir Eiken. (C) Brown-bear hair sample obtained by so-called systematic hair trapping. A scent lure is put in the middle of a small area surrounded by barbed wire. To investigate the smell, the bears have to cross the wire and some hair will be caught. © Hans Geir Eiken. (D) A female, 2.5-year-old bear that was shot at Svanvik in the Pasvik Valley in Norway in August 2008. She and her brother had started to eat from garbage cans after they left their mother and the authorities gave permission to shoot them. The male was shot one month later after appearing in a schoolyard. © Hans Geir Eiken.Eiken said the Norwegian government does not invest a lot of money on helicopters or other surveillance methods, and does not want to not bother the animals. “A lot of disturbing things were done to bears. They were trapped. They were radio-collared,” he said. “I think as a researcher we should replace those approaches with non-invasive genetic techniques. We don''t disturb them. We just collect samples from them.”Eiken said that the bears pose a threat to two million sheep that roam freely around Norway. “Bears can kill several tons of them everyday. This is not the case in the other countries where they don''t have free-ranging sheep. That''s why it''s a big economic issue for us in Norway.” Wildlife managers therefore have to balance the fact that brown bears are endangered against the economic interests of sheep owners; about 10% of the brown bears are killed each year because they have caused damage, or as part of a restricted ‘licensed'' hunting programme. Eiken said that within two days of a sheep kill, DNA analysis can determine which species killed the sheep, and, if it is a bear, which individual. “We protect the females with cubs. Without the DNA profiles, it would be easy to kill the females, which also take sheep of course.”Wildlife managers […] have to balance the fact that brown bears are endangered against the economic interests of sheep owners…It is not only wildlife management that interests Eiken; he was part of a group led by Axel Janke at the Biodiversity and Climate Research Centre in Frankfurt am Main, Germany, which completed sequencing of the brown bear genome last year. The genome will be compared with that of the polar bear in the hope of finding genes involved in environmental adaptation. “The reason why [the comparison is] so interesting between the polar bear and the brown bear is that if you look at their evolution, it''s [maybe] less than one million years when they separated. In genetics that''s not a very long time,” Eiken said. “But there are a lot of other issues that we think are even more interesting. Brown bears stay in their caves for 6 months in northern Norway. We think we can identify genes that allow the bear to be in the den for so long without dying from it.”Like bears, wolves have also been clashing with humans for centuries. Hunters exterminated the natural wolf population in the Scandinavian Peninsula in the late nineteenth century as governments protected reindeer farming in northern Scandinavia. After the Swedish government finally banned wolf hunting in the 1960s, three wolves from Finland and Russia immigrated in the 1980s, and the population rose to 250, along with some other wolves that joined the highly inbred population. Sweden now has a database of all individual wolves, their pedigrees and breeding territories to manage the population and resolve conflicts with farmers. “Wolves are very good at causing conflicts with people. If a wolf takes a sheep or cattle, or it is in a recreation area, it represents a potential conflict. If a wolf is identified as a problem, then the local authorities may issue a license to shoot that wolf,” said Staffan Bensch, a molecular ecologist and ornithologist at Lund University in Sweden.Again, it is the application of genomics tools that informs conservation management for the Scandinavian wolf population. Bensch, who is best known for his work on population genetics and genomics of migratory songbirds, was called to apply his knowledge of microsatellite analysis. The investigators collect saliva from the site where a predator has chewed or bitten the prey, and extract mitochondrial DNA to determine whether a wolf, a bear, a fox or a dog has killed the livestock. The genetic information potentially can serve as a death warrant if a wolf is linked with a kill, and to determine compensation for livestock owners.The genetic information potentially can serve as a death warrant if a wolf is linked with a kill…Yet, not all wolves are equal. “If it''s shown to be a genetically valuable wolf, then somehow more damage can be tolerated, such as a wolf taking livestock for instance,” Bensch said. “In the management policy, there is genetic analysis of every wolf that has a question on whether it should be shot or saved. An inbred Scandinavian wolf has no valuable genes so it''s more likely to be shot.” Moreover, Bensch said that DNA analysis showed that in at least half the cases, dogs were the predator. “There are so many more dogs than there are wolves,” he said. “Some farmers are prejudiced that it is the wolf that killed their sheep.”According to Dirk Steinke, lead scientist at Marine Barcode of Life and an evolutionary biologist at the Biodiversity Institute of Ontario at the University of Guelph in Canada, DNA barcoding could also contribute to conservation efforts. The technique—usually based on comparing the sequence of the mitochondrial CO1 gene with a database—could help to address the growing trade in shark fins for wedding feasts in China and among the Chinese diaspora, for example. Shark fins confiscated by Australian authorities from Indonesian ships are often a mess of tissue; barcoding helps them to identify the exact species. “As it turns out, some of them are really in the high-threat categories on the IUCN Red List of Threatened Species, so it was pretty concerning,” Steinke said. “That is something where barcoding turns into a tool where wildlife management can be done—even if they only get fragments of an animal. I am not sure if this can prevent people from hunting those animals, but you can at least give them the feedback on whether they did something illegal or not.”Steinke commented that DNA tools are handy not only for megafauna, but also for the humbler creatures in the sea, “especially when it comes to marine invertebrates. The larval stages are the only ones where they are mobile. If you''re looking at wildlife management from an invertebrate perspective in the sea, then these mobile life stages are very important. Their barcoding might become very handy because for some of those groups it''s the only reliable way of knowing what you''re looking at.” Yet, this does not necessarily translate into better conservation: “Enforcement reactions come much quicker when it''s for the charismatic megafauna,” Steinke conceded.“Enforcement reactions come much quicker when it''s for the charismatic megafauna”Moreover, reliable identification of animal species could even improve human health. For instance, Amato and colleagues from the US Centers for Disease Control and Prevention demonstrated for the first time the presence of zoonotic viruses in non-human primates seized in American airports [3]. They identified retroviruses (simian foamy virus) and/or herpesviruses (cytomegalovirus and lymphocryptovirus), which potentially pose a threat to human health. Amato suggested that surveillance of the wildlife trade by using barcodes would help facilitate prevention of disease. Moreover, DNA barcoding could also show whether the meat itself is from monkeys or other wild animals to distinguish illegally hunted and traded bushmeat—the term used for meat from wild animals in Africa—from legal meats.Amato''s group also applied barcoding to bluefin tuna, commonly used in sushi, which he described as the “bushmeat of the developed world”, as the species is being driven to near extinction through overharvesting. Developing barcodes for tuna could help to distinguish bluefin from yellowfin or other tuna species and could assist measures to protect the bluefin. “It can be used sort of like ‘Wildlife CSI'' (after the popular American TV series),” he said.As helpful as these technologies are […] they are not sufficient to protect severely threatened species…In fact, barcoding for law enforcement is growing. Mitchell Eaton, assistant unit leader at the US Geological Survey New York Cooperative Fish and Wildlife Research Unit in Ithaca, NY, USA, noted that the technique is being used by US government agencies such as the FDA and the US Fish & Wildlife Service, as well as African and South American governments, to monitor the illegal export of pets and bushmeat. It is also used as part of the United Nations'' Convention on Biological Diversity for cataloguing the Earth''s biodiversity, identifying pathogens and monitoring endangered species. He expects that more law enforcement agencies around the world will routinely apply these tools: “This is actually easy technology to use.”In that way, barcoding as well as genetics and its related technologies help to address a major problem in conservation and protection measures: to monitor the size, distribution and migration of populations of animals and to analyse their genetic diversity. It gives biologists and conservations a better picture of what needs extra protective measures, and gives enforcement agencies a new and reliable forensic tool to identify and track illegal hunting and trade of protected species. As helpful as these technologies are, however, they are not sufficient to protect severely threatened species such as the bluefin tuna and are therefore not a substitute for more political action and stricter enforcement.  相似文献   

17.
Wolinsky H 《EMBO reports》2010,11(12):921-924
The US still leads the world in stem-cell research, yet US scientists are facing yet another political and legal battle for federal funding to support research using human embryonic stem cells.Disputes over stem-cell research have been standard operating procedure since James Thompson and John Gearhart created the first human embryonic cell (hESC) lines. Their work triggered an intense and ongoing debate about the morality, legality and politics of using hESCs for biomedical research. “Stem-cell policy has caused craziness all over the world. It is a never-ending, irresolvable battle about the moral status [of embryos],” commented Timothy Caulfield, research director of the Health Law Institute at the University of Alberta in Edmonton, Canada. “We''re getting to an interesting time in history where science is playing a bigger and bigger part in our lives, and it''s becoming more controversial because it''s becoming more powerful. We need to make some interesting choices about how we decide what kind of scientific inquiry can go forward and what can''t go forward.”“Stem-cell policy has caused craziness all over the world…[i]t is a never-ending, irresolvable battle about the moral status [of embryos]”The most contested battleground for stem-cell research has been the USA, since President George W. Bush banned federal funding for research that uses hESCs. His successor, Barack Obama, eventually reversed the ban, but a pending lawsuit and the November congressional elections have once again thrown the field into jeopardy.Three days after the election, the deans of US medical schools, chiefs of US hospitals and heads of leading scientific organizations sent letters to both the House of Representatives and the Senate urging them to pass the Stem Cell Research Advancement Act when they come back into session. The implication was to pass legislation now, while the Democrats were still the majority. Republicans, boosted in the election by the emerging fiscally conservative Tea Party movement, will be the majority in the House from January, changing the political climate. The Republicans also cut into the Democratic majority in the Senate.Policies and laws to regulate stem-cell research vary between countries. Italy, for example, does not allow the destruction of an embryo to generate stem-cell lines, but it does allow research on such cells if they are imported. Nevertheless, the Italian government deliberately excluded funding for projects using hESCs from its 2009 call for proposals for stem-cell research. In the face of legislative vacuums, this October, Science Foundation Ireland and the Health Research Board in Ireland decided to not consider grant applications for projects involving hESC lines. The UK is at the other end of the scale; it has legalized both research with and the generation of stem-cell lines, albeit under the strict regulation by the independent Human Fertility and Embryology Authority. As Caulfield commented, the UK is “ironically viewed as one of the most permissive [on stem-cell policy], but is perceived as one of the most bureaucratic.”Somewhere in the middle is Germany, where scientists are allowed to use several approved cell lines, but any research that leads to the destruction of an embryo is illegal. Josephine Johnston, director of research operations at the Hastings Center in Garrison, NY, USA—a bioethics centre—said: “In Germany you can do research on embryonic stem-cells, but you can''t take the cells out of the embryo. So, they import their cells from outside of Germany and to me, that''s basically outsourcing the bit that you find difficult as a nation. It doesn''t make a lot of sense ethically.”Despite the public debates and lack of federal support, Johnson noted that the USA continues to lead the world in the field. “[Opposition] hasn''t killed stem-cell research in the United States, but it definitely is a headache,” she said. In October, physicians at the Shepherd Center, a spinal cord and brain injury rehabilitation hospital and clinical research centre in Atlanta, GA, USA, began to treat the first patient with hESCs. This is part of a clinical trial to test a stem-cell-based therapy for spinal cord injury, which was developed by the US biotechnology company Geron from surplus embryos from in vitro fertilization.Nevertheless, the debate in the USA, where various branches of government—executive, legislative and legal—weigh in on the legal system, is becoming confusing. “We''re never going to have consensus [on the moral status of fetuses] and any time that stem-cell research becomes tied to that debate, there''s going to be policy uncertainty,” Caulfield said. “That''s what''s happened again in the United States.”Johnson commented that what makes the USA different is the rules about federally funded and non-federally funded research. “It isn''t much discussed within the United States, but it''s a really dramatic difference to an outsider,” she said. She pointed out that, by contrast, in other countries the rules for stem-cell research apply across the board.The election of Barack Obama as US President triggered the latest bout of uncertainty. The science community welcomed him with open arms; after all, he supports doubling the budget of the National Institutes of Health (NIH) over the next ten years and dismantled the policies of his predecessor that barred it from funding projects beyond the 60 extant hESC lines—only 21 of which were viable. Obama also called on Congress to provide legal backing and funding for the research.The executive order had unforeseen consequences for researchers working with embryonic or adult stem cells. Sean Morrison, Director of the University of Michigan''s Centre for Stem Cell Biology (Ann Arbor, MI, USA), said he thought that Obama''s executive order had swung open the door on federal support forever. “Everybody had that impression,” he said.Leonard I. Zon, Director of the Stem Cell Program at Children''s Hospital Boston (MA, USA), was so confident in Obama''s political will that his laboratory stopped its practice of labelling liquid nitrogen containers as P (Presidential) and NP (non-Presidential) to avoid legal hassles. His lab also stopped purchasing and storing separate pipettes and culture dishes funded by the NIH and private sources such as the Howard Hughes Medical Institute (HHMI; Chevy Chase, MD, USA).But some researchers who focused on adult cells felt that the NIH was now biased in favour of embryonic cells. Backed by pro-life and religious groups, two scientists—James Sherley of the Boston Biomedical Research Institute and Theresa Deisher of AVM Biotechnology (Seattle, WA)—questioned the legality of the new NIH rules and filed a lawsuit against the Department of Health and Human Services (HHS) Secretary, Kathleen Sebelius. Deisher had founded her company to “[w]ork to provide safe, effective and affordable alternative vaccines and stem-cell therapies that are not tainted by embryonic or electively aborted fetal materials” (www.avmbiotech.com).…the debate in the USA, where various branches of government—executive, legislative and legal—weigh in on the legal system, is becoming confusingSherley argued in an Australian newspaper in October 2006 that the science behind embryonic stem-cell research is flawed and rejected arguments that the research will make available new cures for terrible diseases (Sherley, 2006). In court, the researchers also argued that they were irreparably disadvantaged in competing for government grants by their work on adult stem cells.Judge Royce C. Lamberth of the District Court of the District of Columbia initially ruled that the plaintiffs had no grounds on which to sue. However, the US District Court of Appeals for the District of Columbia overturned his decision and found that “[b]ecause the Guidelines have intensified the competition for a share in a fixed amount” of NIH funding. With the case back in his court, Lamberth reversed his decision on August 23 this year, granting a preliminary injunction to block the new NIH guidelines on embryonic stem-cell work. This injunction is detailed in the 1995 Dickey-Wicker Amendment, an appropriation bill rider, which prohibits the HHS from funding “research in which a human embryo or embryos are destroyed, discarded or knowingly subjected to risk of injury or death.” By allowing the destruction of embryos, Lamberth argued, the NIH rules violate the law.This triggered another wave of uncertainty as dozens of labs faced a freeze of federal funding. Morrison commented that an abrupt end to funding does not normally occur in biomedical research in the USA. “We normally have years of warning when grants are going to end so we can make a plan about how we can have smooth transitions from one funding source to another,” he said. Morrison—whose team has been researching Hirschsprung disease, a congenital enlargement of the colon—said his lab potentially faced a loss of US$ 250,000 overnight. “I e-mailed the people in my lab and said, ‘We may have just lost this funding and if so, then the project is over''”.Morrison explained that the positions of two people in his lab were affected by the cut, along with a third person whose job was partly funded by the grant. “Even though it''s only somewhere between 10–15% of the funding in my lab, it''s still a lot of money,” he said. “It''s not like we have hundreds of thousands of dollars of discretionary funds lying around in case a problem like that comes up.” Zon noted that his lab, which experienced an increase in the pace of discovery since Obama had signed his order, reverted to its Bush-era practices.On September 27 this year, a federal appeals court for the District of Columbia extended Lamberth''s stay to enable the government to pursue its appeal. The NIH was allowed to distribute US$78 million earmarked for 44 scientists during the appeal. The court said the matter should be expedited, but it could, over the years ahead, make its way to the US Supreme Court.The White House welcomed the decision of the appeals court in favour of the NIH. “President Obama made expansion of stem-cell research and the pursuit of groundbreaking treatments and cures a top priority when he took office. We''re heartened that the court will allow [the] NIH and their grantees to continue moving forward while the appeal is resolved,” said White House press secretary Robert Gibbs. The White House might have been glad of some good news, while it wrestles with the worst economic downturn since the Great Depression and the rise of the Tea Party movement.Even without a formal position on the matter, the Tea Party has had an impact on stem-cell research through its electoral victoriesTimothy Kamp, whose lab at the University of Wisconsin (Madison, WI, USA) researches embryonic stem-cell-derived cardiomyocytes, said that he finds the Tea Party movement confusing. “It''s hard for me to know what a uniform platform is for the Tea Party. I''ve heard a few comments from folks in the Tea Party who have opposed stem-cell research,” he said.However, the position of the Tea Party on the topic of stem-cell research could prove to be of vital importance. The Tea Party took its name from the Boston Tea Party—a famous protest in 1773 in which American colonists protested against the passing of the British Tea Act, for its attempt to extract yet more taxes from the new colony. Protesters dressed up as Native Americans and threw tea into the Boston harbour. Contemporary Tea Party members tend to have a longer list of complaints, but generally want to reduce the size of government and cut taxes. Their increasing popularity in the USA and the success of many Tea Party-backed Republican candidates for the upcoming congressional election could jeopardize Obama''s plans to pass new laws to regulate federal funding for stem-cell research.Even without a formal position on the matter, the Tea Party has had an impact on stem-cell research through its electoral victories. Perhaps their most high-profile candidate was the telegenic Christine O''Donnell, a Republican Senatorial candidate from Delaware. The Susan B. Anthony List, a pro-life women''s group, has described O''Donnell as one of “the brightest new stars” opposing abortion (www.lifenews.com/state5255.html). Although O''Donnell was eventually defeated in the 2 November congressional election, by winning the Republican primary in August, she knocked out nine-term Congressman and former Delaware governor Mike Castle, a moderate Republican known for his willingness to work with Democrats to pass legislation to protect stem-cell research.In the past, Castle and Diane DeGette, a Democratic representative from Colorado, co-sponsored the Stem Cell Research Advancement Act to expand federal funding of embryonic stem-cell research. They aimed to support Obama''s executive order and “ensure a lasting ethical framework overseeing stem cell research at the National Institutes of Health”.Morrison described Castle as “one of the great public servants in this country—no matter what political affiliation you have. For him to lose to somebody with such a chequered background and such shaky positions on things like evolution and other issues is a tragedy for the country.” Another stem-cell research advocate, Pennsylvania Senator Arlen Specter, a Republican-turned-Democrat, was also defeated in the primary. He had introduced legislation in September to codify Obama''s order. Specter, a cancer survivor, said his legislation is aimed at removing the “great uncertainty in the research community”.According to Sarah Binder, a political scientist at George Washington University in Washington, DC, the chances of passing legislation to codify the Obama executive order are decreasing: “As the Republican Party becomes more conservative and as moderates can''t get nominated in that party, it does lead you to wonder whether it''s possible to make anything happen [with the new Congress] in January.”There are a variety of opinions about how the outcome of the November elections will influence stem-cell policies. Binder said that a number of prominent Republicans have strongly promoted stem-cell research, including the Reagan family. “This hasn''t been a purely Democratic initiative,” she said. “The question is whether the Republican party has moved sufficiently to the right to preclude action on stem cells.” Historically there was “massive” Republican support for funding bills in 2006 and 2007 that were ultimately vetoed by Bush, she noted.…the debate about public funding for stem-cell research is only part of the picture, given the role of private business and states“Rightward shifts in the House and Senate do not bode well for legislative efforts to entrench federal support for stem-cell research,” Binder said. “First, if a large number of Republicans continue to oppose such funding, a conservative House majority is unlikely to pursue the issue. Second, Republican campaign commitments to reduce federal spending could hit the NIH and its support for stem-cell research hard.”Binder added that “a lingering unknown” is how the topic will be framed: “If it gets framed as a pro-choice versus pro-life initiative, that''s quite difficult for Congress to overcome in a bipartisan way. If it is framed as a question of medical research and medical breakthroughs and scientific advancement, it won''t fall purely on partisan lines. If members of Congress talk about their personal experiences, such as having a parent affected by Parkinson''s, then you could see even pro-life members voting in favour of a more expansive interpretation of stem-cell funding.”Johnson said that Congress could alter the wording of the Dickey-Wicker Amendment when passing the NIH budget for 2011 to remove the conflict. “You don''t have to get rid of the amendment completely, but you could rephrase it,” she said. She also commented that the public essentially supports embryonic stem-cell research. “The polls and surveys show the American public is morally behind there being some limited form of embryonic stem-cell research funded by federal money. They don''t favour cloning. There is not a huge amount of support for creating embryos from scratch for research. But there seems to be pretty wide support among the general public for the kind of embryonic stem-cell research that the NIH is currently funding.”In the end, however, the debate about public funding for stem-cell research is only part of the picture, given the role of private business and states. Glenn McGee, a professor at the Center for Practical Bioethics in Kansas City, MO, USA, and editor of the American Journal of Bioethics, commented that perhaps too much emphasis is being put on federal funding. He said that funding from states such as California and from industry—which are not restricted—has become a more important force than NIH funding. “We''re a little bit delusional if we think that this is a moment where the country is making a big decision about what''s going to happen with stem cells,” he said. “I think that ship has sailed.”  相似文献   

18.
19.
Wolinsky H 《EMBO reports》2011,12(8):772-774
With large charities such as the Wellcome Trust or the Gates Foundation committed to funding research, is there a risk that politicians could cut public funding for science?Towards the end of 2010, with the British economy reeling from the combined effects of the global recession, the burst bubble of property speculation and a banking crisis, the country came close to cutting its national science and research budget by up to 25%. UK Business Secretary Vince Cable argued, “there is no justification for taxpayers'' money being used to support research which is neither commercially useful nor theoretically outstanding” (BBC, 2010). The outcry from UK scientists was both passionate and reasoned until, in the end, the British budget slashers blinked and the UK government backed down. The Chancellor of the Exchequer, George Osborne, announced in October that the government would freeze science and research funding at £4.6 billion per annum for four years, although even this represents about a 10% cut in real terms, because of inflation.“there is no justification for taxpayers'' money being used to support research which is neither commercially useful nor theoretically outstanding”There has been a collective sigh of relief. Sir John Savill, Chief Executive of the Medical Research Council (UK), said: “The worst projections for cuts to the science budget have not been realised. It''s clear that the government has listened to and acted on the evidence showing investment in science is vital to securing a healthy, sustainable and prosperous future.”Yet Britain is unusual compared with its counterparts elsewhere in the European Union (EU) and the USA, because private charities, such as the Wellcome Trust (London, UK) and Cancer Research UK (London, UK), already have budgets that rival those of their government counterparts. It was this fact, coupled with UK Prime Minister David Cameron''s idea of the ‘big society''—a vision of smaller government, increased government–private partnerships and a bigger role for non-profit organizations, such as single-disease-focused charities—that led the British government to contemplate reducing its contribution to research, relying on the private sector to pick up the slack.Jonathan Grant, president of RAND Europe (London, UK)—a not-for-profit research institute that advises on policy and decision-making—commented: “There was a strong backlash and [the UK Government] pulled back from that position [to cut funding]. But that''s the first time I''ve really ever seen it floated as a political idea; that government doesn''t need to fund cancer research because we''ve got all these not-for-profits funding it.”“…that''s the first time I''ve really ever seen it floated as a political idea; that government doesn''t need to fund cancer research because we''ve got all these not-for-profits funding it”But the UK was not alone in mooting the idea that research budgets might have to suffer under the financial crisis. Some had worried that declining government funding of research would spread across the developed world, although the worst of these fears have not been realized.Peter Gruss, President of the Max Planck Society (Munich, Germany), explained that his organization receives 85% of its more-than €1.5 billion budget from the public purses of the German federal government, German state ministries and the EU, and that not all governments have backed away from their commitment to research. In fact, during the crisis, the German and US governments boosted their funding of research with the goal of helping the economic recovery. In 2009, German Chancellor Angela Merkel''s government, through negotiation with the German state science ministries, approved a windfall of €18 billion in new science funding, to be spread over the next decade. Similarly, US President Barack Obama''s administration boosted spending on research with a temporary stimulus package for science, through the American Recovery and Reinvestment Act.Even so, Harry Greenberg, Senior Associate Dean for Research at Stanford University (California, USA) pointed out that until the US government injected stimulus funding, the budget at the National Institutes of Health (NIH; Bethesda, Maryland, USA) had essentially “been flat as a pancake for five or six years, and that means that it''s actually gone down and it''s having an effect on people being able to sustain their research mission.”Similarly, Gruss said that the research community should remain vigilant. “I think one could phrase it as there is a danger. If you look at Great Britain, there is the Wellcome Trust, a very strong funding organization for life sciences and medical-oriented, health-oriented research. I think it''s in the back of the minds of the politicians that there is a gigantic foundation that supports that [kind of research]. I don''t think one can deny that. There is an atmosphere that people like the Gates family [Bill and Melinda Gates Foundation] invests in health-related issues, particularly in the poorer countries [and that] maybe that is something that suffices.”The money available for research from private foundations and charities is growing in both size and scope. According to Iain Mattaj, Director General of the European Molecular Biology Laboratory (EMBL; Heidelberg, Germany), this growth might not be a bad thing. As he pointed out, private funding often complements government funding, with charities such as the Wellcome Trust going out of their way to leverage government spending without reducing government contributions. “My feeling is that the reason that the UK government is freezing research funding has all to do with economics and nothing to do with the fact that there are potentially private funders,” he said. “Several very large charities in particular are putting a lot of money into health research. The Gates Foundation is the biggest that has just come on the scene, but the Howard Hughes Medical Institute [HHMI; Chevy Chase, Maryland, USA] and the Wellcome Trust are very big, essentially private charities which have their own agendas.”…charities such as the Wellcome Trust [go] out of their way to leverage government spending without reducing government contributionscontributionsOpen in a separate window© CorbisBut, as he explained, these charities actually contribute to the overall health research budget, rather than substituting funds from one area to another. In fact, they often team up to tackle difficult research questions in partnership with each other and with government. Two-thirds of the €140 million annual budget of EMBL comes from the European states that agree to fund it, with additional contributions from private sources such as the Wellcome Trust and public sources such as the NIH.Yet over the years, as priorities have changed, the focus of those partnerships and the willingness to spend money on certain research themes or approaches has shifted, both within governments and in the private sector. Belief in the success of US President Richard Nixon''s famous ‘war on cancer'', for example, has waned over the years, although the fight and the funding continues. “I don''t want to use the word political, because of course the decisions are sometimes political, but actually it was a social priority to fight cancer. It was a social priority to fight AIDS,” Mattaj commented. “For the Wellcome Trust and the Gates Foundation, which are fighting tropical diseases, they see that as a social necessity, rather than a personal interest if you like.”Nevertheless, Mattaj is not surprised that there is an inclination to reduce research spending in the UK and many smaller countries battered by the economic downturn. “Most countries have to reduce public spending, and research is public spending. It may be less badly hit than other aspects of public spending. [As such] it''s much better off than many other aspects of public spending.”A shift away from government funding to private funding, especially from disease-focused charities, worries some that less funding will be available for basic, curiosity-driven research—a move from pure research to ‘cure'' research. Moreover, charities are often just as vulnerable to economic downturns, so relying on them is not a guarantee of funding in harsh economic times. Indeed, greater reliance on private funding would be a return to the era of ‘gentlemen scientists'' and their benefactors (Sidebar A).

Sidebar A | Gentlemen scientists

Greater reliance on private funding would return science to a bygone age of gentlemen scientists relying on the largesse of their wealthy sponsors. In 1831, for example, naturalist Charles Darwin''s (1809–1882) passage on the HMS Beagle was paid for by his father, albeit reluctantly. According to Laura Snyder, an expert on Victorian science and culture at St John''s University (New York, USA), by the time Darwin returned to England in 1836, the funding game had changed and government and private scientific societies had begun to have a bigger role. When Sir John Frederick William Herschel (1791–1871), an English mathematician, astronomer, chemist, experimental photographer and inventor, journeyed to Cape Colony in 1833, the British government offered to give him a free ride aboard an Admiralty ship. “Herschel turned them down because he wanted to be free to do whatever he wanted once he got to South Africa, and he didn''t want to feel beholden to government to do what they wanted him to do,” Snyder explained, drawing from her new book The Philosophical Breakfast Club, which covers the creation of the modern concept of science.Charles Babbage (1791–1871), the mathematician, philosopher, inventor and mechanical engineer who originated the concept of a programmable computer, was a member of the same circle as Herschel and William Whewell (1794–1866), a polymath, geologist, astronomer and theologian, who coined the word ''scientist''. Although he was wealthy, having inherited £100,000 in 1827—valued at about £13.3 million in 2008—Babbage felt that government should help pay for his research that served the public interest.“Babbage was asking the government constantly for money to build his difference engine,” Snyder said. Babbage griped about feeling like a tradesman begging to be paid. “It annoyed him. He felt that the government should just have said, ''We will support the engine, whatever it is that you need, just tell us and we''ll write you a check''. But that''s not what the government was about to do.”Instead, the British government expected Babbage to report on his progress before it loosened its purse strings. Snyder explained, “What the government was doing was a little bit more like grants today, in the sense that you have to justify getting more money and you have to account for spending the money. Babbage just wanted an open pocketbook at his disposal.”In the end the government donated £17,000, and Babbage never completed the machine.Janet Rowley, a geneticist at the University of Chicago, is worried that the change in funding will make it more difficult to obtain money for the kind of research that led to her discovery in the 1970s of the first chromosomal translocations that cause cancer. She calls such work ‘fishing expeditions''. She said that the Leukemia & Lymphoma Society (White Plains, New York, USA), for example—a non-profit funder of research—has modified its emphasis: “They have now said that they are going to put most of their resources into translational work and trying to take ideas that are close to clinical application, but need what are called incubator funds to ramp up from a laboratory to small-scale industrial production to increase the amount of compound or whatever is required to do studies on more patients.”This echoes Vince Cable''s view that taxpayers should not have to spend money on research that is not of direct economic, technological or health benefit to them. But if neither charities nor governments are willing to fund basic research, then who will pay the bill?…if neither charities nor governments are willing to fund basic research, then who will pay the bill?Iain Mattaj believes that the line between pure research and cure research is actually too blurred to make these kinds of funding distinctions. “In my view, it''s very much a continuum. I think many people who do basic research are actually very interested in the applications of their research. That''s just not their expertise,” he said. “I think many people who are at the basic end of research are more than happy to see things that they find out contributing towards things that are useful for society.”Jack Dixon, Vice President and Chief Scientific Officer at HHMI, also thinks that the line is blurry: “This divide between basic research and translational research is somewhat arbitrary, somewhat artificial in nature. I think every scientist I know who makes important, basic discoveries likes to [...] see their efforts translate into things that help humankind. Our focus at the Hughes has always been on basic things, but we love to see them translated into interesting products.” Even so, HHMI spends less than US $1 billion annually on research, which is overshadowed by the $30 billion spent by the NIH and the relatively huge budgets of the Wellcome Trust and Cancer Research UK. “We''re a small player in terms of the total research funding in the US, so I just don''t see the NIH pulling back on supporting research,” Dixon said.By way of example, Brian Druker, Professor of Medicine at the Oregon Health & Science University (Portland, Oregon, USA) and a HHMI scientist, picked up on Rowley''s work with cancer-causing chromosomal translocations and developed the blockbuster anti-cancer drug, imatinib, marketed by Novartis. “Brian Druker is one of our poster boys in terms of the work he''s done and how that is translated into helping people live longer lives that have this disease,” Dixon commented.There is a similar view at Stanford. The distinction between basic and applied is “in the eye of the beholder,” Greenberg said. “Basic discovery is the grist for the mill that leads to translational research and new breakthroughs. It''s always been a little difficult to convey, but at least here at Stanford, that''s number one. Number two, many of our very basic researchers enjoy thinking about the translational or clinical implications of their basic findings and some of them want to be part of doing it. They want some benefit for mankind other than pure knowledge.”“Basic discovery is the grist for the mill that leads to translational research and new breakthroughs”If it had not backed down from the massive cuts to the research budget that were proposed, the intention of the UK Government to cut funding for basic, rather than applied, research might have proven difficult to implement. Identifying which research will be of no value to society is like trying to decide which child will grow up to be Prime Minister. Nevertheless, most would agree that governments have a duty to get value-for-money for the taxpayer, but defining the value of research in purely economic or translational terms is both short-sighted and near impossible. Even so, science is feeling the economic downturn and budgets are tighter than they have been for a long time. As Greenberg concluded, “It''s human nature when everybody is feeling the pinch that you think [yours] is bigger than the next guy''s, but I would be hard pressed to say who is getting pinched, at least in the biomedical agenda, more than who else.”  相似文献   

20.
Philip Hunter 《EMBO reports》2010,11(8):583-586
Current research aims to produce traditional biofuels from algae, but their potential to generate sustainable energy might be even greater and more ‘natural''At the time of writing, oil continues to pour into the Gulf of Mexico. It is one of the worst environmental disasters in human history and a shocking reminder of the costs of our addiction to fossil fuels. However, the alternative sources of sustainable energy, such as wind, waves and sunshine, cannot alone replace fossil fuels in the short or even medium term. As nuclear fusion is bogged down by almost intractable engineering challenges, and nuclear fission produces toxic and radioactive waste, research has focused increasingly on converting solar energy into electricity or fuels through photosynthesis—either through the use of artificial compounds that mimic the process, or bioengineered organisms that do it ‘naturally''.…research has focused increasingly on converting solar energy into electricity or fuels through photosynthesis…In the ‘natural'' camp, microalgae—single-celled algae—have emerged as the most promising candidates, mainly because of their potential for converting solar energy more efficiently and with less negative environmental impact than the alternatives, especially biofuel crops such as corn and soy, for example. Cyanobacteria, which are photosynthesizing prokaryotes, rather than single-celled eukaryotes, also hold promise in this regard. However, as Ben Graziano, technology commercialization manager at the Carbon Trust, an independent non-profit company set up by the UK government to develop low-carbon energy technologies, pointed out: “We may look at cyanobacteria in the future […] but they produce different co-products and we need to look at those when producing a commercial case for biofuel production.”Perhaps surprisingly, the principal foundations of algae biofuel research were laid in the USA during the presidency of George W. Bush, particularly at the US National Renewable Energy Laboratory (NREL; Golden, CO), the largest federal agency dedicated to research on alternative energy. The interest in algae was triggered by the growing conviction that microalgae could greatly reduce the amount of land or water surface needed to produce sustainable energy, according to Mike Seibert, research fellow at NREL. “Corn grain ethanol—a current biofuel—has a solar energy conversion efficiency of about 0.05%, and thus has a huge land footprint,” he explained. “Replacing all the gasoline used in the USA with corn grain ethanol would take a corn field 1,000 miles (1,600 km) a side. Algae on the other hand have [a] theoretical conversion efficiency of 10% and in practice, 2%, and so could replace all US gasoline in an area 110 miles (176 km) a side.”Given this promise, Europe is racing to catch up with the USA. A lobbying group, the European Algae Biomass Association (EABA), has been established with support from the European Commission to promote research and generate funding, thus demonstrating confidence that the commercial production of algae biofuels can be achieved, perhaps within as little as a decade. In the UK, the Carbon Trust has established a programme to achieve commercial-scale production of biofuels from algae by 2020. “I think by then it will have achieved parity with current biofuels, reaching US$1 per litre production costs, about 10 times cheaper than is possible with algae today,” Graziano said.But the large-scale potential of algae biofuels remains unproven and requires more fundamental research, cautioned Pierre-Antoine Vernon, project manager of the European Biodiesel Board (EBB), a non-profit organization in Brussels, Belgium, set up in 1997 by biofuel producers to promote the development and use of biofuel in the European Union (EU). “It should be kept in mind that this is not yet a mature technology, as indicated by the diversity of algae strains, processing techniques and end products, which are typical for a nascent industry sector trying to identify the right technological path to the objective pursued,” he said.The interest in algae was triggered by the growing conviction that microalgae could greatly reduce the amount of land or water surface needed to produce sustainable energy…There are also regulatory and commercial factors that might inhibit large-scale deployment of algae farms for biofuel production. “You should not underestimate the regulatory barriers to the introduction of new technologies,” Vernon said. “The EBB is currently facing strong opposition from the oil and car industries in the context of the technical standardisation for biodiesel and diesel.”Such opposition is rooted partly in the vested interests of the oil industry, but also in a natural desire to raise the bar when it comes to monitoring the safety and environmental suitability of biofuels, which must be seen to be squeaky clean and as carbon neutral as possible. “Biofuels use is under scrutiny wherever they are used, even while they represent a mere 5% of fuels used in the EU,” Vernon said. “By contrast, the remaining 95% of fossil fuels are still free from sustainability reporting, and even massive oil spills with incomparably higher consequences on biodiversity and the environment are not likely to prompt the introduction of sustainability criteria.”There are also regulatory and commercial factors that might inhibit large-scale deployment of algae farms for biofuel productionThis last point is now being put to the test by the BP spillage in the Gulf of Mexico; US President Barack Obama, in his Oval Office speech in June, called for a new focus on alternative sources of energy. Yet even this is a double-edged sword for the biofuel industry, according to Mike Griffin, an expert on the impact of oil spills from Carnegie Mellon University (Pittsburgh, PA, USA). “For the next five years you will see more money in oil spill effects work,” he said. “More money flows after each major spill until the politicians forget. This could mean less money for everything else since, with our economic situation, the pie is shrinking.”Nevertheless, the future of algae biofuel research seems secure, even if the extent of funding depends on larger economic factors. Apart from energy conversion efficiency, algae could score from other by-products that would improve the economics of production. As Vernon noted, the economics of algae is similar to that of current biofuels in the sense that you need to find applications for the main product—the oil used to make biofuels—and the by-products, mainly protein and carbohydrates. “For soybean, which was cultivated to produce soybean meal to feed cattle long before biofuels existed, an application was already there. For algae, the challenge is to find a species whose ‘algae meal'' can be used before considering biofuels production.” Promisingly, it looks as if the “algae meal” too could be used to feed animals (Becker, 2007).In addition to protein and the oils that are used for biofuels, algae also produce carbohydrates, which could be used to produce biogas: methane and carbon dioxide. “You can recycle the CO2 back into the system, and burn the methane to produce electricity, yielding water and more CO2, which again would go back into the algae pond,” Graziano explained.Furthermore, the conversion of lipids into biofuels can, as Vernon pointed out, be accomplished by using methods established for biodiesel production from plants. “That is one way to harness the potential of algae, as one possible feedstock for biodiesel production, by making them produce lipids that can in turn be trans-esterified into biodiesel,” he said. “Trans-esterification is a rather simple chemical reaction, for which tried-and-tested production technology emitting little greenhouse gas is available.”There is also the more remote possibility of generating electricity directly from algae. Researchers from Stanford University in the USA and Yonsei University in Seoul, South Korea, inserted gold nanoelectrodes into individual cells, drawing one picoampere (10−12 A) of current from each (Ryu et al, 2010). At this level it would take about a trillion photosynthesizing cells more than an hour to generate the amount of energy stored in a single AA battery. Yet, as the study''s lead author Won Hyoung Ryu from Yonsei University pointed out, electricity could be generated more efficiently by cutting out the intermediate step of producing biofuels, or even by creating hydrogen, for example, as a direct output—the hydrogen would still need to be burned first. “The extraction of photosynthetic electrons requires fewer energy conversion steps compared with hydrogen-based electricity production that requires at least three steps such as solar to hydrogen, hydrogen to heat, and finally heat to electricity,” Ryu said. “Every conversion step involves a certain degree of energy loss.”But, as Ryu conceded, there are fundamental challenges to overcome: “First, we need to find a way to access the thylakoid membranes of millions of cells in parallel to obtain practically meaningful energy. Second, we still use external energy—overvoltage—to extract the photosynthetic electrons.” At present, energy to has to be put in before it can be extracted—an issue that certainly needs to be resolved if microalgae biofuels are ever to be used as constituents of self-charging batteries, for example. Doing so would also involve other challenges such as dealing with dead cells and waste products, which would have to be recycled within the battery.Apart from energy conversion efficiency, algae could score from other by-products that would improve the economics of productionIn the shorter term, microalgae will therefore be used to produce ‘traditional'' biofuels, given the proven advantages of algae over land plants. According to Anastasios Melis, whose laboratory at the University of California, Berkeley, USA, specializes in microalgae, cyanobacteria and plant photosynthesis: “Proven commercial scale productivities of microalgae and cyanobacteria are much better than those of plants because of the ‘carpeting effect'' […] Also, microalgae and cyanobacteria do not invest photosynthate into roots, which is biomass that cannot be harvested or exploited. There may also be secondary reasons for the efficiency advantage, such as the fact that larger plants are often limited by the supply of carbon dioxide, since their stomata tend to close under bright sunlight to protect the tissues against photo damage.”Yet, microalgae also show reduced photosynthetic efficiency under bright sunlight. The reason is that most algal species have adapted to the low light levels below the surface of the ocean by developing large chlorophyll-based antennae for harvesting as much of the limited light available as possible. A lot of energy is then wasted under stronger sunlight because the cell is incapable of converting it all, with the rest mostly dissipated as heat. This wasteful process also mops up the incoming radiation and prevents it reaching cells at greater depths, thus further limiting the scope for the high-cell populations that are necessary to increase energy conversion.Melis and colleagues have tackled this problem by engineering strains with shorter light-harvesting antennae by using DNA insertion mutagenesis in a model species, Chlamydomonas reinhardtii (Melis, 2007). This technique has a long history of use in gene discovery, but the sophistication required to develop algal cells that convert energy more efficiently is new. The fundamental idea is to create random mutations and identify those that generate the desired phenotype, in this case shorter light-harvesting antennae. Melis also inserted an exogenous DNA tag alongside the new base pairs, thus enabling him to locate the genomic DNA flanking the mutation. The gene affected by that mutation can then be identified as one associated with the development of light-harvesting antennae, if these are truncated in the resulting cell.But, as with all mutations, there is a high probability that these will cause other less desirable phenotypic changes in addition to the shortened antennae. Indeed, it has turned out that such phenotypic changes often include reduced photosynthetic efficiency, thus defeating the object of the exercise. In response, Melis developed screening processes to identify those strains with truncated antennae but with fully functioning photosynthesis. This entails visually inspecting candidate colonies, as those with low densities of chlorophyll and therefore short harvesting antennae are yellowish in colour rather than green. The selected strains are then cultured and tested for energy yields during photosynthesis to identify the most efficient energy converters.Melis has already demonstrated that cells with truncated antennae are illuminated much more uniformly in dense cultures and achieve the desired effect of creating a thick carpet of algae that efficiently harvest light. “Accordingly, the truncated light-harvesting chlorophyll antenna size property may find application in the commercial exploitation of microalgae and plants for the generation of biomass, biofuel, chemical feedstock, as well as nutraceutical and pharmaceutical products,” he said.Improving the ability of algae to harvest light is an important step towards improving the efficiency of photosynthesis, especially in the densely populated volumes of water in algae biofuel farms (Fig 1). There is also the hope of going further to bioengineer microalgae to produce biofuels or electricity directly, to cut out the need to convert lipids into biofuels such as biodiesel or hydrogen. This is a harder challenge because it involves engineering a truly fundamental change in the second stage of photosynthesis—the Calvin cycle—to redirect the energy liberated by splitting water away from the normal production of glucose and towards the desired biofuel or electricity.Open in a separate windowFigure 1Schematic drawing of an algae farm for the production of biofuels.Fortunately, evolution has provided a good starting point with the hydrogenase enzyme protecting against damage when the Calvin cycle is unable to mop up all the electrons produced by the light-harvesting process. This can happen just after sunrise when light harvesting kicks in but the Calvin cycle has not yet ‘woken up'' from its night''s rest. Under these circumstances, hydrogenase guides the electrons directly to the protons produced by splitting water to form hydrogen. The enzyme is eventually inhibited by oxygen liberated from the Calvin cycle as it gets going to allow normal photosynthesis to resume for the day.There is also the hope of going further to bioengineer microalgae to produce biofuels or electricity directly…Research has therefore focused on holding back this oxygen feedback mechanism to increase production of hydrogen. The first breakthrough came in 2000 when Melis and Seibert reported that reducing sulphate levels in algal cultures would cut the rate of photosynthesis (Melis et al, 2000). The result was a 90% reduction in oxygen production, sufficient to allow the hydrogenase enzyme to continue diverting electrons towards protons to yield hydrogen for a longer period.Although it was a considerable step forward, it did not solve the problem because the C. reinhardtii cells soon died when deprived of sulphate. Melis, Seibert and others have since worked on various methods to achieve the same effect at a molecular level without depriving the cells of sulphate ions, by diverting electrons away from the Calvin cycle while maintaining overall levels of photosynthesis. This involves getting a number of things right and will probably require tuning several genes at the whole genome level to achieve the desired objectives.The recent announcement by Craig Venter that he has created a synthetic bacterium by transplanting the genome from another species of bacteria (Gibson et al, 2010) has therefore added a new twist to the story. Venter''s technology could enable scientists to make changes to algae at the level of the whole genome, custom-building a suite of enzymatic tools to redirect the energy produced by photosynthesis. Venter''s team took bacteria from the genus Mycoplasma mycoides and re-engineered its genome from digitized sequence information. The resulting genome was then transplanted into bacterial cells of another genus, Mycoplasma capricolum, which then acquired all the phenotypic properties of M. mycoides and was capable of self-replication.Venter''s development might prove a significant step on the road towards algae-derived biofuels, according to Ryu. “I think it is a smart move and look forward to hearing what would come out in the near future,” he said. “Regardless of whether it works or fails, we will always learn something. For our approach, genomic manipulation can help greatly.” A key target, Ryu explained, will be the ferredoxin proteins that act as biological capacitors in photosynthesis by accepting electrons from the chlorophyll antennae and carrying them to the Calvin cycle. “In genetically-modified algae, ferredoxin stops delivering the photosynthetic electrons to the Calvin cycle […] Then we have a better chance of stealing the electrons,” Ryu said.Such exciting prospects stoke further optimism that science could at last provide a significant and sustainable source of energy that could be delivered in a variety of forms that might include transportation fuels, hydrogen, large-scale electricity production and possibly self-charging organic batteries.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号