首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Wolinsky H 《EMBO reports》2010,11(11):830-833
Sympatric speciation—the rise of new species in the absence of geographical barriers—remains a puzzle for evolutionary biologists. Though the evidence for sympatric speciation itself is mounting, an underlying genetic explanation remains elusive.For centuries, the greatest puzzle in biology was how to account for the sheer variety of life. In his 1859 landmark book, On the Origin of Species, Charles Darwin (1809–1882) finally supplied an answer: his grand theory of evolution explained how the process of natural selection, acting on the substrate of genetic mutations, could gradually produce new organisms that are better adapted to their environment. It is easy to see how adaptation to a given environment can differentiate organisms that are geographically separated; different environmental conditions exert different selective pressures on organisms and, over time, the selection of mutations creates different species—a process that is known as allopatric speciation.It is more difficult to explain how new and different species can arise within the same environment. Although Darwin never used the term sympatric speciation for this process, he did describe the formation of new species in the absence of geographical separation. “I can bring a considerable catalogue of facts,” he argued, “showing that within the same area, varieties of the same animal can long remain distinct, from haunting different stations, from breeding at slightly different seasons, or from varieties of the same kind preferring to pair together” (Darwin, 1859).It is more difficult to explain how new and different species can arise within the same environmentIn the 1920s and 1930s, however, allopatric speciation and the role of geographical isolation became the focus of speciation research. Among those leading the charge was Ernst Mayr (1904–2005), a young evolutionary biologist, who would go on to influence generations of biologists with his later work in the field. William Baker, head of palm research at the Royal Botanic Gardens, Kew in Richmond, UK, described Mayr as “one of the key figures to crush sympatric speciation.” Frank Sulloway, a Darwin Scholar at the Institute of Personality and Social Research at the University of California, Berkeley, USA, similarly asserted that Mayr''s scepticism about sympatry was central to his career.The debate about sympatric and allopatric speciation has livened up since Mayr''s death…Since Mayr''s death in 2005, however, several publications have challenged the notion that sympatric speciation is a rare exception to the rule of allopatry. These papers describe examples of both plants and animals that have undergone speciation in the same location, with no apparent geographical barriers to explain their separation. In these instances, a single ancestral population has diverged to the extent that the two new species cannot produce viable offspring, despite the fact that their ranges overlap. The debate about sympatric and allopatric speciation has livened up since Mayr''s death, as Mayr''s influence over the field has waned and as new tools and technologies in molecular biology have become available.Sulloway, who studied with Mayr at Harvard University, in the late 1960s and early 1970s, notes that Mayr''s background in natural history and years of fieldwork in New Guinea and the Solomon Islands contributed to his perception that the bulk of the data supported allopatry. “Ernst''s early career was in many ways built around that argument. It wasn''t the only important idea he had, but he was one of the strong proponents of it. When an intellectual stance exists where most people seem to have gotten it wrong, there is a tendency to sort of lay down the law,” Sulloway said.Sulloway also explained that Mayr “felt that botanists had basically led Darwin astray because there is so much evidence of polyploidy in plants and Darwin turned in large part to the study of botany and geographical distribution in drawing evidence in The Origin.” Indeed, polyploidization is common in plants and can lead to ‘instantaneous'' speciation without geographical barriers.In February 2006, the journal Nature simultaneously published two papers that described sympatric speciation in animals and plants, reopening the debate. Axel Meyer, a zoologist and evolutionary biologist at the University of Konstanz, Germany, demonstrated with his colleagues that sympatric speciation has occurred in cichlid fish in Lake Apoyo, Nicaragua (Barluenga et al, 2006). The researchers claimed that the ancestral fish only seeded the crater lake once; from this, new species have evolved that are distinct and reproductively isolated. Meyer''s paper was broadly supported, even by critics of sympatric speciation, perhaps because Mayr himself endorsed sympatric speciation among the cichlids in his 2001 book What Evolution Is. “[Mayr] told me that in the case of our crater lake cichlids, the onus of showing that it''s not sympatric speciation lies with the people who strongly believe in only allopatric speciation,” Meyer said.…several scientists involved in the debate think that molecular biology could help to eventually resolve the issueThe other paper in Nature—by Vincent Savolainen, a molecular systematist at Imperial College, London, UK, and colleagues—described the sympatric speciation of Howea palms on Lord Howe Island (Fig 1), a minute Pacific island paradise (Savolainen et al, 2006a). Savolainen''s research had originally focused on plant diversity in the gesneriad family—the best known example of which is the African violet—while he was in Brazil for the Geneva Botanical Garden, Switzerland. However, he realized that he would never be able prove the occurrence of sympatry within a continent. “It might happen on a continent,” he explained, “but people will always argue that maybe they were separated and got together after. […] I had to go to an isolated piece of the world and that''s why I started to look at islands.”Open in a separate windowFigure 1Lord Howe Island. Photo: Ian Hutton.He eventually heard about Lord Howe Island, which is situated just off the east coast of Australia, has an area of 56 km2 and is known for its abundance of endemic palms (Sidebar A). The palms, Savolainen said, were an ideal focus for sympatric research: “Palms are not the most diverse group of plants in the world, so we could make a phylogeny of all the related species of palms in the Indian Ocean, southeast Asia and so on.”…the next challenges will be to determine which genes are responsible for speciation, and whether sympatric speciation is common

Sidebar A | Research in paradise

Alexander Papadopulos is no Tarzan of the Apes, but he has spent a couple months over the past two years aloft in palm trees hugging rugged mountainsides on Lord Howe Island, a Pacific island paradise and UNESCO World Heritage site.Papadopulos—who is finishing his doctorate at Imperial College London, UK—said the views are breathtaking, but the work is hard and a bit treacherous as he moves from branch to branch. “At times, it can be quite hairy. Often you''re looking over a 600-, 700-metre drop without a huge amount to hold onto,” he said. “There''s such dense vegetation on most of the steep parts of the island. You''re actually climbing between trees. There are times when you''re completely unsupported.”Papadopulos typically spends around 10 hours a day in the field, carrying a backpack and utility belt with a digital camera, a trowel to collect soil samples, a first-aid kit, a field notebook, food and water, specimen bags, tags to label specimens, a GPS device and more. After several days in the field, he spends a day working in a well-equipped field lab and sleeping in the quarters that were built by the Lord Howe governing board to accommodate the scientists who visit the island on various projects. Papadopulos is studying Lord Howe''s flora, which includes more than 200 plant species, about half of which are indigenous.Vincent Savolainen said it takes a lot of planning to get materials to Lord Howe: the two-hour flight from Sydney is on a small plane, with only about a dozen passengers on board and limited space for equipment. Extra gear—from gardening equipment to silica gel and wood for boxes in which to dry wet specimens—arrives via other flights or by boat, to serve the needs of the various scientists on the team, including botanists, evolutionary biologists and ecologists.Savolainen praised the well-stocked researcher station for visiting scientists. It is run by the island board and situated near the palm nursery. It includes one room for the lab and another with bunks. “There is electricity and even email,” he said. Papadoupulos said only in the past year has the internet service been adequate to accommodate video calls back home.Ian Hutton, a Lord Howe-based naturalist and author, who has lived on the island since 1980, said the island authorities set limits on not only the number of residents—350—but also the number of visitors at one time—400—as well as banning cats, to protect birds such as the flightless wood hen. He praised the Imperial/Kew group: “They''re world leaders in their field. And they''re what I call ‘Gentlemen Botanists''. They''re very nice people, they engage the locals here. Sometimes researchers might come here, and they''re just interested in what they''re doing and they don''t want to share what they''re doing. Not so with these people. Savolainen said his research helps the locals: “The genetics that we do on the island are not only useful to understand big questions about evolution, but we also always provide feedback to help in its conservation efforts.”Yet, in Savolainen''s opinion, Mayr''s influential views made it difficult to obtain research funding. “Mayr was a powerful figure and he dismissed sympatric speciation in textbooks. People were not too keen to put money on this,” Savolainen explained. Eventually, the Leverhulme Trust (London, UK) gave Savolainen and Baker £70,000 between 2003–2005 to get the research moving. “It was enough to do the basic genetics and to send a research assistant for six months to the island to do a lot of natural history work,” Savolainen said. Once the initial results had been processed, the project received a further £337,000 from the British Natural Environment Research Council in 2008, and €2.5 million from the European Research Council in 2009.From the data collected on Lord Howe Island, Savolainen and his team constructed a dated phylogenetic tree showing that the two endemic species of the palm Howea (Arecaceae; Fig 2) are sister taxa. From their tree, the researchers were able to establish that the two species—one with a thatch of leaves and one with curly leaves—diverged long after the island was formed 6.9 million years ago. Even where they are found in close proximity, the two species cannot interbreed as they flower at different times.Open in a separate windowFigure 2The two species of Howea palm. (A) Howea fosteriana (Kentia palm). (B) Howea belmoreana. Photos: William Baker, Royal Botanical Gardens, Kew, Richmond, UK.According to the researchers, the palm speciation probably occurred owing to the different soil types in which the plants grow. Baker explained that there are two soil types on Lord Howe—the older volcanic soil and the younger calcareous soils. The Kentia palm grows in both, whereas the curly variety is restricted to the volcanic soil. These soil types are closely intercalated—fingers and lenses of calcareous soils intrude into the volcanic soils in lowland Lord Howe Island. “You can step over a geological boundary and the palms in the forest can change completely, but they remain extremely close to each other,” Baker said. “What''s more, the palms are wind-pollinated, producing vast amounts of pollen that blows all over the place during the flowering season—people even get pollen allergies there because there is so much of the stuff.” According to Savolainen, that the two species have different flowering times is a “way of having isolation so that they don''t reproduce with each other […] this is a mechanism that evolved to allow other species to diverge in situ on a few square kilometres.”According to Baker, the absence of a causative link has not been demonstrated between the different soils and the altered flowering times, “but we have suggested that at the time of speciation, perhaps when calcareous soils first appeared, an environmental effect may have altered the flowering time of palms colonising the new soil, potentially causing non-random mating and kicking off speciation. This is just a hypothesis—we need to do a lot more fieldwork to get to the bottom of this,” he said. What is clear is that this is not allopatric speciation, as “the micro-scale differentiation in geology and soil type cannot create geographical isolation”, said Baker.…although molecular data will add to the debate, it will not settle it aloneThe results of the palm research caused something of a splash in evolutionary biology, although the study was not without its critics. Tod Stuessy, Chair of the Department of Systematic and Evolutionary Botany at the University of Vienna, Austria, has dealt with similar issues of divergence on Chile''s Juan Fernández Islands—also known as the Robinson Crusoe Islands—in the South Pacific. From his research, he points out that on old islands, large ecological areas that once separated species—and caused allopatric speciation—could have since disappeared, diluting the argument for sympatry. “There are a lot of cases [in the Juan Fernández Islands] where you have closely related species occurring in the same place on an island, even in the same valley. We never considered that they had sympatric origins because we were always impressed by how much the island had been modified through time,” Stuessy said. “What [the Lord Howe researchers] really didn''t consider was that Lord Howe Island could have changed a lot over time since the origins of the species in question.” It has also been argued that one of the palm species on Lord Howe Island might have evolved allopatrically on a now-sunken island in the same oceanic region.In their response to a letter from Stuessy, Savolainen and colleagues argued that erosion on the island has been mainly coastal and equal from all sides. “Consequently, Quaternary calcarenite deposits, which created divergent ecological selection pressures conducive to Howea species divergence, have formed evenly around the island; these are so closely intercalated with volcanic rocks that allopatric speciation due to ecogeographic isolation, as Stuessy proposes, is unrealistic” (Savolainen et al, 2006b). Their rebuttal has found support in the field. Evolutionary biologist Loren Rieseberg at the University of British Columbia in Vancouver, Canada, said: “Basically, you have two sister species found on a very small island in the middle of the ocean. It''s hard to see how one could argue anything other than they evolved there. To me, it would be hard to come up with a better case.”Whatever the reality, several scientists involved in the debate think that molecular biology could help to eventually resolve the issue. Savolainen said that the next challenges will be to determine which genes are responsible for speciation, and whether sympatric speciation is common. New sequencing techniques should enable the team to obtain a complete genomic sequence for the palms. Savolainen said that next-generation sequencing is “a total revolution.” By using sequencing, he explained that the team, “want to basically dissect exactly what genes are involved and what has happened […] Is it very special on Lord Howe and for this palm, or is [sympatric speciation] a more general phenomenon? This is a big question now. I think now we''ve found places like Lord Howe and [have] tools like the next-gen sequencing, we can actually get the answer.”Determining whether sympatric speciation occurs in animal species will prove equally challenging, according to Meyer. His own lab, among others, is already looking for ‘speciation genes'', but this remains a tricky challenge. “Genetic models […] argue that two traits (one for ecological specialisation and another for mate choice, based on those ecological differences) need to become tightly linked on one chromosome (so that they don''t get separated, often by segregation or crossing over). The problem is that the genetic basis for most ecologically relevant traits are not known, so it would be very hard to look for them,” Meyer explained. “But, that is about to change […] because of next-generation sequencing and genomics more generally.”Many researchers who knew Mayr personally think he would have enjoyed the challenge to his viewsOthers are more cautious. “In some situations, such as on isolated oceanic islands, or in crater lakes, molecular phylogenetic information can provide strong evidence of sympatric speciation. It also is possible, in theory, to use molecular data to estimate the timing of gene flow, which could help settle the debate,” Rieseberg said. However, he cautioned that although molecular data will add to the debate, it will not settle it alone. “We will still need information from historical biogeography, natural history, phylogeny, and theory, etc. to move things forward.”Many researchers who knew Mayr personally think he would have enjoyed the challenge to his views. “I can only imagine that it would''ve been great fun to engage directly with him [on sympatry on Lord Howe],” Baker said. “It''s a shame that he wasn''t alive to comment on [our paper].” In fact, Mayr was not really as opposed to sympatric speciation as some think. “If one is of the opinion that Mayr opposed all forms of sympatric speciation, well then this looks like a big swing back the other way,” Sulloway commented. “But if one reads Mayr carefully, one sees that he was actually interested in potential exceptions and, as best he could, chronicled which ones he thought were the best candidates.”Mayr''s opinions aside, many biologists today have stronger feelings against sympatric speciation than he did himself in his later years, Meyer added. “I think that Ernst was more open to the idea of sympatric speciation later in his life. He got ‘softer'' on this during the last two of his ten decades of life that I knew him. I was close to him personally and I think that he was much less dogmatic than he is often made out to be […] So, I don''t think that he is spinning in his grave.” Mayr once told Sulloway that he liked to take strong stances, precisely so that other researchers would be motivated to try to prove him wrong. “If they eventually succeeded in doing so, Mayr felt that science was all the better for it.”? Open in a separate windowAlex Papadopulos and Ian Hutton doing fieldwork on a very precarious ridge on top of Mt. Gower. Photo: William Baker, Royal Botanical Gardens, Kew, Richmond, UK.  相似文献   

2.
Wolinsky H 《EMBO reports》2011,12(12):1226-1229
Looking back on the International Year of Biodiversity, some conservationists hope that it has raised awareness, if nothing else. Even so, many scientists remain pessimistic about our efforts to halt biodiversity decline.The United Nations'' (UN) International Year of Biodiversity in 2010 was supposed to see the adoption of measures that would slow global environmental decline and the continuing loss of endangered species and habitats. Even before, in 2002, most UN members had committed to halting the decline in biodiversity, which is a measure of the health of ecosystems. But the results of these international efforts have been funereal. Moreover, the current global economic crisis, coupled with growing anti-science attitudes in the USA, are adding to the concern of scientists about whether there is the political will to address the loss of biodiversity and whether habitat loss and extinction rates are reaching a point of no return.“There is not a single report received last year that claimed to have stopped or reduced the loss of biodiversity”Ahmed Djoghlaf, Executive Secretary of the Convention on Biological Diversity under the UN Environment Programme based in Montreal, Canada, said that of the 175 national reports submitted as part of the International Year of Biodiversity to his agency last year, none reported any progress. “There is not a single report received last year that claimed to have stopped or reduced the loss of biodiversity,” he said. “These reports confirm that the rate of loss of biodiversity today is unprecedented and the rate is 1,000 higher than the rate of natural extinction on species, and [his agency''s Global Biodiversity Outlook 2010; UN, 2010a] predicts that if business is allowed to continue then major ecosystems, the ocean, the fish, the forests, will reach the tipping point, meaning that there will be irreversible and irreparable damage done to the ecosystems.”The UN campaign traces its roots to the European Union (EU) commitment in 2001 to halt the loss of biodiversity by 2010. The 2010 goal was incorporated into the UN Millennium Development Goals because of the severe impact of biodiversity loss on human well-being. However, the EU last year conceded in a report that it missed its 2010 target, too. The EU''s Biodiversity Action Plan, launched in 2006, shows that Europe''s biodiversity “remains under severe threat from the excessive demands we are making on our environment, such as changes in land use, pollution, invasive species and climate change.” Yet, EU Environment Commissioner Janez Potočnik has seen some positive signs: “We have learned some very important lessons and managed to raise biodiversity to the top of the political agenda. But we need everyone on board and not just in Europe. The threat around the world is even greater than in the EU,” he wrote last year (EC, 2010).Despite the initiative''s poor report card, Djoghlaf was upbeat about the International Year of Biodiversity. “It was a success because it was celebrated everywhere,” he said. “In Switzerland, they conducted a survey before and after the International Year of Biodiversity and they concluded that at the end of the year, 67% of all the Swiss people are now aware of biodiversity. When the year started it was 40%. People are more and more aware. In addition, biodiversity has entered the top of the political agenda.”In October 2010, delegates from 193 countries attended the UN Convention on Biodiversity in Nagoya, Japan, and adopted 20 strategic goals to be achieved by 2020 (UN, 2010b). The so-called Aichi Biodiversity Targets include increased public awareness of the values of biodiversity and the steps that individuals can take to conserve and act sustainably; the halving or halting of the rate of loss of all natural habitats, including forests; and the conservation of 17% of terrestrial and inland water, and 10% of coastal and marine areas through effective and equitable management, resulting in ecologically representative and well-connected systems. By contrast, 13% of land areas and 1% of marine areas were protected in 2010.However, the Convention on Biological Diversity is not enforceable. Anne Larigauderie, Executive Director of DIVERSITAS (Paris, France), which promotes research on biodiversity science, said that it is up to the individual countries to adopt enforceable legislation. “In principle, countries have committed. Now it depends on what individual countries are going to do with the agreement,” she said. “I would say that things are generally going in the right direction and it''s too early to tell whether or not it''s going to have an impact in terms of responding and in terms of the biodiversity itself.”Researchers, however, have been disappointed by The International Year of Biodiversity. Conservation biologist Stuart Butchart, of Birdlife International in Cambridge, UK—a partnership of non-governmental environmental organizations and colleagues from other environmental groups—compiled a list of 31 indicators to measure progress towards the 2010 goal of the International Year of Biodiversity. He and his collaborators reported in Science (Butchart et al, 2010) that these indicators, including species population trends, extinction risks and habitat conditions, showed declines with no significant rate reductions. At the same time, indicators of pressure on biodiversity, such as resource consumption, invasive alien species, nitrogen pollution, over-exploitation and climate change impacts showed increases. “Despite some local successes and increasing responses (including extent and biodiversity coverage of protected areas, sustainable forest management, policy responses to invasive alien species and biodiversity-related aid), the rate of biodiversity loss does not appear to be slowing,” the researchers wrote.wrote.Open in a separate window© Thomas Kitchin & Victoria Hurst/Wave/CorbisButchart pointed out that even if the International Year of Biodiversity had an impact on raising awareness and reducing biodiversity loss, detecting the change would take time. He said that the International Year of Biodiversity fell short of increasing awareness in parts of government not dealing with the environment, including ministries of transport, tourism, treasury and finance. It also seems probable that the campaign had little impact on the business sector, which affects development projects with a direct impact on biodiversity. “People can''t even seem to get together on global climate change, which is a whole lot more obvious and right there,” Peter Raven, president emeritus of the Missouri Botanical Gardens in St Louis, USA, explained. “Biodiversity always seems to be a sort of mysterious background thing that isn''t quite there.”“People can''t even seem to get together on global climate change, which is a whole lot more obvious and right there…”Illka Hanski, a professor in the Department of Ecology and Evolutionary Biology at the University of Helsinki in Finland, said that studies such as Butchart''s “indicate that nothing really happened in 2010. Biodiversity decline continued and has been declining over the past 10 years.”Other researchers are more positive, although with reservations. Conservation biologist Thomas Eugene Lovejoy III, Heinz Center Biodiversity Chair and former president of the Center in Washington, DC, USA—a non-partisan, non-profit organization dedicated to advancing sound environmental policy—said that economic trends affect biodiversity and that biodiversity efforts might actually be benefiting from the current global economic crisis. For example, the decline in the housing markets in the USA and Europe has reduced the demand on lumber for new construction and has led to a reduction in deforestation. “Generally speaking, when there is an economic downturn, some of the things that are pressuring biodiversity actually abate somewhat. That''s the good news. The bad news is that the ability to marshal resources to do some things proactively gets harder,” he said.Chris Thomas, a conservation biologist at the University of York in the UK, who studies ecosystems and species in the context of climate change, said that economic depressions do slow the rate of damage to the environment. “But it also takes eyes off the ball of environmental issues. It''s not clear whether these downturns, when you look over a period of a decade, make much difference or not.” Hanski agreed: “[B]ecause there is less economic activity, there may be less use of resources and such. But I don''t think this is a way to solve our problems. It won''t lead to any stable situation. It just leads to a situation where economic policies become more and more dependent on measures that try actually just to increase the growth as soon as possible.”…biodiversity efforts might actually be benefiting from the current global economic crisisRaven said that in bad times, major interests such as those involved in raising cattle, growing soybeans and clearing habitat for oil palms have reduced political clout because there is less money available for investment. But he said economic downturns do not slow poor people scrounging for sustenance in natural habitats.To overcome this attitude of neglect, Lovejoy thinks there ought to be a new type of ‘economics'' that demonstrates the benefits of biodiversity and brings the “natural world into the normal calculus.” Researchers are already making progress in this direction. Thomas said that the valuation of nature is one of the most active areas of research. “People have very different opinions as to how much of it can be truly valued. But it is a rapidly developing field,” he said. “Once you''ve decided how much something is worth, then you''ve got to ask what are the financial or other mechanisms by which the true value of this resource can be appreciated.”Hanski said that the main problem is the short-term view of economic forecasts. “Rapid use of natural resources because of short-term calculation may actually lead to a sort of exploitation rather than conservation or preservation.” He added that the emphasis on economic growth in rich societies in North America and Europe is frustrating. “We have become much richer than in 1970 when there actually was talk of zero growth in serious terms. So now we are richer and we are becoming more and more dependent on continued growth, the opposite of what we should be aiming at. It''s a problem with our society and economics clearly, but I can''t be very optimistic about the biodiversity or other environmental issues in this kind of situation.” He added that biodiversity is still taking a backseat to economics: “There is a very long way to go right now with the economic situation in Europe, it''s clear that these sorts of [biodiversity] issues are not the ones which are currently being debated by the heads of states.”The economic downturn, which has led to reduced government and private funding and declines in endowments, has also hurt organizations dedicated to preserving biodiversity. Butchart said that some of the main US conservation organizations, including the Nature Conservancy and the World Wildlife Federation, have experienced staff cuts up to 30%. “Organizations have had to tighten their belts and reign in programmes just to stay afloat, so it''s definitely impacted the degree to which we could work effectively,” he said. “Most of the big international conservation organizations have had to lay off large numbers of staff.”…a new type of ‘economics'' that demonstrates the benefits of biodiversity and brings the “natural world into the normal calculus”Cary Fowler, Executive Director of the Global Crop Diversity Trust in Rome, Italy, a public–private partnership to fund key crop collections for food security, also feels the extra challenges of the global economic crisis. “We invest our money conservatively like a foundation would in order to generate income that can reliably pay the bills in these seed banks year after year. So I''m always nervous and I have the computer on at the moment looking at what''s happening with the sovereign debt crisis here in Europe. It''s not good,” he said. “Governments are not being very generous with contributions to this area. Donors will rarely give a reason [for cutting funding].”The political situation in the USA, the world''s largest economy, is also not boding well for conservation of and research into biodiversity. The political extremism of the Republican Party during the run up to the 2012 presidential election has worried many involved in biodiversity issues. Republican contender Texas Governor Rick Perry has been described as ‘anti-science'' for his denial of man-made climate change, a switch from the position of 2008 Republican candidate John McCain. Perry was also reported to describe evolution as a “theory that''s out there, and it''s got some gaps in it” at a campaign event in New Hampshire earlier in the year.“Most of the big international conservation organizations have had to lay off large numbers of staff”Raven said this attitude is putting the USA at a disadvantage. “It drives us to an anti-intellectualism and a lack of real verification for anything which is really serious in terms of our general level of scientific education and our ability to act intelligently,” he said.Still, Larigauderie said that although the USA has not signed the conventions on biodiversity, she has seen US observers attend the meetings, especially under the Obama administration. “They just can''t speak,” she said. Meanwhile, Lovejoy said that biodiversity could get lost in the “unbelievable polarisation affecting US politics. I have worked out of Washington for 36 years now—I''ve never seen anything like this: an unwillingness to actually listen to the other side.”Raven said it is vital for the USA to commit to preserving biodiversity nationally and internationally. “It''s extremely important because our progress towards sustainability for the future will depend on our ability to handle biodiversity in large part. We''re already using about half of all the total photosynthetic productivity on land worldwide and that in turn means we''re cutting our options back badly. The US is syphoning money by selling debt and of course promoting instability all over the world,” he explained. “It''s clear that there is no solution to it other than a level population, more moderate consumption levels and new technologies altogether.”The EU and the UN have also changed the time horizon for halting the decline in biodiversity. As part of the Nagoya meeting, the UN announced the UN Decade for Biodiversity. The strategic objectives include a supporting framework for the implementation of the Biodiversity Strategic Plan 2011–2020 and the Aichi Biodiversity Targets, as well as guidance to regional and international organizations, and more public awareness of biodiversity issues.But Butchart remains sceptical. “I suspect ‘decades of whatever'' have even less impact than years,” he said. “2008 was the International Year of the Potato. I don''t know how much impact that had on your life and awareness. I think there is greater awareness and greater potential to make significant progress in addressing biodiversity loss now than there was 10 years ago, but the scale of the challenge is still immense.”“…our progress towards sustainability for the future will depend on our ability to handle biodiversity in large part”Hanski has similar doubts. “I believe it''s inevitable that a very large fraction of the species on Earth will go extinct in the next hundred years. I can''t see any change to that.” But he is optimistic that some positive change can be made. “Being pessimistic doesn''t help. The nations still can make a difference.” He said he has observed ecotourism playing a role in saving some species in Madagascar, where he does some of his research.“We''re not going to fundamentally be able to wipe life off the planet,” Thomas said. “We will wipe ourselves off the planet virtually certainly before we wipe life out on Earth. However, from the point of view of humanity as a culture, and in terms of the resources we might be able to get from biodiversity indirectly or directly, if we start losing things then it takes things millions of years to ‘re-evolve'' something that does an equivalent job. From a human perspective, when we wipe lots of things out, they''re effectively permanently lost. Of course it would be fascinating and I would love to be able to come back to the planet in 10 million years and see what it looks like, assuming humans are not here and other stuff will be.”Djoghlaf, by contrast, is more optimistic about our chances: “I believe in the human survival aspect. When humankind realises that the current pattern of production and consumption and the current way that it is dealing with nature is unsustainable, we will wake up.”  相似文献   

3.
Hunter P 《EMBO reports》2011,12(3):205-207
A more complete catalogue of the Earth''s fauna and flora and a more holistic view of man-made environmental problems could help to slow the rate of biodiversity loss.In the wake of the admission from the United Nations (UN) that, to date, efforts have failed to even slow down the rate of extinction across almost all plant and animal taxa (CBD, 2010), the fight to reverse the human-induced loss of biodiversity is entering a new chapter. The failure to achieve the targets set in 2002 for reducing decline has led to a revised strategy from the Campaign for Biodiversity (CBD). This new approach recognizes that species conservation cannot be treated in isolation from other issues facing humans, including climate change, water scarcity, poverty, agricultural development and global conflict. It also acknowledges that declining biodiversity cannot be tackled properly without a more accurate inventory of the species in existence today. Thus, a large part of the strategy to combat species decline focuses on building an exhaustive catalogue of life.The Global Strategy for Plant Conservation includes such a plan. The intention is to compile an online flora of known plants by 2020, which should enable comprehensive conservation efforts to gather steam. Peter Wyse Jackson, president of the Missouri Botanical Garden in the USA, said that around 25% of the estimated 400,000 plant species in the world, are thought to be threatened. He said that around 850 botanical gardens have, between them, collected around 100,000 species, but only a quarter of these are from the threatened group. “World Flora online will then be an essential baseline to determine the status of individual plant species and threats to them,” Jackson explained. “By 2020 it is proposed that at least 75% of known threatened plants should be conserved both in the wild and in existing collections.”…an online flora of known plants […] should enable comprehensive conservation efforts to gather steamMissouri Botanical Gardens will have an important role in the project and Jackson commented that the first step of the plan has already been achieved: the establishment of an online checklist of flora that is needed to build a comprehensive database of the plant species in the world.Yet, some other plans to halt species decline have drawn criticism. “In my opinion, whilst such international targets are useful to motivate individuals, states and wider society to do conservation, they are not necessarily realistic because they are often ‘pulled out of the hat'' with very little science behind them,” commented Shonil Bhagwat, senior research fellow at the School of Geography and the Environment at Oxford University.The revised CBD plan specifies measures for reversing the decline in biodiversity. One target is to enlarge protected areas for wildlife, within which activities such as logging are prohibited. Ecological corridors could then connect these areas to allow migration and create a network of ‘safe'' places for wildlife.Such a corridor is being created between two parts of the Brazilian Atlantic rainforest—the Pau Brasil National Park and the Monte Pascoal National Park—both of which are already protected. “Well-managed protected areas keep away biodiversity threats, such as deforestation, invasive species, hunting and poaching,” explained Arnd Alexander Rose, marketing manager for Brazil at The Nature Conservancy, a conservation organization that operates on all continents. “We think that the connectivity between the national parks is essential for the long-term permanence of local species, especially fauna,” Rose said.Worldwide, only around 6% of coastlines are within protected areas, but around 12% of the total land area is protected—a figure that is perhaps higher than many would expect, reflecting the large size of many national parks and other designated wildlife zones. Nevertheless, the coverage of different habitats varies greatly: “Only 5% of the world''s temperate needle-leaf forests and woodlands, 4.4% of temperate grasslands and 2.2% of lake systems are protected” (CBD, 2010). The aim of the CBD is to increase the total area of protected land to 17% by 2020, and also to expand the protected coastal zones, as well as extending the area of protected oceans to 10%.Things at sea, however, are different; both in terms of biodiversity and protection. The biggest threat to many marine species is not direct human activity—poaching or habitat encroachment, for example—but the impact of increased ocean acidity due to rising atmospheric carbon dioxide levels. Halting or reversing this increase will therefore contribute to the marine conservation effort and biodiversity in the long term.However, the first task is to establish the extent of marine biodiversity, particularly in terms of invertebrate animals, which are not well catalogued. Ian Poiner is CEO of the Australian Institute of Marine Science and chair of the steering committee for the first Census of Marine Life (Census of Marine Life, 2010), which has revealed the enormity of our remaining uncertainty. “So far 250,000 species [of invertebrates] have been formally described, but at least another 750,000 remain to be discovered, and I think it could be as many as 10 million,” Poiner said. As evidence for this uncertainty he points to the continuing high rate of discovery of new species around coral reefs, where each organism also tends to come with a new parasite. The situation is compounded by the problem of how to define diversity among prokaryotes.“…250,000 species [of invertebrates] have been formally described, but at least another 750,000 remain to be discovered…”Even if the number of non-vertebrate marine species remaining to be discovered turns out to be at the low end of estimates, Poiner points out that the abundance and diversity of life in the oceans will still be far greater than was expected before the census. For fish—a group that has been more extensively analysed than invertebrates—Poiner notes that there are several thousand species yet to be discovered, in addition to the 25,000 or more known species.The levels of diversity are perhaps most surprising for microorganisms. It was expected that these organisms would be present in astronomically large numbers—they are thought to account for 50–90% of the biomass in the oceans, as measured by total amount of carbon—but the high degree of genetic divergence found within even relatively small areas was unexpected. “We found there are about 38,000 kinds of bacteria in a litre of sea water,” Poiner said. “We also found that rarity is common, especially for microbes. If you take two separate litre samples of sea water just 10 or 20 kilometres apart, only a small percentage of the 38,000 bacteria types in each one are of the same kind. The challenge now is to find out why most are so rare.”This mystery is confounded by another result of the census: there is a much greater degree of connectedness than had been expected. Many fish, and even smaller invertebrate species, travel huge distances and navigate with great accuracy, rather like migratory birds. “Pacific white sharks will travel long distances and come back to within 50 metres from where they started,” Poiner said, by way of example.The behaviour of the sharks was discovered by using new tags, measuring just a few centimetres across, that can be attached to the heads of any large creatures to track their location and measure temperature, conductivity—and thereby salinity—and depth. For smaller creatures, such as baby salmon, a different technology is used that involves the attachment of passive acoustic sensors to their bodies. These trigger a signal when the fish swim through arrays of acoustic receivers that are installed in shallower waters at locations throughout the oceans.Although tagging and acoustic monitoring are providing new information about the movements and interactions of many species throughout the oceans, the huge task remains of identifying and cataloguing those species. For this, the quickly maturing technique of DNA barcoding has been useful and provides a relatively inexpensive and convenient way of assessing whether a specimen belongs to a new species or not. The method uses a short DNA sequence in the mitochondrial gene for cytochrome c oxidase subunit 1 (CO1)—around 600 base pairs in most species—which differs little within species but significantly between them (Kress & Erickson, 2008).The Marine Census programme involves several barcoding centres that have determined barcodes for more than 2,000 of the 7,000 known species of holozooplankton, for example (Census of Marine Zooplankton: http://www.cmarz.org). Holozooplankton are small, completely planktonic invertebrates—which spend their lives floating or swimming in open water—and are a particularly sensitive marker of environmental changes such as ocean warming or acidification.DNA barcoding can also be applied to prokaryotes, although it requires alternative sequences owing to the lack of mitochondria. In addition, horizontal gene transfer and uncertainty about how to define prokaryotic species complicate the task of cataloguing them. Nevertheless, by targeting a suitable core subset of a few genes, bacteria and archaea can be identified quite accurately, and barcoding can increase our knowledge and understanding of their behaviour and evolution.Such techniques could be applied to the identification of marine prokaryotic species, but Poiner argues that they need further refinement and will probably need to be combined with analytical methods that help estimate the total diversity, given that it is impossible to identify every single species at present. Indeed, the task of assessing the diversity of even land-based microorganisms is difficult, but such cataloguing is a prerequisite for accurate assessment of their response to environmental change.“There is a general rule that the smaller things are the less we know about them,” commented Stephen Blackmore, Regius Keeper of the Royal Botanical Gardens in Edinburgh, UK, a leading centre for conservation research. “I think it is very difficult or too early to say how biodiversity at the microscopic level is being impacted. Some of the newer approaches using DNA diversity to see, for example, what microorganisms are present in soil, will be important.”In the immediate future, advanced DNA analysis techniques have a more urgent application: the identification of genetic diversity within eukaryotic species. This is important because it determines the ability of populations to cope with rapid change: a species with greater genetic diversity is more likely to have individuals with phenotypes capable of surviving changes in habitat, temperature or nutrient availability. Genetic evidence will help to determine the secret of success for many invasive species of plants and animals, as they have already adapted to human influence.“A major emerging theme is to look at the genetic diversity present in wild plant populations and to try to correlate this with identifying the populations that are best suited for coping with climate change,” Blackmore said. “But it''s a very new field and so far not much is being funded. Meanwhile, the immediate prospect is that plants will continue slipping away more or less un-noticed. Even where the landscape appears green there is generally a steady erosion of plant biodiversity going, on driven by the shrinking of natural habitats, the encroachment of invasive species, climate change and land management practices.”Yet Blackmore is optimistic that knowledge of how to preserve biodiversity is increasing, even for less adaptable species. “We know how to, for example, grow food crops in ways that are more beneficial to biodiversity, but the desire for the cheapest food means that uptake is too limited. We know how to do most of the things needed to protect biodiversity. Unfortunately they are not being done.”There is hope, though, that increased understanding of biodiversity as a single, interconnected problem—rather than a series of unrelated hot spots and particular species—will lead to more coherent strategies for arresting global decline. The fate of flowering plants, for example, is intimately tied to their pollinators and seed dispersers. Most land animals in turn depend directly or indirectly on plants. “Since plants are the base of the food chain in all terrestrial environments, the threats to animals are increasing even more rapidly than those to the plants they depend upon,” Blackmore noted. “It is still the case, however, that most conservation action is framed in terms of charismatic animals—such as tigers, whales, polar bears and pandas—rather than on the continuation of the kinds of place they require to live in.”Due to human nature, this ‘cute'' framing of the problem is perhaps inevitable. However, if it creates a groundswell of public concern leading to voluntary involvement and donation towards biodiversity conservation, then all species might benefit in the end. After all, animals and plants do not respect arbitrary human boundaries, so an ecological corridor and protected habitat created for tigers will also benefit other, less ‘cuddly'' species.  相似文献   

4.
Wolinsky H 《EMBO reports》2011,12(2):107-109
Considering a patient''s ethnic background can make some diagnoses easier. Yet, ‘racial profiling'' is a highly controversial concept and might soon be replaced by the advent of individualized medicine.In 2005, the US Food and Drug Administration (FDA; Bethesda, MD, USA) approved BiDil—a combination of vasodilators to treat heart failure—and hailed it as the first drug to specifically treat an ethnic group. “Approval of a drug to treat severe heart failure in self-identified black population is a striking example of how a treatment can benefit some patients even if it does not help all patients,” announced Robert Temple, the FDA''s Director of Medical Policy. “The information presented to the FDA clearly showed that blacks suffering from heart failure will now have an additional safe and effective option for treating their condition” (Temple & Stockbridge, 2007). Even the National Medical Association—the African-American version of the American Medical Association—advocated the drug, which was developed by NitroMed, Inc. (Lexington, MA, USA). A new era in medicine based on racial profiling seemed to be in the offing.By January 2008, however, the ‘breakthrough'' had gone bust. NitroMed shut down its promotional campaign for BiDil—a combination of the vasodilators isosorbide dinitrate, which affects arteries and veins, and hydralazine hydrochloride, which predominantly affects arteries. In 2009, it sold its BiDil interests and was itself acquired by another pharmaceutical company.In the meantime, critics had largely discredited the efforts of NitroMed, thereby striking a blow against the drug if not the concept of racial profiling or race-based medicine. Jonathan Kahn, a historian and law professor at Hamline University (St Paul, MN, USA), described the BiDil strategy as “a leap to genetics.” He demonstrated that NitroMed, motivated to extend its US patent scheduled to expire in 2007, purported to discover an advantage for a subpopulation of self-identified black people (Kahn, 2009). He noted that NitroMed conducted a race-specific trial to gain FDA approval, but, as there were no comparisons with other populations, it never had conclusive data to show that BiDil worked in black people differently from anyone else.“If you want to understand heart failure, you look at heart failure, and if you want to understand racial disparities in conditions such as heart failure or hypertension, there is much to look at that has nothing to do with genetics,” Kahn said, adding “that jumping to race as a genetic construct is premature at best and reckless generally in practice.” The USA, he explained, has a century-old tradition of marketing to racial and ethnic groups. “BiDil brought to the fore the notion that you can have ethnic markets not only in things like cigarettes and food, but also in pharmaceuticals,” Kahn commented.“BiDil brought to the fore the notion that you can have ethnic markets not only in things like cigarettes and food, but also in pharmaceuticals”However, despite BiDil''s failure, the search for race-based therapies and diagnostics is not over. “What I have found is an increasing, almost exponential, rise in the use of racial and ethnic categories in biotechnology-related patents,” Kahn said. “A lot of these products are still in the pipeline. They''re still patent applications, they''re not out on the market yet so it''s hard to know how they''ll play out.”The growing knowledge of the human genome is also providing new opportunities to market medical products aimed at specific ethnic groups. The first bumpy steps were taken with screening for genetic risk factors for breast cancers. Myriad Genetics (Salt Lake City, UT, USA) holds broad patents in the USA for breast-cancer screening tests that are based on mutations of the BRCA1 and BRCA2 genes, but it faced challenges in Europe, where critics raised concerns about the high costs of screening.The growing knowledge of the human genome is also providing new opportunities to market medical products aimed at specific ethnic groupsThe European Patent Office initially granted Myriad patents for the BRCA1 and BRCA2-based tests in 2001, after years of debate. But it revoked the patent on BRCA1 in 2005, which was again reversed in 2009. In 2005 Myriad decided to narrow the scope of BRCA2 testing on the basis of ethnicity. The company won a patent to predict breast-cancer risk in Ashkenazi Jewish women on the basis of BRCA2 mutations, which occur in one in 100 of these women. Physicians offering the test are supposed to ask their patients whether they are in this ethnic group, and then pay a fee to Myriad.Kahn said Myriad took this approach to package the test differently in order to protect its financial interests. However, he commented, the idea of ethnic profiling by asking women whether they identify themselves as Ashkenazi Jewish and then paying extra for an ‘ethnic'' medical test did not work in Europe. “It''s ridiculous,” Kahn commented.After the preliminary sequence of the human genome was published a decade ago, experts noted that humans were almost the same genetically, implying that race was irrelevant. In fact, the validity of race as a concept in science—let alone the use of the word—has been hotly debated. “Race, inasmuch as the concept ought to be used at all, is a social concept, not a biological one. And using it as though it were a biological one is as a much an ethical problem as a scientific problem,” commented Samia Hurst, a physician and bioethicist at Geneva University Medical School in Switzerland.Switzerland.Open in a separate window© Monalyn Gracia/CorbisCiting a popular slogan: “There is no gene for race,” she noted, “there doesn''t seem to be a single cluster of genes that fits with identification within an ethnic group, let alone with disease risks as well. We''re also in an increasingly mixed world where many people—and I count myself among them—just don''t know what to check on the box. If you start counting up your grandparents and end up with four different ethnic groups, what are you going to do? So there are an increasing number of people who just don''t fit into those categories at all.”Still, some dismiss criticism of racial profiling as political correctness that could potentially prevent patients from receiving proper care. Sally Satel, a psychiatrist in Washington, DC, USA, does not shy away from describing herself as a racially profiling physician and argues that it is good medicine. A commentator and resident scholar at the nonpartisan conservative think tank, the American Enterprise Institute (Washington, DC, USA), Satel wrote the book PC, M.D.: How Political Correctness is Corrupting Medicine. “In practicing medicine, I am not color blind. I take note of my patient''s race. So do many of my colleagues,” she wrote in a New York Times article entitled “I am a racially profiling doctor” (Satel, 2002).…some dismiss criticism of racial profiling as political correctness that could potentially prevent patients from receiving proper careSatel noted in an interview that it is an undeniable fact that black people tend to have more renal disease, Native Americans have more diabetes and white people have more cystic fibrosis. She said these differences can help doctors to decide which drugs to prescribe at which dose and could potentially lead researchers to discover new therapies on the basis of race.Satel added that the mention of race and medicine makes many people nervous. “You can dispel that worry by taking pains to specify biological lineage. Simply put, members of a group have more genes in common than members of the population at large. Some day geneticists hope to be able to conduct genomic profiles of each individual, making group identity irrelevant, but until then, race-based therapeutics has its virtues,” she said. “Denying the relationship between race and medicine flies in the face of clinical reality, and pretending that we are all at equal risk for health problems carries its own dangers.”However, Hurst contended that this approach may be good epidemiology, rather than racial profiling. Physicians therefore need to be cautious about using skin colour, genomic data and epidemiological data in decision making. “If African Americans are at a higher risk for hypertension, are you not going to check for hypertension in white people? You need to check in everyone in any case,” she commented.Hurst said European physicians, similarly to their American colleagues, deal with race and racial profiling, albeit in a different way. “The way in which we struggle with it is strongly determined by the history behind what could be called the biases that we have. If you have been a colonial power, if the past is slavery or if the past or present is immigration, it does change some things,” she said. “On the other hand, you always have the difficulty of doing fair and good medicine in a social situation that has a kind of ‘them and us'' structure. Because you''re not supposed to do medicine in a ‘them and us'' structure, you''re supposed to treat everyone according to their medical needs and not according to whether they''re part of ‘your tribe'' or ‘another tribe''.”Indeed, social factors largely determine one''s health, rather than ethnic or genetic factors. August A. White III, an African-American orthopaedic surgeon at Harvard Medical School (Boston, MA, USA) and author of the book Seeing Patients: Unconscious Bias In Health Care, noted that race is linked to disparities in health care in the USA. A similar point can be made in Europe where, for example, Romani people face discrimination in several countries.White said that although genetic research shows that race is not a scientific concept, the way people are labelled in society and how they are treated needs to be taken into account. “It''d be wonderful at some point if we can pop one''s key genetic information into a computer and get a printout of which medications are best of them and which doses are best for them,” he commented. “In the meantime though, I advocate careful operational attempts to treat everyone as human beings and to value everyone''s life, not devalue old people, or devalue women, or devalue different religious faiths, etc.”Notwithstanding the scientific denunciation, a major obstacle for the concept of racial profiling has been the fact that the word ‘race'' itself is politically loaded, as a result of, among other things, the baggage of eugenics and Nazi racism and the legacies of slavery and colonialism. Richard Tutton, a sociologist at Lancaster University in the UK, said that British scientists he interviewed for a Wellcome Trust project a few years ago prefer the term ethnicity to race. “Race is used in a legal sense in relation to inequality, but certainly otherwise, ethnicity is the preferred term, which obviously is different to the US” he said. “I remember having conversations with German academics and obviously in Germany you couldn''t use the R-word.”Jan Helge Solbakk, a physician, theologian and medical ethicist at the University of Oslo in Norway, said the use of the term race in Europe is a non-starter because it makes it impossible for the public and policy-makers to communicate. “I think in Europe it would be politically impossible to launch a project targeting racial differences on the genetic level. The challenge is to find not just a more politically correct concept, but a genetically more accurate concept and to pursue such research questions,” he said. According to Kahn, researchers therefore tend to refer to ethnicity rather than race: “They''re talking about European, Asian and African, but they''re referring to it as ethnicity instead of race because they think somehow that''s more palatable.”Regardless, race-based medicine might just be a stepping stone towards more refined and accurate methods, with the advent of personalized medicine based on genomics, according to Leroy Hood, whose work has helped to develop tools to analyse the human genome. The focus of his company—the Institute for Systems Biology (Seattle, WA, USA)—is to identify genetic variants that can inform and help patients to pioneer individualized health care.“Race as a concept is disappearing with interbreeding,” Hood said. “Race distinction is going to slowly fade away. We can use it now because we have signposts for race, which are colour, fairness, kinkiness of hair, but compared to a conglomeration of things that define a race, those are very few features. The race-defining features are going to be segregating away from one another more and more as the population becomes racially heterogeneous, so I think it''s going to become a moot point.”Hood instead advocates “4P” health care—“Predictive, Personalized, Preventive and Participatory.” “My overall feeling about the race-based correlations is that it is far more important to think about the individual and their individual unique spectra of health and wellness,” he explained. “I think we are not going to deal in the future with racial or ethnic populations, rather medicine of the future is going to be focused entirely on the individual.”Yet, Arthur Caplan, Director of the Center for Bioethics at the University of Pennsylvania (Philadelphia, PA, USA), is skeptical about the prospects for both race-based and personalized medicine. “Race-based medicine will play a minor role over the next few years in health care because race is a minor factor in health,” he said. “It''s not like we have a group of people who keel over dead at 40 who are in the same ethnic group.”Caplan also argued that establishing personalized genomic medicine in a decade is a pipe dream. “The reason I say that is it''s not just the science,” he explained. “You have to redo the whole health-care system to make that possible. You have to find manufacturers who can figure out how to profit from personalized medicine who are both in Europe and the United States. You have to have doctors that know how to prescribe them. It''s a big, big revamping. That''s not going to happen in 10 years.”Hood, however, is more optimistic and plans to advance the concept with pilot projects; he believes that Europe might be the better testing ground. “I think the European systems are much more efficient for pioneering personalized medicine than the United States because the US health-care system is utterly chaotic. We have every combination of every kind of health care and health delivery. We have no common shared vision,” he said. “In the end we may well go to Europe to persuade a country to really undertake this. The possibility of facilitating a revolution in health care is greater in Europe than in the United States.”  相似文献   

5.
Crop shortages     
A lack of breeders to apply the knowledge from plant science is jeopardizing public breeding programmes and the training of future plant scientistsIn the midst of an economic downturn, many college and university students in the USA face an uncertain future. There is one crop of graduates, though, who need not worry about unemployment: plant breeders. “Our students start with six-digit salaries once they leave and they have three or four offers. We have people coming to molecular biology and they can''t find jobs. People coming to plant breeding, they have as many jobs as they want,” said Edward Buckler, a geneticist with the US Department of Agriculture''s Agricultural Research Service Institute for Genomic Diversity at Cornell University (Ithaca, NY, USA).The lure of Big Ag depletes universities and research institutes of plant breeders […] and jeopardizes the training of future generations of plant scientists and breedersThe secret behind the success of qualified breeders on the job market is that they can join ‘Big Ag''—big agriculture—that is, major seed companies. Roger Boerma, coordinator of academic research for the Center for Applied Genetic Technologies at the University of Georgia (Athens, GA, USA), said that most of his graduate and postdoctoral students find jobs at companies such as Pioneer, Monsanto and Syngenta, rather than working in the orchards and fields of academic research. According to Todd Wehner, a professor and cucurbit breeder at the Department of Horticultural Science, North Carolina State University (Raleigh, NC, USA), the best-paying jobs—US$100,000 plus good benefits and research conditions—are at seed companies that deal with the main crops (Guner & Wehner, 2003). By contrast, university positions typically start at US$75,000 and tenure track.As a result, Wehner said, public crop breeding in the USA has begun to disappear. “To be clear, there is no shortage of plant breeders,” he said. “There is a shortage of plant breeders in the public sector.” The lure of Big Ag depletes universities and research institutes of plant breeders—who, after all, are the ones who create new plant varieties for agriculture—and jeopardizes the training of future generations of plant scientists and breeders. Moreover, there is an increasing demand for breeders to address the challenge of creating environmentally sustainable ways to grow more food for an increasing human population on Earth.At the same time, basic plant research is making rapid progress. The genomes of most of the main crop plants and many vegetables have been sequenced, which has enabled researchers to better understand the molecular details of how plants fend off pests and pathogens, or withstand drought and flooding. This research has also generated molecular markers—short regions of DNA that are linked to, for example, better resistance to fungi or other pathogens. So-called marker-assisted breeding based on this information is now able to create new plant varieties more effectively than would be possible with the classical strategy of crossing, selection and backcrossing.However, applying the genomic knowledge requires both breeders and plant scientists with a better understanding of each other''s expertise. As David Baulcombe, professor of botany at the University of Cambridge, UK, commented, “I think the important gap is actually in making sure that the fundamental scientists working on genomics understand breeding, and equally that those people doing breeding understand the potential of genomics. This is part of the translational gap. There''s incomplete understanding on both sides.”…applying the genomic knowledge requires both breeders and plant scientists with a better understanding of each other''s expertiseIn the genomic age, plant breeding has an image problem: like other hands-on agricultural work, it is dirty and unglamorous. “A research project in agriculture in the twenty-first century resembles agriculture for farmers in the eighteenth century,” Wehner said. “Harvesting in the fields in the summer might be considered one of the worst jobs, but not to me. I''m harvesting cucumbers just like everybody else. I don''t mind working at 105 degrees, with 95% humidity and insects biting my ankles. I actually like that. I like that better than office work.”For most students, however, genomics is the more appealing option as a cutting-edge and glamorous research field. “The exciting photographs that you always see are people holding up glass test tubes and working in front of big computer screens,” Wehner explained.In addition, Wehner said that federal and state governments have given greater priority and funding to molecular genetics than to plant breeding. “The reason we''ve gone away from plant breeding of course is that faculty can get competitive grants for large amounts of money to do things that are more in the area of molecular genetics,” he explained. “Plant breeders have switched over to molecular genetics because they can get money there and they can''t get money in plant breeding.”“The frontiers of science shifted from agriculture to genetics, especially the genetics of corn, wheat and rice,” agreed Richard Flavell, former Director of the John Innes Centre (Norwich, UK) and now Chief Scientific Officer of Ceres (Thousand Oaks, CA, USA). “As university departments have chased their money, chased the bright students, they have [focused on] programmes that pull in research dollars on the frontiers, and plant breeding has been left behind as something of a Cinderella subject.”In the genomic age, plant breeding has an image problem: like other hands-on agricultural work, it is dirty and unglamorousIn a sense, public plant breeding has become a victim of its own success. Wehner explained that over the past century, the protection of intellectual property has created a profitable market for private corporations to the detriment of public programmes. “It started out where they could protect seed-propagated crops,” he said. “The companies began to hire plant breeders and develop their own varieties. And that started the whole agricultural business, which is now huge.”As a result, Wehner said, the private sector can now outmanoeuvre public breeders at will. “[Seed companies] have huge teams that can go much faster than I can go. They have winter nurseries and big greenhouses and lots of pathologists and molecular geneticists and they have large databases and seed technologists and sales reps and catalogue artists and all those things. They can do much faster cucumber breeding than I can. They can beat me in any area that they choose to focus on.”He said that seed corporations turn only to public breeders when they are looking for rare seeds obtained on expeditions around the world or special knowledge. These crops and the breeders and other scientists who work on them receive far less financial support from government than do the more profitable crops, such as corn and soybean. In effect, these crops are in an analogous position to orphan drugs that receive little attention because the patients who need them represent a small economic market.The dwindling support for public breeding programmes is also a result of larger political developments. Since the 1980s, when British Prime Minister Margaret Thatcher and US President Ronald Regan championed the private sector in all things, government has consistently withdrawn support for public research programmes wherever the private sector can profit. “Plant breeding programmes are expensive. My programme costs about US$500,000 a year to run for my crops, watermelon and cucumber. Universities don''t want to spend that money if they don''t have to, especially if it''s already being done by the private sector,” Wehner said.“Over the last 30 years or so, food supplies and food security have fallen off the agenda of policymakers”…“Over the last 30 years or so, food supplies and food security have fallen off the agenda of policymakers,” Baulcombe explained. “Applied research in academic institutions is disappearing, and so the opportunities for linking the achievements of basic research with applications, at least in the public sector, are disappearing. You''ve got these two areas of the work going in opposite directions.”There''s another problem for plant breeding in the publish-or-perish world of academia. According to Ian Graham, Director of the Centre for Novel Agricultural Products at York University in the UK, potential academics in the plant sciences are turned off by plant breeding as a discipline because it is difficult to publish the research in high-impact journals.Graham, who is funded by the Bill & Melinda Gates Foundation to breed new varieties of Artemisia—the plant that produces the anti-malarial compound artemisinin—said this could change. “Now with the new [genomic] technologies, the whole subject of plant breeding has come back into the limelight. We can start thinking seriously about not just the conventional crops […] but all the marginal crops as well that we can really start employing these technologies on and doing exciting science and linking phenotypes to genes and phenotypes to the underlying biology,” he said. “It takes us back again closer to the science. That will bring more people into plant breeding.”…potential academics in the plant sciences are turned off by plant breeding as a discipline because it is difficult to publish the research in high-impact journalsBuckler, who specializes in functional genomic approaches to dissect complex traits in maize, wheat and Arabidopsis, said that public breeding still moves at a slower pace. “The seed companies are trying to figure out how to move genomics from gene discovery all the way to the breeding side. And it''s moving forward,” he said. “There have been some real intellectual questions that people are trying to overcome as to how fast to integrate genomics. I think it''s starting to occur also with a lot of the public breeders. A lot of it has been that the cost of genotyping, especially for specialty crops, was too high to develop marker systems that would really accelerate breeding.”Things might be about to change on the cost side as well. Buckler said that decreasing costs for sequencing and genotyping will give public breeding a boost. Using today''s genomic tools, researchers and plant breeders could match the achievements of the last century in maize breeding within three years. He said that comparable gains could be made in specialty crops, the forte of public breeding. “Right now, most of the simulations suggest that we can accelerate it about threefold,” Buckler said. “Maybe as our knowledge increases, maybe we can approach a 15-fold rate increase.”Indeed, the increasing knowledge from basic research could well contribute to significant advances in the coming years. “We''ve messed around with genes in a rather blind, sort of non-predictive process,” said Scott Jackson, a plant genomics expert at Purdue University (West Lafayette, IN, USA), who headed the team that decoded the soybean genome (Schmutz et al, 2010). “Having a full genome sequence, having all the genes underlying all the traits in whatever plant organism you''re looking at, makes it less blind. You can determine which genes affect the trait and it has the potential to make it a more predictive process where you can take specific genes in combinations and you can predict what the outcome might be. I think that''s where the real revolution in plant breeding is going to come.”Nevertheless, the main problem that could hold back this revolution is a lack of trained people in academia and the private sector. Ted Crosbie, Head of Plant Breeding at Monsanto (St Louis, MO, USA), commented at the national Plant Breeding Coordinating Committee meeting in 2008 that “[w]e, in the plant breeding industry, face a number of challenges. More plant breeders are reaching retirement age at a time when the need for plant breeders has never been greater […] We need to renew our nation''s capacity for plant breeding.”“…with the new [genomic] technologies, the whole subject of plant breeding has come back into the limelight”Dry bean breeder James Kelly, a professor of crop and soil sciences at Michigan State University (East Lansing, MI, USA), said while there has been a disconnect between public breeders and genomics researchers, new federal grants are designed to increase collaboration.In the meantime, developing countries such as India and China have been filling the gap. “China is putting a huge amount of effort into agriculture. They actually know the importance of food. They have plant breeders all over the place,” Wehner said. “The US is starting to fall behind. And now, agricultural companies are looking around wondering—where are we going to get our plant breeders?”To address the problem, major agriculture companies have begun to fund fellowships to train new plant breeders. Thus far, Buckler said, these efforts have had only a small impact. He noted that 500 new PhDs a year are needed just in maize breeding. “It''s not uncommon for the big companies like Monsanto, Pioneer and Syngenta to spend money on training, on endowing chairs at universities,” Flavell said. “It''s good PR, but they''re serious about the need for breeders.”The US government has also taken some measures to alleviate the problem. Congress decided to establish the US National Institute of Food and Agriculture (Washington, DC, USA) under the auspices of the US Department of Agriculture to make more efficient use of research money, advance the application of plant science and attract new students to plant breeding (see the interview with Roger Beachy in this issue, pp 504–507). Another approach is to use distance education to train breeders, such as technicians who want to advance their careers, in certificate programmes rather than master''s or doctorate programmes.“If [breeding] is not done in universities in the public sector, where is it done?”…“If [breeding] is not done in universities in the public sector, where is it done?” Flavell asked about the future of public breeding. “I can wax lyrical and perhaps be perceived as being over the top, but if we''re going to manage this planet on getting more food out of less land, this has to be almost one of the highest things that has got to be taken care of by government.” Wehner added, “The public in the developed world thinks food magically appears in grocery stores. There is no civilization without agriculture. Without plant breeders to work on improving our crops, civilization is at risk.”  相似文献   

6.
Samuel Caddick 《EMBO reports》2008,9(12):1174-1176
  相似文献   

7.
8.
Wolinsky H 《EMBO reports》2010,11(12):921-924
The US still leads the world in stem-cell research, yet US scientists are facing yet another political and legal battle for federal funding to support research using human embryonic stem cells.Disputes over stem-cell research have been standard operating procedure since James Thompson and John Gearhart created the first human embryonic cell (hESC) lines. Their work triggered an intense and ongoing debate about the morality, legality and politics of using hESCs for biomedical research. “Stem-cell policy has caused craziness all over the world. It is a never-ending, irresolvable battle about the moral status [of embryos],” commented Timothy Caulfield, research director of the Health Law Institute at the University of Alberta in Edmonton, Canada. “We''re getting to an interesting time in history where science is playing a bigger and bigger part in our lives, and it''s becoming more controversial because it''s becoming more powerful. We need to make some interesting choices about how we decide what kind of scientific inquiry can go forward and what can''t go forward.”“Stem-cell policy has caused craziness all over the world…[i]t is a never-ending, irresolvable battle about the moral status [of embryos]”The most contested battleground for stem-cell research has been the USA, since President George W. Bush banned federal funding for research that uses hESCs. His successor, Barack Obama, eventually reversed the ban, but a pending lawsuit and the November congressional elections have once again thrown the field into jeopardy.Three days after the election, the deans of US medical schools, chiefs of US hospitals and heads of leading scientific organizations sent letters to both the House of Representatives and the Senate urging them to pass the Stem Cell Research Advancement Act when they come back into session. The implication was to pass legislation now, while the Democrats were still the majority. Republicans, boosted in the election by the emerging fiscally conservative Tea Party movement, will be the majority in the House from January, changing the political climate. The Republicans also cut into the Democratic majority in the Senate.Policies and laws to regulate stem-cell research vary between countries. Italy, for example, does not allow the destruction of an embryo to generate stem-cell lines, but it does allow research on such cells if they are imported. Nevertheless, the Italian government deliberately excluded funding for projects using hESCs from its 2009 call for proposals for stem-cell research. In the face of legislative vacuums, this October, Science Foundation Ireland and the Health Research Board in Ireland decided to not consider grant applications for projects involving hESC lines. The UK is at the other end of the scale; it has legalized both research with and the generation of stem-cell lines, albeit under the strict regulation by the independent Human Fertility and Embryology Authority. As Caulfield commented, the UK is “ironically viewed as one of the most permissive [on stem-cell policy], but is perceived as one of the most bureaucratic.”Somewhere in the middle is Germany, where scientists are allowed to use several approved cell lines, but any research that leads to the destruction of an embryo is illegal. Josephine Johnston, director of research operations at the Hastings Center in Garrison, NY, USA—a bioethics centre—said: “In Germany you can do research on embryonic stem-cells, but you can''t take the cells out of the embryo. So, they import their cells from outside of Germany and to me, that''s basically outsourcing the bit that you find difficult as a nation. It doesn''t make a lot of sense ethically.”Despite the public debates and lack of federal support, Johnson noted that the USA continues to lead the world in the field. “[Opposition] hasn''t killed stem-cell research in the United States, but it definitely is a headache,” she said. In October, physicians at the Shepherd Center, a spinal cord and brain injury rehabilitation hospital and clinical research centre in Atlanta, GA, USA, began to treat the first patient with hESCs. This is part of a clinical trial to test a stem-cell-based therapy for spinal cord injury, which was developed by the US biotechnology company Geron from surplus embryos from in vitro fertilization.Nevertheless, the debate in the USA, where various branches of government—executive, legislative and legal—weigh in on the legal system, is becoming confusing. “We''re never going to have consensus [on the moral status of fetuses] and any time that stem-cell research becomes tied to that debate, there''s going to be policy uncertainty,” Caulfield said. “That''s what''s happened again in the United States.”Johnson commented that what makes the USA different is the rules about federally funded and non-federally funded research. “It isn''t much discussed within the United States, but it''s a really dramatic difference to an outsider,” she said. She pointed out that, by contrast, in other countries the rules for stem-cell research apply across the board.The election of Barack Obama as US President triggered the latest bout of uncertainty. The science community welcomed him with open arms; after all, he supports doubling the budget of the National Institutes of Health (NIH) over the next ten years and dismantled the policies of his predecessor that barred it from funding projects beyond the 60 extant hESC lines—only 21 of which were viable. Obama also called on Congress to provide legal backing and funding for the research.The executive order had unforeseen consequences for researchers working with embryonic or adult stem cells. Sean Morrison, Director of the University of Michigan''s Centre for Stem Cell Biology (Ann Arbor, MI, USA), said he thought that Obama''s executive order had swung open the door on federal support forever. “Everybody had that impression,” he said.Leonard I. Zon, Director of the Stem Cell Program at Children''s Hospital Boston (MA, USA), was so confident in Obama''s political will that his laboratory stopped its practice of labelling liquid nitrogen containers as P (Presidential) and NP (non-Presidential) to avoid legal hassles. His lab also stopped purchasing and storing separate pipettes and culture dishes funded by the NIH and private sources such as the Howard Hughes Medical Institute (HHMI; Chevy Chase, MD, USA).But some researchers who focused on adult cells felt that the NIH was now biased in favour of embryonic cells. Backed by pro-life and religious groups, two scientists—James Sherley of the Boston Biomedical Research Institute and Theresa Deisher of AVM Biotechnology (Seattle, WA)—questioned the legality of the new NIH rules and filed a lawsuit against the Department of Health and Human Services (HHS) Secretary, Kathleen Sebelius. Deisher had founded her company to “[w]ork to provide safe, effective and affordable alternative vaccines and stem-cell therapies that are not tainted by embryonic or electively aborted fetal materials” (www.avmbiotech.com).…the debate in the USA, where various branches of government—executive, legislative and legal—weigh in on the legal system, is becoming confusingSherley argued in an Australian newspaper in October 2006 that the science behind embryonic stem-cell research is flawed and rejected arguments that the research will make available new cures for terrible diseases (Sherley, 2006). In court, the researchers also argued that they were irreparably disadvantaged in competing for government grants by their work on adult stem cells.Judge Royce C. Lamberth of the District Court of the District of Columbia initially ruled that the plaintiffs had no grounds on which to sue. However, the US District Court of Appeals for the District of Columbia overturned his decision and found that “[b]ecause the Guidelines have intensified the competition for a share in a fixed amount” of NIH funding. With the case back in his court, Lamberth reversed his decision on August 23 this year, granting a preliminary injunction to block the new NIH guidelines on embryonic stem-cell work. This injunction is detailed in the 1995 Dickey-Wicker Amendment, an appropriation bill rider, which prohibits the HHS from funding “research in which a human embryo or embryos are destroyed, discarded or knowingly subjected to risk of injury or death.” By allowing the destruction of embryos, Lamberth argued, the NIH rules violate the law.This triggered another wave of uncertainty as dozens of labs faced a freeze of federal funding. Morrison commented that an abrupt end to funding does not normally occur in biomedical research in the USA. “We normally have years of warning when grants are going to end so we can make a plan about how we can have smooth transitions from one funding source to another,” he said. Morrison—whose team has been researching Hirschsprung disease, a congenital enlargement of the colon—said his lab potentially faced a loss of US$ 250,000 overnight. “I e-mailed the people in my lab and said, ‘We may have just lost this funding and if so, then the project is over''”.Morrison explained that the positions of two people in his lab were affected by the cut, along with a third person whose job was partly funded by the grant. “Even though it''s only somewhere between 10–15% of the funding in my lab, it''s still a lot of money,” he said. “It''s not like we have hundreds of thousands of dollars of discretionary funds lying around in case a problem like that comes up.” Zon noted that his lab, which experienced an increase in the pace of discovery since Obama had signed his order, reverted to its Bush-era practices.On September 27 this year, a federal appeals court for the District of Columbia extended Lamberth''s stay to enable the government to pursue its appeal. The NIH was allowed to distribute US$78 million earmarked for 44 scientists during the appeal. The court said the matter should be expedited, but it could, over the years ahead, make its way to the US Supreme Court.The White House welcomed the decision of the appeals court in favour of the NIH. “President Obama made expansion of stem-cell research and the pursuit of groundbreaking treatments and cures a top priority when he took office. We''re heartened that the court will allow [the] NIH and their grantees to continue moving forward while the appeal is resolved,” said White House press secretary Robert Gibbs. The White House might have been glad of some good news, while it wrestles with the worst economic downturn since the Great Depression and the rise of the Tea Party movement.Even without a formal position on the matter, the Tea Party has had an impact on stem-cell research through its electoral victoriesTimothy Kamp, whose lab at the University of Wisconsin (Madison, WI, USA) researches embryonic stem-cell-derived cardiomyocytes, said that he finds the Tea Party movement confusing. “It''s hard for me to know what a uniform platform is for the Tea Party. I''ve heard a few comments from folks in the Tea Party who have opposed stem-cell research,” he said.However, the position of the Tea Party on the topic of stem-cell research could prove to be of vital importance. The Tea Party took its name from the Boston Tea Party—a famous protest in 1773 in which American colonists protested against the passing of the British Tea Act, for its attempt to extract yet more taxes from the new colony. Protesters dressed up as Native Americans and threw tea into the Boston harbour. Contemporary Tea Party members tend to have a longer list of complaints, but generally want to reduce the size of government and cut taxes. Their increasing popularity in the USA and the success of many Tea Party-backed Republican candidates for the upcoming congressional election could jeopardize Obama''s plans to pass new laws to regulate federal funding for stem-cell research.Even without a formal position on the matter, the Tea Party has had an impact on stem-cell research through its electoral victories. Perhaps their most high-profile candidate was the telegenic Christine O''Donnell, a Republican Senatorial candidate from Delaware. The Susan B. Anthony List, a pro-life women''s group, has described O''Donnell as one of “the brightest new stars” opposing abortion (www.lifenews.com/state5255.html). Although O''Donnell was eventually defeated in the 2 November congressional election, by winning the Republican primary in August, she knocked out nine-term Congressman and former Delaware governor Mike Castle, a moderate Republican known for his willingness to work with Democrats to pass legislation to protect stem-cell research.In the past, Castle and Diane DeGette, a Democratic representative from Colorado, co-sponsored the Stem Cell Research Advancement Act to expand federal funding of embryonic stem-cell research. They aimed to support Obama''s executive order and “ensure a lasting ethical framework overseeing stem cell research at the National Institutes of Health”.Morrison described Castle as “one of the great public servants in this country—no matter what political affiliation you have. For him to lose to somebody with such a chequered background and such shaky positions on things like evolution and other issues is a tragedy for the country.” Another stem-cell research advocate, Pennsylvania Senator Arlen Specter, a Republican-turned-Democrat, was also defeated in the primary. He had introduced legislation in September to codify Obama''s order. Specter, a cancer survivor, said his legislation is aimed at removing the “great uncertainty in the research community”.According to Sarah Binder, a political scientist at George Washington University in Washington, DC, the chances of passing legislation to codify the Obama executive order are decreasing: “As the Republican Party becomes more conservative and as moderates can''t get nominated in that party, it does lead you to wonder whether it''s possible to make anything happen [with the new Congress] in January.”There are a variety of opinions about how the outcome of the November elections will influence stem-cell policies. Binder said that a number of prominent Republicans have strongly promoted stem-cell research, including the Reagan family. “This hasn''t been a purely Democratic initiative,” she said. “The question is whether the Republican party has moved sufficiently to the right to preclude action on stem cells.” Historically there was “massive” Republican support for funding bills in 2006 and 2007 that were ultimately vetoed by Bush, she noted.…the debate about public funding for stem-cell research is only part of the picture, given the role of private business and states“Rightward shifts in the House and Senate do not bode well for legislative efforts to entrench federal support for stem-cell research,” Binder said. “First, if a large number of Republicans continue to oppose such funding, a conservative House majority is unlikely to pursue the issue. Second, Republican campaign commitments to reduce federal spending could hit the NIH and its support for stem-cell research hard.”Binder added that “a lingering unknown” is how the topic will be framed: “If it gets framed as a pro-choice versus pro-life initiative, that''s quite difficult for Congress to overcome in a bipartisan way. If it is framed as a question of medical research and medical breakthroughs and scientific advancement, it won''t fall purely on partisan lines. If members of Congress talk about their personal experiences, such as having a parent affected by Parkinson''s, then you could see even pro-life members voting in favour of a more expansive interpretation of stem-cell funding.”Johnson said that Congress could alter the wording of the Dickey-Wicker Amendment when passing the NIH budget for 2011 to remove the conflict. “You don''t have to get rid of the amendment completely, but you could rephrase it,” she said. She also commented that the public essentially supports embryonic stem-cell research. “The polls and surveys show the American public is morally behind there being some limited form of embryonic stem-cell research funded by federal money. They don''t favour cloning. There is not a huge amount of support for creating embryos from scratch for research. But there seems to be pretty wide support among the general public for the kind of embryonic stem-cell research that the NIH is currently funding.”In the end, however, the debate about public funding for stem-cell research is only part of the picture, given the role of private business and states. Glenn McGee, a professor at the Center for Practical Bioethics in Kansas City, MO, USA, and editor of the American Journal of Bioethics, commented that perhaps too much emphasis is being put on federal funding. He said that funding from states such as California and from industry—which are not restricted—has become a more important force than NIH funding. “We''re a little bit delusional if we think that this is a moment where the country is making a big decision about what''s going to happen with stem cells,” he said. “I think that ship has sailed.”  相似文献   

9.
Suran M 《EMBO reports》2011,12(1):27-30
Few environmental disasters are as indicting of humanity as major oil spills. Yet Nature has sometimes shown a remarkable ability to clean up the oil on its own.In late April 2010, the BP-owned semi-submersible oilrig known as Deepwater Horizon exploded just off the coast of Louisiana. Over the following 84 days, the well from which it had been pumping spewed 4.4 million barrels of crude oil into the Gulf of Mexico, according to the latest independent report (Crone & Tolstoy, 2010). In August, the US Government released an even grimmer estimate: according to the federal Flow Rate Technical Group, up to 4.9 million barrels were excreted during the course of the disaster. Whatever the actual figure, images from NASA show that around 184.8 million gallons of oil have darkened the waters just 80 km from the Louisiana coast, where the Mississippi Delta harbours marshlands and an abundance of biodiversity (NASA Jet Propulsion Laboratory, 2010; Fig 1).…the Deepwater incident is not the first time that a massive oil spill has devastated marine and terrestrial ecosystems, nor is it likely to be the lastOpen in a separate windowFigure 1Images of the Deepwater Horizon oil slick in the Gulf of Mexico. These images were recorded by NASA''s Terra spacecraft in May 2010. The image dimensions are 346 × 258 kilometres and North is toward the top. In the upper panel, the oil appears bright turquoise owing to the combination of images that were used from the Multi-angle Imaging SpectroRadiometer (MISR) aboard the craft. The Mississippi Delta, which harbors marshlands and an abundance of biodiversity, is visible in the top left of the image. The white arrow points to a plume of smoke and the red cross-hairs indicate the former location of the drilling rig. The lower two panels are enlargements of the smoke plume, which is probably the result of controlled burning of collected oil on the surface.© NASA/GSFC/LaRC/JPL, MISR TeamThe resulting environmental and economic situation in the Gulf is undoubtedly dreadful—the shrimp-fishing industry has been badly hit, for example. Yet the Deepwater incident is not the first time that a massive oil spill has devastated marine and terrestrial ecosystems, nor is it likely to be the last. In fact, the US National Oceanic and Atmospheric Association (NOAA) deals with approximately 300 oil spills per year and the Deepwater catastrophe—despite its extent and the enormous amount of oil released—might not be as terrible for the environment as was originally feared. Jacqueline Michel, a geochemist who has worked on almost every major oil spill since the 1970s and who is a member of NOAA''s scientific support team for the Gulf spill, commented that “the marshes and grass are showing some of the highest progresses of [oil] degradation because of the wetness.” This rapid degradation is partly due to an increased number of oil-consuming microbes in the water, whose population growth in response to the spill is cleaning things up at a relatively fast pace (Hazen et al, 2010).It therefore seems that, however bad the damage, Nature''s capacity to repair itself might prevent the unmitigated disaster that many feared on first sight of the Deepwater spill. As the late social satirist George Carlin (1937–2008) once put it: “The planet will shake us off like a bad case of fleas, a surface nuisance[.] The planet will be here for a long, long—LONG—time after we''re gone, and it will heal itself, it will cleanse itself, because that''s what it does, it''s a self-correcting system.”Michel believes that there are times when it is best to leave nature alone. In such cases the oil will degrade naturally by processes as simple as exposure to sunlight—which can break it down—or exposure to the air—which evaporates many of its components. “There have been spills where there was no response because we knew we were going to cause more harm,” Michel said. “Although we''re going to remove heavier layers of surface oil [in this case], the decision has been made to leave oil on the beach because we believe it will degrade in a timescale of months […] through natural processing.”To predict the rate of general environmental recovery, Michel said one should examine the area''s fauna, the progress of which can be very variable. Species have different recovery rates and although it takes only weeks or months for tiny organisms such as plankton to bounce back to their normal population density, it can take years for larger species such as the endangered sea turtle to recover.…however bad the damage, Nature''s capacity to repair itself might prevent the unmitigated disaster that many feared on first sight…Kimberly Gray, professor of environmental chemistry and toxicology at Northwestern University (Evanston, IL, USA), is most concerned about the oil damaging the bottom of the food chain. “Small hits at the bottom are amplified as you move up,” she explained. “The most chronic effects will be at the base of the food chain […] we may see lingering effects with the shrimp population, which in time may crash. With Deepwater, it''s sort of like the straw that broke the shrimp''s back.”Wetlands in particular are a crucial component of the natural recovery of ecosystems, as they provide flora that are crucial to the diets of many organisms. They also provide nesting grounds and protective areas where fish and other animals find refuge from predation. “Wetlands and marsh systems are Nature''s kidneys and they''ve been damaged,” Gray said. The problem is exacerbated because the Louisiana wetlands are already stressed in the aftermath of Hurricane Katrina, which devastated the Gulf coast in August 2005, and because of constant human activity and environmental damage. As Gray commented, “Nature has a very powerful capacity to repair itself, but what''s happening in the modern day is assault after assault.”Ron Thom, a marine ecologist at Pacific Northwest National Laboratory—a US government-funded research facility (Richland, WA, USA)—has done important research on coastal ecosystems. He believes that such habitats are able to decontaminate themselves to a limited degree because of evolution. “[Coastal-related ecosystems are] pretty resilient because they''ve been around a long time and know how to survive,” he said.As a result, wetlands can decontaminate themselves of pollutants such as oil, nitrate and phosphate. However, encountering large amounts of pollutants in a short period of time can overwhelm the healing process, or even stop it altogether. “We did some experiments here in the early 90s looking at the ability for salt marshes to break down oil,” Thom said. “When we put too much oil on the surface of the marsh it killed everything.” He explained that the oil also destroyed the sediment–soil column, where plant roots are located. Eventually, the roots disintegrated and the entire soil core fell apart. According to Thom, the Louisiana marshes were weakened by sediment and nutrient starvation, which suggests that the Deepwater spill destroyed below-ground material in some locations. “You can alter a place through a disturbance so drastic that it never recovers to what it used to be because things have changed so much,” he said.“Nature has a very powerful capacity to repair itself, but what''s happening in the modern day is assault after assault”Michael Blum, a coastal marsh ecologist at Tulane University in New Orleans, said that it is hard to determine the long-term effects of the oil because little is known about the relevant ecotoxicology—the effect of toxic agents on ecosystems. He has conducted extensive research on how coastal marsh plants respond to stress: some marshes might be highly susceptible to oil whereas others could have evolved to deal with natural oil seepage to metabolize hydrocarbons. In the former, marshes might perish after drastic exposure to oil leading to major shifts in plant communities. In the latter case, the process of coping with oil could involve the uptake of pollutants in the oil—known as polycyclic aromatic hydrocarbons (PAHs)—and their reintroduction into the environment. “If plants are growing in the polluted sediments and tapping into those contaminated sources, they can pull that material out of the soil and put it back into the water column or back into the leaf tissue that is a food source for other organisms,” Blum explained.In addition to understanding the responses of various flora, scientists also need to know how the presence of oil in an ecosystem affects the fauna. One model that is used to predict the effects of oil on vertebrates is the killifish; a group of minnows that thrive in the waters of Virginia''s Elizabeth River, where they are continuously exposed to PAHs deposited in the water by a creosote factory (Meyer & Di Giulio, 2003). “The killifish have evolved tolerance to the exposure of PAHs over chronic, long-term conditions,” Blum said. “This suggests that something similar may occur elsewhere, including in Gulf Coast marshes exposed to oil.”Although Michel is optimistic about the potential for environmental recovery, she pointed out that no two spills are the same. “There are lot of things we don''t know, we never had a spill that had surface release for so long at this water depth,” she said. Nevertheless, to better predict the long-term effects, scientists have turned to data from similar incidents.In 1989, the petroleum tanker Exxon Valdez struck Bligh Reef off the coast of Prince William Sound in Alaska and poured a minimum of 11 million gallons of oil into the water—enough to fill 125 Olympic-sized swimming pools. Senior scientist at NOAA, Stanley Rice of Juno, Alaska, studies the long-term effects of the spill and the resulting oil-related issues in Prince William Sound. Rice has worked with the spill since day 3 and, 20 years later, he is seeing major progress. “I never want to give the impression that we had this devastating oil spill in 1989 and it''s still devastating,” he said. “We have pockets of a few species where lingering oil hurts their survival, but in terms of looking at the Sound in its entirety […] it''s done a lot of recovery in 20 years.”…little is known about the relevant ecotoxicology—the effect of toxic agents on ecosystemsDespite the progress, Rice is still concerned about one group of otters. The cold temperature of the water in the Sound—rarely above 5 °C—slows the disintegration of the oil and, every so often, the otters come in contact with a lingering pocket. When they are searching for food, for example, the otters often dig into pits containing oil and become contaminated, which damages their ability to maintain body temperature. As a result, they cannot catch as much food and starve because they need to consume the equivalent of 25% of their body weight every day (Rice, 2009).“Common colds or worse, pneumonia, are extremely debilitating to an animal that has to work literally 365 days a year, almost 8 to 12 hours a day,” Rice explained. “If they don''t eat enough to sustain themselves, they die of hyperthermia.” Nevertheless, in just the last two years, Rice has finally seen the otter population rebound.Unlike the otters, one pod of orca whales has not been so lucky. Since it no longer has any reproductive females, the pod will eventually become extinct. However, as it dies out, orca prey such as seals and otters will have a better chance of reproducing. “There are always some winners and losers in these types of events,” Rice said. “Nature is never static.”The only ‘loser'' that Rice is concerned about at the moment is the herring, as many of their populations have remained damaged for the past 20 years. “Herring are critical to the ecosystem,” he said. “[They are] a base diet for many species […] Prince William Sound isn''t fully recovered until the herring recover.”North America is not alone in dealing with oil-spill disasters—Europe has had plenty of experience too. One of the worst spills occurred when the oil tanker Prestige leaked around 20 million gallons of oil into the waters of the Galacian coast in Northern Spain in 2002. This also affected the coastline of France and is considered Spain''s worst ecological disaster.“The impacts of the Prestige were indeed severe in comparison with other spills around the world,” said attorney Xabier Ezeizabarrena, who represented the Fishermen Guilds of Gipuzkoa in a lawsuit relating to the spill. “Some incidents aren''t even reported, but in the European Union the ratio is at least one oil spill every six months.”For disasters involving oil, oceanographic data to monitor and predict the movement of the spill is essentialIn Ezeizabarrena''s estimation, Spanish officials did not respond appropriately to the leak. The government was denounced for towing the shipwreck further out into the Atlantic Ocean—where it eventually sank—rather than to a port. “There was a huge lack of measures and tools from the Spanish government in particular,” Ezeizabarrena said. “[However], there was a huge response from civil society […] to work together [on restoration efforts].”Ionan Marigómez, professor of cellular biology at the University of the Basque Country, Spain, was the principal investigator on a federal coastal-surveillance programme named Orbankosta. He recorded the effects of the oil on the Basque coast and was a member of the Basque government''s technical advisory commission for the response to the Prestige spill. He was also chair of the government''s scientific committee. “Unfortunately, most of us scientists were not prepared to answer questions related to the biological impact of restoration strategies,” Marigómez said. “We lacked data to support our advice since continued monitoring is not conducted in the area […] and most of us had developed our scientific activity with too much focus on each one''s particular area when the problem needed a holistic view.”…the world consumes approximately 31 billion barrels of oil per year; more than 700 times the amount that leaked during the Deepwater spillFor disasters involving oil, oceanographic data to monitor and predict the movement of the spill is essential. Clean-up efforts were initially encouraged in Spain, but data provided by coastal-inspection programmes such as Orbankosta informed the decision to not clean up the Basque shoreline, allowing the remaining oil debris to disintegrate naturally. In fact, the cleaning activity that took place in Galicia only extended the oil pollution to the supralittoral zone—the area of the beach splashed by the high tide, rather than submerged by it—as well as to local soil deposits. On the Basque coast, restoration efforts were limited to regions where people were at risk, such as rocky areas near beaches and marinas.Eight years later, Galicia still suffers from the after-effects of the Prestige disaster. Thick subsurface layers of grey sand are found on beaches, sometimes under sand that seems to be uncontaminated. In Corme-Laxe Bay and Cies Island in Galicia, PAH levels have decreased. Studies have confirmed, however, that organisms exposed to the area''s sediments had accumulated PAHs in their bodies. Marigómez, for example, studied the long-term effects of the spill on mussels. Depending on their location, PAH levels decreased in the sampled mussel tissue between one and two years after the spill. However, later research showed that certain sites suffered later increases in the level of PAHs, due to the remobilization of oil residues (Cajaraville et al, 2006). Indeed, many populations of macroinvertebrate species—which are the keystones of coastal ecosystems—became extinct at the most-affected locations, although neighbouring populations recolonized these areas. The evidence suggests that only time will tell what will happen to the Galicia ecosystem. The same goes for oil-polluted environments around the world.The concern whether nature can recover from oil spills might seem extreme, considering that oil is a natural product derived from the earth. But too much of anything can be harmful and oil would remain locked underground without human efforts to extract it. “As from Paracelsus'' aphorism, the dose makes the poison,” Marigómez said.According to the US Energy Information Administration, the world consumes approximately 31 billion barrels of oil per year; more than 700 times the amount that leaked during the Deepwater spill. Humanity continues, in the words of some US politicians, to “drill, baby, drill!” On 12 October 2010, less than a year after the Gulf Coast disaster, US President Barack Obama declared that he was lifting the ban on deepwater drilling. It appears that George Carlin got it right again when he satirized a famous American anthem: “America, America, man sheds his waste on thee, and hides the pines with billboard signs from sea to oily sea!”  相似文献   

10.
Geneticists and historians collaborated recently to identify the remains of King Richard III of England, found buried under a car park. Genetics has many more contributions to make to history, but scientists and historians must learn to speak each other''s languages.The remains of King Richard III (1452–1485), who was killed with sword in hand at the Battle of Bosworth Field at the end of the War of the Roses, had lain undiscovered for centuries. Earlier this year, molecular biologists, historians, archaeologists and other experts from the University of Leicester, UK, reported that they had finally found his last resting place. They compared ancient DNA extracted from a scoliotic skeleton discovered under a car park in Leicester—once the site of Greyfriars church, where Richard was rumoured to be buried, but the location of which had been lost to time—with that of a seventeenth generation nephew of King Richard: it was a match. Richard has captured the public imagination for centuries: Tudor-friendly playwright William Shakespeare (1564–1616) portrayed Richard as an evil hunchback who killed his nephews in order to ascend to the throne, whilst in succeeding years others have leapt to his defence and backed an effort to find his remains.The application of genetics to history is revealing much about the ancestry and movements of groups of humans, from the fall of the Roman Empire to ancient ChinaMolecular biologist Turi King, who led the Leicester team that extracted the DNA and tracked down a descendant of Richard''s older sister, said that Richard''s case shows how multi-disciplinary teams can join forces to answer history''s questions. “There is a lot of talk about what meaning does it have,” she said. “It tells us where Richard III was buried; that the story that he was buried in Greyfriars is true. I think there are some people who [will] try and say: “well, it''s going to change our view of him” […] It won''t, for example, tell us about his personality or if he was responsible for the killing of the Princes in the Tower.”The discovery and identification of Richard''s skeleton made headlines around the world, but he is not the main prize when it comes to collaborations between historians and molecular biologists. Although some of the work has focused on high-profile historic figures—such as Louis XVI (1754–1793), the only French king to be executed, and Vlad the Impaler, the Transylvanian royal whose patronymic name inspired Bram Stoker''s Dracula (Fig 1)—many other projects involve population studies. Application of genetics to history is revealing much about the ancestry and movements of groups of humans, from the fall of the Roman Empire to ancient China.Open in a separate windowFigure 1The use of molecular genetics to untangle history. Even when the historical record is robust, molecular biology can contribute to our understanding of important figures and their legacies and provide revealing answers to questions about ancient princes and kings.Medieval historian Michael McCormick of Harvard University, USA, commented that historians have traditionally relied on studying records written on paper, sheepskin and papyrus. However, he and other historians are now teaming up with geneticists to read the historical record written down in the human genome and expand their portfolio of evidence. “What we''re seeing happening now—because of the tremendous impact from the natural sciences and particularly the application of genomics; what some of us are calling genomic archaeology—is that we''re working back from modern genomes to past events reported in our genomes,” McCormick explained. “The boundaries between history and pre-history are beginning to dissolve. It''s a really very, very exciting time.”…in the absence of written records, DNA and archaeological records could help fill in gapsMcCormick partnered with Mark Thomas, an evolutionary geneticist at University College London, UK, to try to unravel the mystery of one million Romano-Celtic men who went missing in Britain after the fall of the Roman Empire. Between the fourth and seventh centuries, Germanic tribes of Angles, Saxons and Jutes began to settle in Britain, replacing the Romano-British culture and forcing some of the original inhabitants to migrate to other areas. “You can''t explain the predominance of the Germanic Y chromosome in England based on the population unless you imagine (a) that they killed all the male Romano-Celts or (b) there was what Mark called ‘sexual apartheid'' and the conquerors mated preferentially with the local women. [The latter] seems to be the best explanation that I can see,” McCormick said of the puzzle.Ian Barnes, a molecular palaeobiologist at Royal Holloway University of London, commented that McCormick studies an unusual period, for which both archaeological and written records exist. “I think archaeologists and historians are used to having conflicting evidence between the documentary record and the archaeological record. If we bring in DNA, the goal is to work out how to pair all the information together into the most coherent story.”Patrick Geary, Professor of Western Medieval History at the Institute for Advanced Study in Princeton, New Jersey, USA, studies the migration period of Europe: a time in the first millennium when Germanic tribes, including the Goths, Vandals, Huns and Longobards, moved across Europe as the Roman Empire was declining. “We do not have detailed written information about these migrations or invasions or whatever one wants to call them. Primarily what we have are accounts written later on, some generations later, from the contemporary record. What we tend to have are things like sermons bemoaning the faith of people because God''s wrath has brought the barbarians on them. Hardly the kind of thing that gives us an idea of exactly what is going on—are these really invasions, are they migrations, are they small military groups entering the Empire? And what are these ‘peoples'': biologically related ethnic groups, or ad hoc confederations?” he said.Geary thinks that in the absence of written records, DNA and archaeological records could help fill in the gaps. He gives the example of jewellery, belt buckles and weapons found in ancient graves in Hungary and Northern and Southern Italy, which suggest migrations rather than invasions: “If you find this kind of jewellery in one area and then you find it in a cemetery in another, does it mean that somebody was selling jewellery in these two areas? Does this mean that people in Italy—possibly because of political change—want to identify themselves, dress themselves in a new style? This is hotly debated,” Geary explained. Material goods can suggest a relationship between people but the confirmation will be found in their DNA. “These are the kinds of questions that nobody has been able to ask because until very recently, DNA analysis simply could not be done and there were so many problems with it that this was just hopeless,” he explained. Geary has already collected some ancient DNA samples and plans to collect more from burial sites north and south of the Alps dating from the sixth century, hoping to sort out kinship relations and genetic profiles of populations.King said that working with ancient DNA is a tricky business. “There are two reasons that mitochondrial DNA (mtDNA) is the DNA we wished to be able to analyse in [King] Richard. In the first instance, we had a female line relative of Richard III and mtDNA is passed through the female line. Fortunately, it''s also the most likely bit of DNA that we''d be able to retrieve from the skeletal remains, as there are so many copies of it in the cell. After death, our DNA degrades, so mtDNA is easier to retrieve simply due to the sheer number of copies in each cell.”Geary contrasted the analysis of modern and ancient DNA. He called modern DNA analysis “[…] almost an industrial thing. You send it off to a lab, you get it back, it''s very mechanical.” Meanwhile, he described ancient DNA work as artisanal, because of degeneration and contamination. “Everything that touched it, every living thing, every microbe, every worm, every archaeologist leaves DNA traces, so it''s a real mess.” He said the success rate for extracting ancient mtDNA from teeth and dense bones is only 35%. The rate for nuclear DNA is only 10%. “Five years ago, the chances would have been zero of getting any, so 10% is a great step forward. And it''s possible we would do even better because this is a field that is rapidly transforming.”But the bottleneck is not only the technical challenge to extract and analyse ancient DNA. Historians and geneticists also need to understand each other better. “That''s why historians have to learn what it is that geneticists do, what this data is, and the geneticists have to understand the kind of questions that [historians are] trying to ask, which are not the old nineteenth century questions about identity, but questions about population, about gender roles, about relationship,” Geary said.DNA analysis can help to resolve historical questions and mysteries about our ancestors, but both historians and geneticists are becoming concerned about potential abuses and frivolous applications of DNA analysis in their fields. Thomas is particularly disturbed by studies based on single historical figures. “Unless it''s a pretty damn advanced analysis, then studying individuals isn''t particularly useful for history unless you want to say something like this person had blue eyes or whatever. Population level studies are best,” he said. He conceded that the genetic analysis of Richard III''s remnants was a sound application but added that this often is not the case with other uses, which he referred to as “genetic astrology.” He was critical of researchers who come to unsubstantiated conclusions based on ancient DNA, and scientific journals that readily publish such papers.…both historians and geneticists are becoming concerned about potential abuses or frivolous applications of DNA analysis in their fieldsThomas said that it is reasonable to analyse a Y chromosome or mtDNA to estimate a certain genetic trait. “But then to look at the distribution of those, note in the tree where those types are found, and informally, interpretively make inferences—“Well this must have come from here and therefore when I find it somewhere else then that means that person must have ancestors from this original place”—[…] that''s deeply flawed. It''s the most widely used method for telling historical stories from genetic data. And yet is easily the one with the least credibility.” Thomas criticized such facile use of genetic data, which misleads the public and the media. “I suppose I can''t blame these [broadcast] guys because it''s their job to make the programme look interesting. If somebody comes along and says ‘well, I can tell you you''re descended from some Viking warlord or some Celtic princess'', then who are they to question.”Similarly, the historians have reservations about making questionable historical claims on the basis of DNA analysis. Geary said the use of mtDNA to identify Richard III was valuable because it answered a specific, factual question. However, he is turned off by other research using DNA to look at individual figures, such as a case involving a princess who was a direct descendant of the woman who posed for Leonardo Da Vinci''s Mona Lisa. “There''s some people running around trying to dig up famous people and prove the obvious. I think that''s kind of silly. There are others that I think are quite appropriate, and while is not my kind of history, I think it is fine,” he said. “The Richard III case was in the tradition of forensics.”…the cases in which historians and archaeologists work with molecular biologists are rare and remain disconnected in general from the mainstream of historical or archaeological researchNicola Di Cosmo, a historian at the Institute for Advanced Study, who is researching the impact of climate change on the thirteenth century Mongol empire, follows closely the advances in DNA and history research, but has not yet applied it to his own work. Nevertheless, he said that genetics could help to understand the period he studies because there are no historical documents, although monumental burials exist. “It is important to get a sense of where these people came from, and that''s where genetics can help,” he said. He is also concerned about geneticists who publish results without involving historians and without examining other records. He cited a genetic study of a so-called ‘Eurasian male'' in a prestige burial of the Asian Hun Xiongnu, a nomadic people who at the end of the third century B.C. formed a tribal league that dominated most of Central Asia for more than 500 years. “The conclusion the geneticists came to was that there was some sort of racial tolerance in this nomadic empire, but we have no way to even assume that they had any concept of race or tolerance.”Di Cosmo commented that the cases in which historians and archaeologists work with molecular biologists are rare and remain disconnected in general from the mainstream of historical or archaeological research. “I believe that historians, especially those working in areas for which written records are non-existent, ought to be taking seriously the evidence churned out by genetic laboratories. On the other hand, geneticists must realize that the effectiveness of their research is limited unless they access reliable historical information and understand how a historical argument may or may not explain the genetic data” [1].Notwithstanding the difficulties in collaboration between two fields, McCormick is excited about historians working with DNA. He said the intersection of history and genomics could create a new scientific discipline in the years ahead. “I don''t know what we''d call it. It would be a sort of fusion science. It certainly has the potential to produce enormous amounts of enormously interesting new evidence about our human past.”  相似文献   

11.
“LAUGHING GAS is the newest thing for kids seeking kicks,” the Stanford Daily reports. “They sniff it.”So begins a news story in the Los Angeles Times of 26 January 1967. The story continues:“It''s the latest way to travel, or so say a growing group of devotees on the campus,” the university student paper said. “It can produce much the same effects as psychedelic drugs, they claim, and it''s cheaper to obtain.”“One student said he buys the gas, nitrous oxide, from a medical supply house. `They think I am anesthetizing rats,'' he explained.“Campus medical authorities said the gas, sniffed `in sufficient amounts... could produce all the states of anesthesia, including the final stage—death.''”  相似文献   

12.
More than a blog     
Wolinsky H 《EMBO reports》2011,12(11):1102-1105
Blogging is circumventing traditional communication channels and levelling the playing field of science communication. It helps scientists, journalists and interested laypeople to make their voices heard.Last December, astrobiologists reported in the journal Science that they had discovered the first known microorganism on Earth capable of growing and reproducing by using arsenic (Wolfe-Simon et al, 2010). While media coverage went wild, the paper was met with a resounding public silence from the scientific community. That is, until a new breed of critic, science bloggers, weighed in. Leading the pack was Rosie Redfield, who runs a microbiology research lab in the Life Sciences Centre at the University of British Columbia in Vancouver, Canada. She posted a critique of the research to her blog, RRResearch (rrresearch.fieldofscience.com), which went viral. Redfield said that her site, which is typically a quiet window on activities in her lab got 100,000 hits in a week.Redfield said that her site, which is typically a quiet window on activities in her lab got 100,000 hits in a weekThis incident, like a handful before it and probably more to come, has raised the profile of science blogging and the freedom that the Internet offers to express an opinion and reach a broad audience. Yet it also raises questions about the validity of unfettered opinion and personal bias, and the ability to publish online with little editorial oversight and few checks and balances.Redfield certainly did not hold back in her criticism of the paper. Her post said of the arsenic study: “Lots of flim-flam, but very little reliable information. [...] If this data was presented by a PhD student at their committee meeting, I''d send them back to the bench to do more clean-up and controls.” She also opined on why the article was published: “I don''t know whether the authors are just bad scientists or whether they''re unscrupulously pushing NASA''s ''There''s life in outer space!'' agenda. I hesitate to blame the reviewers, as their objections are likely to have been overruled by Science''s editors in their eagerness to score such a high-impact publication.”Despite the fervor and immediacy of the blogosphere, it took Science and Felisa Wolfe-Simon, the lead author on the paper, nearly six months to respond in print. Eventually, eight letters appeared in Science covering various aspects of the controversy, including one from Redfield, who is now studying the bacteria in her lab. Bruce Alberts, editor-in-chief of Science, downplayed the role that blogging played in drumming up interest in the controversial study. “I am sure that the number of letters sent to us via our website reflected a response to the great publicity the article received, some of it misleading [...] This number was also likely expanded by the blogging activity, but it was not directly connected to the blogs in any way that I can detect,” he explained.Bloggers, of course, have a different take on the matter, arguing that it was another example of a growing number of cases of ''refutation by blog''. The blogging community heralds Redfield as a hero to science and science blogging. By now, more traditional science media outlets have also joined the bloggers in their skepticism over the paper''s claims, with many repeating the points Redfield made in her original blog response.Jerry Coyne, an evolutionary geneticist at the University of Chicago in the USA, writes the blog Why Evolution is True (whyevolutionistrue.wordpress.com), which is a spinoff from his book of the same name. He said that bloggers, both professional scientists and journalists, have been gaining a new legitimacy in recent years as a result of things such as the arsenic bacteria case, as well as from shooting holes in the 2009 claims that the fossil of the extinct primate Darwinius masillae from the Messel Pit in Germany was a ''missing link'' between two primate species (Franzen et al, 2009). “[Blogging has] really affected the pace of how science is done. One of the good things about science blogging, certainly as a professional, is you''re able to pass judgment on papers instantly. You don''t have to write a letter to the editor and have it reviewed. [Redfield] is a good example of the value of science blogging. Claims that are sort of outlandish and strong can be discredited or at least addressed instantaneously instead of waiting weeks and weeks like you''d otherwise have to do,” he said.“... you''re able to pass judgment on papers instantly. You don''t have to write a letter to the editor and have it reviewed”Perhaps because of the increasingly public profile of popular science bloggers, as well as the professional and social value that is becoming attached to their blogs, science blogging is gaining in both popularity and validity. The content in science blogs covers a wide spectrum from genuine science news to simply describing training or running a lab, to opinionated rants about science and its social impact. The authorship is no less diverse than the content with science professionals, science journalists and enthusiastic amateurs all contributing to the melting pot, which also has an impact on the quality.Carl Zimmer is a freelance science journalist, who writes primarily for the New York Times and Discover Magazine, and blogs at The Loom (blogs.discovermagazine.com/loom). “Most scientists have not been trained how to write, so they are working at a disadvantage,” he said. “[Writing for them] would be like me trying to find a dinosaur. I wouldn''t do a very good job because I don''t really know how to do that. There are certainly some scientists who have a real knack for writing and blogs have been a fantastic opportunity for them because they can just start typing away and all of a sudden have thousands of people who want to read what they write every day.”Bora Zivkovic, who is a former online community manager at Public Library of Science, focusing mainly on PLoS ONE, is one of those scientists. A native of Belgrade, he started commenting in the mid-1990s about the Balkan wars on Usenet, an Internet discussion network. He began blogging about science and politics in 2004 and later about his interest in chronobiology, which stems from his degree in the topic from North Carolina State University. He still combines these interests in his latest blog, Blog Around the Clock (blogs.scientificamerican.com/a-blog-around-the-clock). Last year, Scientific American named Zivkovic its blog editor and he set up a blogging network for the publication. “There isn''t really a definition of what is appropriate,” he said. “The number one rule in the blogosphere is you never tell a blogger what to blog about. Those bloggers who started on their own who are scientists treasure their independence more than anything, so networks that give completely free reign and no editorial control are the only ones that can attract interesting bloggers with their own voices.”“The number one rule in the blogosphere is you never tell a blogger what to blog about”Daniel McArthur, an Australian scientist now based in the UK, who blogs about the genetic and evolutionary basis of human variation at Genetic Future (www.wired.com/wiredscience/geneticfuture), and about personal genomics at Genomes Unzipped (www.genomesunzipped.org), said that it is difficult to define a science blog. “I think it''s semantics. There are people like me who spend some time writing about science and some time writing about industry and gossiping about things in the industrial world. Then there are the people who write about the process of doing science. There are many, many blogs where [...] the content is much more about [the blogger''s] personal voyage as a scientist rather than the science that they do. Then there are people who use science blogging as an extra thing that they do and the primary purpose of their blog is to add political advocacy. I think it''s very hard to draw a line between the different categories. My feeling is that science bloggers should write about whatever it is they want to write about .”The ability to distribute your opinion, scientific or otherwise, online and in public is raising difficult questions about standards and the difference between journalism and opinion. Sean Carroll, who writes for the physics group blog Cosmic Variance (blogs.discovermagazine.com/cosmicvariance), is a senior research associate in the Department of Physics at the California Institute of Technology in the USA. “Some blogging is indistinguishable from what you would ordinarily call journalism. Some blogging is very easily distinguishable from what you would ordinarily call journalism,” he said. “I think that whether we like it or not, the effect of the Internet is that readers need to be a little bit more aware of the status of what they are looking at. Is this something reputable? Anyone can have a blog and say anything, so that one fact is both good and bad. It''s bad because there is a tremendous amount of rubbish on the Internet [...] and people who have trouble telling the rubbish from the good stuff will get confused. But it''s also good because it used to be the case that only a very small number of voices were represented in major media.”Zimmer contrasts the independence of blogging with traditional journalism. “You really get to set your own rules. You''re not working with any editor and you''re not trying to satisfy them. You''re just trying to satisfy yourself. In terms of the style of what I do, I will tend to write more—I think of [my blog posts] as short essays, as opposed to an article in the New York Times where I''ll be writing about interviewing someone or describing them on a visit I paid to them. One of the great things about a blog is that it''s a way of making a connection with people who are your readers and people who are following you for a long time.”One of the world''s most popular scientist bloggers is Paul Zachary Myers, known as PZ, a biology professor at the University of Minnesota in the USA. He blogs at Pharyngula (scienceblogs.com/pharyngula), a site named for a particular stage in development shared by all vertebrate embryos. “Passion is an important part of this. If you can communicate a love of the science that you''re talking about, then you''re a natural for blogging,” he explained. “[Pharyngula] is a blog where I have chosen just to express myself, so self-expression is the goal and what I write about are things that annoy me or interest me.”“Passion is an important part of this. If you can communicate a love of the science that you''re talking about, then you''re a natural for blogging”Myers'' blog, which is driven by a mix of opinion, colourful science writing, campaigning against creationism and an unflinching approach to topics about which he is passionate, draws about 3 million visitors a month. He said his blog attracts more traffic than other blogs because it is not purely about science. “I do a lot of very diverse things such as controversial religious stuff and politics, and whatever I feel like. So I tap into a lot of interest groups and that builds up my rank quite a bit. I''d say there are quite a few other science blogs out there that are pure science blogs, but pure science blogs—where they just talk about science and nothing but science—cannot get quite as much traffic as a more broadly based blog.”In an example of his sometimes-incendiary posting, Myers recently took on the Journal of Cosmology regarding an article on the discovery of bacteria fossils in a meteorite. He said that the counterattack got personal, but that he usually enjoys “the push back” from readers. “That''s part of the argument. I would say that everyone has an equal right to make their case on the web. That''s sometimes daunting for some people, but I think it''s part of the give and take of free speech. It''s good. It''s actually kind of fun to get into these arguments.”Beyond the circus that can surround blogs such as Pharyngula, scientist bloggers are debating whether their blogging counts as a professional activity. Redfield said that blogging can be taken into account among the outreach some governments now require from researchers who receive public funds. She said that some researchers now list their blogging activity in their efforts to communicate science to the public.Coyne, however, does not share his interest in blogging with other senior faculty at the University of Chicago, because he does not believe they value it as a professional activity. Still, he said that he recognizes the names of famous scientists among his blog readers and argues that scientists should consider blogging to hone their writing skills. “Blogging gives you outreach potential that you really should have if you''re grant funded, and it''s fun. It opens doors for you that wouldn''t have opened if you just were in your laboratory. So I would recommend it. It takes a certain amount of guts to put yourself out there like that, but I find it immensely rewarding,” he said. In fact, Coyne has had lecture and print publishing opportunities arise from his blogs.“It opens doors for you that wouldn''t have opened if you just were in your laboratory [...] It takes a certain amount of guts to put yourself out there like that...”Redfield said she finds blogging—even if no one reads her posts—a valuable way to focus her thoughts. “Writing online is valuable at all levels for people who choose to do it. Certainly, by far the best science writing happening is in the community of writers who are considered bloggers,” she said.In terms of pay, science blogging usually remains in the ''hobby zone'', with pay varying widely from nothing at all to small amounts from advertising and web traffic. ''GrrlScientist'', an American-trained molecular evolutionary biologist based in Germany, who prefers to go by her nom de blog, has been blogging for seven years. She writes the popular Punctuated Equilibrium blog (www.guardian.co.uk/science/punctuated-equilibrium) for The Guardian newspaper in the UK, as well as Maniraptora (blogs.nature.com/grrlscientist) for the Nature Network, and is co-author of This Scientific Life (scientopia.org/blogs/thisscientificlife) for the science writing community Scientopia. She said she earns a small amount from ad impressions downloaded when her blog is viewed at The Guardian. On the other end of the scale is Myers, who declined to disclose his income from blogging. “It''s a respectable amount. It''s a nice supplement to my income, but I''m not quitting my day job,” he said.Yet bloggers tend not to do it for the money. “I know that when I go to give talks, the fact that I have the blog is one of the first things that people mention, and lots of students in particular say that they really enjoy the blog and that they''re encouraged by it,” Carroll explained. “Part of what we do is not only talk about science, but we act as examples of what it means to be scientist. We are human beings. We care about the world. We have outside interests. We like our jobs. We try to be positive role models for people who are deciding whether or not this is something that they might want to get into themselves one day.”The rise of the science blogosphere has not all been plain sailing. Although the Internet has been hailed as a brave new world of writing where bloggers can express themselves without interference from editors or commercial interests, it has still seen its share of controversy. The blogging portal ScienceBlogs was the launchpad for some of the best and most popular writers of the new generation of science bloggers, including Myers and Zivkovic. But an incident at ScienceBlogs shook up the paradise and raised journalistic ethical quandaries.In July 2010, a new site, Food Frontiers (foodfrontiers.pepsicoblogs.com), appeared on ScienceBlog, sponsored by PepsiCo, the makers of the popular drink. The blog featured posts written by the beverage maker''s representatives and was blended in with the other blog content on the portal. “Pepsi''s blog looked like my blog or PZ''s blog,” Zivkovic explained, “with no warning that this was paid for and written by Pepsi''s R&D or PR people [...] talking about nutrition from a Pepsi perspective, which is a breach in the wall between advertorial and editorial. The moment the Pepsi blog went live, about 10 bloggers immediately left.” He said that the journalist-bloggers in particular pointed to a break of trust that would sully the reputation of ScienceBlogs writers and confuse readers.In his final blog at the site, titled ''A Farewell to Scienceblogs: the Changing Science Blogging Ecosystem'', Zivkovic nailed the danger of the ''Pepsigate'' incident to the validity of the blogosphere. He wrote: “What is relevant is that this event severely undermined the reputation of all of us. Who can trust anything we say in the future? Even if you already know me and trust me, can people arriving here by random searches trust me? Once they look around the site and see that Pepsi has a blog here, why would they believe I am not exactly the same, some kind of shill for some kind of industry?” (scienceblogs.com/clock/2010/07/scienceblogs_and_me_and_the_ch.php). Myers, who at the time was responsible for more than 40% of the traffic at ScienceBlogs, went ''on strike'' to protest. In the aftermath, the Pepsi blog was pulled.Redfield raises another interesting word of caution. “Most scientists are extensively worried about being scooped, so they''re scared to say anything about what''s actually going on in their lab for fear that one of their competitors will steal their ideas,” she said. In this context, social networking sites such as ResearchGate (www.researchgate.net; Sidebar A) might be a more appropriate avenue for securely sharing ideas and exchanging tips and information because it enables users to control who has access to their missives.“... they''re scared to say anything about what''s actually going on in their lab for fear that one of their competitors will steal their ideas”

Sidebar A | ResearchGate—social media goes pro

Whenever she is looking for ideas for a research project, biologist Anne-Laure Prunier, who works in the Department of Cellular Biology and Infection at the Institut Pasteur in Paris, has recently turned to ResearchGate (www.researchgate.net), the scientists'' version of the social networking site Facebook. “Every time I have used ResearchGate, I found it really useful,” she commented.ResearchGate, based in Berlin, Germany and Cambridge, USA, is a free service that launched in January 2009. It was co-founded by Ijad Madisch, who earned his MD and PhD from the University of Hannover''s medical school in Germany and is a former research fellow at Harvard Medical School. He explained that his goal in starting the network was to make research more efficient. “During my research in Boston, I noticed that science is very inefficient, especially if you''re doing an experiment and trying to get feedback from people working on the same problem. You don''t have any platforms, online networks where you can go and ask questions or if you''re trying to find someone with a specific skill set. So I decided to do that on my own.”As a result, the site offers researchers functionality similar to Facebook—the modern template for social networking. Through ResearchGate, members can follow colleagues, be followed by those interested in their research, share their conference attendance and recent papers—their own or those that interest them—and most importantly, perhaps, ask and answer questions about science and scientific techniques.“You can get in touch with a lot of different people with a lot of different backgrounds,” Prunier explained. “When I have a very precise technical question for which I don''t find an answer in my institute, I turn to ResearchGate and I ask this question to the community. I have done it three times and every time I have gotten a lot of answers and comments, and I was able to exchange information with a lot of different people which I found really useful.”By May 2011, ResearchGate had reached one million members across 192 countries. The largest numbers of registrations come from the USA, the UK, Germany and India. Biologists, who are second only to medical doctors on the site, make up more than 20% of members. In addition to blogging, ResearchGate is just one example of how the Internet—originally invented to allow physicists to share data with one another—is changing the way that scientists communicate and share information with each other and the public.Carroll, on the other hand, who has been blogging since 2004, said that physicists are very comfortable about publicly sharing research papers with colleagues online. “The whole discussion gets very heated and very deep in some places about open access publishing. Physicists look on uncomprehendingly in fact because they put everything for free on line. That''s what we''ve been doing for years. It works.” But he said they are more cautious about blogging for a general audience. By contrast, he believes biology is especially well-suited to being blogged. “[Biologists are] actually more comfortable with talking to a wider audience because biology, whether it is through medicine or through debates about creationism or life on other planets or whatever, gets involved with public debate quite often.”Zivkovic agrees: “PZ [Myers] and me and a number of others are interested in reaching a broad lay audience, showing how science is fun and cool and interesting and important in various ways. Connecting science to other areas of life, from art to politics and showing the lay audience how relevant science is to everyday life”. Even so, he pointed out that although blogging is popularizing science with the public, there is a less-mainstream sphere serving professional scientists as a forum for surviving the cut and thrust of modern science. “There is a strong subset of the science blogosphere that discusses a life in science, career choices, how to succeed in academia [...] A lot of these are written by people who [...] believe that if their real names were out there it could jeopardize their jobs. They''re not interested in talking to lay audiences. They are discussing survival techniques in today''s science with each other and providing a forum for other young people coming into science.”Ultimately, whether you read popular science blogs, trawl deeper for survival tips, or write your own, the science blogosphere is expanding rapidly and is likely to do so for years to come.  相似文献   

13.
Heterogeneity in social interactions can have important consequences for the spread of information and diseases and consequently conservation and invasive species management. Common carp (Cyprinus carpio) are a highly social, ubiquitous, and invasive freshwater fish. Management strategies targeting foraging carp may be ideal because laboratory studies have suggested that carp can learn, have individual personalities, a unique diet, and often form large social groups. To examine social feeding behaviors of wild carp, we injected 344 carp with passive integrated transponder (PIT) tags and continuously monitored their feeding behaviors at multiple sites in a natural lake in Minnesota, USA. The high‐resolution, spatio‐temporal data were analyzed using a Gaussian mixture model (GMM). Based on these associations, we analyzed group size, feeding bout duration, and the heterogeneity and connectivity of carp social networks at foraging sites. Wild carp responded quickly to bait, forming aggregations most active from dusk to dawn. During the 2020 baiting period (20 days), 133 unique carp were detected 616,593 times. There was some evidence that feeding at multiple sites was constrained by basin geography, but not distance alone. GMM results suggested that feeding bouts were short, with frequent turnover of small groups. Individual foraging behavior was highly heterogeneous with Gini coefficients of 0.79 in 2020 and 0.66 in 2019. “Superfeeders”—those contributing to 80% of total cumulative detections (top 18% and top 29% of foragers in 2020 and 2019 respectively)—were more likely to be detected earlier at feeding stations, had larger body sizes, and had higher network measures of degree, weighted degree, and betweenness than non‐superfeeders. Overall, our results indicate that wild carp foraging is social, easily induced by bait, dominated by large‐bodied individuals, and potentially predictable, which suggests social behaviors could be leveraged in management of carp, one of the world''s most recognizable and invasive fish.  相似文献   

14.
Wolinsky H 《EMBO reports》2012,13(4):308-312
Genomics has become a powerful tool for conservationists to track individual animals, analyse populations and inform conservation management. But as helpful as these techniques are, they are not a substitute for stricter measures to protect threatened species.You might call him Queequeg. Like Herman Melville''s character in the 1851 novel Moby Dick, Howard Rosenbaum plies the seas in search of whales following old whaling charts. Standing on the deck of a 12 m boat, he brandishes a crossbow with hollow-tipped darts to harpoon the flanks of the whales as they surface to breathe (Fig 1). “We liken it to a mosquito bite. Sometimes there''s a reaction. Sometimes the whales are competing to mate with a female, so they don''t even react to the dart,” explained Rosenbaum, a conservation biologist and geneticist, and Director of the New York City-based Wildlife Conservation Society''s Ocean Giants programme. Rosenbaum and his colleagues use the darts to collect half-gram biopsy samples of whale epidermis and fat—about the size of a human fingernail—to extract DNA as part of international efforts to save the whales.Open in a separate windowFigure 1Howard Rosenbaum with a crossbow to obtain skin samples from whales. © Wildlife Conservation Society.Like Rosenbaum, many conservation biologists and wildlife managers increasingly rely on DNA analysis tools to identify species, determine sex or analyse pedigrees. George Amato, Director of the Sackler Institute for Comparative Genomics at the American Museum of Natural History in New York, NY, USA, said that during his 25-year career, genetic tools have become increasingly important for conservation biology and related fields. Genetic information taken from individual animals to the extent of covering whole populations now plays a valuable part in making decisions about levels of protection for certain species or populations and managing conflicts between humans and conservation goals.[…] many conservation biologists and wildlife managers increasingly rely on DNA analysis tools to identify species, determine sex or analyse pedigreesMoreover, Amato expects the use and importance of genetics to grow even more, given that conservation of biodiversity has become a global issue. “My office overlooks Central Park. And there are conservation issues in Central Park: how do you maintain the diversity of plants and animals? I live in suburban Connecticut, where we want the highest levels of diversity within a suburban environment,” he said. “Then, you take this all the way to Central Africa. There are conservation issues across the entire spectrum of landscapes. With global climate change, techniques in genetics and molecular biology are being used to look at issues and questions across that entire landscape.”Rosenbaum commented, “The genomic revolution has certainly changed the way we think about conservation and the questions we can ask and the things we can do. It can be a forensic analysis.” The data translates “into a conservation value where governments, conservationists, and people who actively protect these species can use this information to better protect these animals in the wild.”“The genomic revolution has certainly changed the way we think about conservation […]”Rosenbaum and colleagues from the Wildlife Conservation Society, the American Museum of Natural History and other organizations used genomics for the largest study so far—based on more than 1,500 DNA samples—about the population dynamics of humpback whales in the Southern hemisphere [1]. The researchers analysed population structure and migration rates; they found the highest gene flow between whales that breed on either side of the African continent and a lower gene flow between whales on opposite sides of the Atlantic, from the Brazilian coast to southern Africa. The group also identified an isolated population of fewer than 200 humpbacks in the northern Indian Ocean off the Arabian Peninsula, which are only distantly related to the humpbacks breeding off the coast of Madagascar and the eastern coast of southern Africa. “This group is a conservation priority,” Rosenbaum noted.He said the US National Oceanographic and Atmospheric Administration is using this information to determine whether whale populations are recovering or endangered and what steps should be taken to protect them. Through wildlife management and protection, humpbacks have rebounded to 60,000 or more individuals from fewer than 5,000 in the 1960s. Rosenbaum''s data will, among other things, help to verify whether the whales should be managed as one large group or divided into subgroups.He has also been looking at DNA collected from dolphins caught in fishing nets off the coast of Argentina. Argentine officials will be using the data to make recommendations about managing these populations. “We''ve been able to demonstrate that it''s not one continuous population in Argentina. There might be multiple populations that merit conservation protection,” Rosenbaum explained.The sea turtle is another popular creature that is high on conservationists'' lists. To get DNA samples from sea turtles, population geneticist and wildlife biologist Nancy FitzSimmons from the University of Canberra in Australia reverts to a simpler method than Rosenbaum''s harpoon. “Ever hear of a turtle rodeo?” she asked. FitzSimmons goes out on a speed boat in the Great Barrier Reef with her colleagues, dives into the water and wrangles a turtle on board so it can be measured, tagged, have its reproductive system examined with a laparoscope and a skin tag removed with a small scissor or scalpel for DNA analysis (Fig 2).Open in a separate windowFigure 2Geneticist Stewart Pittard measuring a sea turtle. © Michael P. Jensen, NOAA.Like Rosenbaum, she uses DNA as a forensic tool to characterize individuals and populations [2]. “That''s been a really important part, to be able to tell people who are doing the management, ‘This population is different from that one, and you need to manage them appropriately,''” FitzSimmons explained. The researchers have characterized the turtle''s feeding grounds around Australia to determine which populations are doing well and which are not. If they see that certain groups are being harmed through predation or being trapped in ‘ghost nets'' abandoned by fishermen, conservation measures can be implemented.FitzSimmons, who started her career studying the genetics of bighorn sheep, has recently been using DNA technology in other areas, including finding purebred crocodiles to reintroduce them into a wetland ecosystem at Cat Tien National Park in Vietnam. “DNA is invaluable. You can''t reintroduce animals that aren''t purebred,” she said, explaining the rationale for looking at purebreds. “It''s been quite important to do genetic studies to make sure you''re getting the right animals to the right places.”Geneticist Hans Geir Eiken, senior researcher at the Norwegian Institute for Agricultural and Environmental Research in Svanvik, Norway, does not wrestle with the animals he is interested in. He uses a non-intrusive method to collect DNA from brown bears (Fig 3). “We collect the hair that is on the vegetation, on the ground. We can manage with only a single hair to get a DNA profile,” he said. “We can even identify mother and cub in the den based on the hairs. We can collect hairs from at least two different individuals and separate them afterwards and identify them as separate entities. Of course we also study how they are related and try to separate the bears into pedigrees, but that''s more research and it''s only occasionally that we do that for [bear] management.”Open in a separate windowFigure 3Bear management in Scandinavia. (A) A brown bear in a forest in Northern Finland © Alexander Kopatz, Norwegian Institute for Agricultural and Environmental Research. (B) Faecal sampling. Monitoring of bears in Norway is performed in a non-invasive way by sampling hair and faecal samples in the field followed by DNA profiling. © Hans Geir Eiken. (C) Brown-bear hair sample obtained by so-called systematic hair trapping. A scent lure is put in the middle of a small area surrounded by barbed wire. To investigate the smell, the bears have to cross the wire and some hair will be caught. © Hans Geir Eiken. (D) A female, 2.5-year-old bear that was shot at Svanvik in the Pasvik Valley in Norway in August 2008. She and her brother had started to eat from garbage cans after they left their mother and the authorities gave permission to shoot them. The male was shot one month later after appearing in a schoolyard. © Hans Geir Eiken.Eiken said the Norwegian government does not invest a lot of money on helicopters or other surveillance methods, and does not want to not bother the animals. “A lot of disturbing things were done to bears. They were trapped. They were radio-collared,” he said. “I think as a researcher we should replace those approaches with non-invasive genetic techniques. We don''t disturb them. We just collect samples from them.”Eiken said that the bears pose a threat to two million sheep that roam freely around Norway. “Bears can kill several tons of them everyday. This is not the case in the other countries where they don''t have free-ranging sheep. That''s why it''s a big economic issue for us in Norway.” Wildlife managers therefore have to balance the fact that brown bears are endangered against the economic interests of sheep owners; about 10% of the brown bears are killed each year because they have caused damage, or as part of a restricted ‘licensed'' hunting programme. Eiken said that within two days of a sheep kill, DNA analysis can determine which species killed the sheep, and, if it is a bear, which individual. “We protect the females with cubs. Without the DNA profiles, it would be easy to kill the females, which also take sheep of course.”Wildlife managers […] have to balance the fact that brown bears are endangered against the economic interests of sheep owners…It is not only wildlife management that interests Eiken; he was part of a group led by Axel Janke at the Biodiversity and Climate Research Centre in Frankfurt am Main, Germany, which completed sequencing of the brown bear genome last year. The genome will be compared with that of the polar bear in the hope of finding genes involved in environmental adaptation. “The reason why [the comparison is] so interesting between the polar bear and the brown bear is that if you look at their evolution, it''s [maybe] less than one million years when they separated. In genetics that''s not a very long time,” Eiken said. “But there are a lot of other issues that we think are even more interesting. Brown bears stay in their caves for 6 months in northern Norway. We think we can identify genes that allow the bear to be in the den for so long without dying from it.”Like bears, wolves have also been clashing with humans for centuries. Hunters exterminated the natural wolf population in the Scandinavian Peninsula in the late nineteenth century as governments protected reindeer farming in northern Scandinavia. After the Swedish government finally banned wolf hunting in the 1960s, three wolves from Finland and Russia immigrated in the 1980s, and the population rose to 250, along with some other wolves that joined the highly inbred population. Sweden now has a database of all individual wolves, their pedigrees and breeding territories to manage the population and resolve conflicts with farmers. “Wolves are very good at causing conflicts with people. If a wolf takes a sheep or cattle, or it is in a recreation area, it represents a potential conflict. If a wolf is identified as a problem, then the local authorities may issue a license to shoot that wolf,” said Staffan Bensch, a molecular ecologist and ornithologist at Lund University in Sweden.Again, it is the application of genomics tools that informs conservation management for the Scandinavian wolf population. Bensch, who is best known for his work on population genetics and genomics of migratory songbirds, was called to apply his knowledge of microsatellite analysis. The investigators collect saliva from the site where a predator has chewed or bitten the prey, and extract mitochondrial DNA to determine whether a wolf, a bear, a fox or a dog has killed the livestock. The genetic information potentially can serve as a death warrant if a wolf is linked with a kill, and to determine compensation for livestock owners.The genetic information potentially can serve as a death warrant if a wolf is linked with a kill…Yet, not all wolves are equal. “If it''s shown to be a genetically valuable wolf, then somehow more damage can be tolerated, such as a wolf taking livestock for instance,” Bensch said. “In the management policy, there is genetic analysis of every wolf that has a question on whether it should be shot or saved. An inbred Scandinavian wolf has no valuable genes so it''s more likely to be shot.” Moreover, Bensch said that DNA analysis showed that in at least half the cases, dogs were the predator. “There are so many more dogs than there are wolves,” he said. “Some farmers are prejudiced that it is the wolf that killed their sheep.”According to Dirk Steinke, lead scientist at Marine Barcode of Life and an evolutionary biologist at the Biodiversity Institute of Ontario at the University of Guelph in Canada, DNA barcoding could also contribute to conservation efforts. The technique—usually based on comparing the sequence of the mitochondrial CO1 gene with a database—could help to address the growing trade in shark fins for wedding feasts in China and among the Chinese diaspora, for example. Shark fins confiscated by Australian authorities from Indonesian ships are often a mess of tissue; barcoding helps them to identify the exact species. “As it turns out, some of them are really in the high-threat categories on the IUCN Red List of Threatened Species, so it was pretty concerning,” Steinke said. “That is something where barcoding turns into a tool where wildlife management can be done—even if they only get fragments of an animal. I am not sure if this can prevent people from hunting those animals, but you can at least give them the feedback on whether they did something illegal or not.”Steinke commented that DNA tools are handy not only for megafauna, but also for the humbler creatures in the sea, “especially when it comes to marine invertebrates. The larval stages are the only ones where they are mobile. If you''re looking at wildlife management from an invertebrate perspective in the sea, then these mobile life stages are very important. Their barcoding might become very handy because for some of those groups it''s the only reliable way of knowing what you''re looking at.” Yet, this does not necessarily translate into better conservation: “Enforcement reactions come much quicker when it''s for the charismatic megafauna,” Steinke conceded.“Enforcement reactions come much quicker when it''s for the charismatic megafauna”Moreover, reliable identification of animal species could even improve human health. For instance, Amato and colleagues from the US Centers for Disease Control and Prevention demonstrated for the first time the presence of zoonotic viruses in non-human primates seized in American airports [3]. They identified retroviruses (simian foamy virus) and/or herpesviruses (cytomegalovirus and lymphocryptovirus), which potentially pose a threat to human health. Amato suggested that surveillance of the wildlife trade by using barcodes would help facilitate prevention of disease. Moreover, DNA barcoding could also show whether the meat itself is from monkeys or other wild animals to distinguish illegally hunted and traded bushmeat—the term used for meat from wild animals in Africa—from legal meats.Amato''s group also applied barcoding to bluefin tuna, commonly used in sushi, which he described as the “bushmeat of the developed world”, as the species is being driven to near extinction through overharvesting. Developing barcodes for tuna could help to distinguish bluefin from yellowfin or other tuna species and could assist measures to protect the bluefin. “It can be used sort of like ‘Wildlife CSI'' (after the popular American TV series),” he said.As helpful as these technologies are […] they are not sufficient to protect severely threatened species…In fact, barcoding for law enforcement is growing. Mitchell Eaton, assistant unit leader at the US Geological Survey New York Cooperative Fish and Wildlife Research Unit in Ithaca, NY, USA, noted that the technique is being used by US government agencies such as the FDA and the US Fish & Wildlife Service, as well as African and South American governments, to monitor the illegal export of pets and bushmeat. It is also used as part of the United Nations'' Convention on Biological Diversity for cataloguing the Earth''s biodiversity, identifying pathogens and monitoring endangered species. He expects that more law enforcement agencies around the world will routinely apply these tools: “This is actually easy technology to use.”In that way, barcoding as well as genetics and its related technologies help to address a major problem in conservation and protection measures: to monitor the size, distribution and migration of populations of animals and to analyse their genetic diversity. It gives biologists and conservations a better picture of what needs extra protective measures, and gives enforcement agencies a new and reliable forensic tool to identify and track illegal hunting and trade of protected species. As helpful as these technologies are, however, they are not sufficient to protect severely threatened species such as the bluefin tuna and are therefore not a substitute for more political action and stricter enforcement.  相似文献   

15.
Howard Wolinsky 《EMBO reports》2013,14(10):871-873
Will the US Supreme Court''s ruling that genes can no longer be patented in the USA boost venture capital investment into biotech and medical startup companies?Three years ago, Noubar Afeyan, managing partner and CEO of Flagship Ventures, an early-stage venture capital firm in Cambridge, Massachusetts, USA, was working with a biotech start-up company developing techniques for BRCA gene testing for breast cancer risk that avoided the patents held by Myriad Genetics, a molecular diagnostics company in Salt Lake City (Utah, USA) and the only operator in the field. However, despite the promise of the start-up''s techniques, investors were put off by Myriad''s extensive patent portfolio and fiercely defensive tactics: “A lot of investors were simply not willing to take that chance, even though our technology was superior in many ways and patentably different,” Afeyan said. The effort to launch the start-up ultimately failed.…it is also not clear how the Supreme Court''s ruling will affect the […] industry at large, now that one of the most contested patents for a human gene has been ruled invalidAfeyan believes the prospects for such start-ups improved on the morning of 13 June 2013 when the US Supreme Court ruled in an unanimous vote that Myriad''s fundamental patents on the BRCA1 and BRCA2 genes themselves are invalid, opening up the field to new competitors. The court''s ruling, however, validated Myriad''s patents for BRCA cDNA and methods-of-use.The court''s decision comes at a time when venture capital investment into the life sciences is projected to decline in the years ahead. Some believe that the court''s decision sets a precedent and could provide a boost for products, diagnostics and other tests under development that would have been legally difficult in the light of existing patents on human and other DNA sequences.The US Patent Office issued the original patents for the BRCA 1 and BRCA2 genes in 1997 and 1998 for the US National Institute of Environmental Health Services, the University of Utah and Myriad Genetics. One year earlier, Myriad had launched its first diagnostic test for breast cancer risk based on the two genes and has since aggressively defended it against both private and public competitors in court. Many universities and hospitals were originally offering the test for a lower cost, but Myriad forced them to stop and eventually monopolized BRCA-based diagnostics for breast cancer risk in the USA and several other countries.“Myriad did not create anything,” Justice Clarence Thomas wrote in the Supreme Court''s decision. “To be sure, it found an important and useful gene, but separating that gene from its surrounding genetic material is not an act of invention.” Even so, the court did uphold Myriad''s patents on the methodology of its test. Ron Rogers, a spokesman for the biotech firm, said the Supreme Court had “affirmed the patent eligibility of synthetic DNA and underscored the importance and applicability of method-of-use patents for gene-based diagnostic tests. Before the Supreme Court case we had 24 patents and 520 claims. After the Supreme Court decision, we still have 24 patents. […] [T]he number of our patent claims was reduced to 515. In the Supreme Court case itself, only nine of our 520 patent claims were at issue. Of the nine, the Supreme Court ruled that five were not patent-eligible and they ruled that four were patent-eligible. We still have strong intellectual property protection surrounding our BRCA test and the Supreme Court''s decision doesn''t change that.”Within hours of the ruling, capitalism kicked into high gear. Two companies, Ambry Genetics in Alieso Viejo, California, and Gene by Gene Ltd in Houston, Texas, USA, announced that they were launching tests for the BRCA1 and BRCA2 genes for less than the US$3,100 Myriad has been charging privately insured patients and US$2,795 for patients covered by Medicare—the government health plan for the elderly and disabled. Several other companies and universities also announced they would be offering BRCA testing.Entrepreneur Bennett Greenspan, a managing partner of Gene by Gene, explained that his company had been poised to offer BRCA testing if the Supreme Court ruled against Myriad. He said, “We had written a press release with our PR firm a month before the release of the Supreme Court with the intention that if the Supreme Court overruled the patent or invalidated the patent that we would launch right away and if they didn''t, we would just tear up the press release.” His company had previously offered BRCA gene testing in Israel based on guidelines from the European Union.Myriad Genetics has not given up defending its patents, however. On 9 and 10 July 2013, it slapped Ambry and Gene by Gene with lawsuits in the US District Court in Salt Lake City for allegedly infringing on patents covering synthetic DNA and methods-of-use related to the BRCA1 and BRCA2 genes. Rogers commented that the testing processes used by the firms “infringes 10 patents covering synthetic primers, probes and arrays, as well as methods of testing, related to the BRCA1 and BRCA2 genes.”On 6 August 2013, Ambry countersued Myriad, arguing that the company “continues a practice of using overreaching practices to wrongfully monopolize the diagnostic testing of humans'' BRCA1 and BRCA2 genes in the United States and to attempt to injure any competitor […] Due to Myriad''s anticompetitive conduct, customers must pay significantly higher prices for Myriad''s products in the relevant market, often nearly twice as high as the price of Ambry''s products and those of other competitors” [1].Just as the courts will have to clarify whether the competitors in this case infringe on Myriad''s patents, it is also not clear how the Supreme Court''s ruling will affect the biotech and diagnostics industry at large, now that one of the most contested patents for a human gene has been ruled invalid. In recent years, venture capital investment into the life sciences has been in decline. The National Venture Capital Association and the Medical Innovation & Competitiveness Coalition reported from a survey that, “An estimated funding loss of half a billion dollars over the next three years will cost America jobs at a time when we desperately need employment growth” [2]. The survey of 156 venture capital firms found that 39% of respondents said they had reduced investment in the life sciences during the previous three years, and the same proportion intended to do so in the next three years. “[US Food and Drug Administration] FDA regulatory challenges were identified as having the highest impact on these investment decisions,” the report states, adding that many investors intended to shift their focus from the US towards Europe and the Asia/Pacific region.Another report from the same groups explains how public policy involving the FDA and other players in “the medical innovation ecosystem”—including the US patent system, public agencies, tax policy, securities regulation, immigration laws and private groups such as insurers—affect the decisions of investors to commit to funding medical innovation [3].Some investors think that the court decision about the patentability of human DNA will increase confidence and help to attract investors back to the life sciencesSome investors think that the court decision about the patentability of human DNA will increase confidence and help to attract investors back to the life sciences. “The clarity is helpful because for the longest time people didn''t do things because of ambiguity about whether those patents would be enforceable,” Afeyan said. “It''s one thing to not do something because of a patent, it''s another to not do something because you know that they have patents but you''re not sure what it''s going to stop you from doing because it hasn''t been really fully fleshed out. Now I think it is reasonably well fleshed out and I think you will see more innovation in the space.”Others also appreciate the clarification from the Supreme Court about what is a patentable invention in regard to human genes and DNA. “The Myriad decision was a very solid reading of the underlying purpose of our patent law, which is to reward novel invention,” commented Patrick Chung, a partner with New Enterprise Associates, a venture capital firm in Menlo Park, California, which invested in 23andMe, a personal genomics company based in Mountain View (California, USA), and who serves on the 23andMe board.But not everyone agrees that the Supreme Court''s decision has provided clarity. “You could spin it and say that it was beneficial to create some certainty, but at the end of the day, what the Court did was reduce the scope of what you''re allowed to get patent claims on,” said Michael Schuster, a patent lawyer and Intellectual Property Partner and Co-Chair of the Life Sciences Group at Fenwick & West LLP in San Francisco, California, USA. “It''s going to be a continuing dance between companies, smart patent lawyers, and the courts to try to minimize the impact of this decision.”Kevin Noonan, a molecular biologist and patent lawyer with McDonnell Boehnen Hulbert & Berghoff LLP in Chicago, Illinois, USA, commented that he does not expect the Supreme Court decision will have much of an impact on venture investments or anything else. “This case comes at a time fortunately when biotechnology is mature enough so that the more pernicious effects of the decision are not going to be quite as harmful as they would if this had happened ten, 15 or 20 years ago,” he said. “We''re now in the ‘post-genomic'' era; since the late ‘90s and turn of the century, the genomic and genetic data from the Human Genome Project have been on publicly available databases. As a consequence, if a company didn''t apply for a patent before the gene was disclosed publicly, it certainly is not able to apply for a patent now. The days of obtaining these sequences and trying to patent them are behind us.”Noonan also noted that the Myriad Genetics patents were due to expire in 2014–2015 anyway. “Patents are meaningless if you can''t enforce them. And when they expire, you can no longer enforce them. So it really isn''t an impediment to genetic testing now,” he explained. “What the case illustrates is a disconnect between scientists and lawyers. That''s an old battle.”George Church, professor of genetics at Harvard Medical School (Boston, Massachusetts, USA) and Director of the Personal Genome Project, maintains that the Supreme Court decision will have minimal influence on the involvement of venture capitalists in biotech. “I think it''s a non-issue. It''s basically addressing something that was already dead. That particular method of patenting or trying to patent components of nature without modification was never really a viable strategy and in a particular case of genes, most of the patents in the realm of bio-technology have added value to genes and that''s what they depend on to protect their patent portfolio—not the concept of the gene itself,” he said. “I don''t know of any investor who is freaked out by this at all. Presumably there are some, because the stock oscillates. But you can get stock to oscillate with all kinds of nonsense. But I think the sober, long-term investors who create companies that keep innovating are not impacted.”Church suggests that the biggest concern for Myriad now is whole-gene sequencing, rather than the Supreme Court''s decision. “Myriad should be worrying about the new technology, and I''m sure they''ve already considered this. The new technology allows you to sequence hundreds of genes or the whole genome for basically the price they''ve been charging all along for two genes. And from what I understand, they are expanding their collection to many genes, taking advantage of next generation sequencing as other companies have already,” he said.Whatever its consequences in the US, the Supreme Court''s decision will have little impact on other parts of the world, notably Europe, where Myriad also holds patents on the BRCA genes in several countries. Gert Matthijs, Head of the Laboratory for Molecular Diagnostics at the Centre for Human Genetics in Leuven, Belgium, says that even though the US Supreme Court has invalidated the principle of patenting genes in America, the concept remains in Europe and is supported by the European Parliament and the European Patent Convention. “Legally, nothing has changed in Europe,” he commented. “But there is some authority from the US Supreme Court even if it''s not legal authority in Europe. Much of what has been used as arguments in the Supreme Court discussions has been written down by the genetics community in Europe back in 2008 in the recommendations on behalf of the European Society for Human Genetics. The Supreme Court decision is something that most of us in Europe would agree upon only because people have been pushing towards protecting the biotech industry that the pendulum was so way out in Europe.”Benjamin Jackson, Senior Director of legal affairs at Myriad Genetics, commented that Myriad holds several patents in Europe that are not likely to be affected by the Supreme Court''s ruling. “The patent situation both generally and for Myriad is a lot clearer in Europe. The European Union Biotech Directive very clearly says that isolated DNA is patentable even if it shares the same sequence as natural DNA,” he said. “Right now, it''s pretty uncontroversial, or at least it''s well settled law basically in Europe that isolated DNA is patentable.” However, while the Directive states that “biological material which is isolated from its natural environment or produced by means of a technical process” might be patentable “even if it previously occurred in nature”, the European Patent Office (EPO) in Munich, Germany, requires that the subject matter is an inventive step and not just an obvious development of existing technology and that the industrial application and usefulness must be disclosed in the application.Myriad has opened a headquarters in Zurich and a lab in Munich during the past year, hoping to make inroads in Europe. In some EU countries, Myriad offers its BRCA test as part of cancer diagnosis. In other countries, BRCA testing is conducted at a fraction of what Myriad charges in the USA, either because institutions ignore the patents that are not enforced in their jurisdictions, or because these countries, such as Belgium, were not included in the patent granted by the European Patent Office. Moreover, in various countries BRCA testing is only available through the healthcare system and only as part of a more extensive diagnosis of cancer risk. In addition, as Matthijs commented, “[t]he healthcare system in Europe is very heterogeneous and that''s also of course a big impediment for a big laboratory to try and conquer Europe because you have to go through different reimbursement policies in different countries and that''s not easy.”Ultimately, it seems the Supreme Court''s decision might turn out to have little impact on biotech firms in either the USA or Europe. Technological advances, in particular new sequencing technologies, might render the issue of patenting individual genes increasingly irrelevant.  相似文献   

16.
Suran M 《EMBO reports》2011,12(5):404-407
The increasing influence of the Tea Party in Congress and politics has potential repercussions for public funding of scientific research in the USAIn 2009, Barack Obama became the 44th President of the USA, amid hopes that he would fix the problems created or left unresolved by his predecessor. However, despite his positive mantra, “Yes we can,” the situation was going to get worse: the country was spiralling towards an economic recession, a collapsing residential real-estate market and the loss of millions of jobs. Now, the deficit lingers around US$14 trillion (US Department of the Treasury, 2011). In response to these hardships and the presence of a perceived ‘socialist'' president in office, a new political movement started brewing that would challenge both the Democrats and the Republicans—the two parties that have dominated US politics for generations. Known as the Tea Party, this movement has been gaining national momentum in its denouncement of the status quo of the government, especially in relation to federal spending, including the support of scientific research.The name is a play on the Boston Tea Party, at which more than 100 American colonists dumped 45 tonnes of tea into Boston Harbour (Massachusetts, USA) in 1773 to protest against the British taxation of imported tea. Whereas the 18th century Boston Tea Party formed to protest against a specific tax, the Tea Party of the 21st century protests against taxes and ‘big'' government in general.Many view Tea Party followers as modern muckrakers, but supporters claim their movement is fundamentally about upholding the US Constitution. Tea Party Patriots, a non-partisan organization, considers itself to be the official home of the Tea Party movement. Fuelled by the values of fiscal responsibility, limited government and free markets, Tea Party Patriots believe, these three principles are granted by the Constitution, although not necessarily upheld by the administration.“If you read the Constitution, the limits of government involvement in society [are] pretty well-defined and our government has gone farther and farther beyond the specific limits of the Constitution,” said Mark Meckler, one of the co-founders of Tea Party Patriots. “Our Constitution is not designed as an empowering document, but as a limiting document… [and] was intended to be used as a weapon by the people against the government to keep it in the box.” Tea Partiers tend to be especially critical when it comes to spending tax dollars on bank bailouts and health care, but anything goes when it comes to cutting back on public spending—even science. “We believe everything needs to be on the table since the government is virtually bankrupt,” Meckler said. “We need to cut the waste, cut the abuse [and] get rid of the departments that shouldn''t exist.”Tea Partiers tend to be especially critical when it comes to spending tax dollars on bank bailouts and health care, but anything goes when […]cutting […] public spending—even scienceOn 19 February 2011, the US House of Representatives, which is currently controlled by the Republicans, passed a federal-spending bill for the remainder of the 2011 fiscal year budget. Among other cuts, the bill called for billions of dollars to be slashed from the budgets of federal science agencies. If the bill is signed into law, the National Institutes of Health (NIH) will have $1.6 billion cut from its budget—a 5.2% decrease—and the Department of Energy (DOE) will experience an 18% cut in funding for its Office of Science. Other agencies targeted include the Environmental Protection Agency (EPA), the National Aeronautics and Space Administration (NASA), the National Institute of Standards and Technology (NIST) and the National Science Foundation (NSF; Anon, 2011; Cho, 2011). Although the US Senate, which has a Democratic majority, must consider the bill before any definite amendments to the budget are made, it is likely that there will be some cuts to science funding.Although the House is in favour of science-related cuts, President Obama supports spending more on science education, basic research and clean-energy research. He has also proposed an 11.8% increase in the budget of the DOE, as well as an 8% increase in the NSF budget (Office of Management and Budget, 2011).The House is in favour of science-related cuts, but President Obama is in favour of spending more on science education, basic science and clean-energy researchJoann Roskoski, acting assistant director of the Biology Directorate at the NSF, said her institute is strongly in favour of President Obama''s budget proposal. “President Obama is a very strong supporter of fundamental research and STEM [science, technology, engineering and mathematics] education because he perceives it as investing in the future of the country,” she said. “These are just difficult budgetary times and we''ll just have to wait and see what happens. As they say, the president proposes and Congress disposes.”Karl Scheidt, a professor of chemistry at Northwestern University (Evanston, Illinois, USA), has four grants from federal agencies. “A couple of my grants expire this year, which is happening at the worst, worst possible time,” explained Scheidt, whose grants are funded by the NIH and the NSF. He added that although many politicians either do not understand or believe in the fundamentals of science, they still preach to the masses about what they ‘think'' they know. “I think it''s an absolute travesty that many people don''t understand science and that many of the Republicans who don''t fully understand science perpetuate incorrect assumptions and scientific falsehoods when speaking in public,” he said. “It makes the US less competitive and puts us collectively at a disadvantage relative to other nations if we don''t succeed in scientific education and innovative research in the future.”Although the Tea Party is not technically associated with the Republican Party, all Tea-Party representatives and senators ran as Republican candidates in the last election. While only one-third of seats in the Senate are on the ballot every two years for a six-year term, all House seats are for a two-year term. In the most recent Senatorial election, 50% of Tea Party-backed candidates won; 10 in total. 140 candidates for seats in the House of Representatives were backed by the Tea Party—all of whom were Republicans—but only 40 won. Nevertheless, with around 100 new Republicans in office, a House controlled by a Republican majority and most Congress-based Republicans in agreement with Tea Party ideals, the Tea Party actually has a lot of sway on the voting floor.Of course, as a fundamentally grass-roots movement, their influence is not limited to the halls of power. Since just before the November election last year, Tea Party-backed politicians have received more scrutiny and media exposure, meaning more people have listened to their arguments against spending on science. In fact, Republican politicians associated with the Tea Party have made critical and sometimes erroneous comments about science. Representative Michelle Bachman, for example, claimed that because carbon dioxide is a natural gas, it is not harmful to our atmosphere (Johnson, 2009). Representative Jack Kingston denounced the theory of evolution and stated that he did not come from a monkey (The Huffington Post, 2011). When asked how old he believes the Earth to be, Senator Rand Paul refused to answer (Binckes, 2010). He also introduced a bill to cut the NSF budget by 62%, and targeted the budget of the Center for Disease Control and Prevention.Scheidt believes part of the challenge is that many scientists do not properly articulate the importance of their work to the public, and there is limited representation on behalf of science in Washington. “It''s difficult sometimes to advocate for and explain the critical importance of basic research and for the most part, Congress may not always appreciate the basic fundamental mission of organizations like the NIH,” Scheidt said. “Arlen Specter was one of the few people who could form coalitions with his colleagues on both sides of the aisle and communicate why scientific research is critical. Why discovering new ways to perform transplants and creating new medicines are so important to everyone.”…part of the challenge is that many scientists do not properly articulate the importance of their work to the public, and there is limited representation on behalf of science in WashingtonSpecter, a former senator, was Republican until 2009 when he decided to switch political parties. During the last Democratic primary, he lost the Pennsylvania Senate nomination after serving in Congress for more than four decades. The Democratic nominee, Joe Sestak, eventually lost the coveted seat to Pat Toomey, a Tea Party Republican who sponsored an amendment denying NIH funding for some grants while he was a House member. Toomey is also against funding climate science and clean-energy research with federal dollars.Specter was considered a strong supporter of biomedical research, especially cancer research. He was the catalyst that pushed through a great deal of pro-science legislation, such as adding approximately $10 billion to NIH funding as part of the stimulus package in 2009, and doubling NIH funding in the 1990s. As scientific research was so important to him, he served on the US Senate Committee on Appropriations Subcommittee on Labor, Health and Human Services, Education, and Related Agencies and on the Senate Committee on Environment and Public Works. Specter was a popular political champion of science not only because of all he had accomplished, but also because so few scientists are elected to office.Among those Democrats who lost their seats to Tea Party Republicans was Congressman Bill Foster. Foster, who once worked for the Fermi National Accelerator Laboratory (Fermilab)—which is funded by the DOE—represented Batavia, Ilinois, which is also where Fermilab has its headquarters. “The new representative in the district where Fermilab resides is Randy Hultgren, a Republican, who has been very supportive of the laboratory since he''s been elected,” said Cindy Conger, Chief Financial Officer at Fermilab. “He''s very interested in us and very interested […] in us having adequate funding.”However, Fermilab is suffering financially. “We will […] have some level of layoffs,” Conger said. “Inadequate federal funding could result in more layoffs or not being able to run our machines for part of the year. These are the things we are contemplating doing in the event of a significant budget cut. Nothing is off the table [but] we will do everything we can to run the [Tevatron] accelerator.”But Fermilab''s desperate appeal for $35 million per year for the next three fiscal years was denied by the Obama administration and not included in the 2012 White House budget request. As a result, the most powerful proton–antiproton accelerator in the USA, the Tevatron, is shutting down indefinitely near the end of this year.Another pro-science Republican is former Congressman John Porter, who studied at the Massachusetts Institute of Technology. He encouraged the federal funding of science while serving as chair of the House Subcommittee on Labor, Health and Human Services, and Education, as well as on the House Committee on Appropriations and Related Agencies. Like Scheidt, Porter said a problem is that not many members of Congress really understand science or what goes into scientific research.“Many members of Congress don''t realize that the money appropriated for the funding of scientific research through NIH, NSF […] is sent out to research institutes in their districts and states where the research is conducted,” said Porter, who retired from Congress in 2001 after serving for more than 20 years. “They simply haven''t been exposed to it and that''s the fault of the science community, which has a great responsibility to educate about the mechanisms on how we fund scientific research.”Today, Porter is vice-chair of the Foundation for the NIH and also chairs Research!America, a non-profit organization which aims to further medical, health and scientific research as higher national priorities. He also noted that industry would not fund scientific research in the way the government does because there would essentially be no profits. Therefore, federal funding remains essential.“Let''s take away the phones, iPads and everything else [those against the federal funding of science] depend on and see what''s left,” Porter said. “The US is the world leader in science, technology and research and the way we got there and the way we have created the technology that makes life easier […] is a result of making investments in that area.”For now, Scheidt said the best approach is to educate as many people as possible to understand that scientific research is a necessity, not a luxury. “We unfortunately have a very uneducated population in regard to science and it''s not 100% their fault,” he said. “However, if people took a real interest in science and paid as much attention to stem-cell or drug-discovery research as they did to the Grammy Awards or People magazine I think they would appreciate what''s going on in the science world.”…the best approach is to educate as many people as possible to understand that scientific research is a necessity, not a luxuryInstead, the USA is lagging behind its competitors when it comes to STEM education. According to the 2009 Program for International Student Assessment (PISA), the USA is ranked 17th on science and 25th on maths out of 34 countries (US Department of Education, 2010). “We''re in a cluster now, we''re no longer the leading country,” said D. Martin Watterson, a molecular biologist who sits on NIH peer-review committees to evaluate grant proposals. The reason, according to Watterson, is that the first things to be cut after a budget decrease are training grants for continuing education efforts. Moreover, the USA already lacks highly trained workers in the field of science. “In some disciplines, employers now look to other places in Europe and Asia to find those trained personnel,” Watterson said.Ultimately, most people at least want a final budget to be passed so that there is sufficient time to plan ahead. However, Georgetown University political science professor Clyde Wilcox thinks that a compromise is not so simple. “The problem is that it''s a three-way poker game. People are going to sit down and they''re going to be bargaining, negotiating and bluffing each other,” he said. “The House Republicans just want to cut the programs that they don''t like, so they''re not cutting any Republican programs for the most part.”As a result, institutions such as the EPA find themselves being targeted by the Republicans. Although there is not a filibuster-proof majority of Democrats in the Senate, they still are a majority and will try to preserve science funding. Wilcox said that it is not necessarily a good thing to continue negotiating if nothing gets done and the country is dependent on continuing resolutions.Although there is not a filibuster-proof majority of Democrats in the Senate, they still are a majority and will try to preserve science funding“What the real problem is, when push comes to shove, someone has to blink,” he said. “I don''t think there will be deep cuts in science for a number of reasons, one is science is consistent with the Democratic ideology of education and the Republican ideology of investment. And then, we don''t really spend that much on science anyway so you couldn''t come remotely close to balancing the budget even if you eliminated everything.”Although during his time in Congress representatives of both parties were not as polarized as they are today, Porter believes the reason they are now is because of the political climate. “The president has made [science] a very important issue on his agenda and unfortunately, there are many Republicans today that say if he''s for it, I''m against it,” Porter said. In fact, several government officials ignored repeated requests or declined to comment for this article.“It''s time for everybody from both parties to stand up for the country, put the party aside and find solutions to our problems,” Porter commented. “The American people didn''t just elect us to yell at each other, they elected us to do a job. You have to choose priorities and to me the most important priority is to have our children lead better lives, to have all human beings live longer, healthier, happier lives and to have our economy grow and prosper and our standard of living maintained—the only way to do that is to invest where we lead the world and that''s in science.”  相似文献   

17.
Wolinsky H 《EMBO reports》2011,12(9):897-900
Our knowledge of the importance of telomeres to health and ageing continues to grow. Some scientists are therefore commercializing their research, whereas others believe we need an even deeper understanding before we can interpret the results.After 30 years of research, the analysis of telomere length is emerging as a commercial biomarker for ageing and disease, as well as a tool in the search for new medications. Several companies offer tests for telomere length, and more are due to launch their products shortly. Even so, and despite the commercial enthusiasm, interpreting precisely what an individual''s telomeres mean for their health and longevity remains challenging. As a result, there is some division within the research community between those who are pushing ahead with ventures to offer tests to the public, and those who feel that telomere testing is not yet ready for prime time.Peter Lansdorp, a scientist at the British Columbia Cancer Agency and a professor at the University of British Columbia (Vancouver, Canada), founded his company, Repeat Diagnostics, in response to the number of questions and requests he received from physicians for tests for telomere length. The company became the first to offer commercial telomere testing in 2005 and now mainly serves medical researchers, although it makes its test available to the public through their physicians for C $400. Nevertheless, Lansdorp thinks that testing is of limited use for the public. “Testing [...] outside the context of research studies is in my view premature. Unfortunately I think some scientists are exploiting it,” he said. “At this point, I would discourage people from getting their telomeres tested unless there are symptoms in the family that may point to a telomere problem, or a disease related to a telomere problem. I don''t see why on Earth you would want to do that for normal individuals.”“Testing [...] outside the context of research studies is in my view premature. Unfortunately I think some scientists are exploiting it”Others are more convinced of the general utility of telomere tests, when used in combination with other diagnostic tools. Elizabeth Blackburn, Professor of Biology and Physiology at the University of California (San Francisco, USA), was a co-recipient of the Nobel Prize for Physiology or Medicine in 2009 for her part in the discovery of telomerase, the enzyme that replenishes telomeres (Sidebar A). She stressed that the point of telomere testing is to obtain an overall picture using a marker that integrates many inputs, and produces a robust statistical association with [...] disease risks. It is not a specific diagnostic.” Telome Health, Inc. (Menlo Park, California, USA)—the company that Blackburn helped found and that she now advises in a scientific capacity—plans to begin selling its own US $200 telomere test later this year. “The science has been emerging at a rapid pace recently [...] for those who are familiar with the wealth of the evidence and the accumulated data, the overwhelming pattern is that there are clear associations with telomere maintenance, including longitudinal patterns, and health measures that have had well-tested clinical relevance,” she explained.

Sidebar A | Telomeres and telomerase

Telomeres are regions of repetitive DNA sequence that prevent the DNA replication process or damage from degrading the ends of chromosomes, essentially acting as buffers and protecting the genes closest to the chromosome ends. Russian biologist Alexei Olovnikov first hypothesized in the early 1970s that chromosomes could not completely replicate their ends, and that such losses could ultimately lead to the end of cell division (Olovnikov, 1973). Some years later, Elizabeth Blackburn, then a postdoctoral fellow in Joseph Gall''s lab at Yale University (New Haven, Connecticut, USA), and her colleagues published work suggesting that telomere shortening was linked with ageing at the cellular level, affected lifespan and could lead to cancer (Blackburn & Gall, 1978; Szostak & Blackburn, 1982). In 1984, Carol Greider, working as a postdoc in Blackburn''s lab at the University of California (Berkeley, USA), discovered telomerase, the enzyme that replenishes telomeres. Blackburn and Greider, together with Jack Szostak, were awarded the 2009 Nobel Prize in Physiology or Medicine for “the discovery of how chromosomes are protected by telomeres and the enzyme telomerase” (http://nobelprize.org/nobel_prizes/medicine/laureates/2009/).María Blasco, Director of the Centro Nacional de Investigaciones Oncológicas (CNIO; Spanish National Cancer Research Centre; Madrid, Spain), is similarly optimistic about the prospect of telomere testing becoming a routine health test. “As an analogy, telomere length testing could be similar to what has occurred with cholesterol tests, which went in [the] early 80s from being an expensive test for which no direct drug treatment was available to being a routine test in general health check-ups,” she said.Carol Greider, Professor and Director of Molecular Biology and Genetics at Johns Hopkins University (Baltimore, Maryland, USA) and co-recipient of the 2009 Nobel Prize with Blackburn, however, does not believe that testing is ready for widespread use, although she agreed that telomere length can reveal a lot about disease and is an important subject for research. “Certainly, right now, I think it''s very premature to be offering this kind of testing to the public. I don''t think that the research has yet told us about the risks, what we can actually say statistically with high confidence, so it''s unclear to me if there is any real value to the general public to testing telomeres,” she said.Blasco is Chief Scientific Advisor to Life Length, a CNIO spin-off company that launched its test last year to a storm of media attention. “For some scientists, there is always a question that needs to be solved or has not been sufficiently evaluated,” she said. “We have lots of information showing that telomere length is important for understanding ageing and certain diseases [...] New technologies have been developed that allow us now to measure telomere length in a large scale using a simple blood sample or a spit sample. The fact that the technology is here and the science is here makes this a good moment to market this testing.”“We have lots of information showing that telomere length is important for understanding ageing and certain diseases [...] the technology is here and the science is here”Apart from discussion of the science, companies that offer telomere testing are also encountering scepticism from ethicists and other scientists about the value of telomere-length testing for normal healthy people.Lansdorp, who is a medical doctor by training, thinks that practitioners are not yet ready to use and interpret the tests. “It''s a new field and there are good clinical papers out there, but the irony is that our work [that] has highlighted the value of these tests for specific clinical conditions [is] now being used [...] to make the point that it''s really important to have your telomeres tested, but the dots are not connected by a straight line,” he said.Jonathan Stein, Director of Science and Research at SpectraCell Laboratories (Houston, Texas, USA)—which offers its US $250 telomere test as an extension of its nutritional product line that is sold to family physicians, chiropractors and naturopaths—said that there has only really been demand for the telomere test from his company among physicians and their spouses, but not for use in the clinic. “Doctors are incredibly curious about [the test] and then when we do follow-ups in general, they tell us it''s interesting and they know it''s valuable, but they''re not entirely sure what it means to people. Where we go from the bench to bedside, there seems to be a real sticking point,” he said, adding that he thinks demand will increase as the public becomes increasingly educated about telomeres and health.Arthur Caplan, Professor of Bioethics and Director of the Center for Bioethics at the University of Pennsylvania (Philadelphia, USA), is not clear that even an educated public will be interested in what the test can tell them. “We don''t have any great reason to think that people will be interested in knowing facts about themselves [...] if they can''t do anything about it. I think most people would say ''I''m not going to spend money on this until you tell me if there''s something I can do to slow this process or expand my life''.” As such, he thinks that companies that are getting in early to ''cash in'' on the novelty of telomere testing are unlikely to see huge success, partly because the science is not yet settled.Calvin Harley, President and Chief Science Officer at Telome Health, disagrees. He thinks that two things will drive demand for telomere testing: the growing number of clinical studies validating the utility of the test, and the growing interest in lifestyle changes and interventions that help to maintain telomeres....two things will drive demand for telomere testing: the growing number of clinical studies [...] and [...] interest in lifestyle changes and interventions that help maintain telomeresBut these are early days. Jerry Shay, Professor of Cell Biology and Neuroscience at the University of Texas Southwestern Medical Center (Dallas, USA) and an adviser to the company Life Length, said that early adopters are likely to be the health conscious and the curious. “Some people will say, ''Well, look, I had my telomeres measured: I''m a 60 year old with 50-year-old telomeres'',” he explained. “It will have ''My telomeres are longer than your telomeres'' type of cocktail talk appeal. That''s fine. I have no problem with that as long as we can follow this sort of population and individuals over decades.”“It will have ''My telomeres are longer than your telomeres'' type of cocktail talk appeal [...] I have no problem with that as long as we can follow this sort of population and individuals...”Shay''s last point is the key—research and data collection. Even those commercializing telomere-length tests agree that our understanding of telomere biology, although extensive, is incomplete and that we have yet to unpick fully the links between telomeres and disease. Stefan Kiechl, a telomere researcher in the Department of Neurology at Innsbruck Medical University (Austria), published an article last year on telomere length and cancer (Willeit et al, 2010). “The appealing thing with telomere length measurements is that they allow the estimation of the biological—in contrast to the chronological—age of an organism. This was previously not possible. Moreover, long telomere length has been linked with a low risk of advanced atherosclerosis, cardiovascular disease and cancer, and, vice versa, short telomere length is associated with a higher risk of these diseases.”But, he said that problems remain to be resolved, such as whether telomere length can only be measured in cells that are readily available, such as leukocytes, and whether telomere length in leukocytes varies substantially from telomere length in other tissues and cells. “Moreover, there is still insufficient knowledge on which lifestyle behaviours and other factors affect telomere length,” he concluded.This might be a bumpy road. When Life Length announced its launch in May, newspapers carried headlines such as ''The £400 test that tells you how long you''ll live'', reporting: “A blood test that can show how fast someone is ageing—and offers the tantalizing possibility of estimating how long they have left to live—is to go on sale to the general public in Britain later this year” (Connor, 2011).The story was catchy, but Life Length officials are determined to explain that, despite the name of the company, its tests do not predict longevity for individuals. Blasco said that the word ''life'' in the name is meant as an analogy between telomeres and life. “A British newspaper chose to use this headline, but the company name has no intention to predict longevity,” she said. Instead, the name refers to extensive research correlating the shortened chromosome tips with the risk for certain diseases and personal habits, such as smoking, obesity, lack of exercise and stress, Blasco explained.Life Length''s test measures the abundance of short telomeres, as they claim that there is genetic evidence that short telomeres are the ones that are relevant to disease. “The preliminary results are exciting: we are observing that the percent of short telomeres with increasing age is more divergent between individuals than average telomere length for the same group of individuals,” Blasco explained. “This is exactly what you would expect from a parameter [abundance of short telomeres] that reflects the effects of environmental factors and lifestyle on people''s telomeres.” She noted that being in a lower quartile of average telomere length and the higher quartile of abundance of short telomeres would indicate that telomeres are shorter than normal for a given age, which has been correlated with a higher risk of developing certain diseases.So, what can be done about an abundance of short telomeres? Lansdorp said that, as a physician, he would be hard pressed to know what to tell patients to do about it. “The best measure of someone''s age and life expectancy is the date on their birth certificate. Telomere length, as a biomarker, shows a clear correlation with age at the population level. For an individual the value of telomere length is very limited,” he said. “I suspect there''s going to be a lot of false alarms based on biological variation as well as measurement errors using these less accurate tests.”“The best measure of someone''s age and life expectancy is the date on their birth certificate. [...] For an individual the value of telomere length is very limited”Harley, however, said that if telomere length were perfectly correlated with age, it would be a useless biomarker, except for in forensic work. “The differences in telomere length between individuals at any given age is where the utility lies [...] people with shorter telomeres are at higher risk for morbidity and mortality. In addition, there is emerging data suggesting that people with shorter telomeres respond differently to certain drugs than people with longer telomeres. This fits into the paradigm of personalized medicine,” he said....if telomere length were perfectly correlated with age, it would be a useless biomarker, except for in forensic workWhile he was at Geron Corporation, Harley was the lead discoverer of telomerase activators purified from the root of Astragalus membranaceus. Harley, Blasco and colleagues have published two peer-reviewed papers on one of those molecules, TA-65—one in humans and the other in mice (de Jesus et al, 2011; Harley et al, 2011). Both showed positive effects on certain health measures, and Blasco''s lab found that mice treated with TA-65 had improved health status compared with those given a placebo. “However, we did not see significant effects on longevity,” Blasco said.In the meantime, researchers are squabbling about the techniques used by the testing companies. Greider maintains that Flow-FISH (fluorescence in situ hybridization), which was developed by Lansdorp, is the gold standard used by clinical researchers and that it is the most reliable technique. Harley argues that the quantitative real-time (qRT)-PCR assay developed by the Blackburn lab is just as reliable, and easier to scale-up for commercial use. Blasco pointed out that, similarly to its rivals, the qFISH used by Life Length offers measurements of average telomere length, but that it is the only company to report the percentage of short telomeres in individual cells. In the end, Lansdorp suggested that the errors inherent in the tests, along with biological variations and cost, should give healthy people pause for thought about being tested.Ultimately, whichever test for telomere length is used and whatever the results can tell us about longevity and health, it is unlikely that manipulating telomere length will unlock the fountain of youth, à la Spanish explorer Juan Ponce de León y Figueroa (1474–1521). Nevertheless, telomere testing could become a key diagnostic tool for getting a few more years out of life, and it could motivate people to follow healthier lifestyles. As Kiechl pointed out, “[t]here is convincing evidence that calculation of an individual''s risk of cardiovascular disease [...] substantially enhances compliance for taking medicines and the willingness to change lifestyle. Knowing one''s biological age may well have similar favourable effects.”  相似文献   

18.
Greener M 《EMBO reports》2008,9(11):1067-1069
A consensus definition of life remains elusiveIn July this year, the Phoenix Lander robot—launched by NASA in 2007 as part of the Phoenix mission to Mars—provided the first irrefutable proof that water exists on the Red Planet. “We''ve seen evidence for this water ice before in observations by the Mars Odyssey orbiter and in disappearing chunks observed by Phoenix […], but this is the first time Martian water has been touched and tasted,” commented lead scientist William Boynton from the University of Arizona, USA (NASA, 2008). The robot''s discovery of water in a scooped-up soil sample increases the probability that there is, or was, life on Mars.Meanwhile, the Darwin project, under development by the European Space Agency (ESA; Paris, France; www.esa.int/science/darwin), envisages a flotilla of four or five free-flying spacecraft to search for the chemical signatures of life in 25 to 50 planetary systems. Yet, in the vastness of space, to paraphrase the British astrophysicist Arthur Eddington (1822–1944), life might be not only stranger than we imagine, but also stranger than we can imagine. The limits of our current definitions of life raise the possibility that we would not be able to recognize an extra-terrestrial organism.Back on Earth, molecular biologists—whether deliberately or not—are empirically tackling the question of what is life. Researchers at the J Craig Venter Institute (Rockville, MD, USA), for example, have synthesized an artificial bacterial genome (Gibson et al, 2008). Others have worked on ‘minimal cells'' with the aim of synthesizing a ‘bioreactor'' that contains the minimum of components necessary to be self-sustaining, reproduce and evolve. Some biologists regard these features as the hallmarks of life (Luisi, 2007). However, to decide who is first in the ‘race to create life'' requires a consensus definition of life itself. “A definition of the precise boundary between complex chemistry and life will be critical in deciding which group has succeeded in what might be regarded by the public as the world''s first theology practical,” commented Jamie Davies, Professor of Experimental Anatomy at the University of Edinburgh, UK.For most biologists, defining life is a fascinating, fundamental, but largely academic question. It is, however, crucial for exobiologists looking for extra-terrestrial life on Mars, Jupiter''s moon Europa, Saturn''s moon Titan and on planets outside our solar system.In their search for life, exobiologists base their working hypothesis on the only example to hand: life on Earth. “At the moment, we can only assume that life elsewhere is based on the same principles as on Earth,” said Malcolm Fridlund, Secretary for the Exo-Planet Roadmap Advisory Team at the ESA''s European Space Research and Technology Centre (Noordwijk, The Netherlands). “We should, however, always remember that the universe is a peculiar place and try to interpret unexpected results in terms of new physics and chemistry.”The ESA''s Darwin mission will, therefore, search for life-related gases such as carbon dioxide, water, methane and ozone in the atmospheres of other planets. On Earth, the emergence of life altered the balance of atmospheric gases: living organisms produced all of the Earth'' oxygen, which now accounts for one-fifth of the atmosphere. “If all life on Earth was extinguished, the oxygen in our atmosphere would disappear in less than 4 million years, which is a very short time as planets go—the Earth is 4.5 billion years old,” Fridlund said. He added that organisms present in the early phases of life on Earth produced methane, which alters atmospheric composition compared with a planet devoid of life.Although the Darwin project will use a pragmatic and specific definition of life, biologists, philosophers and science-fiction authors have devised numerous other definitions—none of which are entirely satisfactory. Some are based on basic physiological characteristics: a living organism must feed, grow, metabolize, respond to stimuli and reproduce. Others invoke metabolic definitions that define a living organism as having a distinct boundary—such as a membrane—which facilitates interaction with the environment and transfers the raw materials needed to maintain its structure (Wharton, 2002). The minimal cell project, for example, defines cellular life as “the capability to display a concert of three main properties: self-maintenance (metabolism), reproduction and evolution. When these three properties are simultaneously present, we will have a full fledged cellular life” (Luisi, 2007). These concepts regard life as an emergent phenomenon arising from the interaction of non-living chemical components.Cryptobiosis—hidden life, also known as anabiosis—and bacterial endospores challenge the physiological and metabolic elements of these definitions (Wharton, 2002). When the environment changes, certain organisms are able to undergo cryptobiosis—a state in which their metabolic activity either ceases reversibly or is barely discernible. Cryptobiosis allows the larvae of the African fly Polypedilum vanderplanki to survive desiccation for up to 17 years and temperatures ranging from −270 °C (liquid helium) to 106 °C (Watanabe et al, 2002). It also allows the cysts of the brine shrimp Artemia to survive desiccation, ultraviolet radiation, extremes of temperature (Wharton, 2002) and even toyshops, which sell the cysts as ‘sea monkeys''. Organisms in a cryptobiotic state show characteristics that vary markedly from what we normally consider to be life, although they are certainly not dead. “[C]ryptobiosis is a unique state of biological organization”, commented James Clegg, from the Bodega Marine Laboratory at the University of California (Davies, CA, USA), in an article in 2001 (Clegg, 2001). Bacterial endospores, which are the “hardiest known form of life on Earth” (Nicholson et al, 2000), are able to withstand almost any environment—perhaps even interplanetary space. Microbiologists isolated endospores of strict thermophiles from cold lake sediments and revived spores from samples some 100,000 years old (Nicholson et al, 2000).…life might be not only stranger than we imagine, but also stranger than we can imagineAnother problem with the definitions of life is that these can expand beyond biology. The minimal cell project, for example, in common with most modern definitions of life, encompass the ability to undergo Darwinian evolution (Wharton, 2002). “To be considered alive, the organism needs to be able to undergo extensive genetic modification through natural selection,” said Professor Paul Freemont from Imperial College London, UK, whose research interests encompass synthetic biology. But the virtual ‘organisms'' in computer simulations such as the Game of Life (www.bitstorm.org/gameoflife) and Tierra (http://life.ou.edu/tierra) also exhibit life-like characteristics, including growth, death and evolution—similar to robots and other artifical systems that attempt to mimic life (Guruprasad & Sekar, 2006). “At the moment, we have some problems differentiating these approaches from something biologists consider [to be] alive,” Fridlund commented.…to decide who is first in the ‘race to create life'' requires a consensus definition of lifeBoth the genetic code and all computer-programming languages are means of communicating large quantities of codified information, which adds another element to a comprehensive definition of life. Guenther Witzany, an Austrian philosopher, has developed a “theory of communicative nature” that, he claims, differentiates biotic and abiotic life. “Life is distinguished from non-living matter by language and communication,” Witzany said. According to his theory, RNA and DNA use a ‘molecular syntax'' to make sense of the genetic code in a manner similar to language. This paragraph, for example, could contain the same words in a random order; it would be meaningless without syntactic and semantic rules. “The RNA/DNA language follows syntactic, semantic and pragmatic rules which are absent in [a] random-like mixture of nucleic acids,” Witzany explained.Yet, successful communication requires both a speaker using the rules and a listener who is aware of and can understand the syntax and semantics. For example, cells, tissues, organs and organisms communicate with each other to coordinate and organize their activities; in other words, they exchange signals that contain meaning. Noradrenaline binding to a β-adrenergic receptor in the bronchi communicates a signal that says ‘dilate''. “If communication processes are deformed, destroyed or otherwise incorrectly mediated, both coordination and organisation of cellular life is damaged or disturbed, which can lead to disease,” Witzany added. “Cellular life also interprets abiotic environmental circumstances—such as the availability of nutrients, temperature and so on—to generate appropriate behaviour.”Nonetheless, even definitions of life that include all the elements mentioned so far might still be incomplete. “One can make a very complex definition that covers life on the Earth, but what if we find life elsewhere and it is different? My opinion, shared by many, is that we don''t have a clue of how life arose on Earth, even if there are some hypotheses,” Fridlund said. “This underlies many of our problems defining life. Since we do not have a good minimum definition of life, it is hard or impossible to find out how life arose without observing the process. Nevertheless, I''m an optimist who believes the universe is understandable with some hard work and I think we will understand these issues one day.”Both synthetic biology and research on organisms that live in extreme conditions allow biologists to explore biological boundaries, which might help them to reach a consensual minimum definition of life, and understand how it arose and evolved. Life is certainly able to flourish in some remarkably hostile environments. Thermus aquaticus, for example, is metabolically optimal in the springs of Yellowstone National Park at temperatures between 75 °C and 80 °C. Another extremophile, Deinococcus radiodurans, has evolved a highly efficient biphasic system to repair radiation-induced DNA breaks (Misra et al, 2006) and, as Fridlund noted, “is remarkably resistant to gamma radiation and even lives in the cooling ponds of nuclear reactors.”In turn, synthetic biology allows for a detailed examination of the elements that define life, including the minimum set of genes required to create a living organism. Researchers at the J Craig Venter Institute, for example, have synthesized a 582,970-base-pair Mycoplasma genitalium genome containing all the genes of the wild-type bacteria, except one that they disrupted to block pathogenicity and allow for selection. ‘Watermarks'' at intergenic sites that tolerate transposon insertions identify the synthetic genome, which would otherwise be indistinguishable from the wild type (Gibson et al, 2008).Yet, as Pier Luigi Luisi from the University of Roma in Italy remarked, even M. genitalium is relatively complex. “The question is whether such complexity is necessary for cellular life, or whether, instead, cellular life could, in principle, also be possible with a much lower number of molecular components”, he said. After all, life probably did not start with cells that already contained thousands of genes (Luisi, 2007).…researchers will continue their attempts to create life in the test tube—it is, after all, one of the greatest scientific challengesTo investigate further the minimum number of genes required for life, researchers are using minimal cell models: synthetic genomes that can be included in liposomes, which themselves show some life-like characteristics. Certain lipid vesicles are able to grow, divide and grow again, and can include polymerase enzymes to synthesize RNA from external substrates as well as functional translation apparatuses, including ribosomes (Deamer, 2005).However, the requirement that an organism be subject to natural selection to be considered alive could prove to be a major hurdle for current attempts to create life. As Freemont commented: “Synthetic biologists could include the components that go into a cell and create an organism [that is] indistinguishable from one that evolved naturally and that can replicate […] We are beginning to get to grips with what makes the cell work. Including an element that undergoes natural selection is proving more intractable.”John Dupré, Professor of Philosophy of Science and Director of the Economic and Social Research Council (ESRC) Centre for Genomics in Society at the University of Exeter, UK, commented that synthetic biologists still approach the construction of a minimal organism with certain preconceptions. “All synthetic biology research assumes certain things about life and what it is, and any claims to have ‘confirmed'' certain intuitions—such as life is not a vital principle—aren''t really adding empirical evidence for those intuitions. Anyone with the opposite intuition may simply refuse to admit that the objects in question are living,” he said. “To the extent that synthetic biology is able to draw a clear line between life and non-life, this is only possible in relation to defining concepts brought to the research. For example, synthetic biologists may be able to determine the number of genes required for minimal function. Nevertheless, ‘what counts as life'' is unaffected by minimal genomics.”Partly because of these preconceptions, Dan Nicholson, a former molecular biologist now working at the ESRC Centre, commented that synthetic biology adds little to the understanding of life already gained from molecular biology and biochemistry. Nevertheless, he said, synthetic biology might allow us to go boldly into the realms of biological possibility where evolution has not gone before.An engineered synthetic organism could, for example, express novel amino acids, proteins, nucleic acids or vesicular forms. A synthetic organism could use pyranosyl-RNA, which produces a stronger and more selective pairing system than the natural existent furanosyl-RNA (Bolli et al, 1997). Furthermore, the synthesis of proteins that do not exist in nature—so-called never-born proteins—could help scientists to understand why evolutionary pressures only selected certain structures.As Luisi remarked, the ratio between the number of theoretically possible proteins containing 100 amino acids and the real number present in nature is close to the ratio between the space of the universe and the space of a single hydrogen atom, or the ratio between all the sand in the Sahara Desert and a single grain. Exploring never-born proteins could, therefore, allow synthetic biologists to determine whether particular physical, structural, catalytic, thermodynamic and other properties maximized the evolutionary fitness of natural proteins, or whether the current protein repertoire is predominately the result of chance (Luisi, 2007).In the final analysis, as with all science, deep understanding is more important than labelling with words.“Synthetic biology also could conceivably help overcome the ‘n = 1 problem''—namely, that we base biological theorising on terrestrial life only,” Nicholson said. “In this way, synthetic biology could contribute to the development of a more general, broader understanding of what life is and how it might be defined.”No matter the uncertainties, researchers will continue their attempts to create life in the test tube—it is, after all, one of the greatest scientific challenges. Whether or not they succeed will depend partly on the definition of life that they use, though in any case, the research should yield numerous insights that are beneficial to biologists generally. “The process of creating a living system from chemical components will undoubtedly offer many rich insights into biology,” Davies concluded. “However, the definition will, I fear, reflect politics more than biology. Any definition will, therefore, be subject to a lot of inter-lab political pressure. Definitions are also important for bioethical legislation and, as a result, reflect larger politics more than biology. In the final analysis, as with all science, deep understanding is more important than labelling with words.”  相似文献   

19.
Hunter P 《EMBO reports》2010,11(12):924-926
The global response to the credit crunch has varied from belt tightening to spending sprees. Philip Hunter investigates how various countries react to the financial crisis in terms of supporting scientific research.The overall state of biomedical research in the wake of the global financial crisis remains unclear amid growing concern that competition for science funding is compromising the pursuit of research. Such concerns pre-date the credit crunch, but there is a feeling that an increasing amount of time and energy is being wasted in the ongoing scramble for grants, in the face of mounting pressure from funding agencies demanding value for money. Another problem is balancing funding between different fields; while the biomedical sciences have generally fared well, they are increasingly dependent on basic research in physics and chemistry that are in greater jeopardy. This has led to calls for rebalancing funding, in order to ensure the long-term viability of all fields in an increasingly multidisciplinary and collaborative research world.For countries that are cutting funding—such as Spain, Italy and the UK—the immediate priority is to preserve the fundamental research base and avoid a significant drain of expertise, either to rival countries or away from science altogether. This has highlighted the plight of postdoctoral researchers who have traditionally been the first to suffer from funding cuts, partly because they have little immediate impact on on a country''s scientific competitiveness. Postdocs have been the first to go whenever budgets have been cut, according to Richard Frankel, a physicist at California Polytechnic State University in Saint Luis Obispo, who investigates magnetotaxis in bacteria. “In the short term there will be little effect but the long-term effects can be devastating,” he said.…there is a feeling that an increasing amount of time and energy is being wasted in the ongoing scramble for grants, in the face of mounting pressure from funding agencies…According to Peter Stadler, head of a bioinformatics group at the University of Leipzig in Germany, such cuts tend to cause the long-term erosion of a country''s science skills base. “Short-term cuts in science funding translate totally into a brain drain, since they predominantly affect young researchers who are paid from the soft money that is drying up first,” said Stadler. “They either leave science, an irreversible step, or move abroad but do not come back later, because the medium-term effect of cuts is a reduction in career opportunities and fiercer competition giving those already in the system a big advantage.”Even when young researchers are not directly affected, the prevailing culture of short-term funding—which requires ongoing grant applications—can be disruptive, according to Xavier Salvatella, principal investigator in the Laboratory of Molecular Biophysics at the Institute for Research in Biomedicine in Barcelona, Spain. “I do not think the situation is dramatic but too much time is indeed spent writing proposals,” he commented. “Because success rates are decreasing, the time devoted to raise funds to run the lab necessarily needs to increase.”At the University of Adelaide in Australia, Andrew Somogyi, professor of pharmacology, thinks that the situation is serious: “[M]y postdocs would spend about half their time applying for grants.” Somogyi pointed out that the success rate has been declining in Australia, as it has in some other countries. “For ARC [Australian Research Council] the success rate is now close to 20%, which means many excellent projects don''t get funding because the assessment is now so fine cut,” he said.Similar developments have taken place in the USA at both the National Institutes of Health (NIH)—which provides US$16 billion funding per year and the American Cancer Society (ACS), the country''s largest private non-profit funder of cancer research, with a much smaller pot of US$120 million per year. The NIH funded 21% of research proposals submitted to it in 2009, compared with 32% a decade earlier, while the ACS approves only 15% of grant applications, down several percentage points over the past few years.While the NIH is prevented by federal law from allowing observers in to its grant review meetings, the ACS did allow a reporter from Nature to attend one of its sessions on the condition that the names of referees and the applications themselves were not revealed (Powell, 2010). The general finding was that while the review process works well when around 30% of proposals are successful, it tends to break down as the success rate drops, as more arbitrary decisions are made and the risk of strong pitches being rejected increases. This can also discourage the best people from being reviewers because the process becomes more tiring and time-consuming.Even when young researchers are not directly affected, the prevailing culture of short-term funding—which requires ongoing grant applications—can be disruptive…In some countries, funding shortfalls are also leading to the loss of permanent jobs, for example in the UK where finance minister George Osborne announced on October 20 that the science budget would be frozen at £4.6 billion, rather than cut as had been expected. Even so, combined with the cut in funding for universities that was announced on the same day, this raises the prospect of reductions in academic staff numbers, which could affect research projects. This follows several years of increasing funding for UK science. Such uncertainty is damaging, according to Cornelius Gross, deputy head of the mouse biology unit, European Molecular Biology Laboratory in Monterotondo, Italy. “Large fluctuations in funding have been shown to cause damage beyond their direct magnitude as can be seen in the US where the Clinton boom was inevitably followed by a slowdown that led to rapid and extreme tightening of budgets,” he said.Some countries are aware of these dangers and have acted to protect budgets and, in some cases, even increase spending. A report by the OECD argued that countries and companies that boosted research and development spending during the ‘creative destruction'' of an economic downturn tended to gain ground on their competitors and emerge from the crisis in a relatively stronger position (OECD, 2009). This was part of the rationale of the US stimulus package, which was intended to provide an immediate lift to the economy and has been followed by a slight increase in funding. The NIH''s budget is set to increase by $1 billion, or 3% from 2010 to 2011, reaching just over $32 billion. This looks like a real-term increase, since inflation in the USA is now between 1 and 2%. However, there are fears that budgets will soon be cut; even now the small increase at the Federal level is being offset by cuts in state support, according to Mike Seibert, research fellow at the US Department of Energy''s National Renewable Energy Laboratory. “The stimulus funds are disappearing in the US, and the overall budget for science may be facing a correction at the national level as economic, budget, and national debt issues are addressed,” he said. “The states in most cases are suffering their own budget crises and will be cutting back on anything that is not nailed down.”…countries and companies that boosted research and development spending during the ‘creative destruction'' of an economic downturn tended to gain ground on their competitors…In Germany, the overall funding situation is also confused by a split between the Federal and 16 state governments, each of which has its own budget for science. In contrast to many other countries though, both federal and state governments have responded boldly to the credit crisis by increasing the total budget for the DFG (Deutsche Forschungsgemeinschaft)—Germany''s largest research funding agency—to €2.3 billion in 2011. Moreover, total funding for research and education from the BMBF (Federal Ministry for Education and Research) is expected to increase by another 7% from €10.9 billion in 2010 to €11.64 billion, although the overall federal budget is set to shrink by 3.8% under Germany''s austerity measures (Anon, 2010). There have also been increases in funding from non-government sources, such as the Fraunhofer Society, Europe''s largest application-oriented research organization, which has an annual budget of €1.6 billion.The German line has been strongly applauded by the European Union, which since 2007 has channelled its funding for cutting-edge research through the European Research Council (ERC). The ERC''s current budget of €7.5 billion, which runs until 2013, was set in 2007 and negotiations for the next period have not yet begun, but the ERC''s executive agency director Jack Metthey has indicated that it will be increased: “The Commission will firmly sustain in the negotiations the view that research and innovation, central to the Europe 2020 Strategy agreed by the Member States, should be a top budgetary priority.” Metthey also implied that governments cutting funding, as the UK had been planning to do, were making a false economy that would gain only in the short term. “Situations vary at the national level but the European Commission believes that governments should maintain and even increase research and innovation investments during difficult times, because these are pro-growth, anti-crisis investments,” he said.Many other countries have to cope with flat or declining science budgets; some are therefore exploring ways in which to do more with less. In Japan, for instance, money has been concentrated on larger projects and fewer scientists, with the effect of intensifying the grant application process. Since 2002, the total Japanese government budget for science and technology has remained flat at around ¥3,500 billion—or €27 billion at current exchange rates—with a 1% annual decline in university support but increased funding for projects considered to be of high value to the economy. This culminated in March 2010 with the launch of the ¥100 billion (€880 million) programme for World Leading Innovative Research and Development on Science and Technology.But such attempts to make funding more competitive or focus it on specific areas could have unintended side effects on innovation and risk taking. One side effect can be favouring scientists who may be less creative but good at attracting grants, according to Roger Butlin, evolutionary biologist at the University of Sheffield in the UK. “Some productive staff are being targeted because they do not bring in grants, so money is taking precedence over output,” said Butlin. “This is very dangerous if it results in loss of good theoreticians or data specialists, especially as the latter will be a critical group in the coming years.”“Scientists are usually very energetic when they can pursue their own ideas and less so when the research target is too narrowly prescribed”There have been attempts to provide funding for young scientists based entirely on merit, such as the ERC ‘Starting Grant'' for top young researchers, whose budget was increased by 25% to €661 million for 2011. Although they are welcome, such schemes could also backfire unless they are supported by measures to continue supporting the scientists after these early career grants expire, according to Gross. “There are moves to introduce significant funding for young investigators to encourage independence, so called anti-brain-drain grants,” he said. “These are dangerous if provided without later independent positions for these people and a national merit-based funding agency to support their future work.”Such schemes might work better if they are incorporated into longer-term funding programmes that provide some security as well as freedom to expand a project and explore promising side avenues. Butlin cited the Canadian ‘Discovery Grant'' scheme as an example worth adopting elsewhere; it supports ongoing programmes with long-term goals, giving researchers freedom to pursue new lines of investigation, provided that they fit within the overall objective of the project.To some extent the system of ‘open calls''—supported by some European funding agencies—has the same objective, although it might not provide long-term funding. The idea is to allow scientists to manoeuvre within a broad objective, rather than confining them to specific lines of research or ‘thematic calls'', which tend to be highly focused. “The majority of funding should be distributed through open calls, rather than thematic calls,” said Thomas Höfer from the Modeling Research Group at the German Cancer Research Center & BioQuant Center in Heidelberg. “Scientists are usually very energetic when they can pursue their own ideas and less so when the research target is too narrowly prescribed. In my experience as a reviewer at both the national and EU level, open calls are also better at funding high-quality research whereas too narrow thematic calls often result in less coherent proposals.”“Cutting science, and education, is the national equivalent of a farmer eating his ‘seed corn'', and will lead to developing nation status within a generation”Common threads seems to be emerging from the different themes and opinions about funding: budgets should be consistent over time and spread fairly among all disciplines, rather than focused on targeted objectives. They should also be spread across the working lifetime of a scientist rather than being shot in a scatter-gun approach at young researchers. Finally, policies should put a greater emphasis on long-term support for the best scientists and projects, chosen for their merit. Above all, funding policy should reflect the fundamental importance of science to economies, as Seibert concluded: “Cutting science, and education, is the national equivalent of a farmer eating his ‘seed corn'', and will lead to developing nation status within a generation.”  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号