首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
A large number of recent studies suggest that the sensorimotor system uses probabilistic models to predict its environment and makes inferences about unobserved variables in line with Bayesian statistics. One of the important features of Bayesian statistics is Occam''s Razor—an inbuilt preference for simpler models when comparing competing models that explain some observed data equally well. Here, we test directly for Occam''s Razor in sensorimotor control. We designed a sensorimotor task in which participants had to draw lines through clouds of noisy samples of an unobserved curve generated by one of two possible probabilistic models—a simple model with a large length scale, leading to smooth curves, and a complex model with a short length scale, leading to more wiggly curves. In training trials, participants were informed about the model that generated the stimulus so that they could learn the statistics of each model. In probe trials, participants were then exposed to ambiguous stimuli. In probe trials where the ambiguous stimulus could be fitted equally well by both models, we found that participants showed a clear preference for the simpler model. Moreover, we found that participants’ choice behaviour was quantitatively consistent with Bayesian Occam''s Razor. We also show that participants’ drawn trajectories were similar to samples from the Bayesian predictive distribution over trajectories and significantly different from two non-probabilistic heuristics. In two control experiments, we show that the preference of the simpler model cannot be simply explained by a difference in physical effort or by a preference for curve smoothness. Our results suggest that Occam''s Razor is a general behavioural principle already present during sensorimotor processing.  相似文献   

2.

Objective

Previous studies identified different typologies of role models (as teacher/supervisor, physician and person) and explored which of faculty''s characteristics could distinguish good role models. The aim of this study was to explore how and to which extent clinical faculty''s teaching performance influences residents'' evaluations of faculty''s different role modelling statuses, especially across different specialties.

Methods

In a prospective multicenter multispecialty study of faculty''s teaching performance, we used web-based questionnaires to gather empirical data from residents. The main outcome measures were the different typologies of role modelling. The predictors were faculty''s overall teaching performance and faculty''s teaching performance on specific domains of teaching. The data were analyzed using multilevel regression equations.

Results

In total 219 (69% response rate) residents filled out 2111 questionnaires about 423 (96% response rate) faculty. Faculty''s overall teaching performance influenced all role model typologies (OR: from 8.0 to 166.2). For the specific domains of teaching, overall, all three role model typologies were strongly associated with “professional attitude towards residents” (OR: 3.28 for teacher/supervisor, 2.72 for physician and 7.20 for the person role). Further, the teacher/supervisor role was strongly associated with “feedback” and “learning climate” (OR: 3.23 and 2.70). However, the associations of the specific domains of teaching with faculty''s role modelling varied widely across specialties.

Conclusion

This study suggests that faculty can substantially enhance their role modelling by improving their teaching performance. The amount of influence that the specific domains of teaching have on role modelling differs across specialties.  相似文献   

3.
A major challenge in computational biology is constraining free parameters in mathematical models. Adjusting a parameter to make a given model output more realistic sometimes has unexpected and undesirable effects on other model behaviors. Here, we extend a regression-based method for parameter sensitivity analysis and show that a straightforward procedure can uniquely define most ionic conductances in a well-known model of the human ventricular myocyte. The model''s parameter sensitivity was analyzed by randomizing ionic conductances, running repeated simulations to measure physiological outputs, then collecting the randomized parameters and simulation results as “input” and “output” matrices, respectively. Multivariable regression derived a matrix whose elements indicate how changes in conductances influence model outputs. We show here that if the number of linearly-independent outputs equals the number of inputs, the regression matrix can be inverted. This is significant, because it implies that the inverted matrix can specify the ionic conductances that are required to generate a particular combination of model outputs. Applying this idea to the myocyte model tested, we found that most ionic conductances could be specified with precision (R2 > 0.77 for 12 out of 16 parameters). We also applied this method to a test case of changes in electrophysiology caused by heart failure and found that changes in most parameters could be well predicted. We complemented our findings using a Bayesian approach to demonstrate that model parameters cannot be specified using limited outputs, but they can be successfully constrained if multiple outputs are considered. Our results place on a solid mathematical footing the intuition-based procedure simultaneously matching a model''s output to several data sets. More generally, this method shows promise as a tool to define model parameters, in electrophysiology and in other biological fields.  相似文献   

4.
Howard Wolinsky 《EMBO reports》2013,14(10):871-873
Will the US Supreme Court''s ruling that genes can no longer be patented in the USA boost venture capital investment into biotech and medical startup companies?Three years ago, Noubar Afeyan, managing partner and CEO of Flagship Ventures, an early-stage venture capital firm in Cambridge, Massachusetts, USA, was working with a biotech start-up company developing techniques for BRCA gene testing for breast cancer risk that avoided the patents held by Myriad Genetics, a molecular diagnostics company in Salt Lake City (Utah, USA) and the only operator in the field. However, despite the promise of the start-up''s techniques, investors were put off by Myriad''s extensive patent portfolio and fiercely defensive tactics: “A lot of investors were simply not willing to take that chance, even though our technology was superior in many ways and patentably different,” Afeyan said. The effort to launch the start-up ultimately failed.…it is also not clear how the Supreme Court''s ruling will affect the […] industry at large, now that one of the most contested patents for a human gene has been ruled invalidAfeyan believes the prospects for such start-ups improved on the morning of 13 June 2013 when the US Supreme Court ruled in an unanimous vote that Myriad''s fundamental patents on the BRCA1 and BRCA2 genes themselves are invalid, opening up the field to new competitors. The court''s ruling, however, validated Myriad''s patents for BRCA cDNA and methods-of-use.The court''s decision comes at a time when venture capital investment into the life sciences is projected to decline in the years ahead. Some believe that the court''s decision sets a precedent and could provide a boost for products, diagnostics and other tests under development that would have been legally difficult in the light of existing patents on human and other DNA sequences.The US Patent Office issued the original patents for the BRCA 1 and BRCA2 genes in 1997 and 1998 for the US National Institute of Environmental Health Services, the University of Utah and Myriad Genetics. One year earlier, Myriad had launched its first diagnostic test for breast cancer risk based on the two genes and has since aggressively defended it against both private and public competitors in court. Many universities and hospitals were originally offering the test for a lower cost, but Myriad forced them to stop and eventually monopolized BRCA-based diagnostics for breast cancer risk in the USA and several other countries.“Myriad did not create anything,” Justice Clarence Thomas wrote in the Supreme Court''s decision. “To be sure, it found an important and useful gene, but separating that gene from its surrounding genetic material is not an act of invention.” Even so, the court did uphold Myriad''s patents on the methodology of its test. Ron Rogers, a spokesman for the biotech firm, said the Supreme Court had “affirmed the patent eligibility of synthetic DNA and underscored the importance and applicability of method-of-use patents for gene-based diagnostic tests. Before the Supreme Court case we had 24 patents and 520 claims. After the Supreme Court decision, we still have 24 patents. […] [T]he number of our patent claims was reduced to 515. In the Supreme Court case itself, only nine of our 520 patent claims were at issue. Of the nine, the Supreme Court ruled that five were not patent-eligible and they ruled that four were patent-eligible. We still have strong intellectual property protection surrounding our BRCA test and the Supreme Court''s decision doesn''t change that.”Within hours of the ruling, capitalism kicked into high gear. Two companies, Ambry Genetics in Alieso Viejo, California, and Gene by Gene Ltd in Houston, Texas, USA, announced that they were launching tests for the BRCA1 and BRCA2 genes for less than the US$3,100 Myriad has been charging privately insured patients and US$2,795 for patients covered by Medicare—the government health plan for the elderly and disabled. Several other companies and universities also announced they would be offering BRCA testing.Entrepreneur Bennett Greenspan, a managing partner of Gene by Gene, explained that his company had been poised to offer BRCA testing if the Supreme Court ruled against Myriad. He said, “We had written a press release with our PR firm a month before the release of the Supreme Court with the intention that if the Supreme Court overruled the patent or invalidated the patent that we would launch right away and if they didn''t, we would just tear up the press release.” His company had previously offered BRCA gene testing in Israel based on guidelines from the European Union.Myriad Genetics has not given up defending its patents, however. On 9 and 10 July 2013, it slapped Ambry and Gene by Gene with lawsuits in the US District Court in Salt Lake City for allegedly infringing on patents covering synthetic DNA and methods-of-use related to the BRCA1 and BRCA2 genes. Rogers commented that the testing processes used by the firms “infringes 10 patents covering synthetic primers, probes and arrays, as well as methods of testing, related to the BRCA1 and BRCA2 genes.”On 6 August 2013, Ambry countersued Myriad, arguing that the company “continues a practice of using overreaching practices to wrongfully monopolize the diagnostic testing of humans'' BRCA1 and BRCA2 genes in the United States and to attempt to injure any competitor […] Due to Myriad''s anticompetitive conduct, customers must pay significantly higher prices for Myriad''s products in the relevant market, often nearly twice as high as the price of Ambry''s products and those of other competitors” [1].Just as the courts will have to clarify whether the competitors in this case infringe on Myriad''s patents, it is also not clear how the Supreme Court''s ruling will affect the biotech and diagnostics industry at large, now that one of the most contested patents for a human gene has been ruled invalid. In recent years, venture capital investment into the life sciences has been in decline. The National Venture Capital Association and the Medical Innovation & Competitiveness Coalition reported from a survey that, “An estimated funding loss of half a billion dollars over the next three years will cost America jobs at a time when we desperately need employment growth” [2]. The survey of 156 venture capital firms found that 39% of respondents said they had reduced investment in the life sciences during the previous three years, and the same proportion intended to do so in the next three years. “[US Food and Drug Administration] FDA regulatory challenges were identified as having the highest impact on these investment decisions,” the report states, adding that many investors intended to shift their focus from the US towards Europe and the Asia/Pacific region.Another report from the same groups explains how public policy involving the FDA and other players in “the medical innovation ecosystem”—including the US patent system, public agencies, tax policy, securities regulation, immigration laws and private groups such as insurers—affect the decisions of investors to commit to funding medical innovation [3].Some investors think that the court decision about the patentability of human DNA will increase confidence and help to attract investors back to the life sciencesSome investors think that the court decision about the patentability of human DNA will increase confidence and help to attract investors back to the life sciences. “The clarity is helpful because for the longest time people didn''t do things because of ambiguity about whether those patents would be enforceable,” Afeyan said. “It''s one thing to not do something because of a patent, it''s another to not do something because you know that they have patents but you''re not sure what it''s going to stop you from doing because it hasn''t been really fully fleshed out. Now I think it is reasonably well fleshed out and I think you will see more innovation in the space.”Others also appreciate the clarification from the Supreme Court about what is a patentable invention in regard to human genes and DNA. “The Myriad decision was a very solid reading of the underlying purpose of our patent law, which is to reward novel invention,” commented Patrick Chung, a partner with New Enterprise Associates, a venture capital firm in Menlo Park, California, which invested in 23andMe, a personal genomics company based in Mountain View (California, USA), and who serves on the 23andMe board.But not everyone agrees that the Supreme Court''s decision has provided clarity. “You could spin it and say that it was beneficial to create some certainty, but at the end of the day, what the Court did was reduce the scope of what you''re allowed to get patent claims on,” said Michael Schuster, a patent lawyer and Intellectual Property Partner and Co-Chair of the Life Sciences Group at Fenwick & West LLP in San Francisco, California, USA. “It''s going to be a continuing dance between companies, smart patent lawyers, and the courts to try to minimize the impact of this decision.”Kevin Noonan, a molecular biologist and patent lawyer with McDonnell Boehnen Hulbert & Berghoff LLP in Chicago, Illinois, USA, commented that he does not expect the Supreme Court decision will have much of an impact on venture investments or anything else. “This case comes at a time fortunately when biotechnology is mature enough so that the more pernicious effects of the decision are not going to be quite as harmful as they would if this had happened ten, 15 or 20 years ago,” he said. “We''re now in the ‘post-genomic'' era; since the late ‘90s and turn of the century, the genomic and genetic data from the Human Genome Project have been on publicly available databases. As a consequence, if a company didn''t apply for a patent before the gene was disclosed publicly, it certainly is not able to apply for a patent now. The days of obtaining these sequences and trying to patent them are behind us.”Noonan also noted that the Myriad Genetics patents were due to expire in 2014–2015 anyway. “Patents are meaningless if you can''t enforce them. And when they expire, you can no longer enforce them. So it really isn''t an impediment to genetic testing now,” he explained. “What the case illustrates is a disconnect between scientists and lawyers. That''s an old battle.”George Church, professor of genetics at Harvard Medical School (Boston, Massachusetts, USA) and Director of the Personal Genome Project, maintains that the Supreme Court decision will have minimal influence on the involvement of venture capitalists in biotech. “I think it''s a non-issue. It''s basically addressing something that was already dead. That particular method of patenting or trying to patent components of nature without modification was never really a viable strategy and in a particular case of genes, most of the patents in the realm of bio-technology have added value to genes and that''s what they depend on to protect their patent portfolio—not the concept of the gene itself,” he said. “I don''t know of any investor who is freaked out by this at all. Presumably there are some, because the stock oscillates. But you can get stock to oscillate with all kinds of nonsense. But I think the sober, long-term investors who create companies that keep innovating are not impacted.”Church suggests that the biggest concern for Myriad now is whole-gene sequencing, rather than the Supreme Court''s decision. “Myriad should be worrying about the new technology, and I''m sure they''ve already considered this. The new technology allows you to sequence hundreds of genes or the whole genome for basically the price they''ve been charging all along for two genes. And from what I understand, they are expanding their collection to many genes, taking advantage of next generation sequencing as other companies have already,” he said.Whatever its consequences in the US, the Supreme Court''s decision will have little impact on other parts of the world, notably Europe, where Myriad also holds patents on the BRCA genes in several countries. Gert Matthijs, Head of the Laboratory for Molecular Diagnostics at the Centre for Human Genetics in Leuven, Belgium, says that even though the US Supreme Court has invalidated the principle of patenting genes in America, the concept remains in Europe and is supported by the European Parliament and the European Patent Convention. “Legally, nothing has changed in Europe,” he commented. “But there is some authority from the US Supreme Court even if it''s not legal authority in Europe. Much of what has been used as arguments in the Supreme Court discussions has been written down by the genetics community in Europe back in 2008 in the recommendations on behalf of the European Society for Human Genetics. The Supreme Court decision is something that most of us in Europe would agree upon only because people have been pushing towards protecting the biotech industry that the pendulum was so way out in Europe.”Benjamin Jackson, Senior Director of legal affairs at Myriad Genetics, commented that Myriad holds several patents in Europe that are not likely to be affected by the Supreme Court''s ruling. “The patent situation both generally and for Myriad is a lot clearer in Europe. The European Union Biotech Directive very clearly says that isolated DNA is patentable even if it shares the same sequence as natural DNA,” he said. “Right now, it''s pretty uncontroversial, or at least it''s well settled law basically in Europe that isolated DNA is patentable.” However, while the Directive states that “biological material which is isolated from its natural environment or produced by means of a technical process” might be patentable “even if it previously occurred in nature”, the European Patent Office (EPO) in Munich, Germany, requires that the subject matter is an inventive step and not just an obvious development of existing technology and that the industrial application and usefulness must be disclosed in the application.Myriad has opened a headquarters in Zurich and a lab in Munich during the past year, hoping to make inroads in Europe. In some EU countries, Myriad offers its BRCA test as part of cancer diagnosis. In other countries, BRCA testing is conducted at a fraction of what Myriad charges in the USA, either because institutions ignore the patents that are not enforced in their jurisdictions, or because these countries, such as Belgium, were not included in the patent granted by the European Patent Office. Moreover, in various countries BRCA testing is only available through the healthcare system and only as part of a more extensive diagnosis of cancer risk. In addition, as Matthijs commented, “[t]he healthcare system in Europe is very heterogeneous and that''s also of course a big impediment for a big laboratory to try and conquer Europe because you have to go through different reimbursement policies in different countries and that''s not easy.”Ultimately, it seems the Supreme Court''s decision might turn out to have little impact on biotech firms in either the USA or Europe. Technological advances, in particular new sequencing technologies, might render the issue of patenting individual genes increasingly irrelevant.  相似文献   

5.
Despite the scientific community''s overwhelming support for the European Research Council, many grant recipients are irked about red tapeThere is one thing that most European researchers agree on: B stands for Brussels and bureaucracy. Research funding from the European Commission (EC), which distributes EU money, is accompanied by strict accountability and auditing rules in order to ensure that European taxpayers'' money is not wasted. All disbursements are treated the same, whether subsidies to farmers or grants to university researchers. However, the creation of the European Research Council (ERC) in 2007 as a new EU funding agency for basic research created high hopes among scientists for a reduced bureaucratic burden.… many researchers who have received ERC funding have been angered with accounting rules inherited from the EC''s Framework Programmes…ERC has, indeed, been a breath of fresh air to European-level research funding as it distributes substantial grants based only on the excellence of the proposal and has been overwhelmingly supported by the scientific community. Nevertheless, many researchers who have received ERC funding have been angered with accounting rules inherited from the EC''s Framework Programmes, and which seem impossible to change. In particular, a requirement to fill out time sheets to demonstrate that scientists spend an appropriate amount of time working on the project for which they received their ERC grant has triggered protests over the paperwork (Jacobs, 2009).Luis Serrano, Coordinator of the Systems Biology Programme at the Centre for Genomic Regulation in Barcelona, Spain, and recipient of a €2 million ERC Advanced Investigator Grant for five years, said the requirement of keeping time sheets is at best a waste of time and worst an insult to the high-level researchers. “Time sheets do not make much sense, to be honest. If you want to cheat, you can always cheat,” he said. He said other grants he receives from the Spanish government and the Human Frontier Science Programme do not require time sheets.Complaints by academic researchers about the creeping bureaucratization of research are not confined to the old continent (see Opinion by Paul van Helden, page 648). As most research, as well as universities and research institutes, is now funded by public agencies using taxpayers'' money, governments and regulators feel to be under pressure to make sure that the funds are not wasted or misappropriated. Yet, the USA and the EU have taken different approaches to making sure that scientists use public money correctly. In the USA, misappropriation of public money is considered a criminal offence that can be penalized by a ban on receiving public funds, fines and even jail time; in fact, a few scientists in the USA have gone to prison.By contrast, the EU puts the onus on controlling how public money is spent upfront. Research funding under the EU''s Framework Programmes requires clearly spelt out deliverables and milestones, and requires researchers to adhere to strict accountability and auditing rules. Not surprisingly, this comes with an administrative burden that has raised the ire of many scientists who feel that their time is better spent doing research. Serrano said in a major research centre such as the CRG, the administration could minimize the paper burden. “My administration prepares them for me and I go one, two, three, four, five and I do all of them. You can even have a machine sign for you,” he commented. “But I can imagine researchers who don''t have the administrative help, this can take up a significant amount of time.” For ERC grants, which by definition are for ‘blue-skies'' research and thus do not have milestones or deliverables, such paperwork is clearly not needed.Complaints by academic researchers about the creeping bureaucratization of research are not confined to the old continentNot everyone is as critical as Serrano though. Vincent Savolainen at the Division of Biology at Imperial College London, UK, and recipient of a €2.5 million, five-year ERC Advanced Investigator Grant, said, “Everything from the European Commission always comes with time sheets, and ERC is part of the European Commission.” Still, he felt it was very confusing to track time spent on individual grants for Principal Investigators such as him. “It is a little bit ridiculous but I guess there are places where people may abuse the system. So I can also see the side of the European Commission,” he said. “It''s not too bad. I can live with doing time sheets every month,” he added. “Still, it would be better if they got rid of it.”Juleen Zierath, an integrative physiologist in the Department of Molecular Medicine at Karolinska Institutet (Stockholm, Sweden), who received a €2.5 million, five-year ERC grant, takes the time sheets in her stride. “If I worked in a company, I would have to fill out a time sheet,” she said. “I''m delighted to have the funding. It''s a real merit. It''s a real honour. It really helps my work. If I have to fill out a time sheet for the privilege of having that amount of funding for five years, it''s not a big issue.”Zierath, a native of Milwaukee (WI, USA) who came to Karolinska for graduate work in 1989, said the ERC''s requirements are certainly “bureaucracy light” compared with the accounting and reporting requirements for more traditional EU funding instruments, such as the ‘Integrated Projects''. “ERC allows you to focus more on the science,” she said. “I don''t take time sheets as a signal that the European Union doesn''t count on us to be doing our work on the project. They have to be able to account for where they''re spending the money somehow and I think it''s okay. I can understand where some people would be really upset about that.”…governments and regulators feel to be under pressure to make sure that the funds are not wasted or misappropriated…The complaints about time sheets and other bureaucratic red tape have caught the attention of high-level scientists and research managers throughout Europe. In March 2009, the EC appointed an outside panel, headed by Vaira Vike-Freiberga, former President of Latvia, to review the ERC''s structures and mechanisms. The panel reported in July last year that the objective of building a world-class institution is not properly served by “undue cumbersome regulations, checks and controls.” Although fraud and mismanagement should be prevented, excessively bureaucratic procedures detract from the mission, and might be counter-productive.Helga Nowotny, President of the ERC, said the agency has to operate within the rules of the EC''s Framework Programme 7, which includes the ERC. She explained that if researchers hold several grants, the EC wants recipients to account for their time. “The Commission and the Rules of Participation of course argue that many of these researchers have more than one grant or they may have other contracts. In order to be accountable, the researchers must tell us how much time they spend on the project. But instead of simply asking if they spent a percentage of time on it, the Commission auditors insist on time sheets. I realize that filling them out has a high symbolic value for a researcher. So, why not leave it to the administration of the host institution?”Particle physicist Ian Halliday, President of the European Science Foundation and a major supporter of the ERC, said that financial irregularities that affected the EU over many years prompted the Commission to tighten its monitoring of cash outlays. “There have been endless scandals over the agricultural subsidies. Wine leaks. Nonexistent olive trees. You name it,” he said. “The Commission''s financial system is designed to cope with that kind of pressure as opposed to trusting the University of Cambridge, for example, which has been there for 800 years or so and has a well-earned reputation by now. That kind of system is applied in every corner of the European Commission. And that is basically what is causing the trouble. But these rules are not appropriate for research.”…financial irregularities that affected the EU over many years prompted the Commission to tighten its monitoring of cash outlaysNowotny is sympathetic and sensitive to the researchers'' complaints, saying that requiring time sheets for researchers sends a message of distrust. “It feels like you''re not trusted. It has this sort of pedantic touch to it,” she said. “If you''ve been recognized for doing this kind of top research, researchers feel, ‘Why bother [with time sheets]?''” But the bureaucratic alternative would not work for the ERC either. This would mean spelling out ‘deliverables'' in advance, which is clearly not possible with frontier research.Moreover, as Halliday pointed out, there is inevitably an element of fiction with time sheets in a research environment. In his area of research, for example, he considers it reasonable to track the hours of a technician fabricating parts of a telescope. But he noted that there is a different dynamic for researchers: “Scientists end up doing their science sitting in their bath at midnight. And you mull over problems and so forth. How do you put that on a time sheet?” Halliday added that one of the original arguments in establishing the ERC was to put it at an arm''s length from the Commission and in particular from financial regulations. But to require scientists to specify what proportion of their neurons are dedicated to a particular project at any hour of the day or night is nonsensical. Nowotny agreed. “The time sheet says I''ve been working on this from 11 in the morning until 6 in the evening or until midnight or whatever. This is not the way frontier research works,” she said.Halliday, who served for seven years as chief executive of the Particle Physics and Astronomy Research Council (Swindon, UK), commented that all governments require accountability. In Great Britain, for instance, much more general accountability rules are applied to grantees, thereby offering a measure of trust. “We were given a lot of latitude. Don''t get me wrong that we allowed fraud, but the system was fit for the purpose of science. If a professor says he''s spending half his time on a certain bit of medical research, let''s say, the government will expect half his salary to show up in the grants he gets from the funding agencies. We believe that if the University of Cambridge says that this guy is spending half his time on this research, then that''s probably right and nobody would get excited if it was 55% or 45%. People would get excited if it was 5%. There are checks and balances at that kind of level, but it''s not at a level of time sheets. It will be checked whether the project has done roughly what it said.”Other funding agencies also take a less bureaucratic approach. Candace Hassall, head of Basic Careers at the Wellcome Trust (London, UK), which funds research to improve human and animal health, said Wellcome''s translation awards have milestones that researchers are expected to meet. But “time sheets are something that the Wellcome Trust hasn''t considered at all. I would be astonished if we would ever consider them. We like to work closely with our researchers, but we don''t require that level of reporting detail,” she said. “We think that such detailed, day-by-day monitoring is actually potentially counterproductive overall. It drives people to be afraid to take risks when risks should be taken.”…to require scientists to specify what proportion of their neurons are dedicated to a particular project at any hour of the day or night is nonsensicalOn the other side of the Atlantic, Jack Dixon, vice president and chief scientific officer at the Howard Hughes Medical Institution (Chevy Chase, MD, USA), who directs Hughes'' investigator programme, said he''d never heard of researchers being asked to keep time sheets: “Researchers filling out time sheets is just something that''s never crossed our minds at the Hughes. I find it sort of goofy if you want to know the truth.”In fact, a system based on trust still works better in the academic worldInstead, Hughes trusts researchers to spend the money according to their needs. “We trust them,” Dixon said. “What we ask each of our scientists to do is devote 75% of their time to research and then we give them 25% of their time which they can use to teach, serve on committees. They can do consulting. They can do a variety of things. Researchers are free to explore.”There is already growing support for eliminating the time sheets and other bureaucratic requirements that come with an ERC grant, and which are obviously just a hangover from the old system. Indeed, there have been complaints, such as reviewers of grant applications having to fax in copies of their passports or identity cards, before being allowed sight of the proposals, said Nowotny. The review panel called on the EC to adapt its rules “based on trust and not suspicion and mistrust” so that the ERC can attain the “full realization of the dream shared by so many Europeans in the academic and policy world as well as in political milieus.”In fact, a system based on trust still works better in the academic world. Hassall commented that lump-sum payments encourage the necessary trust and give researchers a sense of freedom, which is already the principle behind ERC funding. “We think that you have to trust the researcher. Their careers are on the line,” she said. Nowotny hopes ERC will be allowed to take a similar approach to that of the Wellcome Trust, with its grants treated more like “a kind of prize money” than as a contract for services.She sees an opportunity to relax the bureaucratic burden with a scheduled revision of the Rules of Participation but issues a word of caution given that, when it comes to EU money, other players are involved. “We don''t know whether we will succeed in this because it''s up to the finance ministers, not even the research ministers,” she explained. “It''s the finance ministers who decide the rules of participation. If finance ministers agree then the time sheets would be gone.”  相似文献   

6.
Thornley JH 《Annals of botany》2011,108(7):1365-1380

Background and Aims

Plant growth and respiration still has unresolved issues, examined here using a model. The aims of this work are to compare the model''s predictions with McCree''s observation-based respiration equation which led to the ‘growth respiration/maintenance respiration paradigm’ (GMRP) – this is required to give the model credibility; to clarify the nature of maintenance respiration (MR) using a model which does not represent MR explicitly; and to examine algebraic and numerical predictions for the respiration:photosynthesis ratio.

Methods

A two-state variable growth model is constructed, with structure and substrate, applicable on plant to ecosystem scales. Four processes are represented: photosynthesis, growth with growth respiration (GR), senescence giving a flux towards litter, and a recycling of some of this flux. There are four significant parameters: growth efficiency, rate constants for substrate utilization and structure senescence, and fraction of structure returned to the substrate pool.

Key Results

The model can simulate McCree''s data on respiration, providing an alternative interpretation to the GMRP. The model''s parameters are related to parameters used in this paradigm. MR is defined and calculated in terms of the model''s parameters in two ways: first during exponential growth at zero growth rate; and secondly at equilibrium. The approaches concur. The equilibrium respiration:photosynthesis ratio has the value of 0·4, depending only on growth efficiency and recycling fraction.

Conclusions

McCree''s equation is an approximation that the model can describe; it is mistaken to interpret his second coefficient as a maintenance requirement. An MR rate is defined and extracted algebraically from the model. MR as a specific process is not required and may be replaced with an approach from which an MR rate emerges. The model suggests that the respiration:photosynthesis ratio is conservative because it depends on two parameters only whose values are likely to be similar across ecosystems.  相似文献   

7.
Wolinsky H 《EMBO reports》2011,12(8):772-774
With large charities such as the Wellcome Trust or the Gates Foundation committed to funding research, is there a risk that politicians could cut public funding for science?Towards the end of 2010, with the British economy reeling from the combined effects of the global recession, the burst bubble of property speculation and a banking crisis, the country came close to cutting its national science and research budget by up to 25%. UK Business Secretary Vince Cable argued, “there is no justification for taxpayers'' money being used to support research which is neither commercially useful nor theoretically outstanding” (BBC, 2010). The outcry from UK scientists was both passionate and reasoned until, in the end, the British budget slashers blinked and the UK government backed down. The Chancellor of the Exchequer, George Osborne, announced in October that the government would freeze science and research funding at £4.6 billion per annum for four years, although even this represents about a 10% cut in real terms, because of inflation.“there is no justification for taxpayers'' money being used to support research which is neither commercially useful nor theoretically outstanding”There has been a collective sigh of relief. Sir John Savill, Chief Executive of the Medical Research Council (UK), said: “The worst projections for cuts to the science budget have not been realised. It''s clear that the government has listened to and acted on the evidence showing investment in science is vital to securing a healthy, sustainable and prosperous future.”Yet Britain is unusual compared with its counterparts elsewhere in the European Union (EU) and the USA, because private charities, such as the Wellcome Trust (London, UK) and Cancer Research UK (London, UK), already have budgets that rival those of their government counterparts. It was this fact, coupled with UK Prime Minister David Cameron''s idea of the ‘big society''—a vision of smaller government, increased government–private partnerships and a bigger role for non-profit organizations, such as single-disease-focused charities—that led the British government to contemplate reducing its contribution to research, relying on the private sector to pick up the slack.Jonathan Grant, president of RAND Europe (London, UK)—a not-for-profit research institute that advises on policy and decision-making—commented: “There was a strong backlash and [the UK Government] pulled back from that position [to cut funding]. But that''s the first time I''ve really ever seen it floated as a political idea; that government doesn''t need to fund cancer research because we''ve got all these not-for-profits funding it.”“…that''s the first time I''ve really ever seen it floated as a political idea; that government doesn''t need to fund cancer research because we''ve got all these not-for-profits funding it”But the UK was not alone in mooting the idea that research budgets might have to suffer under the financial crisis. Some had worried that declining government funding of research would spread across the developed world, although the worst of these fears have not been realized.Peter Gruss, President of the Max Planck Society (Munich, Germany), explained that his organization receives 85% of its more-than €1.5 billion budget from the public purses of the German federal government, German state ministries and the EU, and that not all governments have backed away from their commitment to research. In fact, during the crisis, the German and US governments boosted their funding of research with the goal of helping the economic recovery. In 2009, German Chancellor Angela Merkel''s government, through negotiation with the German state science ministries, approved a windfall of €18 billion in new science funding, to be spread over the next decade. Similarly, US President Barack Obama''s administration boosted spending on research with a temporary stimulus package for science, through the American Recovery and Reinvestment Act.Even so, Harry Greenberg, Senior Associate Dean for Research at Stanford University (California, USA) pointed out that until the US government injected stimulus funding, the budget at the National Institutes of Health (NIH; Bethesda, Maryland, USA) had essentially “been flat as a pancake for five or six years, and that means that it''s actually gone down and it''s having an effect on people being able to sustain their research mission.”Similarly, Gruss said that the research community should remain vigilant. “I think one could phrase it as there is a danger. If you look at Great Britain, there is the Wellcome Trust, a very strong funding organization for life sciences and medical-oriented, health-oriented research. I think it''s in the back of the minds of the politicians that there is a gigantic foundation that supports that [kind of research]. I don''t think one can deny that. There is an atmosphere that people like the Gates family [Bill and Melinda Gates Foundation] invests in health-related issues, particularly in the poorer countries [and that] maybe that is something that suffices.”The money available for research from private foundations and charities is growing in both size and scope. According to Iain Mattaj, Director General of the European Molecular Biology Laboratory (EMBL; Heidelberg, Germany), this growth might not be a bad thing. As he pointed out, private funding often complements government funding, with charities such as the Wellcome Trust going out of their way to leverage government spending without reducing government contributions. “My feeling is that the reason that the UK government is freezing research funding has all to do with economics and nothing to do with the fact that there are potentially private funders,” he said. “Several very large charities in particular are putting a lot of money into health research. The Gates Foundation is the biggest that has just come on the scene, but the Howard Hughes Medical Institute [HHMI; Chevy Chase, Maryland, USA] and the Wellcome Trust are very big, essentially private charities which have their own agendas.”…charities such as the Wellcome Trust [go] out of their way to leverage government spending without reducing government contributionscontributionsOpen in a separate window© CorbisBut, as he explained, these charities actually contribute to the overall health research budget, rather than substituting funds from one area to another. In fact, they often team up to tackle difficult research questions in partnership with each other and with government. Two-thirds of the €140 million annual budget of EMBL comes from the European states that agree to fund it, with additional contributions from private sources such as the Wellcome Trust and public sources such as the NIH.Yet over the years, as priorities have changed, the focus of those partnerships and the willingness to spend money on certain research themes or approaches has shifted, both within governments and in the private sector. Belief in the success of US President Richard Nixon''s famous ‘war on cancer'', for example, has waned over the years, although the fight and the funding continues. “I don''t want to use the word political, because of course the decisions are sometimes political, but actually it was a social priority to fight cancer. It was a social priority to fight AIDS,” Mattaj commented. “For the Wellcome Trust and the Gates Foundation, which are fighting tropical diseases, they see that as a social necessity, rather than a personal interest if you like.”Nevertheless, Mattaj is not surprised that there is an inclination to reduce research spending in the UK and many smaller countries battered by the economic downturn. “Most countries have to reduce public spending, and research is public spending. It may be less badly hit than other aspects of public spending. [As such] it''s much better off than many other aspects of public spending.”A shift away from government funding to private funding, especially from disease-focused charities, worries some that less funding will be available for basic, curiosity-driven research—a move from pure research to ‘cure'' research. Moreover, charities are often just as vulnerable to economic downturns, so relying on them is not a guarantee of funding in harsh economic times. Indeed, greater reliance on private funding would be a return to the era of ‘gentlemen scientists'' and their benefactors (Sidebar A).

Sidebar A | Gentlemen scientists

Greater reliance on private funding would return science to a bygone age of gentlemen scientists relying on the largesse of their wealthy sponsors. In 1831, for example, naturalist Charles Darwin''s (1809–1882) passage on the HMS Beagle was paid for by his father, albeit reluctantly. According to Laura Snyder, an expert on Victorian science and culture at St John''s University (New York, USA), by the time Darwin returned to England in 1836, the funding game had changed and government and private scientific societies had begun to have a bigger role. When Sir John Frederick William Herschel (1791–1871), an English mathematician, astronomer, chemist, experimental photographer and inventor, journeyed to Cape Colony in 1833, the British government offered to give him a free ride aboard an Admiralty ship. “Herschel turned them down because he wanted to be free to do whatever he wanted once he got to South Africa, and he didn''t want to feel beholden to government to do what they wanted him to do,” Snyder explained, drawing from her new book The Philosophical Breakfast Club, which covers the creation of the modern concept of science.Charles Babbage (1791–1871), the mathematician, philosopher, inventor and mechanical engineer who originated the concept of a programmable computer, was a member of the same circle as Herschel and William Whewell (1794–1866), a polymath, geologist, astronomer and theologian, who coined the word ''scientist''. Although he was wealthy, having inherited £100,000 in 1827—valued at about £13.3 million in 2008—Babbage felt that government should help pay for his research that served the public interest.“Babbage was asking the government constantly for money to build his difference engine,” Snyder said. Babbage griped about feeling like a tradesman begging to be paid. “It annoyed him. He felt that the government should just have said, ''We will support the engine, whatever it is that you need, just tell us and we''ll write you a check''. But that''s not what the government was about to do.”Instead, the British government expected Babbage to report on his progress before it loosened its purse strings. Snyder explained, “What the government was doing was a little bit more like grants today, in the sense that you have to justify getting more money and you have to account for spending the money. Babbage just wanted an open pocketbook at his disposal.”In the end the government donated £17,000, and Babbage never completed the machine.Janet Rowley, a geneticist at the University of Chicago, is worried that the change in funding will make it more difficult to obtain money for the kind of research that led to her discovery in the 1970s of the first chromosomal translocations that cause cancer. She calls such work ‘fishing expeditions''. She said that the Leukemia & Lymphoma Society (White Plains, New York, USA), for example—a non-profit funder of research—has modified its emphasis: “They have now said that they are going to put most of their resources into translational work and trying to take ideas that are close to clinical application, but need what are called incubator funds to ramp up from a laboratory to small-scale industrial production to increase the amount of compound or whatever is required to do studies on more patients.”This echoes Vince Cable''s view that taxpayers should not have to spend money on research that is not of direct economic, technological or health benefit to them. But if neither charities nor governments are willing to fund basic research, then who will pay the bill?…if neither charities nor governments are willing to fund basic research, then who will pay the bill?Iain Mattaj believes that the line between pure research and cure research is actually too blurred to make these kinds of funding distinctions. “In my view, it''s very much a continuum. I think many people who do basic research are actually very interested in the applications of their research. That''s just not their expertise,” he said. “I think many people who are at the basic end of research are more than happy to see things that they find out contributing towards things that are useful for society.”Jack Dixon, Vice President and Chief Scientific Officer at HHMI, also thinks that the line is blurry: “This divide between basic research and translational research is somewhat arbitrary, somewhat artificial in nature. I think every scientist I know who makes important, basic discoveries likes to [...] see their efforts translate into things that help humankind. Our focus at the Hughes has always been on basic things, but we love to see them translated into interesting products.” Even so, HHMI spends less than US $1 billion annually on research, which is overshadowed by the $30 billion spent by the NIH and the relatively huge budgets of the Wellcome Trust and Cancer Research UK. “We''re a small player in terms of the total research funding in the US, so I just don''t see the NIH pulling back on supporting research,” Dixon said.By way of example, Brian Druker, Professor of Medicine at the Oregon Health & Science University (Portland, Oregon, USA) and a HHMI scientist, picked up on Rowley''s work with cancer-causing chromosomal translocations and developed the blockbuster anti-cancer drug, imatinib, marketed by Novartis. “Brian Druker is one of our poster boys in terms of the work he''s done and how that is translated into helping people live longer lives that have this disease,” Dixon commented.There is a similar view at Stanford. The distinction between basic and applied is “in the eye of the beholder,” Greenberg said. “Basic discovery is the grist for the mill that leads to translational research and new breakthroughs. It''s always been a little difficult to convey, but at least here at Stanford, that''s number one. Number two, many of our very basic researchers enjoy thinking about the translational or clinical implications of their basic findings and some of them want to be part of doing it. They want some benefit for mankind other than pure knowledge.”“Basic discovery is the grist for the mill that leads to translational research and new breakthroughs”If it had not backed down from the massive cuts to the research budget that were proposed, the intention of the UK Government to cut funding for basic, rather than applied, research might have proven difficult to implement. Identifying which research will be of no value to society is like trying to decide which child will grow up to be Prime Minister. Nevertheless, most would agree that governments have a duty to get value-for-money for the taxpayer, but defining the value of research in purely economic or translational terms is both short-sighted and near impossible. Even so, science is feeling the economic downturn and budgets are tighter than they have been for a long time. As Greenberg concluded, “It''s human nature when everybody is feeling the pinch that you think [yours] is bigger than the next guy''s, but I would be hard pressed to say who is getting pinched, at least in the biomedical agenda, more than who else.”  相似文献   

8.
Wolinsky H 《EMBO reports》2011,12(12):1226-1229
Looking back on the International Year of Biodiversity, some conservationists hope that it has raised awareness, if nothing else. Even so, many scientists remain pessimistic about our efforts to halt biodiversity decline.The United Nations'' (UN) International Year of Biodiversity in 2010 was supposed to see the adoption of measures that would slow global environmental decline and the continuing loss of endangered species and habitats. Even before, in 2002, most UN members had committed to halting the decline in biodiversity, which is a measure of the health of ecosystems. But the results of these international efforts have been funereal. Moreover, the current global economic crisis, coupled with growing anti-science attitudes in the USA, are adding to the concern of scientists about whether there is the political will to address the loss of biodiversity and whether habitat loss and extinction rates are reaching a point of no return.“There is not a single report received last year that claimed to have stopped or reduced the loss of biodiversity”Ahmed Djoghlaf, Executive Secretary of the Convention on Biological Diversity under the UN Environment Programme based in Montreal, Canada, said that of the 175 national reports submitted as part of the International Year of Biodiversity to his agency last year, none reported any progress. “There is not a single report received last year that claimed to have stopped or reduced the loss of biodiversity,” he said. “These reports confirm that the rate of loss of biodiversity today is unprecedented and the rate is 1,000 higher than the rate of natural extinction on species, and [his agency''s Global Biodiversity Outlook 2010; UN, 2010a] predicts that if business is allowed to continue then major ecosystems, the ocean, the fish, the forests, will reach the tipping point, meaning that there will be irreversible and irreparable damage done to the ecosystems.”The UN campaign traces its roots to the European Union (EU) commitment in 2001 to halt the loss of biodiversity by 2010. The 2010 goal was incorporated into the UN Millennium Development Goals because of the severe impact of biodiversity loss on human well-being. However, the EU last year conceded in a report that it missed its 2010 target, too. The EU''s Biodiversity Action Plan, launched in 2006, shows that Europe''s biodiversity “remains under severe threat from the excessive demands we are making on our environment, such as changes in land use, pollution, invasive species and climate change.” Yet, EU Environment Commissioner Janez Potočnik has seen some positive signs: “We have learned some very important lessons and managed to raise biodiversity to the top of the political agenda. But we need everyone on board and not just in Europe. The threat around the world is even greater than in the EU,” he wrote last year (EC, 2010).Despite the initiative''s poor report card, Djoghlaf was upbeat about the International Year of Biodiversity. “It was a success because it was celebrated everywhere,” he said. “In Switzerland, they conducted a survey before and after the International Year of Biodiversity and they concluded that at the end of the year, 67% of all the Swiss people are now aware of biodiversity. When the year started it was 40%. People are more and more aware. In addition, biodiversity has entered the top of the political agenda.”In October 2010, delegates from 193 countries attended the UN Convention on Biodiversity in Nagoya, Japan, and adopted 20 strategic goals to be achieved by 2020 (UN, 2010b). The so-called Aichi Biodiversity Targets include increased public awareness of the values of biodiversity and the steps that individuals can take to conserve and act sustainably; the halving or halting of the rate of loss of all natural habitats, including forests; and the conservation of 17% of terrestrial and inland water, and 10% of coastal and marine areas through effective and equitable management, resulting in ecologically representative and well-connected systems. By contrast, 13% of land areas and 1% of marine areas were protected in 2010.However, the Convention on Biological Diversity is not enforceable. Anne Larigauderie, Executive Director of DIVERSITAS (Paris, France), which promotes research on biodiversity science, said that it is up to the individual countries to adopt enforceable legislation. “In principle, countries have committed. Now it depends on what individual countries are going to do with the agreement,” she said. “I would say that things are generally going in the right direction and it''s too early to tell whether or not it''s going to have an impact in terms of responding and in terms of the biodiversity itself.”Researchers, however, have been disappointed by The International Year of Biodiversity. Conservation biologist Stuart Butchart, of Birdlife International in Cambridge, UK—a partnership of non-governmental environmental organizations and colleagues from other environmental groups—compiled a list of 31 indicators to measure progress towards the 2010 goal of the International Year of Biodiversity. He and his collaborators reported in Science (Butchart et al, 2010) that these indicators, including species population trends, extinction risks and habitat conditions, showed declines with no significant rate reductions. At the same time, indicators of pressure on biodiversity, such as resource consumption, invasive alien species, nitrogen pollution, over-exploitation and climate change impacts showed increases. “Despite some local successes and increasing responses (including extent and biodiversity coverage of protected areas, sustainable forest management, policy responses to invasive alien species and biodiversity-related aid), the rate of biodiversity loss does not appear to be slowing,” the researchers wrote.wrote.Open in a separate window© Thomas Kitchin & Victoria Hurst/Wave/CorbisButchart pointed out that even if the International Year of Biodiversity had an impact on raising awareness and reducing biodiversity loss, detecting the change would take time. He said that the International Year of Biodiversity fell short of increasing awareness in parts of government not dealing with the environment, including ministries of transport, tourism, treasury and finance. It also seems probable that the campaign had little impact on the business sector, which affects development projects with a direct impact on biodiversity. “People can''t even seem to get together on global climate change, which is a whole lot more obvious and right there,” Peter Raven, president emeritus of the Missouri Botanical Gardens in St Louis, USA, explained. “Biodiversity always seems to be a sort of mysterious background thing that isn''t quite there.”“People can''t even seem to get together on global climate change, which is a whole lot more obvious and right there…”Illka Hanski, a professor in the Department of Ecology and Evolutionary Biology at the University of Helsinki in Finland, said that studies such as Butchart''s “indicate that nothing really happened in 2010. Biodiversity decline continued and has been declining over the past 10 years.”Other researchers are more positive, although with reservations. Conservation biologist Thomas Eugene Lovejoy III, Heinz Center Biodiversity Chair and former president of the Center in Washington, DC, USA—a non-partisan, non-profit organization dedicated to advancing sound environmental policy—said that economic trends affect biodiversity and that biodiversity efforts might actually be benefiting from the current global economic crisis. For example, the decline in the housing markets in the USA and Europe has reduced the demand on lumber for new construction and has led to a reduction in deforestation. “Generally speaking, when there is an economic downturn, some of the things that are pressuring biodiversity actually abate somewhat. That''s the good news. The bad news is that the ability to marshal resources to do some things proactively gets harder,” he said.Chris Thomas, a conservation biologist at the University of York in the UK, who studies ecosystems and species in the context of climate change, said that economic depressions do slow the rate of damage to the environment. “But it also takes eyes off the ball of environmental issues. It''s not clear whether these downturns, when you look over a period of a decade, make much difference or not.” Hanski agreed: “[B]ecause there is less economic activity, there may be less use of resources and such. But I don''t think this is a way to solve our problems. It won''t lead to any stable situation. It just leads to a situation where economic policies become more and more dependent on measures that try actually just to increase the growth as soon as possible.”…biodiversity efforts might actually be benefiting from the current global economic crisisRaven said that in bad times, major interests such as those involved in raising cattle, growing soybeans and clearing habitat for oil palms have reduced political clout because there is less money available for investment. But he said economic downturns do not slow poor people scrounging for sustenance in natural habitats.To overcome this attitude of neglect, Lovejoy thinks there ought to be a new type of ‘economics'' that demonstrates the benefits of biodiversity and brings the “natural world into the normal calculus.” Researchers are already making progress in this direction. Thomas said that the valuation of nature is one of the most active areas of research. “People have very different opinions as to how much of it can be truly valued. But it is a rapidly developing field,” he said. “Once you''ve decided how much something is worth, then you''ve got to ask what are the financial or other mechanisms by which the true value of this resource can be appreciated.”Hanski said that the main problem is the short-term view of economic forecasts. “Rapid use of natural resources because of short-term calculation may actually lead to a sort of exploitation rather than conservation or preservation.” He added that the emphasis on economic growth in rich societies in North America and Europe is frustrating. “We have become much richer than in 1970 when there actually was talk of zero growth in serious terms. So now we are richer and we are becoming more and more dependent on continued growth, the opposite of what we should be aiming at. It''s a problem with our society and economics clearly, but I can''t be very optimistic about the biodiversity or other environmental issues in this kind of situation.” He added that biodiversity is still taking a backseat to economics: “There is a very long way to go right now with the economic situation in Europe, it''s clear that these sorts of [biodiversity] issues are not the ones which are currently being debated by the heads of states.”The economic downturn, which has led to reduced government and private funding and declines in endowments, has also hurt organizations dedicated to preserving biodiversity. Butchart said that some of the main US conservation organizations, including the Nature Conservancy and the World Wildlife Federation, have experienced staff cuts up to 30%. “Organizations have had to tighten their belts and reign in programmes just to stay afloat, so it''s definitely impacted the degree to which we could work effectively,” he said. “Most of the big international conservation organizations have had to lay off large numbers of staff.”…a new type of ‘economics'' that demonstrates the benefits of biodiversity and brings the “natural world into the normal calculus”Cary Fowler, Executive Director of the Global Crop Diversity Trust in Rome, Italy, a public–private partnership to fund key crop collections for food security, also feels the extra challenges of the global economic crisis. “We invest our money conservatively like a foundation would in order to generate income that can reliably pay the bills in these seed banks year after year. So I''m always nervous and I have the computer on at the moment looking at what''s happening with the sovereign debt crisis here in Europe. It''s not good,” he said. “Governments are not being very generous with contributions to this area. Donors will rarely give a reason [for cutting funding].”The political situation in the USA, the world''s largest economy, is also not boding well for conservation of and research into biodiversity. The political extremism of the Republican Party during the run up to the 2012 presidential election has worried many involved in biodiversity issues. Republican contender Texas Governor Rick Perry has been described as ‘anti-science'' for his denial of man-made climate change, a switch from the position of 2008 Republican candidate John McCain. Perry was also reported to describe evolution as a “theory that''s out there, and it''s got some gaps in it” at a campaign event in New Hampshire earlier in the year.“Most of the big international conservation organizations have had to lay off large numbers of staff”Raven said this attitude is putting the USA at a disadvantage. “It drives us to an anti-intellectualism and a lack of real verification for anything which is really serious in terms of our general level of scientific education and our ability to act intelligently,” he said.Still, Larigauderie said that although the USA has not signed the conventions on biodiversity, she has seen US observers attend the meetings, especially under the Obama administration. “They just can''t speak,” she said. Meanwhile, Lovejoy said that biodiversity could get lost in the “unbelievable polarisation affecting US politics. I have worked out of Washington for 36 years now—I''ve never seen anything like this: an unwillingness to actually listen to the other side.”Raven said it is vital for the USA to commit to preserving biodiversity nationally and internationally. “It''s extremely important because our progress towards sustainability for the future will depend on our ability to handle biodiversity in large part. We''re already using about half of all the total photosynthetic productivity on land worldwide and that in turn means we''re cutting our options back badly. The US is syphoning money by selling debt and of course promoting instability all over the world,” he explained. “It''s clear that there is no solution to it other than a level population, more moderate consumption levels and new technologies altogether.”The EU and the UN have also changed the time horizon for halting the decline in biodiversity. As part of the Nagoya meeting, the UN announced the UN Decade for Biodiversity. The strategic objectives include a supporting framework for the implementation of the Biodiversity Strategic Plan 2011–2020 and the Aichi Biodiversity Targets, as well as guidance to regional and international organizations, and more public awareness of biodiversity issues.But Butchart remains sceptical. “I suspect ‘decades of whatever'' have even less impact than years,” he said. “2008 was the International Year of the Potato. I don''t know how much impact that had on your life and awareness. I think there is greater awareness and greater potential to make significant progress in addressing biodiversity loss now than there was 10 years ago, but the scale of the challenge is still immense.”“…our progress towards sustainability for the future will depend on our ability to handle biodiversity in large part”Hanski has similar doubts. “I believe it''s inevitable that a very large fraction of the species on Earth will go extinct in the next hundred years. I can''t see any change to that.” But he is optimistic that some positive change can be made. “Being pessimistic doesn''t help. The nations still can make a difference.” He said he has observed ecotourism playing a role in saving some species in Madagascar, where he does some of his research.“We''re not going to fundamentally be able to wipe life off the planet,” Thomas said. “We will wipe ourselves off the planet virtually certainly before we wipe life out on Earth. However, from the point of view of humanity as a culture, and in terms of the resources we might be able to get from biodiversity indirectly or directly, if we start losing things then it takes things millions of years to ‘re-evolve'' something that does an equivalent job. From a human perspective, when we wipe lots of things out, they''re effectively permanently lost. Of course it would be fascinating and I would love to be able to come back to the planet in 10 million years and see what it looks like, assuming humans are not here and other stuff will be.”Djoghlaf, by contrast, is more optimistic about our chances: “I believe in the human survival aspect. When humankind realises that the current pattern of production and consumption and the current way that it is dealing with nature is unsustainable, we will wake up.”  相似文献   

9.
Samuel Caddick 《EMBO reports》2008,9(12):1174-1176
  相似文献   

10.
Geneticists and historians collaborated recently to identify the remains of King Richard III of England, found buried under a car park. Genetics has many more contributions to make to history, but scientists and historians must learn to speak each other''s languages.The remains of King Richard III (1452–1485), who was killed with sword in hand at the Battle of Bosworth Field at the end of the War of the Roses, had lain undiscovered for centuries. Earlier this year, molecular biologists, historians, archaeologists and other experts from the University of Leicester, UK, reported that they had finally found his last resting place. They compared ancient DNA extracted from a scoliotic skeleton discovered under a car park in Leicester—once the site of Greyfriars church, where Richard was rumoured to be buried, but the location of which had been lost to time—with that of a seventeenth generation nephew of King Richard: it was a match. Richard has captured the public imagination for centuries: Tudor-friendly playwright William Shakespeare (1564–1616) portrayed Richard as an evil hunchback who killed his nephews in order to ascend to the throne, whilst in succeeding years others have leapt to his defence and backed an effort to find his remains.The application of genetics to history is revealing much about the ancestry and movements of groups of humans, from the fall of the Roman Empire to ancient ChinaMolecular biologist Turi King, who led the Leicester team that extracted the DNA and tracked down a descendant of Richard''s older sister, said that Richard''s case shows how multi-disciplinary teams can join forces to answer history''s questions. “There is a lot of talk about what meaning does it have,” she said. “It tells us where Richard III was buried; that the story that he was buried in Greyfriars is true. I think there are some people who [will] try and say: “well, it''s going to change our view of him” […] It won''t, for example, tell us about his personality or if he was responsible for the killing of the Princes in the Tower.”The discovery and identification of Richard''s skeleton made headlines around the world, but he is not the main prize when it comes to collaborations between historians and molecular biologists. Although some of the work has focused on high-profile historic figures—such as Louis XVI (1754–1793), the only French king to be executed, and Vlad the Impaler, the Transylvanian royal whose patronymic name inspired Bram Stoker''s Dracula (Fig 1)—many other projects involve population studies. Application of genetics to history is revealing much about the ancestry and movements of groups of humans, from the fall of the Roman Empire to ancient China.Open in a separate windowFigure 1The use of molecular genetics to untangle history. Even when the historical record is robust, molecular biology can contribute to our understanding of important figures and their legacies and provide revealing answers to questions about ancient princes and kings.Medieval historian Michael McCormick of Harvard University, USA, commented that historians have traditionally relied on studying records written on paper, sheepskin and papyrus. However, he and other historians are now teaming up with geneticists to read the historical record written down in the human genome and expand their portfolio of evidence. “What we''re seeing happening now—because of the tremendous impact from the natural sciences and particularly the application of genomics; what some of us are calling genomic archaeology—is that we''re working back from modern genomes to past events reported in our genomes,” McCormick explained. “The boundaries between history and pre-history are beginning to dissolve. It''s a really very, very exciting time.”…in the absence of written records, DNA and archaeological records could help fill in gapsMcCormick partnered with Mark Thomas, an evolutionary geneticist at University College London, UK, to try to unravel the mystery of one million Romano-Celtic men who went missing in Britain after the fall of the Roman Empire. Between the fourth and seventh centuries, Germanic tribes of Angles, Saxons and Jutes began to settle in Britain, replacing the Romano-British culture and forcing some of the original inhabitants to migrate to other areas. “You can''t explain the predominance of the Germanic Y chromosome in England based on the population unless you imagine (a) that they killed all the male Romano-Celts or (b) there was what Mark called ‘sexual apartheid'' and the conquerors mated preferentially with the local women. [The latter] seems to be the best explanation that I can see,” McCormick said of the puzzle.Ian Barnes, a molecular palaeobiologist at Royal Holloway University of London, commented that McCormick studies an unusual period, for which both archaeological and written records exist. “I think archaeologists and historians are used to having conflicting evidence between the documentary record and the archaeological record. If we bring in DNA, the goal is to work out how to pair all the information together into the most coherent story.”Patrick Geary, Professor of Western Medieval History at the Institute for Advanced Study in Princeton, New Jersey, USA, studies the migration period of Europe: a time in the first millennium when Germanic tribes, including the Goths, Vandals, Huns and Longobards, moved across Europe as the Roman Empire was declining. “We do not have detailed written information about these migrations or invasions or whatever one wants to call them. Primarily what we have are accounts written later on, some generations later, from the contemporary record. What we tend to have are things like sermons bemoaning the faith of people because God''s wrath has brought the barbarians on them. Hardly the kind of thing that gives us an idea of exactly what is going on—are these really invasions, are they migrations, are they small military groups entering the Empire? And what are these ‘peoples'': biologically related ethnic groups, or ad hoc confederations?” he said.Geary thinks that in the absence of written records, DNA and archaeological records could help fill in the gaps. He gives the example of jewellery, belt buckles and weapons found in ancient graves in Hungary and Northern and Southern Italy, which suggest migrations rather than invasions: “If you find this kind of jewellery in one area and then you find it in a cemetery in another, does it mean that somebody was selling jewellery in these two areas? Does this mean that people in Italy—possibly because of political change—want to identify themselves, dress themselves in a new style? This is hotly debated,” Geary explained. Material goods can suggest a relationship between people but the confirmation will be found in their DNA. “These are the kinds of questions that nobody has been able to ask because until very recently, DNA analysis simply could not be done and there were so many problems with it that this was just hopeless,” he explained. Geary has already collected some ancient DNA samples and plans to collect more from burial sites north and south of the Alps dating from the sixth century, hoping to sort out kinship relations and genetic profiles of populations.King said that working with ancient DNA is a tricky business. “There are two reasons that mitochondrial DNA (mtDNA) is the DNA we wished to be able to analyse in [King] Richard. In the first instance, we had a female line relative of Richard III and mtDNA is passed through the female line. Fortunately, it''s also the most likely bit of DNA that we''d be able to retrieve from the skeletal remains, as there are so many copies of it in the cell. After death, our DNA degrades, so mtDNA is easier to retrieve simply due to the sheer number of copies in each cell.”Geary contrasted the analysis of modern and ancient DNA. He called modern DNA analysis “[…] almost an industrial thing. You send it off to a lab, you get it back, it''s very mechanical.” Meanwhile, he described ancient DNA work as artisanal, because of degeneration and contamination. “Everything that touched it, every living thing, every microbe, every worm, every archaeologist leaves DNA traces, so it''s a real mess.” He said the success rate for extracting ancient mtDNA from teeth and dense bones is only 35%. The rate for nuclear DNA is only 10%. “Five years ago, the chances would have been zero of getting any, so 10% is a great step forward. And it''s possible we would do even better because this is a field that is rapidly transforming.”But the bottleneck is not only the technical challenge to extract and analyse ancient DNA. Historians and geneticists also need to understand each other better. “That''s why historians have to learn what it is that geneticists do, what this data is, and the geneticists have to understand the kind of questions that [historians are] trying to ask, which are not the old nineteenth century questions about identity, but questions about population, about gender roles, about relationship,” Geary said.DNA analysis can help to resolve historical questions and mysteries about our ancestors, but both historians and geneticists are becoming concerned about potential abuses and frivolous applications of DNA analysis in their fields. Thomas is particularly disturbed by studies based on single historical figures. “Unless it''s a pretty damn advanced analysis, then studying individuals isn''t particularly useful for history unless you want to say something like this person had blue eyes or whatever. Population level studies are best,” he said. He conceded that the genetic analysis of Richard III''s remnants was a sound application but added that this often is not the case with other uses, which he referred to as “genetic astrology.” He was critical of researchers who come to unsubstantiated conclusions based on ancient DNA, and scientific journals that readily publish such papers.…both historians and geneticists are becoming concerned about potential abuses or frivolous applications of DNA analysis in their fieldsThomas said that it is reasonable to analyse a Y chromosome or mtDNA to estimate a certain genetic trait. “But then to look at the distribution of those, note in the tree where those types are found, and informally, interpretively make inferences—“Well this must have come from here and therefore when I find it somewhere else then that means that person must have ancestors from this original place”—[…] that''s deeply flawed. It''s the most widely used method for telling historical stories from genetic data. And yet is easily the one with the least credibility.” Thomas criticized such facile use of genetic data, which misleads the public and the media. “I suppose I can''t blame these [broadcast] guys because it''s their job to make the programme look interesting. If somebody comes along and says ‘well, I can tell you you''re descended from some Viking warlord or some Celtic princess'', then who are they to question.”Similarly, the historians have reservations about making questionable historical claims on the basis of DNA analysis. Geary said the use of mtDNA to identify Richard III was valuable because it answered a specific, factual question. However, he is turned off by other research using DNA to look at individual figures, such as a case involving a princess who was a direct descendant of the woman who posed for Leonardo Da Vinci''s Mona Lisa. “There''s some people running around trying to dig up famous people and prove the obvious. I think that''s kind of silly. There are others that I think are quite appropriate, and while is not my kind of history, I think it is fine,” he said. “The Richard III case was in the tradition of forensics.”…the cases in which historians and archaeologists work with molecular biologists are rare and remain disconnected in general from the mainstream of historical or archaeological researchNicola Di Cosmo, a historian at the Institute for Advanced Study, who is researching the impact of climate change on the thirteenth century Mongol empire, follows closely the advances in DNA and history research, but has not yet applied it to his own work. Nevertheless, he said that genetics could help to understand the period he studies because there are no historical documents, although monumental burials exist. “It is important to get a sense of where these people came from, and that''s where genetics can help,” he said. He is also concerned about geneticists who publish results without involving historians and without examining other records. He cited a genetic study of a so-called ‘Eurasian male'' in a prestige burial of the Asian Hun Xiongnu, a nomadic people who at the end of the third century B.C. formed a tribal league that dominated most of Central Asia for more than 500 years. “The conclusion the geneticists came to was that there was some sort of racial tolerance in this nomadic empire, but we have no way to even assume that they had any concept of race or tolerance.”Di Cosmo commented that the cases in which historians and archaeologists work with molecular biologists are rare and remain disconnected in general from the mainstream of historical or archaeological research. “I believe that historians, especially those working in areas for which written records are non-existent, ought to be taking seriously the evidence churned out by genetic laboratories. On the other hand, geneticists must realize that the effectiveness of their research is limited unless they access reliable historical information and understand how a historical argument may or may not explain the genetic data” [1].Notwithstanding the difficulties in collaboration between two fields, McCormick is excited about historians working with DNA. He said the intersection of history and genomics could create a new scientific discipline in the years ahead. “I don''t know what we''d call it. It would be a sort of fusion science. It certainly has the potential to produce enormous amounts of enormously interesting new evidence about our human past.”  相似文献   

11.
Suran M 《EMBO reports》2011,12(1):27-30
Few environmental disasters are as indicting of humanity as major oil spills. Yet Nature has sometimes shown a remarkable ability to clean up the oil on its own.In late April 2010, the BP-owned semi-submersible oilrig known as Deepwater Horizon exploded just off the coast of Louisiana. Over the following 84 days, the well from which it had been pumping spewed 4.4 million barrels of crude oil into the Gulf of Mexico, according to the latest independent report (Crone & Tolstoy, 2010). In August, the US Government released an even grimmer estimate: according to the federal Flow Rate Technical Group, up to 4.9 million barrels were excreted during the course of the disaster. Whatever the actual figure, images from NASA show that around 184.8 million gallons of oil have darkened the waters just 80 km from the Louisiana coast, where the Mississippi Delta harbours marshlands and an abundance of biodiversity (NASA Jet Propulsion Laboratory, 2010; Fig 1).…the Deepwater incident is not the first time that a massive oil spill has devastated marine and terrestrial ecosystems, nor is it likely to be the lastOpen in a separate windowFigure 1Images of the Deepwater Horizon oil slick in the Gulf of Mexico. These images were recorded by NASA''s Terra spacecraft in May 2010. The image dimensions are 346 × 258 kilometres and North is toward the top. In the upper panel, the oil appears bright turquoise owing to the combination of images that were used from the Multi-angle Imaging SpectroRadiometer (MISR) aboard the craft. The Mississippi Delta, which harbors marshlands and an abundance of biodiversity, is visible in the top left of the image. The white arrow points to a plume of smoke and the red cross-hairs indicate the former location of the drilling rig. The lower two panels are enlargements of the smoke plume, which is probably the result of controlled burning of collected oil on the surface.© NASA/GSFC/LaRC/JPL, MISR TeamThe resulting environmental and economic situation in the Gulf is undoubtedly dreadful—the shrimp-fishing industry has been badly hit, for example. Yet the Deepwater incident is not the first time that a massive oil spill has devastated marine and terrestrial ecosystems, nor is it likely to be the last. In fact, the US National Oceanic and Atmospheric Association (NOAA) deals with approximately 300 oil spills per year and the Deepwater catastrophe—despite its extent and the enormous amount of oil released—might not be as terrible for the environment as was originally feared. Jacqueline Michel, a geochemist who has worked on almost every major oil spill since the 1970s and who is a member of NOAA''s scientific support team for the Gulf spill, commented that “the marshes and grass are showing some of the highest progresses of [oil] degradation because of the wetness.” This rapid degradation is partly due to an increased number of oil-consuming microbes in the water, whose population growth in response to the spill is cleaning things up at a relatively fast pace (Hazen et al, 2010).It therefore seems that, however bad the damage, Nature''s capacity to repair itself might prevent the unmitigated disaster that many feared on first sight of the Deepwater spill. As the late social satirist George Carlin (1937–2008) once put it: “The planet will shake us off like a bad case of fleas, a surface nuisance[.] The planet will be here for a long, long—LONG—time after we''re gone, and it will heal itself, it will cleanse itself, because that''s what it does, it''s a self-correcting system.”Michel believes that there are times when it is best to leave nature alone. In such cases the oil will degrade naturally by processes as simple as exposure to sunlight—which can break it down—or exposure to the air—which evaporates many of its components. “There have been spills where there was no response because we knew we were going to cause more harm,” Michel said. “Although we''re going to remove heavier layers of surface oil [in this case], the decision has been made to leave oil on the beach because we believe it will degrade in a timescale of months […] through natural processing.”To predict the rate of general environmental recovery, Michel said one should examine the area''s fauna, the progress of which can be very variable. Species have different recovery rates and although it takes only weeks or months for tiny organisms such as plankton to bounce back to their normal population density, it can take years for larger species such as the endangered sea turtle to recover.…however bad the damage, Nature''s capacity to repair itself might prevent the unmitigated disaster that many feared on first sight…Kimberly Gray, professor of environmental chemistry and toxicology at Northwestern University (Evanston, IL, USA), is most concerned about the oil damaging the bottom of the food chain. “Small hits at the bottom are amplified as you move up,” she explained. “The most chronic effects will be at the base of the food chain […] we may see lingering effects with the shrimp population, which in time may crash. With Deepwater, it''s sort of like the straw that broke the shrimp''s back.”Wetlands in particular are a crucial component of the natural recovery of ecosystems, as they provide flora that are crucial to the diets of many organisms. They also provide nesting grounds and protective areas where fish and other animals find refuge from predation. “Wetlands and marsh systems are Nature''s kidneys and they''ve been damaged,” Gray said. The problem is exacerbated because the Louisiana wetlands are already stressed in the aftermath of Hurricane Katrina, which devastated the Gulf coast in August 2005, and because of constant human activity and environmental damage. As Gray commented, “Nature has a very powerful capacity to repair itself, but what''s happening in the modern day is assault after assault.”Ron Thom, a marine ecologist at Pacific Northwest National Laboratory—a US government-funded research facility (Richland, WA, USA)—has done important research on coastal ecosystems. He believes that such habitats are able to decontaminate themselves to a limited degree because of evolution. “[Coastal-related ecosystems are] pretty resilient because they''ve been around a long time and know how to survive,” he said.As a result, wetlands can decontaminate themselves of pollutants such as oil, nitrate and phosphate. However, encountering large amounts of pollutants in a short period of time can overwhelm the healing process, or even stop it altogether. “We did some experiments here in the early 90s looking at the ability for salt marshes to break down oil,” Thom said. “When we put too much oil on the surface of the marsh it killed everything.” He explained that the oil also destroyed the sediment–soil column, where plant roots are located. Eventually, the roots disintegrated and the entire soil core fell apart. According to Thom, the Louisiana marshes were weakened by sediment and nutrient starvation, which suggests that the Deepwater spill destroyed below-ground material in some locations. “You can alter a place through a disturbance so drastic that it never recovers to what it used to be because things have changed so much,” he said.“Nature has a very powerful capacity to repair itself, but what''s happening in the modern day is assault after assault”Michael Blum, a coastal marsh ecologist at Tulane University in New Orleans, said that it is hard to determine the long-term effects of the oil because little is known about the relevant ecotoxicology—the effect of toxic agents on ecosystems. He has conducted extensive research on how coastal marsh plants respond to stress: some marshes might be highly susceptible to oil whereas others could have evolved to deal with natural oil seepage to metabolize hydrocarbons. In the former, marshes might perish after drastic exposure to oil leading to major shifts in plant communities. In the latter case, the process of coping with oil could involve the uptake of pollutants in the oil—known as polycyclic aromatic hydrocarbons (PAHs)—and their reintroduction into the environment. “If plants are growing in the polluted sediments and tapping into those contaminated sources, they can pull that material out of the soil and put it back into the water column or back into the leaf tissue that is a food source for other organisms,” Blum explained.In addition to understanding the responses of various flora, scientists also need to know how the presence of oil in an ecosystem affects the fauna. One model that is used to predict the effects of oil on vertebrates is the killifish; a group of minnows that thrive in the waters of Virginia''s Elizabeth River, where they are continuously exposed to PAHs deposited in the water by a creosote factory (Meyer & Di Giulio, 2003). “The killifish have evolved tolerance to the exposure of PAHs over chronic, long-term conditions,” Blum said. “This suggests that something similar may occur elsewhere, including in Gulf Coast marshes exposed to oil.”Although Michel is optimistic about the potential for environmental recovery, she pointed out that no two spills are the same. “There are lot of things we don''t know, we never had a spill that had surface release for so long at this water depth,” she said. Nevertheless, to better predict the long-term effects, scientists have turned to data from similar incidents.In 1989, the petroleum tanker Exxon Valdez struck Bligh Reef off the coast of Prince William Sound in Alaska and poured a minimum of 11 million gallons of oil into the water—enough to fill 125 Olympic-sized swimming pools. Senior scientist at NOAA, Stanley Rice of Juno, Alaska, studies the long-term effects of the spill and the resulting oil-related issues in Prince William Sound. Rice has worked with the spill since day 3 and, 20 years later, he is seeing major progress. “I never want to give the impression that we had this devastating oil spill in 1989 and it''s still devastating,” he said. “We have pockets of a few species where lingering oil hurts their survival, but in terms of looking at the Sound in its entirety […] it''s done a lot of recovery in 20 years.”…little is known about the relevant ecotoxicology—the effect of toxic agents on ecosystemsDespite the progress, Rice is still concerned about one group of otters. The cold temperature of the water in the Sound—rarely above 5 °C—slows the disintegration of the oil and, every so often, the otters come in contact with a lingering pocket. When they are searching for food, for example, the otters often dig into pits containing oil and become contaminated, which damages their ability to maintain body temperature. As a result, they cannot catch as much food and starve because they need to consume the equivalent of 25% of their body weight every day (Rice, 2009).“Common colds or worse, pneumonia, are extremely debilitating to an animal that has to work literally 365 days a year, almost 8 to 12 hours a day,” Rice explained. “If they don''t eat enough to sustain themselves, they die of hyperthermia.” Nevertheless, in just the last two years, Rice has finally seen the otter population rebound.Unlike the otters, one pod of orca whales has not been so lucky. Since it no longer has any reproductive females, the pod will eventually become extinct. However, as it dies out, orca prey such as seals and otters will have a better chance of reproducing. “There are always some winners and losers in these types of events,” Rice said. “Nature is never static.”The only ‘loser'' that Rice is concerned about at the moment is the herring, as many of their populations have remained damaged for the past 20 years. “Herring are critical to the ecosystem,” he said. “[They are] a base diet for many species […] Prince William Sound isn''t fully recovered until the herring recover.”North America is not alone in dealing with oil-spill disasters—Europe has had plenty of experience too. One of the worst spills occurred when the oil tanker Prestige leaked around 20 million gallons of oil into the waters of the Galacian coast in Northern Spain in 2002. This also affected the coastline of France and is considered Spain''s worst ecological disaster.“The impacts of the Prestige were indeed severe in comparison with other spills around the world,” said attorney Xabier Ezeizabarrena, who represented the Fishermen Guilds of Gipuzkoa in a lawsuit relating to the spill. “Some incidents aren''t even reported, but in the European Union the ratio is at least one oil spill every six months.”For disasters involving oil, oceanographic data to monitor and predict the movement of the spill is essentialIn Ezeizabarrena''s estimation, Spanish officials did not respond appropriately to the leak. The government was denounced for towing the shipwreck further out into the Atlantic Ocean—where it eventually sank—rather than to a port. “There was a huge lack of measures and tools from the Spanish government in particular,” Ezeizabarrena said. “[However], there was a huge response from civil society […] to work together [on restoration efforts].”Ionan Marigómez, professor of cellular biology at the University of the Basque Country, Spain, was the principal investigator on a federal coastal-surveillance programme named Orbankosta. He recorded the effects of the oil on the Basque coast and was a member of the Basque government''s technical advisory commission for the response to the Prestige spill. He was also chair of the government''s scientific committee. “Unfortunately, most of us scientists were not prepared to answer questions related to the biological impact of restoration strategies,” Marigómez said. “We lacked data to support our advice since continued monitoring is not conducted in the area […] and most of us had developed our scientific activity with too much focus on each one''s particular area when the problem needed a holistic view.”…the world consumes approximately 31 billion barrels of oil per year; more than 700 times the amount that leaked during the Deepwater spillFor disasters involving oil, oceanographic data to monitor and predict the movement of the spill is essential. Clean-up efforts were initially encouraged in Spain, but data provided by coastal-inspection programmes such as Orbankosta informed the decision to not clean up the Basque shoreline, allowing the remaining oil debris to disintegrate naturally. In fact, the cleaning activity that took place in Galicia only extended the oil pollution to the supralittoral zone—the area of the beach splashed by the high tide, rather than submerged by it—as well as to local soil deposits. On the Basque coast, restoration efforts were limited to regions where people were at risk, such as rocky areas near beaches and marinas.Eight years later, Galicia still suffers from the after-effects of the Prestige disaster. Thick subsurface layers of grey sand are found on beaches, sometimes under sand that seems to be uncontaminated. In Corme-Laxe Bay and Cies Island in Galicia, PAH levels have decreased. Studies have confirmed, however, that organisms exposed to the area''s sediments had accumulated PAHs in their bodies. Marigómez, for example, studied the long-term effects of the spill on mussels. Depending on their location, PAH levels decreased in the sampled mussel tissue between one and two years after the spill. However, later research showed that certain sites suffered later increases in the level of PAHs, due to the remobilization of oil residues (Cajaraville et al, 2006). Indeed, many populations of macroinvertebrate species—which are the keystones of coastal ecosystems—became extinct at the most-affected locations, although neighbouring populations recolonized these areas. The evidence suggests that only time will tell what will happen to the Galicia ecosystem. The same goes for oil-polluted environments around the world.The concern whether nature can recover from oil spills might seem extreme, considering that oil is a natural product derived from the earth. But too much of anything can be harmful and oil would remain locked underground without human efforts to extract it. “As from Paracelsus'' aphorism, the dose makes the poison,” Marigómez said.According to the US Energy Information Administration, the world consumes approximately 31 billion barrels of oil per year; more than 700 times the amount that leaked during the Deepwater spill. Humanity continues, in the words of some US politicians, to “drill, baby, drill!” On 12 October 2010, less than a year after the Gulf Coast disaster, US President Barack Obama declared that he was lifting the ban on deepwater drilling. It appears that George Carlin got it right again when he satirized a famous American anthem: “America, America, man sheds his waste on thee, and hides the pines with billboard signs from sea to oily sea!”  相似文献   

12.
Hidden Markov models (HMMs) and their variants are widely used in Bioinformatics applications that analyze and compare biological sequences. Designing a novel application requires the insight of a human expert to define the model''s architecture. The implementation of prediction algorithms and algorithms to train the model''s parameters, however, can be a time-consuming and error-prone task. We here present HMMConverter, a software package for setting up probabilistic HMMs, pair-HMMs as well as generalized HMMs and pair-HMMs. The user defines the model itself and the algorithms to be used via an XML file which is then directly translated into efficient C++ code. The software package provides linear-memory prediction algorithms, such as the Hirschberg algorithm, banding and the integration of prior probabilities and is the first to present computationally efficient linear-memory algorithms for automatic parameter training. Users of HMMConverter can thus set up complex applications with a minimum of effort and also perform parameter training and data analyses for large data sets.  相似文献   

13.
Wolinsky H 《EMBO reports》2011,12(2):107-109
Considering a patient''s ethnic background can make some diagnoses easier. Yet, ‘racial profiling'' is a highly controversial concept and might soon be replaced by the advent of individualized medicine.In 2005, the US Food and Drug Administration (FDA; Bethesda, MD, USA) approved BiDil—a combination of vasodilators to treat heart failure—and hailed it as the first drug to specifically treat an ethnic group. “Approval of a drug to treat severe heart failure in self-identified black population is a striking example of how a treatment can benefit some patients even if it does not help all patients,” announced Robert Temple, the FDA''s Director of Medical Policy. “The information presented to the FDA clearly showed that blacks suffering from heart failure will now have an additional safe and effective option for treating their condition” (Temple & Stockbridge, 2007). Even the National Medical Association—the African-American version of the American Medical Association—advocated the drug, which was developed by NitroMed, Inc. (Lexington, MA, USA). A new era in medicine based on racial profiling seemed to be in the offing.By January 2008, however, the ‘breakthrough'' had gone bust. NitroMed shut down its promotional campaign for BiDil—a combination of the vasodilators isosorbide dinitrate, which affects arteries and veins, and hydralazine hydrochloride, which predominantly affects arteries. In 2009, it sold its BiDil interests and was itself acquired by another pharmaceutical company.In the meantime, critics had largely discredited the efforts of NitroMed, thereby striking a blow against the drug if not the concept of racial profiling or race-based medicine. Jonathan Kahn, a historian and law professor at Hamline University (St Paul, MN, USA), described the BiDil strategy as “a leap to genetics.” He demonstrated that NitroMed, motivated to extend its US patent scheduled to expire in 2007, purported to discover an advantage for a subpopulation of self-identified black people (Kahn, 2009). He noted that NitroMed conducted a race-specific trial to gain FDA approval, but, as there were no comparisons with other populations, it never had conclusive data to show that BiDil worked in black people differently from anyone else.“If you want to understand heart failure, you look at heart failure, and if you want to understand racial disparities in conditions such as heart failure or hypertension, there is much to look at that has nothing to do with genetics,” Kahn said, adding “that jumping to race as a genetic construct is premature at best and reckless generally in practice.” The USA, he explained, has a century-old tradition of marketing to racial and ethnic groups. “BiDil brought to the fore the notion that you can have ethnic markets not only in things like cigarettes and food, but also in pharmaceuticals,” Kahn commented.“BiDil brought to the fore the notion that you can have ethnic markets not only in things like cigarettes and food, but also in pharmaceuticals”However, despite BiDil''s failure, the search for race-based therapies and diagnostics is not over. “What I have found is an increasing, almost exponential, rise in the use of racial and ethnic categories in biotechnology-related patents,” Kahn said. “A lot of these products are still in the pipeline. They''re still patent applications, they''re not out on the market yet so it''s hard to know how they''ll play out.”The growing knowledge of the human genome is also providing new opportunities to market medical products aimed at specific ethnic groups. The first bumpy steps were taken with screening for genetic risk factors for breast cancers. Myriad Genetics (Salt Lake City, UT, USA) holds broad patents in the USA for breast-cancer screening tests that are based on mutations of the BRCA1 and BRCA2 genes, but it faced challenges in Europe, where critics raised concerns about the high costs of screening.The growing knowledge of the human genome is also providing new opportunities to market medical products aimed at specific ethnic groupsThe European Patent Office initially granted Myriad patents for the BRCA1 and BRCA2-based tests in 2001, after years of debate. But it revoked the patent on BRCA1 in 2005, which was again reversed in 2009. In 2005 Myriad decided to narrow the scope of BRCA2 testing on the basis of ethnicity. The company won a patent to predict breast-cancer risk in Ashkenazi Jewish women on the basis of BRCA2 mutations, which occur in one in 100 of these women. Physicians offering the test are supposed to ask their patients whether they are in this ethnic group, and then pay a fee to Myriad.Kahn said Myriad took this approach to package the test differently in order to protect its financial interests. However, he commented, the idea of ethnic profiling by asking women whether they identify themselves as Ashkenazi Jewish and then paying extra for an ‘ethnic'' medical test did not work in Europe. “It''s ridiculous,” Kahn commented.After the preliminary sequence of the human genome was published a decade ago, experts noted that humans were almost the same genetically, implying that race was irrelevant. In fact, the validity of race as a concept in science—let alone the use of the word—has been hotly debated. “Race, inasmuch as the concept ought to be used at all, is a social concept, not a biological one. And using it as though it were a biological one is as a much an ethical problem as a scientific problem,” commented Samia Hurst, a physician and bioethicist at Geneva University Medical School in Switzerland.Switzerland.Open in a separate window© Monalyn Gracia/CorbisCiting a popular slogan: “There is no gene for race,” she noted, “there doesn''t seem to be a single cluster of genes that fits with identification within an ethnic group, let alone with disease risks as well. We''re also in an increasingly mixed world where many people—and I count myself among them—just don''t know what to check on the box. If you start counting up your grandparents and end up with four different ethnic groups, what are you going to do? So there are an increasing number of people who just don''t fit into those categories at all.”Still, some dismiss criticism of racial profiling as political correctness that could potentially prevent patients from receiving proper care. Sally Satel, a psychiatrist in Washington, DC, USA, does not shy away from describing herself as a racially profiling physician and argues that it is good medicine. A commentator and resident scholar at the nonpartisan conservative think tank, the American Enterprise Institute (Washington, DC, USA), Satel wrote the book PC, M.D.: How Political Correctness is Corrupting Medicine. “In practicing medicine, I am not color blind. I take note of my patient''s race. So do many of my colleagues,” she wrote in a New York Times article entitled “I am a racially profiling doctor” (Satel, 2002).…some dismiss criticism of racial profiling as political correctness that could potentially prevent patients from receiving proper careSatel noted in an interview that it is an undeniable fact that black people tend to have more renal disease, Native Americans have more diabetes and white people have more cystic fibrosis. She said these differences can help doctors to decide which drugs to prescribe at which dose and could potentially lead researchers to discover new therapies on the basis of race.Satel added that the mention of race and medicine makes many people nervous. “You can dispel that worry by taking pains to specify biological lineage. Simply put, members of a group have more genes in common than members of the population at large. Some day geneticists hope to be able to conduct genomic profiles of each individual, making group identity irrelevant, but until then, race-based therapeutics has its virtues,” she said. “Denying the relationship between race and medicine flies in the face of clinical reality, and pretending that we are all at equal risk for health problems carries its own dangers.”However, Hurst contended that this approach may be good epidemiology, rather than racial profiling. Physicians therefore need to be cautious about using skin colour, genomic data and epidemiological data in decision making. “If African Americans are at a higher risk for hypertension, are you not going to check for hypertension in white people? You need to check in everyone in any case,” she commented.Hurst said European physicians, similarly to their American colleagues, deal with race and racial profiling, albeit in a different way. “The way in which we struggle with it is strongly determined by the history behind what could be called the biases that we have. If you have been a colonial power, if the past is slavery or if the past or present is immigration, it does change some things,” she said. “On the other hand, you always have the difficulty of doing fair and good medicine in a social situation that has a kind of ‘them and us'' structure. Because you''re not supposed to do medicine in a ‘them and us'' structure, you''re supposed to treat everyone according to their medical needs and not according to whether they''re part of ‘your tribe'' or ‘another tribe''.”Indeed, social factors largely determine one''s health, rather than ethnic or genetic factors. August A. White III, an African-American orthopaedic surgeon at Harvard Medical School (Boston, MA, USA) and author of the book Seeing Patients: Unconscious Bias In Health Care, noted that race is linked to disparities in health care in the USA. A similar point can be made in Europe where, for example, Romani people face discrimination in several countries.White said that although genetic research shows that race is not a scientific concept, the way people are labelled in society and how they are treated needs to be taken into account. “It''d be wonderful at some point if we can pop one''s key genetic information into a computer and get a printout of which medications are best of them and which doses are best for them,” he commented. “In the meantime though, I advocate careful operational attempts to treat everyone as human beings and to value everyone''s life, not devalue old people, or devalue women, or devalue different religious faiths, etc.”Notwithstanding the scientific denunciation, a major obstacle for the concept of racial profiling has been the fact that the word ‘race'' itself is politically loaded, as a result of, among other things, the baggage of eugenics and Nazi racism and the legacies of slavery and colonialism. Richard Tutton, a sociologist at Lancaster University in the UK, said that British scientists he interviewed for a Wellcome Trust project a few years ago prefer the term ethnicity to race. “Race is used in a legal sense in relation to inequality, but certainly otherwise, ethnicity is the preferred term, which obviously is different to the US” he said. “I remember having conversations with German academics and obviously in Germany you couldn''t use the R-word.”Jan Helge Solbakk, a physician, theologian and medical ethicist at the University of Oslo in Norway, said the use of the term race in Europe is a non-starter because it makes it impossible for the public and policy-makers to communicate. “I think in Europe it would be politically impossible to launch a project targeting racial differences on the genetic level. The challenge is to find not just a more politically correct concept, but a genetically more accurate concept and to pursue such research questions,” he said. According to Kahn, researchers therefore tend to refer to ethnicity rather than race: “They''re talking about European, Asian and African, but they''re referring to it as ethnicity instead of race because they think somehow that''s more palatable.”Regardless, race-based medicine might just be a stepping stone towards more refined and accurate methods, with the advent of personalized medicine based on genomics, according to Leroy Hood, whose work has helped to develop tools to analyse the human genome. The focus of his company—the Institute for Systems Biology (Seattle, WA, USA)—is to identify genetic variants that can inform and help patients to pioneer individualized health care.“Race as a concept is disappearing with interbreeding,” Hood said. “Race distinction is going to slowly fade away. We can use it now because we have signposts for race, which are colour, fairness, kinkiness of hair, but compared to a conglomeration of things that define a race, those are very few features. The race-defining features are going to be segregating away from one another more and more as the population becomes racially heterogeneous, so I think it''s going to become a moot point.”Hood instead advocates “4P” health care—“Predictive, Personalized, Preventive and Participatory.” “My overall feeling about the race-based correlations is that it is far more important to think about the individual and their individual unique spectra of health and wellness,” he explained. “I think we are not going to deal in the future with racial or ethnic populations, rather medicine of the future is going to be focused entirely on the individual.”Yet, Arthur Caplan, Director of the Center for Bioethics at the University of Pennsylvania (Philadelphia, PA, USA), is skeptical about the prospects for both race-based and personalized medicine. “Race-based medicine will play a minor role over the next few years in health care because race is a minor factor in health,” he said. “It''s not like we have a group of people who keel over dead at 40 who are in the same ethnic group.”Caplan also argued that establishing personalized genomic medicine in a decade is a pipe dream. “The reason I say that is it''s not just the science,” he explained. “You have to redo the whole health-care system to make that possible. You have to find manufacturers who can figure out how to profit from personalized medicine who are both in Europe and the United States. You have to have doctors that know how to prescribe them. It''s a big, big revamping. That''s not going to happen in 10 years.”Hood, however, is more optimistic and plans to advance the concept with pilot projects; he believes that Europe might be the better testing ground. “I think the European systems are much more efficient for pioneering personalized medicine than the United States because the US health-care system is utterly chaotic. We have every combination of every kind of health care and health delivery. We have no common shared vision,” he said. “In the end we may well go to Europe to persuade a country to really undertake this. The possibility of facilitating a revolution in health care is greater in Europe than in the United States.”  相似文献   

14.
Does the Golgi self-organize or does it form around an instructive template? Evidence on both sides is piling up, but a definitive conclusion is proving elusive.In the battle to define the Golgi, discussions easily spiral into what can appear like nitpicking. In a contentious poster session, an entire worldview rests on whether you think a particular mutant is arrested with vesicles that are close to but distinct from the ER or almost budded from but still attached to the ER.Sometimes obscured by these details are the larger issues. This debate “gets to the fundamental issue of how you think of the Golgi,” says Ben Glick of the University of Chicago (Chicago, IL). “The dogma has been that you need a template to build an organelle. But in the secretory system it''s possible in principle that you could get de novo organization of structure. That''s the issue that stirs people emotionally and intellectually.”Then there are the collateral issues. There is an ongoing controversy about the nature of forward transport through the Golgi—it may occur via forward movement of small vesicles, or by gradual maturation of one cisterna to form the next. The cisternal maturation model “argues for a Golgi that can be made and consumed,” says Graham Warren (Yale University, New Haven, CT)—a situation that is more difficult to reconcile with Warren''s template-determined universe.Even more confusing is the situation in mitosis. Accounts vary wildly on how much of the Golgi disappears into the ER during mitosis. The answer would determine to what extent the cell has to rebuild the Golgi after mitosis, and what method it might use to do so.Several laboratories have made major contributions to address these issues. But none define them so clearly as those of Warren and Jennifer Lippincott-Schwartz (National Institutes of Health, Bethesda, MD). At almost every turn, on almost every issue, it seems that Warren and Lippincott-Schwartz reach opposite conclusions, sometimes based on similar or identical data.And yet, at least in public, there is a remarkable lack of rancor. “These are not easy experiments for us to do,” says Warren. “It''s all cutting-edge research and we are pushing the technology to the limit. Part of that is that you push your own interpretation.” For her part, Lippincott-Schwartz approaches a lengthy poster-session debate with Warren with something approaching glee. This is not triumphal glee, however. Rather, Lippincott-Schwartz seems to relish the opportunity to exchange ideas, and on this point Warren agrees. “Complacency is the worst thing to have in a field,” he says. The debate “has made all of us think a lot harder.”  相似文献   

15.
Crop shortages     
A lack of breeders to apply the knowledge from plant science is jeopardizing public breeding programmes and the training of future plant scientistsIn the midst of an economic downturn, many college and university students in the USA face an uncertain future. There is one crop of graduates, though, who need not worry about unemployment: plant breeders. “Our students start with six-digit salaries once they leave and they have three or four offers. We have people coming to molecular biology and they can''t find jobs. People coming to plant breeding, they have as many jobs as they want,” said Edward Buckler, a geneticist with the US Department of Agriculture''s Agricultural Research Service Institute for Genomic Diversity at Cornell University (Ithaca, NY, USA).The lure of Big Ag depletes universities and research institutes of plant breeders […] and jeopardizes the training of future generations of plant scientists and breedersThe secret behind the success of qualified breeders on the job market is that they can join ‘Big Ag''—big agriculture—that is, major seed companies. Roger Boerma, coordinator of academic research for the Center for Applied Genetic Technologies at the University of Georgia (Athens, GA, USA), said that most of his graduate and postdoctoral students find jobs at companies such as Pioneer, Monsanto and Syngenta, rather than working in the orchards and fields of academic research. According to Todd Wehner, a professor and cucurbit breeder at the Department of Horticultural Science, North Carolina State University (Raleigh, NC, USA), the best-paying jobs—US$100,000 plus good benefits and research conditions—are at seed companies that deal with the main crops (Guner & Wehner, 2003). By contrast, university positions typically start at US$75,000 and tenure track.As a result, Wehner said, public crop breeding in the USA has begun to disappear. “To be clear, there is no shortage of plant breeders,” he said. “There is a shortage of plant breeders in the public sector.” The lure of Big Ag depletes universities and research institutes of plant breeders—who, after all, are the ones who create new plant varieties for agriculture—and jeopardizes the training of future generations of plant scientists and breeders. Moreover, there is an increasing demand for breeders to address the challenge of creating environmentally sustainable ways to grow more food for an increasing human population on Earth.At the same time, basic plant research is making rapid progress. The genomes of most of the main crop plants and many vegetables have been sequenced, which has enabled researchers to better understand the molecular details of how plants fend off pests and pathogens, or withstand drought and flooding. This research has also generated molecular markers—short regions of DNA that are linked to, for example, better resistance to fungi or other pathogens. So-called marker-assisted breeding based on this information is now able to create new plant varieties more effectively than would be possible with the classical strategy of crossing, selection and backcrossing.However, applying the genomic knowledge requires both breeders and plant scientists with a better understanding of each other''s expertise. As David Baulcombe, professor of botany at the University of Cambridge, UK, commented, “I think the important gap is actually in making sure that the fundamental scientists working on genomics understand breeding, and equally that those people doing breeding understand the potential of genomics. This is part of the translational gap. There''s incomplete understanding on both sides.”…applying the genomic knowledge requires both breeders and plant scientists with a better understanding of each other''s expertiseIn the genomic age, plant breeding has an image problem: like other hands-on agricultural work, it is dirty and unglamorous. “A research project in agriculture in the twenty-first century resembles agriculture for farmers in the eighteenth century,” Wehner said. “Harvesting in the fields in the summer might be considered one of the worst jobs, but not to me. I''m harvesting cucumbers just like everybody else. I don''t mind working at 105 degrees, with 95% humidity and insects biting my ankles. I actually like that. I like that better than office work.”For most students, however, genomics is the more appealing option as a cutting-edge and glamorous research field. “The exciting photographs that you always see are people holding up glass test tubes and working in front of big computer screens,” Wehner explained.In addition, Wehner said that federal and state governments have given greater priority and funding to molecular genetics than to plant breeding. “The reason we''ve gone away from plant breeding of course is that faculty can get competitive grants for large amounts of money to do things that are more in the area of molecular genetics,” he explained. “Plant breeders have switched over to molecular genetics because they can get money there and they can''t get money in plant breeding.”“The frontiers of science shifted from agriculture to genetics, especially the genetics of corn, wheat and rice,” agreed Richard Flavell, former Director of the John Innes Centre (Norwich, UK) and now Chief Scientific Officer of Ceres (Thousand Oaks, CA, USA). “As university departments have chased their money, chased the bright students, they have [focused on] programmes that pull in research dollars on the frontiers, and plant breeding has been left behind as something of a Cinderella subject.”In the genomic age, plant breeding has an image problem: like other hands-on agricultural work, it is dirty and unglamorousIn a sense, public plant breeding has become a victim of its own success. Wehner explained that over the past century, the protection of intellectual property has created a profitable market for private corporations to the detriment of public programmes. “It started out where they could protect seed-propagated crops,” he said. “The companies began to hire plant breeders and develop their own varieties. And that started the whole agricultural business, which is now huge.”As a result, Wehner said, the private sector can now outmanoeuvre public breeders at will. “[Seed companies] have huge teams that can go much faster than I can go. They have winter nurseries and big greenhouses and lots of pathologists and molecular geneticists and they have large databases and seed technologists and sales reps and catalogue artists and all those things. They can do much faster cucumber breeding than I can. They can beat me in any area that they choose to focus on.”He said that seed corporations turn only to public breeders when they are looking for rare seeds obtained on expeditions around the world or special knowledge. These crops and the breeders and other scientists who work on them receive far less financial support from government than do the more profitable crops, such as corn and soybean. In effect, these crops are in an analogous position to orphan drugs that receive little attention because the patients who need them represent a small economic market.The dwindling support for public breeding programmes is also a result of larger political developments. Since the 1980s, when British Prime Minister Margaret Thatcher and US President Ronald Regan championed the private sector in all things, government has consistently withdrawn support for public research programmes wherever the private sector can profit. “Plant breeding programmes are expensive. My programme costs about US$500,000 a year to run for my crops, watermelon and cucumber. Universities don''t want to spend that money if they don''t have to, especially if it''s already being done by the private sector,” Wehner said.“Over the last 30 years or so, food supplies and food security have fallen off the agenda of policymakers”…“Over the last 30 years or so, food supplies and food security have fallen off the agenda of policymakers,” Baulcombe explained. “Applied research in academic institutions is disappearing, and so the opportunities for linking the achievements of basic research with applications, at least in the public sector, are disappearing. You''ve got these two areas of the work going in opposite directions.”There''s another problem for plant breeding in the publish-or-perish world of academia. According to Ian Graham, Director of the Centre for Novel Agricultural Products at York University in the UK, potential academics in the plant sciences are turned off by plant breeding as a discipline because it is difficult to publish the research in high-impact journals.Graham, who is funded by the Bill & Melinda Gates Foundation to breed new varieties of Artemisia—the plant that produces the anti-malarial compound artemisinin—said this could change. “Now with the new [genomic] technologies, the whole subject of plant breeding has come back into the limelight. We can start thinking seriously about not just the conventional crops […] but all the marginal crops as well that we can really start employing these technologies on and doing exciting science and linking phenotypes to genes and phenotypes to the underlying biology,” he said. “It takes us back again closer to the science. That will bring more people into plant breeding.”…potential academics in the plant sciences are turned off by plant breeding as a discipline because it is difficult to publish the research in high-impact journalsBuckler, who specializes in functional genomic approaches to dissect complex traits in maize, wheat and Arabidopsis, said that public breeding still moves at a slower pace. “The seed companies are trying to figure out how to move genomics from gene discovery all the way to the breeding side. And it''s moving forward,” he said. “There have been some real intellectual questions that people are trying to overcome as to how fast to integrate genomics. I think it''s starting to occur also with a lot of the public breeders. A lot of it has been that the cost of genotyping, especially for specialty crops, was too high to develop marker systems that would really accelerate breeding.”Things might be about to change on the cost side as well. Buckler said that decreasing costs for sequencing and genotyping will give public breeding a boost. Using today''s genomic tools, researchers and plant breeders could match the achievements of the last century in maize breeding within three years. He said that comparable gains could be made in specialty crops, the forte of public breeding. “Right now, most of the simulations suggest that we can accelerate it about threefold,” Buckler said. “Maybe as our knowledge increases, maybe we can approach a 15-fold rate increase.”Indeed, the increasing knowledge from basic research could well contribute to significant advances in the coming years. “We''ve messed around with genes in a rather blind, sort of non-predictive process,” said Scott Jackson, a plant genomics expert at Purdue University (West Lafayette, IN, USA), who headed the team that decoded the soybean genome (Schmutz et al, 2010). “Having a full genome sequence, having all the genes underlying all the traits in whatever plant organism you''re looking at, makes it less blind. You can determine which genes affect the trait and it has the potential to make it a more predictive process where you can take specific genes in combinations and you can predict what the outcome might be. I think that''s where the real revolution in plant breeding is going to come.”Nevertheless, the main problem that could hold back this revolution is a lack of trained people in academia and the private sector. Ted Crosbie, Head of Plant Breeding at Monsanto (St Louis, MO, USA), commented at the national Plant Breeding Coordinating Committee meeting in 2008 that “[w]e, in the plant breeding industry, face a number of challenges. More plant breeders are reaching retirement age at a time when the need for plant breeders has never been greater […] We need to renew our nation''s capacity for plant breeding.”“…with the new [genomic] technologies, the whole subject of plant breeding has come back into the limelight”Dry bean breeder James Kelly, a professor of crop and soil sciences at Michigan State University (East Lansing, MI, USA), said while there has been a disconnect between public breeders and genomics researchers, new federal grants are designed to increase collaboration.In the meantime, developing countries such as India and China have been filling the gap. “China is putting a huge amount of effort into agriculture. They actually know the importance of food. They have plant breeders all over the place,” Wehner said. “The US is starting to fall behind. And now, agricultural companies are looking around wondering—where are we going to get our plant breeders?”To address the problem, major agriculture companies have begun to fund fellowships to train new plant breeders. Thus far, Buckler said, these efforts have had only a small impact. He noted that 500 new PhDs a year are needed just in maize breeding. “It''s not uncommon for the big companies like Monsanto, Pioneer and Syngenta to spend money on training, on endowing chairs at universities,” Flavell said. “It''s good PR, but they''re serious about the need for breeders.”The US government has also taken some measures to alleviate the problem. Congress decided to establish the US National Institute of Food and Agriculture (Washington, DC, USA) under the auspices of the US Department of Agriculture to make more efficient use of research money, advance the application of plant science and attract new students to plant breeding (see the interview with Roger Beachy in this issue, pp 504–507). Another approach is to use distance education to train breeders, such as technicians who want to advance their careers, in certificate programmes rather than master''s or doctorate programmes.“If [breeding] is not done in universities in the public sector, where is it done?”…“If [breeding] is not done in universities in the public sector, where is it done?” Flavell asked about the future of public breeding. “I can wax lyrical and perhaps be perceived as being over the top, but if we''re going to manage this planet on getting more food out of less land, this has to be almost one of the highest things that has got to be taken care of by government.” Wehner added, “The public in the developed world thinks food magically appears in grocery stores. There is no civilization without agriculture. Without plant breeders to work on improving our crops, civilization is at risk.”  相似文献   

16.
Suran M 《EMBO reports》2011,12(5):404-407
The increasing influence of the Tea Party in Congress and politics has potential repercussions for public funding of scientific research in the USAIn 2009, Barack Obama became the 44th President of the USA, amid hopes that he would fix the problems created or left unresolved by his predecessor. However, despite his positive mantra, “Yes we can,” the situation was going to get worse: the country was spiralling towards an economic recession, a collapsing residential real-estate market and the loss of millions of jobs. Now, the deficit lingers around US$14 trillion (US Department of the Treasury, 2011). In response to these hardships and the presence of a perceived ‘socialist'' president in office, a new political movement started brewing that would challenge both the Democrats and the Republicans—the two parties that have dominated US politics for generations. Known as the Tea Party, this movement has been gaining national momentum in its denouncement of the status quo of the government, especially in relation to federal spending, including the support of scientific research.The name is a play on the Boston Tea Party, at which more than 100 American colonists dumped 45 tonnes of tea into Boston Harbour (Massachusetts, USA) in 1773 to protest against the British taxation of imported tea. Whereas the 18th century Boston Tea Party formed to protest against a specific tax, the Tea Party of the 21st century protests against taxes and ‘big'' government in general.Many view Tea Party followers as modern muckrakers, but supporters claim their movement is fundamentally about upholding the US Constitution. Tea Party Patriots, a non-partisan organization, considers itself to be the official home of the Tea Party movement. Fuelled by the values of fiscal responsibility, limited government and free markets, Tea Party Patriots believe, these three principles are granted by the Constitution, although not necessarily upheld by the administration.“If you read the Constitution, the limits of government involvement in society [are] pretty well-defined and our government has gone farther and farther beyond the specific limits of the Constitution,” said Mark Meckler, one of the co-founders of Tea Party Patriots. “Our Constitution is not designed as an empowering document, but as a limiting document… [and] was intended to be used as a weapon by the people against the government to keep it in the box.” Tea Partiers tend to be especially critical when it comes to spending tax dollars on bank bailouts and health care, but anything goes when it comes to cutting back on public spending—even science. “We believe everything needs to be on the table since the government is virtually bankrupt,” Meckler said. “We need to cut the waste, cut the abuse [and] get rid of the departments that shouldn''t exist.”Tea Partiers tend to be especially critical when it comes to spending tax dollars on bank bailouts and health care, but anything goes when […]cutting […] public spending—even scienceOn 19 February 2011, the US House of Representatives, which is currently controlled by the Republicans, passed a federal-spending bill for the remainder of the 2011 fiscal year budget. Among other cuts, the bill called for billions of dollars to be slashed from the budgets of federal science agencies. If the bill is signed into law, the National Institutes of Health (NIH) will have $1.6 billion cut from its budget—a 5.2% decrease—and the Department of Energy (DOE) will experience an 18% cut in funding for its Office of Science. Other agencies targeted include the Environmental Protection Agency (EPA), the National Aeronautics and Space Administration (NASA), the National Institute of Standards and Technology (NIST) and the National Science Foundation (NSF; Anon, 2011; Cho, 2011). Although the US Senate, which has a Democratic majority, must consider the bill before any definite amendments to the budget are made, it is likely that there will be some cuts to science funding.Although the House is in favour of science-related cuts, President Obama supports spending more on science education, basic research and clean-energy research. He has also proposed an 11.8% increase in the budget of the DOE, as well as an 8% increase in the NSF budget (Office of Management and Budget, 2011).The House is in favour of science-related cuts, but President Obama is in favour of spending more on science education, basic science and clean-energy researchJoann Roskoski, acting assistant director of the Biology Directorate at the NSF, said her institute is strongly in favour of President Obama''s budget proposal. “President Obama is a very strong supporter of fundamental research and STEM [science, technology, engineering and mathematics] education because he perceives it as investing in the future of the country,” she said. “These are just difficult budgetary times and we''ll just have to wait and see what happens. As they say, the president proposes and Congress disposes.”Karl Scheidt, a professor of chemistry at Northwestern University (Evanston, Illinois, USA), has four grants from federal agencies. “A couple of my grants expire this year, which is happening at the worst, worst possible time,” explained Scheidt, whose grants are funded by the NIH and the NSF. He added that although many politicians either do not understand or believe in the fundamentals of science, they still preach to the masses about what they ‘think'' they know. “I think it''s an absolute travesty that many people don''t understand science and that many of the Republicans who don''t fully understand science perpetuate incorrect assumptions and scientific falsehoods when speaking in public,” he said. “It makes the US less competitive and puts us collectively at a disadvantage relative to other nations if we don''t succeed in scientific education and innovative research in the future.”Although the Tea Party is not technically associated with the Republican Party, all Tea-Party representatives and senators ran as Republican candidates in the last election. While only one-third of seats in the Senate are on the ballot every two years for a six-year term, all House seats are for a two-year term. In the most recent Senatorial election, 50% of Tea Party-backed candidates won; 10 in total. 140 candidates for seats in the House of Representatives were backed by the Tea Party—all of whom were Republicans—but only 40 won. Nevertheless, with around 100 new Republicans in office, a House controlled by a Republican majority and most Congress-based Republicans in agreement with Tea Party ideals, the Tea Party actually has a lot of sway on the voting floor.Of course, as a fundamentally grass-roots movement, their influence is not limited to the halls of power. Since just before the November election last year, Tea Party-backed politicians have received more scrutiny and media exposure, meaning more people have listened to their arguments against spending on science. In fact, Republican politicians associated with the Tea Party have made critical and sometimes erroneous comments about science. Representative Michelle Bachman, for example, claimed that because carbon dioxide is a natural gas, it is not harmful to our atmosphere (Johnson, 2009). Representative Jack Kingston denounced the theory of evolution and stated that he did not come from a monkey (The Huffington Post, 2011). When asked how old he believes the Earth to be, Senator Rand Paul refused to answer (Binckes, 2010). He also introduced a bill to cut the NSF budget by 62%, and targeted the budget of the Center for Disease Control and Prevention.Scheidt believes part of the challenge is that many scientists do not properly articulate the importance of their work to the public, and there is limited representation on behalf of science in Washington. “It''s difficult sometimes to advocate for and explain the critical importance of basic research and for the most part, Congress may not always appreciate the basic fundamental mission of organizations like the NIH,” Scheidt said. “Arlen Specter was one of the few people who could form coalitions with his colleagues on both sides of the aisle and communicate why scientific research is critical. Why discovering new ways to perform transplants and creating new medicines are so important to everyone.”…part of the challenge is that many scientists do not properly articulate the importance of their work to the public, and there is limited representation on behalf of science in WashingtonSpecter, a former senator, was Republican until 2009 when he decided to switch political parties. During the last Democratic primary, he lost the Pennsylvania Senate nomination after serving in Congress for more than four decades. The Democratic nominee, Joe Sestak, eventually lost the coveted seat to Pat Toomey, a Tea Party Republican who sponsored an amendment denying NIH funding for some grants while he was a House member. Toomey is also against funding climate science and clean-energy research with federal dollars.Specter was considered a strong supporter of biomedical research, especially cancer research. He was the catalyst that pushed through a great deal of pro-science legislation, such as adding approximately $10 billion to NIH funding as part of the stimulus package in 2009, and doubling NIH funding in the 1990s. As scientific research was so important to him, he served on the US Senate Committee on Appropriations Subcommittee on Labor, Health and Human Services, Education, and Related Agencies and on the Senate Committee on Environment and Public Works. Specter was a popular political champion of science not only because of all he had accomplished, but also because so few scientists are elected to office.Among those Democrats who lost their seats to Tea Party Republicans was Congressman Bill Foster. Foster, who once worked for the Fermi National Accelerator Laboratory (Fermilab)—which is funded by the DOE—represented Batavia, Ilinois, which is also where Fermilab has its headquarters. “The new representative in the district where Fermilab resides is Randy Hultgren, a Republican, who has been very supportive of the laboratory since he''s been elected,” said Cindy Conger, Chief Financial Officer at Fermilab. “He''s very interested in us and very interested […] in us having adequate funding.”However, Fermilab is suffering financially. “We will […] have some level of layoffs,” Conger said. “Inadequate federal funding could result in more layoffs or not being able to run our machines for part of the year. These are the things we are contemplating doing in the event of a significant budget cut. Nothing is off the table [but] we will do everything we can to run the [Tevatron] accelerator.”But Fermilab''s desperate appeal for $35 million per year for the next three fiscal years was denied by the Obama administration and not included in the 2012 White House budget request. As a result, the most powerful proton–antiproton accelerator in the USA, the Tevatron, is shutting down indefinitely near the end of this year.Another pro-science Republican is former Congressman John Porter, who studied at the Massachusetts Institute of Technology. He encouraged the federal funding of science while serving as chair of the House Subcommittee on Labor, Health and Human Services, and Education, as well as on the House Committee on Appropriations and Related Agencies. Like Scheidt, Porter said a problem is that not many members of Congress really understand science or what goes into scientific research.“Many members of Congress don''t realize that the money appropriated for the funding of scientific research through NIH, NSF […] is sent out to research institutes in their districts and states where the research is conducted,” said Porter, who retired from Congress in 2001 after serving for more than 20 years. “They simply haven''t been exposed to it and that''s the fault of the science community, which has a great responsibility to educate about the mechanisms on how we fund scientific research.”Today, Porter is vice-chair of the Foundation for the NIH and also chairs Research!America, a non-profit organization which aims to further medical, health and scientific research as higher national priorities. He also noted that industry would not fund scientific research in the way the government does because there would essentially be no profits. Therefore, federal funding remains essential.“Let''s take away the phones, iPads and everything else [those against the federal funding of science] depend on and see what''s left,” Porter said. “The US is the world leader in science, technology and research and the way we got there and the way we have created the technology that makes life easier […] is a result of making investments in that area.”For now, Scheidt said the best approach is to educate as many people as possible to understand that scientific research is a necessity, not a luxury. “We unfortunately have a very uneducated population in regard to science and it''s not 100% their fault,” he said. “However, if people took a real interest in science and paid as much attention to stem-cell or drug-discovery research as they did to the Grammy Awards or People magazine I think they would appreciate what''s going on in the science world.”…the best approach is to educate as many people as possible to understand that scientific research is a necessity, not a luxuryInstead, the USA is lagging behind its competitors when it comes to STEM education. According to the 2009 Program for International Student Assessment (PISA), the USA is ranked 17th on science and 25th on maths out of 34 countries (US Department of Education, 2010). “We''re in a cluster now, we''re no longer the leading country,” said D. Martin Watterson, a molecular biologist who sits on NIH peer-review committees to evaluate grant proposals. The reason, according to Watterson, is that the first things to be cut after a budget decrease are training grants for continuing education efforts. Moreover, the USA already lacks highly trained workers in the field of science. “In some disciplines, employers now look to other places in Europe and Asia to find those trained personnel,” Watterson said.Ultimately, most people at least want a final budget to be passed so that there is sufficient time to plan ahead. However, Georgetown University political science professor Clyde Wilcox thinks that a compromise is not so simple. “The problem is that it''s a three-way poker game. People are going to sit down and they''re going to be bargaining, negotiating and bluffing each other,” he said. “The House Republicans just want to cut the programs that they don''t like, so they''re not cutting any Republican programs for the most part.”As a result, institutions such as the EPA find themselves being targeted by the Republicans. Although there is not a filibuster-proof majority of Democrats in the Senate, they still are a majority and will try to preserve science funding. Wilcox said that it is not necessarily a good thing to continue negotiating if nothing gets done and the country is dependent on continuing resolutions.Although there is not a filibuster-proof majority of Democrats in the Senate, they still are a majority and will try to preserve science funding“What the real problem is, when push comes to shove, someone has to blink,” he said. “I don''t think there will be deep cuts in science for a number of reasons, one is science is consistent with the Democratic ideology of education and the Republican ideology of investment. And then, we don''t really spend that much on science anyway so you couldn''t come remotely close to balancing the budget even if you eliminated everything.”Although during his time in Congress representatives of both parties were not as polarized as they are today, Porter believes the reason they are now is because of the political climate. “The president has made [science] a very important issue on his agenda and unfortunately, there are many Republicans today that say if he''s for it, I''m against it,” Porter said. In fact, several government officials ignored repeated requests or declined to comment for this article.“It''s time for everybody from both parties to stand up for the country, put the party aside and find solutions to our problems,” Porter commented. “The American people didn''t just elect us to yell at each other, they elected us to do a job. You have to choose priorities and to me the most important priority is to have our children lead better lives, to have all human beings live longer, healthier, happier lives and to have our economy grow and prosper and our standard of living maintained—the only way to do that is to invest where we lead the world and that''s in science.”  相似文献   

17.
Wolinsky H 《EMBO reports》2012,13(4):308-312
Genomics has become a powerful tool for conservationists to track individual animals, analyse populations and inform conservation management. But as helpful as these techniques are, they are not a substitute for stricter measures to protect threatened species.You might call him Queequeg. Like Herman Melville''s character in the 1851 novel Moby Dick, Howard Rosenbaum plies the seas in search of whales following old whaling charts. Standing on the deck of a 12 m boat, he brandishes a crossbow with hollow-tipped darts to harpoon the flanks of the whales as they surface to breathe (Fig 1). “We liken it to a mosquito bite. Sometimes there''s a reaction. Sometimes the whales are competing to mate with a female, so they don''t even react to the dart,” explained Rosenbaum, a conservation biologist and geneticist, and Director of the New York City-based Wildlife Conservation Society''s Ocean Giants programme. Rosenbaum and his colleagues use the darts to collect half-gram biopsy samples of whale epidermis and fat—about the size of a human fingernail—to extract DNA as part of international efforts to save the whales.Open in a separate windowFigure 1Howard Rosenbaum with a crossbow to obtain skin samples from whales. © Wildlife Conservation Society.Like Rosenbaum, many conservation biologists and wildlife managers increasingly rely on DNA analysis tools to identify species, determine sex or analyse pedigrees. George Amato, Director of the Sackler Institute for Comparative Genomics at the American Museum of Natural History in New York, NY, USA, said that during his 25-year career, genetic tools have become increasingly important for conservation biology and related fields. Genetic information taken from individual animals to the extent of covering whole populations now plays a valuable part in making decisions about levels of protection for certain species or populations and managing conflicts between humans and conservation goals.[…] many conservation biologists and wildlife managers increasingly rely on DNA analysis tools to identify species, determine sex or analyse pedigreesMoreover, Amato expects the use and importance of genetics to grow even more, given that conservation of biodiversity has become a global issue. “My office overlooks Central Park. And there are conservation issues in Central Park: how do you maintain the diversity of plants and animals? I live in suburban Connecticut, where we want the highest levels of diversity within a suburban environment,” he said. “Then, you take this all the way to Central Africa. There are conservation issues across the entire spectrum of landscapes. With global climate change, techniques in genetics and molecular biology are being used to look at issues and questions across that entire landscape.”Rosenbaum commented, “The genomic revolution has certainly changed the way we think about conservation and the questions we can ask and the things we can do. It can be a forensic analysis.” The data translates “into a conservation value where governments, conservationists, and people who actively protect these species can use this information to better protect these animals in the wild.”“The genomic revolution has certainly changed the way we think about conservation […]”Rosenbaum and colleagues from the Wildlife Conservation Society, the American Museum of Natural History and other organizations used genomics for the largest study so far—based on more than 1,500 DNA samples—about the population dynamics of humpback whales in the Southern hemisphere [1]. The researchers analysed population structure and migration rates; they found the highest gene flow between whales that breed on either side of the African continent and a lower gene flow between whales on opposite sides of the Atlantic, from the Brazilian coast to southern Africa. The group also identified an isolated population of fewer than 200 humpbacks in the northern Indian Ocean off the Arabian Peninsula, which are only distantly related to the humpbacks breeding off the coast of Madagascar and the eastern coast of southern Africa. “This group is a conservation priority,” Rosenbaum noted.He said the US National Oceanographic and Atmospheric Administration is using this information to determine whether whale populations are recovering or endangered and what steps should be taken to protect them. Through wildlife management and protection, humpbacks have rebounded to 60,000 or more individuals from fewer than 5,000 in the 1960s. Rosenbaum''s data will, among other things, help to verify whether the whales should be managed as one large group or divided into subgroups.He has also been looking at DNA collected from dolphins caught in fishing nets off the coast of Argentina. Argentine officials will be using the data to make recommendations about managing these populations. “We''ve been able to demonstrate that it''s not one continuous population in Argentina. There might be multiple populations that merit conservation protection,” Rosenbaum explained.The sea turtle is another popular creature that is high on conservationists'' lists. To get DNA samples from sea turtles, population geneticist and wildlife biologist Nancy FitzSimmons from the University of Canberra in Australia reverts to a simpler method than Rosenbaum''s harpoon. “Ever hear of a turtle rodeo?” she asked. FitzSimmons goes out on a speed boat in the Great Barrier Reef with her colleagues, dives into the water and wrangles a turtle on board so it can be measured, tagged, have its reproductive system examined with a laparoscope and a skin tag removed with a small scissor or scalpel for DNA analysis (Fig 2).Open in a separate windowFigure 2Geneticist Stewart Pittard measuring a sea turtle. © Michael P. Jensen, NOAA.Like Rosenbaum, she uses DNA as a forensic tool to characterize individuals and populations [2]. “That''s been a really important part, to be able to tell people who are doing the management, ‘This population is different from that one, and you need to manage them appropriately,''” FitzSimmons explained. The researchers have characterized the turtle''s feeding grounds around Australia to determine which populations are doing well and which are not. If they see that certain groups are being harmed through predation or being trapped in ‘ghost nets'' abandoned by fishermen, conservation measures can be implemented.FitzSimmons, who started her career studying the genetics of bighorn sheep, has recently been using DNA technology in other areas, including finding purebred crocodiles to reintroduce them into a wetland ecosystem at Cat Tien National Park in Vietnam. “DNA is invaluable. You can''t reintroduce animals that aren''t purebred,” she said, explaining the rationale for looking at purebreds. “It''s been quite important to do genetic studies to make sure you''re getting the right animals to the right places.”Geneticist Hans Geir Eiken, senior researcher at the Norwegian Institute for Agricultural and Environmental Research in Svanvik, Norway, does not wrestle with the animals he is interested in. He uses a non-intrusive method to collect DNA from brown bears (Fig 3). “We collect the hair that is on the vegetation, on the ground. We can manage with only a single hair to get a DNA profile,” he said. “We can even identify mother and cub in the den based on the hairs. We can collect hairs from at least two different individuals and separate them afterwards and identify them as separate entities. Of course we also study how they are related and try to separate the bears into pedigrees, but that''s more research and it''s only occasionally that we do that for [bear] management.”Open in a separate windowFigure 3Bear management in Scandinavia. (A) A brown bear in a forest in Northern Finland © Alexander Kopatz, Norwegian Institute for Agricultural and Environmental Research. (B) Faecal sampling. Monitoring of bears in Norway is performed in a non-invasive way by sampling hair and faecal samples in the field followed by DNA profiling. © Hans Geir Eiken. (C) Brown-bear hair sample obtained by so-called systematic hair trapping. A scent lure is put in the middle of a small area surrounded by barbed wire. To investigate the smell, the bears have to cross the wire and some hair will be caught. © Hans Geir Eiken. (D) A female, 2.5-year-old bear that was shot at Svanvik in the Pasvik Valley in Norway in August 2008. She and her brother had started to eat from garbage cans after they left their mother and the authorities gave permission to shoot them. The male was shot one month later after appearing in a schoolyard. © Hans Geir Eiken.Eiken said the Norwegian government does not invest a lot of money on helicopters or other surveillance methods, and does not want to not bother the animals. “A lot of disturbing things were done to bears. They were trapped. They were radio-collared,” he said. “I think as a researcher we should replace those approaches with non-invasive genetic techniques. We don''t disturb them. We just collect samples from them.”Eiken said that the bears pose a threat to two million sheep that roam freely around Norway. “Bears can kill several tons of them everyday. This is not the case in the other countries where they don''t have free-ranging sheep. That''s why it''s a big economic issue for us in Norway.” Wildlife managers therefore have to balance the fact that brown bears are endangered against the economic interests of sheep owners; about 10% of the brown bears are killed each year because they have caused damage, or as part of a restricted ‘licensed'' hunting programme. Eiken said that within two days of a sheep kill, DNA analysis can determine which species killed the sheep, and, if it is a bear, which individual. “We protect the females with cubs. Without the DNA profiles, it would be easy to kill the females, which also take sheep of course.”Wildlife managers […] have to balance the fact that brown bears are endangered against the economic interests of sheep owners…It is not only wildlife management that interests Eiken; he was part of a group led by Axel Janke at the Biodiversity and Climate Research Centre in Frankfurt am Main, Germany, which completed sequencing of the brown bear genome last year. The genome will be compared with that of the polar bear in the hope of finding genes involved in environmental adaptation. “The reason why [the comparison is] so interesting between the polar bear and the brown bear is that if you look at their evolution, it''s [maybe] less than one million years when they separated. In genetics that''s not a very long time,” Eiken said. “But there are a lot of other issues that we think are even more interesting. Brown bears stay in their caves for 6 months in northern Norway. We think we can identify genes that allow the bear to be in the den for so long without dying from it.”Like bears, wolves have also been clashing with humans for centuries. Hunters exterminated the natural wolf population in the Scandinavian Peninsula in the late nineteenth century as governments protected reindeer farming in northern Scandinavia. After the Swedish government finally banned wolf hunting in the 1960s, three wolves from Finland and Russia immigrated in the 1980s, and the population rose to 250, along with some other wolves that joined the highly inbred population. Sweden now has a database of all individual wolves, their pedigrees and breeding territories to manage the population and resolve conflicts with farmers. “Wolves are very good at causing conflicts with people. If a wolf takes a sheep or cattle, or it is in a recreation area, it represents a potential conflict. If a wolf is identified as a problem, then the local authorities may issue a license to shoot that wolf,” said Staffan Bensch, a molecular ecologist and ornithologist at Lund University in Sweden.Again, it is the application of genomics tools that informs conservation management for the Scandinavian wolf population. Bensch, who is best known for his work on population genetics and genomics of migratory songbirds, was called to apply his knowledge of microsatellite analysis. The investigators collect saliva from the site where a predator has chewed or bitten the prey, and extract mitochondrial DNA to determine whether a wolf, a bear, a fox or a dog has killed the livestock. The genetic information potentially can serve as a death warrant if a wolf is linked with a kill, and to determine compensation for livestock owners.The genetic information potentially can serve as a death warrant if a wolf is linked with a kill…Yet, not all wolves are equal. “If it''s shown to be a genetically valuable wolf, then somehow more damage can be tolerated, such as a wolf taking livestock for instance,” Bensch said. “In the management policy, there is genetic analysis of every wolf that has a question on whether it should be shot or saved. An inbred Scandinavian wolf has no valuable genes so it''s more likely to be shot.” Moreover, Bensch said that DNA analysis showed that in at least half the cases, dogs were the predator. “There are so many more dogs than there are wolves,” he said. “Some farmers are prejudiced that it is the wolf that killed their sheep.”According to Dirk Steinke, lead scientist at Marine Barcode of Life and an evolutionary biologist at the Biodiversity Institute of Ontario at the University of Guelph in Canada, DNA barcoding could also contribute to conservation efforts. The technique—usually based on comparing the sequence of the mitochondrial CO1 gene with a database—could help to address the growing trade in shark fins for wedding feasts in China and among the Chinese diaspora, for example. Shark fins confiscated by Australian authorities from Indonesian ships are often a mess of tissue; barcoding helps them to identify the exact species. “As it turns out, some of them are really in the high-threat categories on the IUCN Red List of Threatened Species, so it was pretty concerning,” Steinke said. “That is something where barcoding turns into a tool where wildlife management can be done—even if they only get fragments of an animal. I am not sure if this can prevent people from hunting those animals, but you can at least give them the feedback on whether they did something illegal or not.”Steinke commented that DNA tools are handy not only for megafauna, but also for the humbler creatures in the sea, “especially when it comes to marine invertebrates. The larval stages are the only ones where they are mobile. If you''re looking at wildlife management from an invertebrate perspective in the sea, then these mobile life stages are very important. Their barcoding might become very handy because for some of those groups it''s the only reliable way of knowing what you''re looking at.” Yet, this does not necessarily translate into better conservation: “Enforcement reactions come much quicker when it''s for the charismatic megafauna,” Steinke conceded.“Enforcement reactions come much quicker when it''s for the charismatic megafauna”Moreover, reliable identification of animal species could even improve human health. For instance, Amato and colleagues from the US Centers for Disease Control and Prevention demonstrated for the first time the presence of zoonotic viruses in non-human primates seized in American airports [3]. They identified retroviruses (simian foamy virus) and/or herpesviruses (cytomegalovirus and lymphocryptovirus), which potentially pose a threat to human health. Amato suggested that surveillance of the wildlife trade by using barcodes would help facilitate prevention of disease. Moreover, DNA barcoding could also show whether the meat itself is from monkeys or other wild animals to distinguish illegally hunted and traded bushmeat—the term used for meat from wild animals in Africa—from legal meats.Amato''s group also applied barcoding to bluefin tuna, commonly used in sushi, which he described as the “bushmeat of the developed world”, as the species is being driven to near extinction through overharvesting. Developing barcodes for tuna could help to distinguish bluefin from yellowfin or other tuna species and could assist measures to protect the bluefin. “It can be used sort of like ‘Wildlife CSI'' (after the popular American TV series),” he said.As helpful as these technologies are […] they are not sufficient to protect severely threatened species…In fact, barcoding for law enforcement is growing. Mitchell Eaton, assistant unit leader at the US Geological Survey New York Cooperative Fish and Wildlife Research Unit in Ithaca, NY, USA, noted that the technique is being used by US government agencies such as the FDA and the US Fish & Wildlife Service, as well as African and South American governments, to monitor the illegal export of pets and bushmeat. It is also used as part of the United Nations'' Convention on Biological Diversity for cataloguing the Earth''s biodiversity, identifying pathogens and monitoring endangered species. He expects that more law enforcement agencies around the world will routinely apply these tools: “This is actually easy technology to use.”In that way, barcoding as well as genetics and its related technologies help to address a major problem in conservation and protection measures: to monitor the size, distribution and migration of populations of animals and to analyse their genetic diversity. It gives biologists and conservations a better picture of what needs extra protective measures, and gives enforcement agencies a new and reliable forensic tool to identify and track illegal hunting and trade of protected species. As helpful as these technologies are, however, they are not sufficient to protect severely threatened species such as the bluefin tuna and are therefore not a substitute for more political action and stricter enforcement.  相似文献   

18.
It is not currently possible to measure the real-world thought process that a child has while observing an actual school lesson. However, if it could be done, children''s neural processes would presumably be predictive of what they know. Such neural measures would shed new light on children''s real-world thought. Toward that goal, this study examines neural processes that are evoked naturalistically, during educational television viewing. Children and adults all watched the same Sesame Street video during functional magnetic resonance imaging (fMRI). Whole-brain intersubject correlations between the neural timeseries from each child and a group of adults were used to derive maps of “neural maturity” for children. Neural maturity in the intraparietal sulcus (IPS), a region with a known role in basic numerical cognition, predicted children''s formal mathematics abilities. In contrast, neural maturity in Broca''s area correlated with children''s verbal abilities, consistent with prior language research. Our data show that children''s neural responses while watching complex real-world stimuli predict their cognitive abilities in a content-specific manner. This more ecologically natural paradigm, combined with the novel measure of “neural maturity,” provides a new method for studying real-world mathematics development in the brain.  相似文献   

19.
Cândido Godói (CG) is a small municipality in South Brazil with approximately 6,000 inhabitants. It is known as the “Twins'' Town” due to its high rate of twin births. Recently it was claimed that such high frequency of twinning would be connected to experiments performed by the German Nazi doctor Joseph Mengele. It is known, however, that this town was founded by a small number of families and therefore a genetic founder effect may represent an alternatively explanation for the high twinning prevalence in CG. In this study, we tested specific predictions of the “Nazi''s experiment” and of the “founder effect” hypotheses. We surveyed a total of 6,262 baptism records from 1959–2008 in CG catholic churches, and identified 91 twin pairs and one triplet. Contrary to the “Nazi''s experiment hypothesis”, there is no spurt in twinning between the years (1964–1968) when Mengele allegedly was in CG (P = 0.482). Moreover, there is no temporal trend for a declining rate of twinning since the 1960s (P = 0.351), and no difference in twinning among CG districts considering two different periods: 1927–1958 and 1959–2008 (P = 0.638). On the other hand, the “founder effect hypothesis” is supported by an isonymy analysis that shows that women who gave birth to twins have a higher inbreeding coefficient when compared to women who never had twins (0.0148, 0.0081, respectively, P = 0.019). In summary, our results show no evidence for the “Nazi''s experiment hypothesis” and strongly suggest that the “founder effect hypothesis” is a much more likely alternative for explaining the high prevalence of twinning in CG. If this hypothesis is correct, then this community represents a valuable population where genetic factors linked to twinning may be identified.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号