首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 149 毫秒
1.
The authors of “The anglerfish deception” respond to the criticism of their article.EMBO reports (2012) advanced online publication; doi: 10.1038/embor.2012.70EMBO reports (2012) 13 2, 100–105; doi: 10.1038/embor.2011.254Our respondents, eight current or former members of the EFSA GMO panel, focus on defending the EFSA''s environmental risk assessment (ERA) procedures. In our article for EMBO reports, we actually focused on the proposed EU GMO legislative reform, especially the European Commission (EC) proposal''s false political inflation of science, which denies the normative commitments inevitable in risk assessment (RA). Unfortunately the respondents do not address this problem. Indeed, by insisting that Member States enjoy freedom over risk management (RM) decisions despite the EFSA''s central control over RA, they entirely miss the relevant point. This is the unacknowledged policy—normative commitments being made before, and during, not only after, scientific ERA. They therefore only highlight, and extend, the problem we identified.The respondents complain that we misunderstood the distinction between RA and RM. We did not. We challenged it as misconceived and fundamentally misleading—as though only objective science defined RA, with normative choices cleanly confined to RM. Our point was that (i) the processes of scientific RA are inevitably shaped by normative commitments, which (ii) as a matter of institutional, policy and scientific integrity must be acknowledged and inclusively deliberated. They seem unaware that many authorities [1,2,3,4] have recognized such normative choices as prior matters, of RA policy, which should be established in a broadly deliberative manner “in advance of risk assessment to ensure that [RA] is systematic, complete, unbiased and transparent” [1]. This was neither recognized nor permitted in the proposed EC reform—a central point that our respondents fail to recognize.In dismissing our criticism that comparative safety assessment appears as a ‘first step'' in defining ERA, according to the new EFSA ERA guidelines, which we correctly referred to in our text but incorrectly referenced in the bibliography [5], our respondents again ignore this widely accepted ‘framing'' or ‘problem formulation'' point for science. The choice of comparator has normative implications as it immediately commits to a definition of what is normal and, implicitly, acceptable. Therefore the specific form and purpose of the comparison(s) is part of the validity question. Their claim that we are against comparison as a scientific step is incorrect—of course comparison is necessary. This simply acts as a shield behind which to avoid our and others'' [6] challenge to their self-appointed discretion to define—or worse, allow applicants to define—what counts in the comparative frame. Denying these realities and their difficult but inevitable implications, our respondents instead try to justify their own particular choices as ‘science''. First, they deny the first-step status of comparative safety assessment, despite its clear appearance in their own ERA Guidance Document [5]—in both the representational figure (p.11) and the text “the outcome of the comparative safety assessment allows the determination of those ‘identified'' characteristics that need to be assessed [...] and will further structure the ERA” (p.13). Second, despite their claims to the contrary, ‘comparative safety assessment'', effectively a resurrection of substantial equivalence, is a concept taken from consumer health RA, controversially applied to the more open-ended processes of ERA, and one that has in fact been long-discredited if used as a bottleneck or endpoint for rigorous RA processes [7,8,9,10]. The key point is that normative commitments are being embodied, yet not acknowledged, in RA science. This occurs through a range of similar unaccountable RA steps introduced into the ERA Guidance, such as judgement of ‘biological relevance'', ‘ecological relevance'', or ‘familiarity''. We cannot address these here, but our basic point is that such endless ‘methodological'' elaborations of the kind that our EFSA colleagues perform, only obscure the institutional changes needed to properly address the normative questions for policy-engaged science.Our respondents deny our claim concerning the singular form of science the EC is attempting to impose on GM policy and debate, by citing formal EFSA procedures for consultations with Member States and non-governmental organizations. However, they directly refute themselves by emphasizing that all Member State GM cultivation bans, permitted only on scientific grounds, have been deemed invalid by EFSA. They cannot have it both ways. We have addressed the importance of unacknowledged normativity in quality assessments of science for policy in Europe elsewhere [11]. However, it is the ‘one door, one key'' policy framework for science, deriving from the Single Market logic, which forces such singularity. While this might be legitimate policy, it is not scientific. It is political economy.Our respondents conclude by saying that the paramount concern of the EFSA GMO panel is the quality of its science. We share this concern. However, they avoid our main point that the EC-proposed legislative reform would only exacerbate their problem. Ignoring the normative dimensions of regulatory science and siphoning-off scientific debate and its normative issues to a select expert panel—which despite claiming independence faces an EU Ombudsman challenge [12] and European Parliament refusal to discharge their 2010 budget, because of continuing questions over conflicts of interests [13,14]—will not achieve quality science. What is required are effective institutional mechanisms and cultural norms that identify, and deliberatively address, otherwise unnoticed normative choices shaping risk science and its interpretive judgements. It is not the EFSA''s sole responsibility to achieve this, but it does need to recognize and press the point, against resistance, to develop better EU science and policy.  相似文献   

2.
3.
4.
Rinaldi A 《EMBO reports》2012,13(4):303-307
Scientists and journalists try to engage the public with exciting stories, but who is guilty of overselling research and what are the consequences?Scientists love to hate the media for distorting science or getting the facts wrong. Even as they do so, they court publicity for their latest findings, which can bring a slew of media attention and public interest. Getting your research into the national press can result in great boons in terms of political and financial support. Conversely, when scientific discoveries turn out to be wrong, or to have been hyped, the negative press can have a damaging effect on careers and, perhaps more importantly, the image of science itself. Walking the line between ‘selling'' a story and ‘hyping'' it far beyond the evidence is no easy task. Professional science communicators work carefully with scientists and journalists to ensure that the messages from research are translated for the public accurately and appropriately. But when things do go wrong, is it always the fault of journalists, or are scientists and those they employ to communicate sometimes equally to blame?Walking the line between ‘selling'' a story and ‘hyping'' it far beyond the evidence is no easy taskHyping in science has existed since the dawn of research itself. When scientists relied on the money of wealthy benefactors with little expertise to fund their research, the temptation to claim that they could turn lead into gold, or that they could discover the secret of eternal life, must have been huge. In the modern era, hyping of research tends to make less exuberant claims, but it is no less damaging and no less deceitful, even if sometimes unintentionally so. A few recent cases have brought this problem to the surface again.The most frenzied of these was the report in Science last year that a newly isolated bacterial strain could replace phosphate with arsenate in cellular constituents such as nucleic acids and proteins [1]. The study, led by NASA astrobiologist Felisa Wolfe-Simon, showed that a new strain of the Halomonadaceae family of halofilic proteobacteria, isolated from the alkaline and hypersaline Mono Lake in California (Fig 1), could not only survive in arsenic-rich conditions, such as those found in its original environment, but even thrive by using arsenic entirely in place of phosphorus. “The definition of life has just expanded. As we pursue our efforts to seek signs of life in the solar system, we have to think more broadly, more diversely and consider life as we do not know it,” commented Ed Weiler, NASA''s associate administrator for the Science Mission Directorate at the agency''s Headquarters in Washington, in the original press release [2].Open in a separate windowFigure 1Sunrise at Mono Lake. Mono Lake, located in eastern California, is bounded to the west by the Sierra Nevada mountains. This ancient alkaline lake is known for unusual tufa (limestone) formations rising from the water''s surface (shown here), as well as for its hypersalinity and high concentrations of arsenic. See Wolfe-Simon et al [1]. Credit: Henry Bortman.The accompanying “search for life beyond Earth” and “alternative biochemistry makeup” hints contained in the same release were lapped up by the media, which covered the breakthrough with headlines such as “Arsenic-loving bacteria may help in hunt for alien life” (BBC News), “Arsenic-based bacteria point to new life forms” (New Scientist), “Arsenic-feeding bacteria find expands traditional notions of life” (CNN). However, it did not take long for criticism to manifest, with many scientists openly questioning whether background levels of phosphorus could have fuelled the bacteria''s growth in the cultures, whether arsenate compounds are even stable in aqueous solution, and whether the tests the authors used to prove that arsenic atoms were replacing phosphorus ones in key biomolecules were accurate. The backlash was so bitter that Science published the concerns of several research groups commenting on the technical shortcomings of the study and went so far as to change its original press release for reporters, adding a warning note that reads “Clarification: this paper describes a bacterium that substitutes arsenic for a small percentage of its phosphorus, rather than living entirely off arsenic.”Microbiologists Simon Silver and Le T. Phung, from the University of Illinois, Chicago, USA, were heavily critical of the study, voicing their concern in one of the journals of the Federation of European Microbiological Societies, FEMS Microbiology Letters. “The recent online report in Science […] either (1) wonderfully expands our imaginations as to how living cells might function […] or (2) is just the newest example of how scientist-authors can walk off the plank in their imaginations when interpreting their results, how peer reviewers (if there were any) simply missed their responsibilities and how a press release from the publisher of Science can result in irresponsible publicity in the New York Times and on television. We suggest the latter alternative is the case, and that this report should have been stopped at each of several stages” [3]. Meanwhile, Wolfe-Simon is looking for another chance to prove she was right about the arsenic-loving bug, and Silver and colleagues have completed the bacterium''s genome shotgun sequencing and found 3,400 genes in its 3.5 million bases (www.ncbi.nlm.nih.gov/Traces/wgs/?val=AHBC01).“I can only comment that it would probably be best if one had avoided a flurry of press conferences and speculative extrapolations. The discovery, if true, would be similarly impressive without any hype in the press releases,” commented John Ioannidis, Professor of Medicine at Stanford University School of Medicine in the USA. “I also think that this is the kind of discovery that can definitely wait for a validation by several independent teams before stirring the world. It is not the type of research finding that one cannot wait to trumpet as if thousands and millions of people were to die if they did not know about it,” he explained. “If validated, it may be material for a Nobel prize, but if not, then the claims would backfire on the credibility of science in the public view.”Another instructive example of science hyping was sparked by a recent report of fossil teeth, dating to between 200,000 and 400,000 years ago, which were unearthed in the Qesem Cave near Tel Aviv by Israeli and Spanish scientists [4]. Although the teeth cannot yet be conclusively ascribed to Homo sapiens, Homo neanderthalensis, or any other species of hominid, the media coverage and the original press release from Tel Aviv University stretched the relevance of the story—and the evidence—proclaiming that the finding demonstrates humans lived in Israel 400,000 years ago, which should force scientists to rewrite human history. Were such evidence of modern humans in the Middle East so long ago confirmed, it would indeed clash with the prevailing view of human origin in Africa some 200,000 years ago and the dispersal from the cradle continent that began about 70,000 years ago. But, as freelance science writer Brian Switek has pointed out, “The identity of the Qesem Cave humans cannot be conclusively determined. All the grandiose statements about their relevance to the origin of our species reach beyond what the actual fossil material will allow” [5].An example of sensationalist coverage? “It has long been believed that modern man emerged from the continent of Africa 200,000 years ago. Now Tel Aviv University archaeologists have uncovered evidence that Homo sapiens roamed the land now called Israel as early as 400,000 years ago—the earliest evidence for the existence of modern man anywhere in the world,” reads a press release from the New York-based organization, American Friends of Tel Aviv University [6].“The extent of hype depends on how people interpret facts and evidence, and their intent in the claims they are making. Hype in science can range from ‘no hype'', where predictions of scientific futures are 100% fact based, to complete exaggeration based on no facts or evidence,” commented Zubin Master, a researcher in science ethics at the University of Alberta in Edmonton, Canada. “Intention also plays a role in hype and the prediction of scientific futures, as making extravagant claims, for example in an attempt to secure funds, could be tantamount to lying.”Are scientists more and more often indulging in creative speculation when interpreting their results, just to get extraordinary media coverage of their discoveries? Is science journalism progressively shifting towards hyping stories to attract readers?“The vast majority of scientific work can wait for some independent validation before its importance is trumpeted to the wider public. Over-interpretation of results is common and as scientists we are continuously under pressure to show that we make big discoveries,” commented Ioannidis. “However, probably our role [as scientists] is more important in making sure that we provide balanced views of evidence and in identifying how we can question more rigorously the validity of our own discoveries.”“The vast majority of scientific work can wait for some independent validation before its importance is trumpeted to the wider public”Stephanie Suhr, who is involved in the management of the European XFEL—a facility being built in Germany to generate intense X-ray flashes for use in many disciplines—notes in her introduction to a series of essays on the ethics of science journalism that, “Arguably, there may also be an increasing temptation for scientists to hype their research and ‘hit the headlines''” [7]. In her analysis, Suhr quotes at least one instance—the discovery in 2009 of the Darwinius masillae fossil, presented as the missing link in human evolution [8]—in which the release of a ‘breakthrough'' scientific publication seems to have been coordinated with simultaneous documentaries and press releases, resulting in what can be considered a study case for science hyping [7].Although there is nothing wrong in principle with a broad communication strategy aimed at the rapid dissemination of a scientific discovery, some caveats exist. “[This] strategy […] might be better applied to a scientific subject or body of research. When applied to a single study, there [is] a far greater likelihood of engaging in unmerited hype with the risk of diminishing public trust or at least numbing the audience to claims of ‘startling new discoveries'',” wrote science communication expert Matthew Nisbet in his Age of Engagement blog (bigthink.com/blogs/age-of-engagement) about how media communication was managed in the Darwinius affair. “[A]ctivating the various channels and audiences was the right strategy but the language and metaphor used strayed into the realm of hype,” Nisbet, who is an Associate Professor in the School of Communication at American University, Washington DC, USA, commented in his post [9]. “We are ethically bound to think carefully about how to go beyond the very small audience that follows traditional science coverage and think systematically about how to reach a wider, more diverse audience via multiple media platforms. But in engaging with these new media platforms and audiences, we are also ethically bound to avoid hype and maintain accuracy and context” [9].But the blame for science hype cannot be laid solely at the feet of scientists and press officers. Journalists must take their fair share of reproach. “As news online comes faster and faster, there is an enormous temptation for media outlets and journalists to quickly publish topics that will grab the readers'' attention, sometimes at the cost of accuracy,” Suhr wrote [7]. Of course, the media landscape is extremely varied, as science blogger and writer Bora Zivkovic pointed out. “There is no unified thing called ‘Media''. There are wonderful specialized science writers out there, and there are beat reporters who occasionally get assigned a science story as one of several they have to file every day,” he explained. “There are careful reporters, and there are those who tend to hype. There are media outlets that value accuracy above everything else; others that put beauty of language above all else; and there are outlets that value speed, sexy headlines and ad revenue above all.”…the blame for science hype cannot be laid solely at the feet of scientists and press officers. Journalists must take their fair share of reproachOne notable example of media-sourced hype comes from J. Craig Venter''s announcement in the spring of 2010 of the first self-replicating bacterial cell controlled by a synthetic genome (Fig 2). A major media buzz ensued, over-emphasizing and somewhat distorting an anyway remarkable scientific achievement. Press coverage ranged from the extremes of announcing ‘artificial life'' to saying that Venter was playing God, adding to cultural and bioethical tension the warning that synthetic organisms could be turned into biological weapons or cause environmental disasters.Open in a separate windowFigure 2Schematic depicting the assembly of a synthetic Mycoplasma mycoides genome in yeast. For details of the construction of the genome, please see the original article. From Gibson et al [13] Science 329, 52–56. Reprinted with permission from AAAS.“The notion that scientists might some day create life is a fraught meme in Western culture. One mustn''t mess with such things, we are told, because the creation of life is the province of gods, monsters, and practitioners of the dark arts. Thus, any hint that science may be on the verge of putting the power of creation into the hands of mere mortals elicits a certain discomfort, even if the hint amounts to no more than distorted gossip,” remarked Rob Carlson, who writes on the future role of biology as a human technology, about the public reaction and the media frenzy that arose from the news [10].Yet the media can also behave responsibly when faced with extravagant claims in press releases. Fiona Fox, Chief Executive of the Science Media Centre in the UK, details such an example in her blog, On Science and the Media (fionafox.blogspot.com). The Science Media Centre''s role is to facilitate communication between scientists and the press, so they often receive calls from journalists asking to be put in touch with an expert. In this case, the journalist asked for an expert to comment on a story about silver being more effective against cancer than chemotherapy. A wild claim; yet, as Fox points out in her blog, the hype came directly from the institution''s press office: “Under the heading ‘A silver bullet to beat cancer?'' the top line of the press release stated that ‘Lab tests have shown that it (silver) is as effective as the leading chemotherapy drug—and may have far fewer side effects.'' Far from including any caveats or cautionary notes up front, the press office even included an introductory note claiming that the study ‘has confirmed the quack claim that silver has cancer-killing properties''” [11]. Fox praises the majority of the UK national press that concluded that this was not a big story to cover, pointing out that, “We''ve now got to the stage where not only do the best science journalists have to fight the perverse news values of their news editors but also to try to read between the lines of overhyped press releases to get to the truth of what a scientific study is really claiming.”…the concern is that hype inflates public expectations, resulting in a loss of trust in a given technology or research avenue if promises are not kept; however, the premise is not fully provenYet, is hype detrimental to science? In many instances, the concern is that hype inflates public expectations, resulting in a loss of trust in a given technology or research avenue if promises are not kept; however, the premise is not fully proven (Sidebar A). “There is no empirical evidence to suggest that unmet promises due to hype in biotechnology, and possibly other scientific fields, will lead to a loss of public trust and, potentially, a loss of public support for science. Thus, arguments made on hype and public trust must be nuanced to reflect this understanding,” Master pointed out.

Sidebar A | Up and down the hype cycle

AlthoughAlthough hype is usually considered a negative and largely unwanted aspect of scientific and technological communication, it cannot be denied that emphasizing, at least initially, the benefits of a given technology can further its development and use. From this point of view, hype can be seen as a normal stage of technological development, within certain limits. The maturity, adoption and application of specific technologies apparently follow a common trend pattern, described by the information technology company, Gartner, Inc., as the ‘hype cycle''. The idea is based on the observation that, after an initial trigger phase, novel technologies pass through a peak of over-excitement (or hype), often followed by a subsequent general disenchantment, before eventually coming under the spotlight again and reaching a stable plateau of productivity. Thus, hype cycles “[h]ighlight overhyped areas against those that are high impact, estimate how long technologies and trends will take to reach maturity, and help organizations decide when to adopt” (www.gartner.com).“Science is a human endeavour and as such it is inevitably shaped by our subjective responses. Scientists are not immune to these same reactions and it might be valuable to evaluate the visibility of different scientific concepts or technologies using the hype cycle,” commented Pedro Beltrao, a cellular biologist at the University of California San Francisco, USA, who runs the Public Rambling blog (pbeltrao.blogspot.com) about bioinformatics science and technology. The exercise of placing technologies in the context of the hype cycle can help us to distinguish between their real productive value and our subjective level of excitement, Beltrao explained. “As an example, I have tried to place a few concepts and technologies related to systems biology along the cycle''s axis of visibility and maturity [see illustration]. Using this, one could suggest that technologies like gene-expression arrays or mass-spectrometry have reached a stable productivity level, while the potential of concepts like personalized medicine or genome-wide association studies (GWAS) might be currently over-valued.”Together with bioethicist colleague David Resnik, Master has recently highlighted the need for empirical research that examines the relationships between hype, public trust, and public enthusiasm and/or support [12]. Their argument proposes that studies on the effect of hype on public trust can be undertaken by using both quantitative and qualitative methods: “Research can be designed to measure hype through a variety of sources including websites, blogs, movies, billboards, magazines, scientific publications, and press releases,” the authors write. “Semi-structured interviews with several specific stakeholders including genetics researchers, media representatives, patient advocates, other academic researchers (that is, ethicists, lawyers, and social scientists), physicians, ethics review board members, patients with genetic diseases, government spokespersons, and politicians could be performed. Also, members of the general public would be interviewed” [12]. They also point out that such an approach to estimate hype and its effect on public enthusiasm and support should carefully define the public under study, as different publics might have different expectations of scientific research, and will therefore have different baseline levels of trust.Increased awareness of the underlying risks of over-hyping research should help to balance the scientific facts with speculation on the enticing truths and possibilities they revealUltimately, exaggerating, hyping or outright lying is rarely a good thing. Hyping science is detrimental to various degrees to all science communication stakeholders—scientists, institutions, journalists, writers, newspapers and the public. It is important that scientists take responsibility for their share of the hyping done and do not automatically blame the media for making things up or getting things wrong. Such discipline in science communication is increasingly important as science searches for answers to the challenges of this century. Increased awareness of the underlying risks of over-hyping research should help to balance the scientific facts with speculation on the enticing truths and possibilities they reveal. The real challenge lies in favouring such an evolved approach to science communication in the face of a rolling 24-hour news cycle, tight science budgets and the uncontrolled and uncontrollable world of the Internet.? Open in a separate windowThe hype cycle for the life sciences. Pedro Beltrao''s view of the excitement–disappointment–maturation cycle of bioscience-related technologies and/or ideas. GWAS: genome-wide association studies. Credit: Pedro Beltrao.  相似文献   

5.
Wickson F  Wynne B 《EMBO reports》2012,13(2):100-105
A recent proposal to reform the EU''s policy on the use of genetically modified crops looks good at first sight, but there are dangers for science lurking in the background.Anglerfish are predators that live in the eternal darkness of the deep oceans and have a distinctive way of catching their prey. They use a long light-emitting filament that extends from their head to lure organisms in the darkness. Those attracted to the shimmering light and movement are then unwittingly caught in front of the anglerfish''s wide-open jaws. Such is the nature of the European Commission (EC)''s proposal for a new European Union (EU) policy on the regulation of genetically modified organisms (GMOs)—it looks alluring at first glance, but there are hidden dangers lurking in the background.After years of protracted conflict between the EC and several EU member states over the import of GM food and the use of GM crops in agriculture, a new regulatory approach to the approval and cultivation of GMOs is currently moving through the legislative process. In July 2010, the EC proposed the inclusion of a new article (Article 26b) in Directive 2001/18/EC that regulates the deliberate environmental release of GMOs. It would give member states autonomy to make their own decisions about cultivating GM crops, independently of EC authorizations (EC, 2010). However, member states would not be able to make such decisions on the grounds of scientific assessments of health and environmental risk because these are performed by the EU''s scientific advisory body, the European Food Safety Authority (EFSA). The EC''s rationale for this proposed policy change is to address the bitter resistance to GM crops in some member states and break the resulting long-standing regulatory and policy deadlock.In July 2011, the European Parliament (EP) overwhelmingly voted to endorse the principle of member-state freedom, but rejected the EC''s attempt to completely prevent member states from using scientific arguments to ban GMOs (Sidebar A). Whereas the EC wished to protect a centralized and singular voice of science for EU policy (namely the EFSA), the EP asserted that the different conditions across the EU could allow a rational scientific approach to reach different conclusions, especially on matters of environmental risk. In its amendments, the EP also implicitly accepted other points of criticism of the EFSA and EC processes; for example, that there are normative choices being made in EU GM policy, but under the false name of science.

Sidebar A | Development of the proposal for EU GM regulatory reform

4 December 2008Council identifies areas for improvement in the European Union (EU) framework for authorizing genetically modified organisms (GMOs), including fuller environmental assessment and socio-economic appraisal.2 March 2009A Dutch proposal is made to the Environment Council (of EU Member State Ministers) that the decision to cultivate GM crops should be left to individual member states.24 June 2009A group of 13 member states requests that the European Commission (EC) give member states the freedom to decide on the cultivation of GM plants based on “relevant socio-economic aspects”.3 September 2009EC President José Manuel Barroso suggests “it should be possible to combine a Community authorisation system, based on science, with freedom for Member States to decide whether or not they wish to cultivate GM crops on their territory” (Barroso, 2009).13 July 2010In response to the Council of Ministers, the EC proposes amendments to Directive 2001/18/EC through the addition of Article 26b, allowing member states to restrict or prohibit GMO cultivation on grounds other than adverse effects to health and the environment.September 2010Ad hoc working party is established by COREPER (The Committee of Permanent Representatives of Member State Governments) to consider the EC''s proposal, taking into account the recommendation on coexistence.7 September 2010Delegates to the ad hoc working party raise concerns about the legality of the proposal within international trade law, as well as the need for enhanced clarity on the proposed acceptable grounds for member state restrictions of GMO cultivation.27 September and 14 October 2010Councils on Agriculture and Fisheries and Environment reiterate concerns of the COREPER working party and the opinion of the Council Legal Service is requested.5 November 2010Council Legal Service opinion concludes that the EC''s proposal might not be compatible with international treaties or with the General Agreement on Tariffs and Trade (GATT).23 November 2010Commission Services disagrees with legal service opinion and argues that the EC''s proposal is a way to ensure smooth functioning of the internal market in accordance with Article 114 of the EU Constitution—the 2009 Treaty of Lisbon—and that grounds other than ethics might be invoked; for example, public order or public interest to preserve cultural traditions, or ‘public morals'' as permitted under GATT.8 December 2010COREPER working party argues that a list of grounds that could be used by member states to restrict GMOs under the new proposal needs to be provided by the EC.9 December 2010EU Economic and Social Committee (2011) concludes that the proposal will “create more vagueness than certainty and could in practice result in a proliferation of (legally unstable) measures adopted by States” and also calls for more clearly specified grounds for restrictions.8 February 2011Commission Services (2011) release an open but not exhaustive list of possible reasons that could be invoked to restrict or prohibit GMO cultivation under the new proposal, including: public morals, public order, avoiding presence in other products, social policy objectives, land-use planning, cultural policy and general environmental policy objectives (other than assessment of adverse effects of GMOs on the environment) such as maintenance of certain types of landscape features, ecosystems or ecosystem services.12 April 2011European Parliament (EP) Environment Committee votes to submit to the full EP its amendments to the EC legislative proposal to include scientifically justified environmental impacts complementary to those assessed by the EFSA as legitimate grounds for member state restrictions or prohibitions. This includes prevention of pesticide resistance, invasiveness and/or biodiversity loss; maintenance of seed purity, local biodiversity, unviability of coexistence regimes, ecosystem and agricultural sustainability; and/or presence of persistent uncertainty through data absence or contradictions (Committee on the Environment, Public Health and Food Safety, 2011).5 July 2011 (originally scheduled for 9 June)Parliament plenary vote on the EC''s proposal and the EP Environment Committee amendments. Large majority votes in favour of Environment Committee amendments (548 for, 84 against, 31 abstentions). This Parliamentary verdict goes to the Council of Ministers for agreement on a final legal schedule.There are inherent dangers with the EC''s goal of pursuing a political and economic union for Europe that increasingly depends on claims about a unitary, singular, deterministic and independent quality to scientific risk analysis. We argue that such claims are confused, false and ultimately self-defeating, despite the honourable intent of the original reasons for moving towards political union. …the EC wished to protect a centralized and singular voice of science for EU policy…In recent years, several EU member states have used the ‘safeguard clause'', Article 23 of EC Directive 2001/18, to ban the cultivation of GM crops in their territories, despite safety approvals from the EFSA. Article 23 allows ‘temporary'' prohibitions if there is new scientific knowledge indicating a potential risk to human health or the environment. However, the EFSA has assessed and declared that all such current prohibitions by member states lack sufficient scientific support and are therefore illegal under the original EC authorizations. Nonetheless, various member states uphold these bans, thereby formally violating European law and creating an escalating sense of crisis. This has seen the attempt to establish a centralized authority for the regulation of GMOs fall into disarray, as bans are met with EC legal threats, and these are met with further member state intransigence. Disagreement between the EC and member states has typically focused on the EFSA, which acts as scientific authority to its policy client, the EC''s Directorate-General for Consumer Health and Protection. The EFSA''s central responsibility for risk assessment effectively makes it the EC''s scientific authority for GM policy, and it is the risk science of the EFSA''s GM panel that has been publicly disputed in member states'' justifications of their Article 23 prohibitions.Disputed science is crucial in disagreements over GMOs, but the dispute is not limited to facts revealed by researchIn September 2009, EC President José Manuel Barroso urged a reconsideration of the EU constitutional principle of subsidiarity in GMO policy: “It should be possible to combine a Community authorisation system, based on science, with freedom for Member States to decide whether or not they wish to cultivate GM crops on their territory” (Barroso, 2009). In July 2010, the EC (2010) proposed amendments to Directive 2001/18 to create a formal basis for member states to restrict or prohibit the cultivation in their territory of GMOs authorized at the EU level. The EC proposal turns the existing situation on its head: instead of prohibitions only being permitted on the basis of potential risks to human health or the environment, the new proposal would allow bans only on “grounds other than those related to the assessment of the adverse effect on health and environment” (EC, 2010). The implication is that the EFSA adequately assesses health and environmental risks. Yet, this was, and remains, precisely the main issue for those member states that refuse to accept the scientific adequacy of EFSA authorizations.This separation of risk science from other concerns has been misinterpreted by some commentators as allowing EU member states to make “arbitrary” decisions, “without explanation” and “based on irrational criteria” (Sabalza et al, 2011). This ignores other rational grounds for decision-making—for example, socio-economic and/or ethical considerations. Moreover, it fails to recognize the contingencies that pervade risk assessment: that is, the possibility for divergent scientific assessments depending on different framing commitments, including the way such commitments define relevant factors, interpretive criteria and implicit burden-of-proof assumptions (Wynne, 1989; Stirling, 1998). The impossibility of separating scientific risk knowledge from normative questions, assumptions and commitments is neither a failing of that science nor those institutions. It is an unavoidable reality that needs to be addressed in an enlightened and accountable way.Disputed science is crucial in disagreements over GMOs, but the dispute is not limited to facts revealed by research. It is also about the normative commitments that scientists make and how these shape what are deemed to be salient and reliable facts; for example, the choices made concerning the relevant questions to ask, the appropriate methods to employ, the pertinent baselines for comparison and so on. The EC''s proposal embodies a confusion of risk science with an idealized model of pure scientific research unaffected by normative considerations, and which, therefore, supposedly speaks only in the singular voice of Nature. Thus the EC produces a framework that asserts that current scientific and regulatory institutions, namely the EFSA in this case, are sufficiently capable of exhaustively defining and assessing such risks in an impartial, objective and over-arching way. However, not only are there legitimate scientific differences in environmental, agronomic and health risk assessment situations across Europe, there are also unacknowledged social, ethical and political commitments embedded in the supposedly singular EC risk science (Brunk et al, 1991; EU, 2007). An unavoidable effect of this confusion is that member states'' legitimate differences with EFSA''s ‘science'' (which stands for EC–EU policy), are arbitrarily rendered ‘unscientific'' and illegitimate.The EC''s proposal embodies a confusion of risk science with an idealized model of pure scientific research…The conflation of risk, science and rationality into the combined position that risk represents the only legitimate ground for social concern, current scientific and regulatory institutions are capable of defining and assessing such risks in an impartial and objective way, and scientific risk assessment as performed by existing institutions is the only rational basis for decision-making, is arguably exactly the institutional mindset that has created the current paralysis in EU GMO regulation and policy, and therefore the need for reform. Thus the same mindset that created the paralysing conflict in the first place is informing the EC''s approach to revising legislation.At first sight, then, the EC''s proposal seems to be a positive move to accept different member state policies on GM cultivation, particularly as it includes socio-economic and/or ethical considerations as legitimate grounds for these. Closer examination, however, suggests that it might be a trap. The EC''s proposal attempts to create a rigid boundary between a supposed singular, objective, non-contingent and universal scientific knowledge on risk, and diverse ‘non-scientific'' social, ethical, religious and/or political concerns. This framing of a rigid division between the scientific and ‘non-scientific'', corresponding with a ‘rational (universal)'' over ‘irrational (local)'' standpoint, ignores that risk science is actually shaped by unacknowledged normative commitments and contingencies, which are manifested through uncertainty, ambiguity, indeterminacy and ignorance (Wynne, 1992; Brunk et al, 1991). This scientism as a form of politics undermines an enlightened, scientifically informed democratic cultureThe EC proposal draws on ideals of impartiality in research science (Daston & Galison, 2007; Lacey, 2005), but uses these to claim authority for what is a different knowledge culture, namely regulatory science (Jasanoff, 1990). The EC has been here before (Laurence & Wynne, 1989), and its stance seems to express that economic and political union is achievable through scientific authority—as if science can declare unionist policy ends as a revelation of Nature rather than as a reasonably argued but contestable human aim. This scientism as a form of politics undermines an enlightened, scientifically informed democratic culture.In April 2011, the Environment Committee of the EP recommended significant amendments to the EC''s proposal. These amendments allowed contextually variable definitions of environmental harm, recognized the intertwined character of nature and culture in agriculture, and acknowledged the significance of scientific uncertainties. In doing so, these proposed amendments permitted non-scientific as well as scientific reasons for bans by member states (Committee on the Environment, Public Health and Food Safety, 2011; Sidebar B). On 5 July 2011, the EP followed these recommendations and voted down the original EC proposal (EP, 2011).

Sidebar B | Legal text of the EC legislative proposal and EP amendments

Original wording of the European Commission proposalArticle 26b“Member States may adopt measures restricting or prohibiting the cultivation of all or particular [genetically modified organisms, GMOs] […] in all or part of their territory, provided that:(a) those measures are based on grounds other than those related to the assessment of the adverse effect on health and environment which might arise from the deliberate release or the placing on the market of GMOs”.Amendments voted by the European Parliament“Member States may adopt, after a case-by-case examination, measures restricting or prohibiting the cultivation of particular GMOs or of groups of GMOs defined by crop or trait or of all GMOs […] in all or part of their territory, provided that:(a) those measures are based on(i) duly justified grounds relating to local or regional environmental impacts which might arise from the deliberate release or placing on the market of GMOs, and which are complementary to the environmental impacts examined during the scientific assessment of the impacts on the environment conducted under Part C of this Directive [that is, by the EFSA]; or grounds relating to risk management. Those grounds may include:
  • the prevention of the development of pesticide resistance among weeds and pests;
  • the invasiveness or persistence of a GM variety, or the possibility of interbreeding with domestic cultivated or wild plants;
  • the prevention of negative impacts on the local environment caused by changes in agricultural practices linked to the cultivation of GMOs;
  • the maintenance and development of agricultural practices which offer a better potential to reconcile production with ecosystem sustainability;
  • the maintenance of local biodiversity, including certain habitats and ecosystems, or certain types of natural and landscape features;
  • the absence or lack of adequate data concerning the potential negative impacts of the release of GMOs on the local or regional environment of a Member State, including on biodiversity;
(ii) grounds relating to socio-economic impacts. Those grounds may include:
  • the impracticability or the high costs of coexistence measures or the impossibility of implementing coexistence measures due to specific geographical conditions such as small islands or mountain zones;
  • the need to protect the diversity of agricultural production; or
  • the need to ensure seed purity;
(iii) other grounds that may include land use, town and country planning, or other legitimate factors”.Other significant European Parliament amendments“(2a) The Commission and Member States should ensure, as a priority, the implementation of the Environment Council Conclusions adopted on 4 December 2008, namely a proper implementation of the legal requirements laid down in Annex II of Directive 2001/18/EC for the risk assessment of GMOs. In particular, the long-term environmental effects of GM crops, as well as their potential effects on non-target organisms, should be rigorously assessed; the characteristics of the receiving environments and the geographical areas in which GM plants may be cultivated should be duly taken into account; and the potential environmental consequences brought about by changes in the use of herbicides linked to herbicide-tolerant GM crops should be assessed. More specifically, the Commission should ensure that the new guidelines on GMO risk assessment are adopted. Those guidelines should not be based only on the principle of substantial equivalence or on the concept of a comparative safety assessment, and should make it possible to clearly identify direct and indirect long-term effects, as well as scientific uncertainties. The European Food Safety Authority (EFSA) and the Member States should aim to establish an extensive network of scientific organizations representing all disciplines including those relating to ecological issues, and should cooperate to identify at an early stage any potential divergence between scientific opinions with a view to resolving or clarifying the contentious scientific issues. The Commission and the Member States should ensure that the necessary resources for independent research on the potential risks of GMOs are secured, and that the enforcement of intellectual property rights does not prevent independent researchers from accessing all relevant material.”The justification for the ‘supplementary'' scientific reasons for member state bans was that environmental issues specific to national, regional or local conditions will require scientific data that might not be sufficiently addressed in a risk assessment at the European level. The EP amendments also stated that a lack of relevant information would constitute legitimate grounds for member states to restrict or prohibit the cultivation of GMOs. Other amendments also restate and reinforce the requirement to implement Directive 2001/18 (Sidebar B), for example, including specific legislative requirements to assess long-term risks. Although this legal requirement already exists under Directive 2001/18, complaints continue that long-term and cumulative risks from GMOs have never been adequately considered by the EFSA and consequent EC authorizations. Significantly, the EP also sought to change the legal basis for the proposed new regulation from Article 114 of the EU Lisbon Treaty, which focuses on the establishment of a single market, to Article 192, which grants member states responsibility for the conservation of fauna and flora, land-use or town and country planning. This change recognizes that GMO cultivation and agriculture are closely linked to issues of land-use, the conservation of flora and fauna, and biodiversity. The idea is that closed, expert resolution, using the disinterested and supposedly unitary voice of science, will deliver union by abstract revelation…Whereas the EP voted with a large majority to amend the EC''s proposal, the EC perspective remains alive and the EP''s recommended amendments are now undergoing political bargaining between the Council of Ministers, the EC and the EP. As one participant informed us, the EP version will not necessarily survive into final EU legislation as it is unlikely that the Council will support all of the amendments. If these negotiations fail to reach agreement, an interagency group with members from the EP and the Council of Ministers will prepare a common statement that will then be voted on again by both bodies. Although the EP''s proposed amendments are promising, political negotiation continues and the EC''s approach, which is fundamentally different from the EP''s, might well survive.While the content of any final EU legislative text remains speculative at this point, it is worth considering the EP''s amendments in the light of new EU guidelines for the environmental risk assessment of GM plants, which after a period of feedback and consultation were finalized in May 2011 (EFSA, 2011). It seems that many of the issues that the EP''s amendments outline as potentially valid grounds for member state restrictions or prohibitions already fall within the scope of the EFSA. This includes specific environmental risks, such as invasiveness or weed and pest resistance, and general issues such as the specificities of the receiving environment and the potential for changes in agricultural management practices. The EP also calls for the documentation of any scientific uncertainty and disagreement, which has been EU law, although unimplemented, for nine years.The more controversial that public issues involving science become, the more this aggrandisement of scientific risk assessment becomes appealingOne important and contested issue in the finalized guidelines of the EFSA (2011) is the introduction of a ‘Comparative Safety Assessment'' by the EFSA GM panel (ENSSER, 2011). If the GM crop under assessment passes this first step, no further questions need be asked. Given that comparative baselines are themselves a point of contention and are normative choices that affect the scientific appraisal, this makes the EC''s attempt to assert central singular scientific rule over any possible scientific criticism by member states even more problematic and potentially provocative.Therefore, if the EFSA follows its own guidelines for environmental risk assessment, the concrete issues described in the EP''s amendments could not be considered ‘complementary to'' those assessed by the EFSA. Should this happen, we would return to a situation in which the EFSA is deemed to sufficiently assess environmental risk, and in which member states would effectively be left with no formal basis to contest the quality or content of the agency''s assessment, and therefore could not prohibit or restrict GM cultivation on the basis of an alternative interpretation of the available science as defined by the EFSA. Consistent with our critical analysis of the EC''s proposal, such a final agreement would maintain most of the problematic framing that mistakenly defines ‘science'' performed by the EFSA and endorsed by the EC to be above and before any other normative commitments with respect to GMO risk assessment and policy.It is also worth noting that the very language of ‘complementary'' suggests that EFSA risk assessments might be incomplete, but it does not acknowledge that choices in risk-based science and/or particular framings of a risk assessment might be legitimately different or mutually exclusive between the EFSA and member states. This is true, for example, of the choices of ‘normal baseline'' comparators for defining harm, of protection goals, of the timescales during which to observe effects, of the chosen endpoints, of the relevant test material, or of the required weight of empirical evidence for defining adequate ‘proof'' of harm. The language of ‘complementary'' might be a pragmatic compromise; however, honesty might require ‘alternative'' and the wider corresponding political debate over the hidden normative issues that this deserves.The quality of the risk assessment process will inevitably remain an issue of debate, as will the significance of particular uncertainties or gaps in knowledge, owing to the inherently normative nature of risk-based choices and assumptions. Here, the key issues are to what extent local, regional and national environmental and social conditions and aims can remain valid grounds for member states'' prohibitions; to what extent scientific disagreement over the validity and quality of a risk assessment can be used as valid grounds for member state prohibitions; and whether the knowledge produced and used in risk research and assessment can be acknowledged to be fundamentally different from knowledge as scientific research. This latter admission would also expose the normative public questions that are currently hidden and promoted falsely in the name of science in the EC''s attempt to achieve a unified European policy authority.Recognizing the fundamentally different nature of risk-based regulatory science, particularly the way it makes normative choices and assumptions, requires that risk assessment as a policy tool include broad-based deliberative quality assurance (Wickson, 2009). This is recognized by the EP in its calls to fully involve member states, competent scientific bodies and other relevant stakeholders in the assessment process. Since the EP, in its amendments, also challenged the appropriateness of the EFSA guidelines to have assessment led by the discredited principle of substantial equivalence (or an undefined ‘comparative safety assessment''), it is important that the guidelines themselves also remain open for critical scrutiny.The EC''s constitutional role is that of technical administrator of EU policy. Fearing that the original ideal of political union could disintegrate, the EC has often attempted to sublimate political, institutional and cultural differences into purely technical idioms. The idea is that closed, expert resolution, using the disinterested and supposedly unitary voice of science, will deliver union by abstract revelation rather than by grounded and grinding political negotiation. The tendency to redefine problematic policy issues as scientific questions only, and to assume that public concerns can be correspondingly reduced to scientific unitary terms such as ‘risk'', is understandable. This typically accompanies the increased framing of policy issues as exclusively questions of risk. Although risk has long been identified as an ambiguous combination of both propositional knowledge and value-commitments (EU, 2007; Brunk et al, 1991; Wynne, 1989)—that is, it can only be scientifically defined by first choosing what is ‘at risk'' (worth protecting)—a scientific definition of public concerns has been abetted by policy officials who are anxious to avoid or mitigate political confrontation. The more controversial that public issues involving science become, the more this aggrandisement of scientific risk assessment becomes appealing (Wynne, 2001). We are not suggesting that risks are unimportant. Rather, it is that they do not have singular meaning or definition; nor are they the sole definition of public concerns or of public policy issues.Science for policy cannot be rendered more accountable without also making accountable the policy processes that mutually shape and deploy that scienceBy seeking to establish a clean and final boundary between the social and scientific aspects of decision-making on GMO cultivation, the EC fails to acknowledge and confront the normative dimensions embedded in risk-based science for policy. This discredits the good name of science and costs it public support. The fact that the EP, led by its Environment Committee, has challenged this particular proposal is significant, but will do little to change matters unless the final legislative document incorporates the suggested EP amendments. More broadly, we argue that the EC''s background assumption of apolitical scientific sovereignty over rationally supported policy difference needs to be acknowledged, and altered.Although the precise form of the legislative amendments voted by the EP remains to be seen, the opening up of the EC''s misconceived assertion could be directly pursued through the establishment of more inclusive and plural knowledge assessment processes, under a different understanding of the kinds of knowledge in play. Such developments would increase transparency in political decision-making and its scientific justifications, rather than hiding these under a false mantle of objective, singular and uncontestable science. Science for policy cannot be rendered more accountable without also making accountable the policy processes that mutually shape and deploy that science (Jasanoff, 2004). Pursuing this would, however, require a different vision and practice of European political union; one not based on a habit of false scientific determination (Waterton & Wynne, 1996).Permitting diverse European policy options on GM cultivation and rationally diverse grounds for their justification could improve the potential for a resilient EU socio-agricultural policy portfolio. However, the cultivation of resilient diversity within European agriculture and its supporting sciences depends on the crucial question of whether GM can indeed coexist with alternative approaches, technologies and imaginaries. The scientific and political challenges associated with this, however, are a whole new kettle of fish.? Open in a separate windowFern WicksonOpen in a separate windowBrian Wynne  相似文献   

6.
Perry JN  Arpaia S  Bartsch D  Kiss J  Messéan A  Nuti M  Sweet JB  Tebbe CC 《EMBO reports》2012,13(6):481-2; author reply 482-3
The correspondents argue that “The anglerfish deception” contains omissions, errors, misunderstandings and misinterpretations.EMBO reports (2012) advanced online publication; doi: 10.1038/embor.2012.71EMBO reports (2012) 13 2, 100–105; doi: 10.1038/embor.2011.254The commentary [1] on aspects of genetically modified organism (GMO) regulation, risk assessment and risk management in the EU contains omissions, errors, misunderstandings and misinterpretations. As background, environmental risk assessment (ERA) of genetically modified (GM) plants for cultivation in the EU is conducted by applicants following principles and data requirements described in the Guidance Document (ERA GD) established by the European Food Safety Authority (EFSA) [2], which follows the tenets of Directive 2001/18/EC. The ERA GD was not referenced in [1], which wrongly referred only to EFSA guidance that does not cover ERA. Applications for cultivation of a GM plant containing the ERA, submitted to the European Commission (EC), are checked by the EFSA to ensure they address all the requirements specified in its ERA GD [2]. A lead Member State (MS) is then appointed to conduct the initial evaluation of the application, requesting further information from the applicant if required. The MS evaluation is forwarded to the EC, EFSA and all other MSs. Meanwhile, all other MSs can comment on the application and raise concerns. The EFSA GMO Panel carefully considers the content of the application, the lead MS Opinion, other MSs'' concerns, all relevant data published in the scientific literature, and the applicant''s responses to its own requests for further information. The Panel then delivers its Opinion on the application, which covers all the potential environmental areas of risk listed in 2001/18/EC. This Opinion is sent to the EC, all MSs and the applicant and published in the EFSA journal (efsa.europa.eu). Panel Opinions on GM plants for cultivation consider whether environmental harm might be caused, and, if so, suggest possible management to mitigate these risks, and make recommendations for post-market environmental monitoring (PMEM). The final decision on whether to allow the cultivation of GM plants, and any specific conditions for management and monitoring, rests with the EC and MSs and is not within the remit of the EFSA.Against this background we respond to several comments in [1]. Regarding the Comparative Safety Assessment of GM plants and whether or not further questions are asked following this assessment, the Comparative Safety Assessment, described fully in [2], is not a ‘first step''. It is a general principle that forms a central part of the ERA process, as introduced in section 2.1 of [2]. Each ERA starts with problem formulation and identification, facilitating a structured approach to identifying potential risks and scientific uncertainties; following this critical first step many further questions must be asked and addressed. In [2] it is clearly stated that all nine specific areas of risk listed in 2001/18/EC must be addressed—persistence and invasiveness; vertical gene flow; horizontal gene flow; interactions with target organisms; interactions with non-target organisms; human health; animal health; biogeochemical processes; cultivation, management and harvesting techniques. Under the Comparative Safety Assessment, following problem formulation, each of these areas of risk must be assessed by using a six-step approach, involving hazard identification, hazard characterization, exposure assessment, risk characterization, risk management strategies and an overall risk evaluation and conclusion. Indeed, far from asking “no further questions” [1], the EFSA GMO Panel always sends a sequence of written questions to the applicant as part of the ERA process to achieve a complete set of data to support the ERA evaluation (on average about ten per application).The principle of comparative analysis in ERA—sometimes referred to as substantial equivalence in the risk assessment of food and feed—is not discredited. The comparative approach is supported by all of the world''s leading national science academies [for example, 3]; none has recommended an alternative. The principle is enshrined in risk assessment guidelines issued by all relevant major international bodies, including the World Health Organization, the Food and Agriculture Organization of the United Nations and the Organisation for Economic Co-operation and Development. Critics of this approach have failed to propose any credible alternative baseline to risk assess GMOs. The comparative analysis as described in [2] is not a substitute for a safety assessment, but is a tool within the ERA [4] through which comparisons are made with non-GM counterparts in order to identify hazards associated with the GM trait, the transformation process and the associated management systems, which are additional to those impacts associated with the non-GM plant itself. The severity and frequency of these hazards are then quantified in order to assess the levels of risks associated with the novel features of the GM plant and its cultivation.European Parliament (EP) communications include that “the characteristics of the receiving environments and the geographical areas in which GM plants may be cultivated should be duly taken into account”. We agree, and the ERA GD [2] recognizes explicitly that receiving environments differ across the EU, and that environmental impacts might differ regionally. Therefore, the ERA GD [2] demands that such differences be fully accounted for in cultivation applications and that receiving environments be assessed separately in each of the nine specific areas of risk (see section 2.3.2). Furthermore, [2] states in section 3.5 that the ERA should consider scenarios representative of the diversity of situations that might occur and assess their potential implications. The EP communications state that “the long-term environmental effects of GM crops, as well as their potential effects on non-target organisms, should be rigorously assessed”. This is covered explicitly in section 2.3.4 of [2], and developed in the recent guidance on PMEM [5].The EFSA is committed to openness, transparency and dialogue and meets regularly with a wide variety of stakeholders including non-governmental organizations (NGOs) [6] to discuss GMO topics. That the EFSA is neither a centralized nor a singular voice of science in the EU is clear, because the initial report on the ERA is delivered by a MS, not the EFSA; all MSs can comment on the ERA; and EFSA GMO Panel Opinions respond transparently to every concern raised by each MS. Following publication, the EFSA regularly attends the SCFCAH Committee (comprising MS representatives) to account for its Opinions. The involvement of all MSs in the evaluation process ensures that concerns relating to their environments are addressed in the ERA. Subsequently, MSs can contribute to decisions on the management and monitoring of GM plants in their territories if cultivation is approved.In recent years, several MSs have used the ‘safeguard clause'', Article 23 of 2001/18/EC, to attempt to ban the cultivation of specific GM plants in their territories, despite earlier EFSA Panel Opinions on those plants. But the claim that “the risk science of the EFSA''s GM Panel has been publicly disputed in Member State''s justifications of their Article 23 prohibitions” needs to be placed into context [1]. When a safeguard clause (SC) is issued by a MS, the EFSA GMO Panel is often asked by the EC to deliver an Opinion on the scientific basis of the SC. The criteria on which to judge the documentation accompanying a SC are whether: (i) it represents new scientific evidence—and is not just repetition of information previously assessed—that demonstrates a risk to human and animal health and the environment; and (from the guidance notes to Annex II of 2001/18/EC) (ii) it is proportionate to the level of risk and to the level of uncertainty. It is pertinent that on 8 September 2011, the EU Court of Justice ruled that ‘with a view to the adoption of emergency measures, Article 34 of Regulation (EC) No 1829/2003 requires Member States to establish, in addition to urgency, the existence of a situation which is likely to constitute a clear and serious risk to human health, animal health or the environment''. Scientific literature is monitored continually by the Panel and relevant new work is examined to determine whether it raises any new safety concern. In all cases where the EFSA was consulted by the EC, there has been no new scientific information presented that would invalidate the Panel''s previous assessment.Throughout [1] the text demonstrates a fundamental misunderstanding of the distinction between ERA and risk management. ERA is the responsibility of the EFSA, although it is asked for its opinion on risk management methodology by the EC. Risk management implementation is the responsibility of the EC and MSs. Hence, the setting of protection goals is an issue for risk managers and might vary between MSs. However, the ERA GD [2], through its six-step approach, makes it mandatory for applications to relate the results of any studies directly to limits of environmental concern that reflect protection goals and the level of change deemed acceptable. Indeed, the recent EFSA GMO Panel Opinions on Bt-maize events [for example, 7] have been written specifically to provide MSs and risk managers with the tools to adapt the results of the quantified ERA to their own local protection goals. This enables MSs to implement risk management and PMEM proportional to the risks identified in their territories.The EFSA GMO Panel comprises independent researchers, appointed for their expertise following an open call to the scientific community. The Panel receives able support from staff of the EFSA GMO Unit and numerous ad hoc members of its working groups. It has no agenda and is neither pro- or anti-GMOs; its paramount concern is the quality of the science underpinning its Guidance Documents and Opinions.  相似文献   

7.
8.
The debate about GM crops in Europe holds valuable lessons about risk management and risk communication. These lessons will be helpful for the upcoming debate on GM animals.Biomedical research and biotechnology have grown enormously in the past decades, as nations have heavily invested time and money in these endeavours to reap the benefits of the so-called ‘bioeconomy''. Higher investments on research should increase knowledge, which is expected to translate into applied research and eventually give rise to new products and services that are of economic or social benefit. Many governments have developed ambitious strategies—both economic and political—to accelerate this process and fuel economic growth (http://www.oecd.org/futures/bioeconomy/2030). However, it turns out that social attitudes are a more important factor for translating scientific advances than previously realized; public resistance can effectively slow down or even halt technological progress, and some hoped-for developments have hit roadblocks. Addressing these difficulties has become a major challenge for policy-makers, who have to find the middle ground between promoting innovation and addressing ethical and cultural values.There are many examples of how scientific and technological advances raise broad societal concerns: research that uses human embryonic stem cells, nanotechnology, cloning and genetically modified (GM) organisms are perhaps the most contested ones. The prime example of a promising technology that has failed to reach its full potential owing to ethical, cultural and societal concerns is GM organisms (GMOs); specifically, GM crops. Intense lobbying and communication by ‘anti-GM'' groups, combined with poor public relations from industry and scientists, has turned consumers against GM crops and has largely hampered the application of this technology in most European countries. Despite this negative outcome, however, the decade-long debate has provided important lessons and insight for the management of other controversial technologies: in particular, the use of GM animals.During the early 1990s, ‘anti-GM'' non-governmental organizations (NGOs) and ‘pro-GM'' industry were the main culprits for the irreversible polarization of the GMO debate. Both groups lobbied policy-makers and politicians, but NGOs ultimately proved better at persuading the public, a crucial player in the debate. Nevertheless, the level of public outcry varied significantly, reaching its peak in the European Union (EU). In addition to the values of citizens and effective campaigning by NGOs, the structural organization of the EU had a crucial role in triggering the GMO crisis. Within the EU, the European Commission (EC) is an administrative body the decisions of which have a legal impact on the 27 Member States. The EC is well-aware of its unique position and has compensated its lack of democratic accountability by increasing transparency and making itself accessible to the third sector [1]. This strategy was an important factor in the GMO debate as the EC was willing to listen to the views of environmental groups and consumer organizations.…it turns out that social attitudes are a more important factor for translating scientific advances than previously realized…Environmental NGOs successfully exploited this gap between the European electorate and the EC, and assumed to speak as the vox populi in debates. At the same time, politicians in EU Member States were faced with aggressive anti-GMO campaigns and increasingly polarized debates. To avoid the lobbying pressure and alleviate public concerns, they chose to hide behind science: the result was a proliferation of ‘scientific committees'' charged with assessing the health and environmental risks of GM crops.Scientists soon realized that their so-called ‘expert consultation'' was only a political smoke screen in most cases. Their reports and advice were used as arguments to justify policies—rather than tools for determining policy—that sometimes ignored the actual evidence and scientific results [2,3]. For example, in 2008, French President Nikolas Sarkozy announced that he would not authorize GM pest-resistant MON810 maize for cultivation in France if ‘the experts'' had any concerns over its safety. However, although the scientific committee appointed to assess MON810 concluded that the maize was safe for cultivation, the government''s version of the report eventually claimed that scientists had “serious doubts” on MON810 safety, which was then used as an argument to ban its cultivation. Francoise Hollande''s government has adopted a similar strategy to maintain the ban on MON810 [4].In addition to the values of citizens and effective campaigning by NGOs, the structural organization of the EU had a crucial role in triggering the GMO crisisSuch unilateral decisions by Member States challenged the EC''s authority to approve the cultivation of GM crops in the EU. After intense discussions, the EC and the Member States agreed on a centralized procedure for the approval of GMOs and the distribution of responsibilities for the three stages of the risk management process: risk assessment, risk management and risk communication (Fig 1). The European Food Safety Authority (EFSA) alone would be responsible for carrying out risk assessment, whilst the Member States would deal with risk management through the standard EU comitology procedure, by which policy-makers from Member States reach consensus on existing laws. Finally, both the EC and Member States committed to engage with European citizens in an attempt to gain credibility and promote transparency.Open in a separate windowFigure 1Risk assessment and risk management for GM crops in the EU. The new process for GM crop approval under Regulation (EC) No. 1829/2003, which defines the responsibilities for risk assessment and risk management. EC, European Community; EU, European Union; GM, genetically modified.More than 20 years after this debate, the claims made both for and against GM crops have failed to materialize. GMOs have neither reduced world hunger, nor destroyed entire ecosystems or poisoned humankind, even after widespread cultivation. Most of the negative effects have occurred in international food trade [5], partly owing to a lack of harmonization in international governance. More importantly, given that the EU is the largest commodity market in the world, this is caused by the EU''s chronic resistance to GM crops. The agreed centralized procedure has not been implemented satisfactorily and the blame is laid at the door of risk management (http://ec.europa.eu/food/food/biotechnology/evaluation/index_en.htm). Indeed, the 27 Member States have never reached a consensus on GM crops, which is the only non-functional comitology procedure in the EU [2]. Moreover, even after a GM crop was approved, some member states refused to allow its cultivation, which prompted the USA, Canada and Argentina to file a dispute at the World Trade Organization (WTO) against the EU.The inability to reach agreement through the comitology procedure, has forced the EC to make the final decision for all GMO applications. Given that the EC is an administrative body with no scientific expertise, it has relied heavily on EFSA''s opinion. This has created a peculiar situation in which the EFSA performs both risk assessment and management. Anti-GM groups have therefore focused their efforts on discrediting the EFSA as an expert body. Faced with regular questions related to agricultural management or globalization, EFSA scientists are forced to respond to issues that are more linked to risk management than risk assessment [5]. By repeatedly mixing socio-economic and cultural values with scientific opinions, NGOs have questioned the expertise of EFSA scientists and portrayed them as having vested interests in GMOs.Nevertheless, there is no doubt that science has accumulated enormous knowledge on GM crops, which are the most studied crops in human history [6]. In the EU alone, about 270 million euros have been spent through the Framework Programme to study health and environmental risks [5]. Framework Programme funding is approved by Member State consensus and benefits have never been on the agenda of these studies. Despite this bias in funding, the results show that GM crops do not pose a greater threat to human health and the environment than traditional crops [5,6,7]. In addition, scientists have reached international consensus on the methodology to perform risk assessment of GMOs under the umbrella of the Codex Alimentarius [8]. One might therefore conclude that the scientific risk assessment is solid and, contrary to the views of NGOs, that science has done its homework. However, attention still remains fixed on risk assessment in an attempt to fix risk management. But what about the third stage? Have the EC and Member States done their homework on risk communication?It is generally accepted that risk management in food safety crucially depends on efficient risk communication [9]. However, risk communication has remained the stepchild of the three risk management stages [6]. A review of the GM Food/Feed Regulations noted that public communication by EU authorities had been sparse and sometimes inconsistent between the EC and Member States. Similarly, a review of the EC Directive for the release of GMOs to the environment described the information provided to the public as inadequate because it is highly technical and only published in English (http://ec.europa.eu/food/food/biotechnology/evaluation/index_en.htm). Accordingly, it is not surprising that EU citizens remain averse to GMOs. Moreover, a Eurobarometer poll lists GMOs as one of the top five environmental issues for which EU citizens feel they lack sufficient information [10]. Despite the overwhelming proliferation of scientific evidence, politicians and policy-makers have ignored the most important stakeholder: society. Indeed, the reviews mentioned above recommend that the EC and Member States should improve their risk communication activities.What have we learned from the experience? Is it prudent and realistic to gauge the public''s views on a new technology before it is put into use? Can we move towards a bioeconomy and continue to ignore society? To address these questions, we focus on GM animals, as these organisms are beginning to reach the market, raise many similar issues to GM plants and thus have the potential to re-open the GM debate. GM animals, if brought into use, will involve a similar range and distribution of stakeholders in the EU, with two significant differences: animal welfare organizations will probably take the lead over environmental NGOs in the anti-GM side, and the breeding industry is far more cautious in adopting GM animals than the plant seed industry was to adopt GM crops [11].It is generally accepted that risk management in food safety crucially depends on efficient risk communicationGloFish®—a GM fish that glows when illuminated with UV light and is being sold as a novelty pet—serves as an illustrative example. GloFish® was the first GM animal to reach the market and, more importantly, did so without any negative media coverage. It is also a controversial application of GM technology, as animal welfare organizations and scientists alike consider it a frivolous use of GM, describing it as “complete nonsense” [18]. The GloFish® is not allowed in the EU, but it is commercially available throughout the USA, except in California. One might imagine that consumers in general would not be that interested in GloFish®, as research indicates that consumer acceptance of a new product is usually higher when there are clear perceived benefits [13,14]. It is difficult to imagine the benefit of GloFish® beyond its novelty, and yet it has been found illegally in the Netherlands, Germany and the UK [15]. This highlights the futility of predicting the public''s views without consulting them.Consumer attitudes and behaviour—including in regard to GMOs—are complex and change over time [13,14]. During the past years, the perception from academia and governments of the public has moved away from portraying them as a ‘victim'' of industry towards recognizing consumers as an important factor for change. Still, such arguments put citizens at the end of the production chain where they can only exert their influence by choosing to buy or to ignore certain products. Indeed, one of the strongest arguments against GM crops has been that the public never asked for them in the first place.With GM animals, the use of recombinant DNA technologies in animal breeding would rekindle an old battle between animal welfare organizations and the meat industryWith GM animals, the use of recombinant DNA technologies in animal breeding would rekindle an old battle between animal welfare organizations and the meat industry. Animal welfare organizations claim that European consumers demand better treatment for farm animals, whilst industry maintains that price remains one of the most important factors for consumers [12]. Both sides have facts to support their claims: animal welfare issues take a prominent role in the political agenda and animal welfare organizations are growing in both number and influence; industry can demonstrate a competitive disadvantage over countries in which animal welfare regulations are more relaxed and prices are lower, such as Argentina. However, the public is absent in this debate.Consumers have been described as wearing two hats: one that supports animal welfare and one that looks at the price ticket at the supermarket [16]. This situation has an impact on the breeding of livestock and the meat industry, which sees consumer prices decreasing whilst production costs increase. This trend is believed to reflect the increasing detachment of consumers from the food production chain [17]. Higher demands on animal welfare standards, environmental protection and competing international meat producers all influence the final price of meat. To remain competitive, the meat industry has to increase production per unit; it can therefore be argued that one of the main impetuses to develop GM animals was created by the behaviour—not belief—of consumers. This second example illustrates once again that society cannot be ignored when discussing any strategy to move towards the bioeconomy.The EU''s obsession with assessing risk and side-lining benefits has not facilitated an open dialogueIn conclusion, we believe that functional risk management requires all three components, including risk communication. For applications of biotechnology, a disproportionate amount of emphasis has been placed on risk assessment. The result is that the GMO debate has been framed as black and white, as either safe or unsafe, leaving policy-makers with the difficult task of educating the public about the many shades of grey. However, there are a wide range of issues that a citizen will want take into account when deciding about GM, and not all of them can be answered by science. Citizens might trust what scientists say, but “when scientists and politicians are brought together, we may well not trust that the quality of science will remain intact” [18]. By reducing the debate to scientific matters, it is a free card for the misuse of science and has a negative impact on science itself. Whilst scientists publishing pro-GM results have been attacked by NGOs, scientific publications that highlighted potential risks of GM crops came under disproportionate attacks from the scientific community [19].Flexible governance and context need to work hand-in-hand if investments in biotechnology are ultimately to benefit society. The EU''s obsession with assessing risk and side-lining benefits has not facilitated an open dialogue. The GMO experience has also shown that science cannot provide all the answers. Democratically elected governments should therefore take the lead in communicating the risks and benefits of technological advances to their electorate, and should discuss what the bioeconomy really means and the role of new technologies, including GMOs. We need to move the spotlight away from the science alone to take in the bigger picture. Ultimately, do consumers feel that paying a few extra cents for a dozen eggs is worth it if they know the chicken is happy whether it is so-called ‘natural'' or GM?? Open in a separate windowNúria Vàzquez-SalatOpen in a separate windowLouis-Marie Houdebine  相似文献   

9.
Direct-to-consumer genetic tests and population genome research challenge traditional notions of privacy and consentThe concerns about genetic privacy in the 1990s were largely triggered by the Human Genome Project (HGP) and the establishment of population biobanks in the following decade. Citizens and lawmakers were worried that genetic information on people, or even subpopulations, could be used to discriminate or stigmatize. The ensuing debates led to legislation both in Europe and the USA to protect the privacy of genetic information and prohibit genetic discrimination.Notions of genetic determinism have also been eroded as population genomics research has discovered a plethora of risk factors that offer only probabilistic value…Times have changed. The cost of DNA sequencing has decreased markedly, which means it will soon be possible to sequence individual human genomes for a few thousand dollars. Notions of genetic determinism have also been eroded as population genomics research has discovered a plethora of risk factors that offer only probabilistic value for predicting disease. Nevertheless, there are several increasingly popular internet genetic testing services that do offer predictions to consumers of their health risks on the basis of genetic factors, medical history and lifestyle. Also, not to be underestimated is the growing popularity of social networks on the internet that expose the decline in traditional notions of the privacy of personal information. It was only a matter of time until all these developments began to challenge the notion of genetic privacy.For instance, the internet-based Personal Genome Project asks volunteers to make their personal, medical and genetic information publicly available so as, “to advance our understanding of genetic and environmental contributions to human traits and to improve our ability to diagnose, treat, and prevent illness” (www.personalgenomes.org). The Project, which was founded by George Church at Harvard University, has enrolled its first 10 volunteers and plans to expand to 100,000. Its proponents have proclaimed the limitations, if not the death, of privacy (Lunshof et al, 2008) and maintain that, under the principle of veracity, their own personal genomes will be made public. Moreover, they have argued that in a socially networked world there can be no total guarantee of confidentiality. Indeed, total protection of privacy is increasingly unrealistic in an era in which direct-to-consumer (DTC) genetic testing is offered on the internet (Lee & Crawley, 2009) and forensic technologies can potentially ‘identify'' individuals in aggregated data sets, even if their identity has been anonymized (Homer et al, 2008).Since the start of the HGP in the 1990s, personal privacy and the confidentiality of genetic information have been important ethical and legal issues. Their ‘regulatory'' expression in policies and legislation has been influenced by both genetic determinism and exceptionalism. Paradoxically, there has been a concomitant emergence of collaborative and international consortia conducting genomics research on populations. These consortia openly share data, on the premise that it is for public benefit. These developments require a re-examination of an ‘ethics of scientific research'' that is founded solely on the protection and rights of the individual.… total protection of privacy is increasingly unrealistic in an era in which direct-to-consumer (DTC) genetic testing is offered on the internetAlthough personalized medicine empowers consumers and democratizes the sharing of ‘information'' beyond the data sharing that characterizes population genomics research (Kaye et al, 2009), it also creates new social groups based on beliefs of common genetic susceptibility and risk (Lee & Crawley, 2009). The increasing allure of DTC genetic tests and the growth of online communities based on these services also challenges research in population genomics to provide the necessary scientific knowledge (Yang et al, 2009). The scientific data from population studies might therefore lend some useful validation to the results from DTC, as opposed to the probabilistic ‘harmful'' information that is now provided to consumers (Ransohoff & Khoury, 2010; Action Group on Erosion, Technology and Concentration, 2008). Population data clearly erodes the linear, deterministic model of Mendelian inheritance, in addition to providing information on inherited risk factors. The socio-demographic data provided puts personal genetic risk factors in a ‘real environmental'' context (Knoppers, 2009).Thus, beginning with a brief overview of the principles of data sharing and privacy under both population and consumer testing, we will see that the notion of identifiability is closely linked to the definition of what constitutes ‘personal'' information. It is against this background that we need to examine the issue of consumer consent to online offers of genetic tests that promise whole-genome sequencing and analysis. Moreover, we also demonstrate the need to restructure ethical reviews of genetic research that are not part of classical clinical trials and that are non-interventionist, such as population studies.The HGP heralded a new open access approach under the Bermuda Principles of 1996: “It was agreed that all human genomic sequence information, generated by centres funded for large-scale human sequencing, should be freely available and in the public domain in order to encourage research and development and to maximise its benefit to society” (HUGO, 1996). Reaffirmed in 2003 under the Fort Lauderdale Rules, the premise was that, “the scientific community will best be served if the results of community resource projects are made immediately available for free and unrestricted use by the scientific community to engage in the full range of opportunities for creative science” (HUGO, 2003). The international Human Genome Organization (HUGO) played an important role in achieving this consensus. Its Ethics Committee considered genomic databases as “global public goods” (HUGO Ethics Committee, 2003). The value of this information—based on the donation of biological samples and health information—to realize the benefits of personal genomics is maximized through collaborative, high-quality research. Indeed, it could be argued that, “there is an ethical imperative to promote access and exchange of information, provided confidentiality is protected” (European Society of Human Genetics, 2003). This promotion of data sharing culminated in a recent policy on releasing research data, including pre-publication data (Toronto International Data Release Workshop, 2009).There is room for improvement in both the personal genome and the population genome endeavoursIn its 2009 Guidelines for Human Biobanks and Genetic Research Databases, the Organization for Economic Cooperation and Development (OECD) states that the “operators of the HBGRD [Human Biobanks and Genetic Research Databases] should strive to make data and materials widely available to researchers so as to advance knowledge and understanding.” More specifically, the Guidelines propose mechanisms to ensure the validity of access procedures and applications for access. In fact, they insist that access to human biological materials and data should be based on “objective and clearly articulated criteria [...] consistent with the participants'' informed consent”. Access policies should be fair, transparent and not inhibit research (OECD, 2009).In parallel to such open and public science was the rise of privacy protection, particularly when it concerns genetic information. The United Nations Educational, Scientific and Cultural Organization''s (UNESCO) 2003 International Declaration on Human Genetic Data (UNESCO, 2003) epitomizes this approach. Setting genetic information apart from other sensitive medical or personal information, it mandated an “express” consent for each research use of human genetic data or samples in the absence of domestic law, or, when such use “corresponds to an important public interest reason”. Currently, however, large population genomics infrastructures use a broad consent as befits both their longitudinal nature as well as their goal of serving future unspecified scientific research. The risk is that ethics review committees that require such continuous “express” consents will thereby foreclose efficient access to data in such population resources for disease-specific research. It is difficult for researchers to provide proof of such “important public interest[s]” in order to avoid reconsents.Personal information itself refers to identifying and identifiable information. Logically, a researcher who receives a coded data set but who does not have access to the linking keys, would not have access to ‘identifiable'' information and so the rules governing access to personal data would not apply (Interagency Advisory Panel on Research Ethics, 2009; OHRP, 2008). In fact, in the USA, such research is considered to be on ‘non-humans'' and, in the absence of institutional rules to the contrary, it would theoretically not require research ethics approval (www.vanderbilthealth.com/main/25443).… the ethics norms that govern clinical research are not suited for the wide range of data privacy and consent issues in today''s social networks and bioinformatics systemsNevertheless, if the samples or data of an individual are accessible in more than one repository or on DTC internet sites, a remote possibility remains that any given individual could be re-identified (Homer et al, 2008). To prevent the restriction of open access to public databases, owing to the fear of re-identifiability, a more reasonable approach is necessary; “[t]his means that a mere hypothetical possibility to single out the individual is not enough to consider the persons as ‘identifiable''” (Data Protection Working Party, 2007). This is a proportionate and important approach because fundamental genomic ‘maps'' such as the International HapMap Project (www.hapmap.org) and the 1000 Genomes project (www.1000genomes.org) have stated as their goal “to make data as widely available as possible to further scientific progress” (Kaye et al, 2009). What then of the nature of the consent and privacy protections in DTC genetic testing?The Personal Genome Project makes the genetic and medical data of its volunteers publicly available. Indeed, there is a marked absence of the traditional confidentiality and other protections of the physician–patient relationship across such sites; overall, the degree of privacy protection by commercial DTC and other sequencing enterprises varies. The company 23andMe allows consumers to choose whether they wish to disclose personal information, but warns that disclosure of personal information is also possible “through other means not associated with 23andMe, […] to friends and/or family members […] and other individuals”. 23andMe also announces that it might enter into commercial or other partnerships for access to its databases (www.23andme.com). deCODEme offers tiered levels of visibility, but does not grant access to third parties in the absence of explicit consumer authorization (www.decodeme.com). GeneEssence will share coded DNA samples with other parties and can transfer or sell personal information or samples with an opt-out option according to their Privacy Policy, though the terms of the latter can be changed at any time (www.geneessence.com). Navigenics is transparent: “If you elect to contribute your genetic information to science through the Navigenics service, you allow us to share Your Genetic Data and Your Phenotype Information with not-for-profit organizations who perform genetic or medical research” (www.navigenics.com). Finally, SeqWright separates the personal information of its clients from their genetic information so as to avoid access to the latter in the case of a security breach (www.seqwright.com).Much has been said about the lack of clinical utility and validity of DTC genetic testing services (Howard & Borry, 2009), to say nothing of the absence of genetic counsellors or physicians to interpret the resulting probabilistic information (Knoppers & Avard, 2009; Wright & Kroese, 2010). But what are the implications for consent and privacy considering the seemingly divergent needs of ensuring data sharing in population projects and ‘protecting'' consumer-citizens in the marketplace?At first glance, the same accusations of paternalism levelled at ethics review committees who hesitate to respect the broad consent of participants in population databases could be applied to restraining the very same citizens from genetic ‘info-voyeurism'' on the internet. But, it should be remembered that citizen empowerment, which enables their participation both in population projects and in DTC, is expressed within very different contexts. Population biobanks, by the very fact of their broad consent and long-term nature, have complex security systems and are subject to governance and ongoing ethical monitoring and review. In addition, independent committees evaluate requests for access (Knoppers & Abdul-Rahman, 2010). The same cannot be said for the governance of the DTC companies just presented.There is room for improvement in both the personal genome and the population genome endeavours. The former require regulatory approaches to ensure the quality, safety, security and utility of their services. The latter require further clarification of their ongoing funding and operations and more transparency to the public as researchers begin to access these resources for disease-specific studies (Institute of Medicine, 2009). Public genomic databases should be interoperable and grant access to authenticated researchers internationally in order to be of utility and statistical significance (Burton et al, 2009). Moreover, to enable international access to such databases for disease-specific research means that the interests of publicly funded research and privacy protection must be weighed against each other, rather than imposing a requirement that research has to demonstrate that the public interest substantially outweighs privacy protection (Weisbrot, 2009). Collaboration through interoperability has been one of the goals of the Public Population Project in Genomics (P3G; www.p3g.org) and, more recently, of the Biobanking and Biomolecular Resources Research Infrastructure (www.bbmri.eu).Even if the tools for harmonization and standardization are built and used, will trans-border data flow still be stymied by privacy concerns? The mutual recognition between countries of privacy equivalent approaches—that is, safe harbour—the limiting of access to approved researchers and the development of international best practices in privacy, security and transparency through a Code of Conduct along with a system for penalizing those who fail to respect such norms, would go some way towards maintaining public trust in genomic and genetic research (P3G Consortium et al, 2009). Finally, consumer protection agencies should monitor DTC sites under a regulatory regime, to ensure that these companies adhere to their own privacy policies.… genetic information is probabilistic and participating in population or on-line studies may not create the fatalistic and harmful discriminatory scenarios originally perceived or imaginedMore importantly in both contexts, the ethics norms that govern clinical research are not suited for the wide range of data privacy and consent issues in today''s social networks and bioinformatics systems. One could go further and ask whether the current biomedical ethics review system is inadequate—if not inappropriate—in these ‘data-driven research'' contexts. Perhaps it is time to create ethics review and oversight systems that are particularly adapted for those citizens who seek either to participate through online services or to contribute to population research resources. Both are contexts of minimal risk and require structural governance reforms rather than the application of traditional ethics consent and privacy review processes that are more suited to clinical research involving drugs or devices. In this information age, genetic information is probabilistic, and participating in population or online studies might not create the fatalistic and harmful discriminatory scenarios originally perceived or imagined. The time is ripe for a change in governance and regulatory approaches, a reform that is consistent with what citizens seem to have already understood and acted on.? Open in a separate windowBartha Maria Knoppers  相似文献   

10.
Greener M 《EMBO reports》2008,9(11):1067-1069
A consensus definition of life remains elusiveIn July this year, the Phoenix Lander robot—launched by NASA in 2007 as part of the Phoenix mission to Mars—provided the first irrefutable proof that water exists on the Red Planet. “We''ve seen evidence for this water ice before in observations by the Mars Odyssey orbiter and in disappearing chunks observed by Phoenix […], but this is the first time Martian water has been touched and tasted,” commented lead scientist William Boynton from the University of Arizona, USA (NASA, 2008). The robot''s discovery of water in a scooped-up soil sample increases the probability that there is, or was, life on Mars.Meanwhile, the Darwin project, under development by the European Space Agency (ESA; Paris, France; www.esa.int/science/darwin), envisages a flotilla of four or five free-flying spacecraft to search for the chemical signatures of life in 25 to 50 planetary systems. Yet, in the vastness of space, to paraphrase the British astrophysicist Arthur Eddington (1822–1944), life might be not only stranger than we imagine, but also stranger than we can imagine. The limits of our current definitions of life raise the possibility that we would not be able to recognize an extra-terrestrial organism.Back on Earth, molecular biologists—whether deliberately or not—are empirically tackling the question of what is life. Researchers at the J Craig Venter Institute (Rockville, MD, USA), for example, have synthesized an artificial bacterial genome (Gibson et al, 2008). Others have worked on ‘minimal cells'' with the aim of synthesizing a ‘bioreactor'' that contains the minimum of components necessary to be self-sustaining, reproduce and evolve. Some biologists regard these features as the hallmarks of life (Luisi, 2007). However, to decide who is first in the ‘race to create life'' requires a consensus definition of life itself. “A definition of the precise boundary between complex chemistry and life will be critical in deciding which group has succeeded in what might be regarded by the public as the world''s first theology practical,” commented Jamie Davies, Professor of Experimental Anatomy at the University of Edinburgh, UK.For most biologists, defining life is a fascinating, fundamental, but largely academic question. It is, however, crucial for exobiologists looking for extra-terrestrial life on Mars, Jupiter''s moon Europa, Saturn''s moon Titan and on planets outside our solar system.In their search for life, exobiologists base their working hypothesis on the only example to hand: life on Earth. “At the moment, we can only assume that life elsewhere is based on the same principles as on Earth,” said Malcolm Fridlund, Secretary for the Exo-Planet Roadmap Advisory Team at the ESA''s European Space Research and Technology Centre (Noordwijk, The Netherlands). “We should, however, always remember that the universe is a peculiar place and try to interpret unexpected results in terms of new physics and chemistry.”The ESA''s Darwin mission will, therefore, search for life-related gases such as carbon dioxide, water, methane and ozone in the atmospheres of other planets. On Earth, the emergence of life altered the balance of atmospheric gases: living organisms produced all of the Earth'' oxygen, which now accounts for one-fifth of the atmosphere. “If all life on Earth was extinguished, the oxygen in our atmosphere would disappear in less than 4 million years, which is a very short time as planets go—the Earth is 4.5 billion years old,” Fridlund said. He added that organisms present in the early phases of life on Earth produced methane, which alters atmospheric composition compared with a planet devoid of life.Although the Darwin project will use a pragmatic and specific definition of life, biologists, philosophers and science-fiction authors have devised numerous other definitions—none of which are entirely satisfactory. Some are based on basic physiological characteristics: a living organism must feed, grow, metabolize, respond to stimuli and reproduce. Others invoke metabolic definitions that define a living organism as having a distinct boundary—such as a membrane—which facilitates interaction with the environment and transfers the raw materials needed to maintain its structure (Wharton, 2002). The minimal cell project, for example, defines cellular life as “the capability to display a concert of three main properties: self-maintenance (metabolism), reproduction and evolution. When these three properties are simultaneously present, we will have a full fledged cellular life” (Luisi, 2007). These concepts regard life as an emergent phenomenon arising from the interaction of non-living chemical components.Cryptobiosis—hidden life, also known as anabiosis—and bacterial endospores challenge the physiological and metabolic elements of these definitions (Wharton, 2002). When the environment changes, certain organisms are able to undergo cryptobiosis—a state in which their metabolic activity either ceases reversibly or is barely discernible. Cryptobiosis allows the larvae of the African fly Polypedilum vanderplanki to survive desiccation for up to 17 years and temperatures ranging from −270 °C (liquid helium) to 106 °C (Watanabe et al, 2002). It also allows the cysts of the brine shrimp Artemia to survive desiccation, ultraviolet radiation, extremes of temperature (Wharton, 2002) and even toyshops, which sell the cysts as ‘sea monkeys''. Organisms in a cryptobiotic state show characteristics that vary markedly from what we normally consider to be life, although they are certainly not dead. “[C]ryptobiosis is a unique state of biological organization”, commented James Clegg, from the Bodega Marine Laboratory at the University of California (Davies, CA, USA), in an article in 2001 (Clegg, 2001). Bacterial endospores, which are the “hardiest known form of life on Earth” (Nicholson et al, 2000), are able to withstand almost any environment—perhaps even interplanetary space. Microbiologists isolated endospores of strict thermophiles from cold lake sediments and revived spores from samples some 100,000 years old (Nicholson et al, 2000).…life might be not only stranger than we imagine, but also stranger than we can imagineAnother problem with the definitions of life is that these can expand beyond biology. The minimal cell project, for example, in common with most modern definitions of life, encompass the ability to undergo Darwinian evolution (Wharton, 2002). “To be considered alive, the organism needs to be able to undergo extensive genetic modification through natural selection,” said Professor Paul Freemont from Imperial College London, UK, whose research interests encompass synthetic biology. But the virtual ‘organisms'' in computer simulations such as the Game of Life (www.bitstorm.org/gameoflife) and Tierra (http://life.ou.edu/tierra) also exhibit life-like characteristics, including growth, death and evolution—similar to robots and other artifical systems that attempt to mimic life (Guruprasad & Sekar, 2006). “At the moment, we have some problems differentiating these approaches from something biologists consider [to be] alive,” Fridlund commented.…to decide who is first in the ‘race to create life'' requires a consensus definition of lifeBoth the genetic code and all computer-programming languages are means of communicating large quantities of codified information, which adds another element to a comprehensive definition of life. Guenther Witzany, an Austrian philosopher, has developed a “theory of communicative nature” that, he claims, differentiates biotic and abiotic life. “Life is distinguished from non-living matter by language and communication,” Witzany said. According to his theory, RNA and DNA use a ‘molecular syntax'' to make sense of the genetic code in a manner similar to language. This paragraph, for example, could contain the same words in a random order; it would be meaningless without syntactic and semantic rules. “The RNA/DNA language follows syntactic, semantic and pragmatic rules which are absent in [a] random-like mixture of nucleic acids,” Witzany explained.Yet, successful communication requires both a speaker using the rules and a listener who is aware of and can understand the syntax and semantics. For example, cells, tissues, organs and organisms communicate with each other to coordinate and organize their activities; in other words, they exchange signals that contain meaning. Noradrenaline binding to a β-adrenergic receptor in the bronchi communicates a signal that says ‘dilate''. “If communication processes are deformed, destroyed or otherwise incorrectly mediated, both coordination and organisation of cellular life is damaged or disturbed, which can lead to disease,” Witzany added. “Cellular life also interprets abiotic environmental circumstances—such as the availability of nutrients, temperature and so on—to generate appropriate behaviour.”Nonetheless, even definitions of life that include all the elements mentioned so far might still be incomplete. “One can make a very complex definition that covers life on the Earth, but what if we find life elsewhere and it is different? My opinion, shared by many, is that we don''t have a clue of how life arose on Earth, even if there are some hypotheses,” Fridlund said. “This underlies many of our problems defining life. Since we do not have a good minimum definition of life, it is hard or impossible to find out how life arose without observing the process. Nevertheless, I''m an optimist who believes the universe is understandable with some hard work and I think we will understand these issues one day.”Both synthetic biology and research on organisms that live in extreme conditions allow biologists to explore biological boundaries, which might help them to reach a consensual minimum definition of life, and understand how it arose and evolved. Life is certainly able to flourish in some remarkably hostile environments. Thermus aquaticus, for example, is metabolically optimal in the springs of Yellowstone National Park at temperatures between 75 °C and 80 °C. Another extremophile, Deinococcus radiodurans, has evolved a highly efficient biphasic system to repair radiation-induced DNA breaks (Misra et al, 2006) and, as Fridlund noted, “is remarkably resistant to gamma radiation and even lives in the cooling ponds of nuclear reactors.”In turn, synthetic biology allows for a detailed examination of the elements that define life, including the minimum set of genes required to create a living organism. Researchers at the J Craig Venter Institute, for example, have synthesized a 582,970-base-pair Mycoplasma genitalium genome containing all the genes of the wild-type bacteria, except one that they disrupted to block pathogenicity and allow for selection. ‘Watermarks'' at intergenic sites that tolerate transposon insertions identify the synthetic genome, which would otherwise be indistinguishable from the wild type (Gibson et al, 2008).Yet, as Pier Luigi Luisi from the University of Roma in Italy remarked, even M. genitalium is relatively complex. “The question is whether such complexity is necessary for cellular life, or whether, instead, cellular life could, in principle, also be possible with a much lower number of molecular components”, he said. After all, life probably did not start with cells that already contained thousands of genes (Luisi, 2007).…researchers will continue their attempts to create life in the test tube—it is, after all, one of the greatest scientific challengesTo investigate further the minimum number of genes required for life, researchers are using minimal cell models: synthetic genomes that can be included in liposomes, which themselves show some life-like characteristics. Certain lipid vesicles are able to grow, divide and grow again, and can include polymerase enzymes to synthesize RNA from external substrates as well as functional translation apparatuses, including ribosomes (Deamer, 2005).However, the requirement that an organism be subject to natural selection to be considered alive could prove to be a major hurdle for current attempts to create life. As Freemont commented: “Synthetic biologists could include the components that go into a cell and create an organism [that is] indistinguishable from one that evolved naturally and that can replicate […] We are beginning to get to grips with what makes the cell work. Including an element that undergoes natural selection is proving more intractable.”John Dupré, Professor of Philosophy of Science and Director of the Economic and Social Research Council (ESRC) Centre for Genomics in Society at the University of Exeter, UK, commented that synthetic biologists still approach the construction of a minimal organism with certain preconceptions. “All synthetic biology research assumes certain things about life and what it is, and any claims to have ‘confirmed'' certain intuitions—such as life is not a vital principle—aren''t really adding empirical evidence for those intuitions. Anyone with the opposite intuition may simply refuse to admit that the objects in question are living,” he said. “To the extent that synthetic biology is able to draw a clear line between life and non-life, this is only possible in relation to defining concepts brought to the research. For example, synthetic biologists may be able to determine the number of genes required for minimal function. Nevertheless, ‘what counts as life'' is unaffected by minimal genomics.”Partly because of these preconceptions, Dan Nicholson, a former molecular biologist now working at the ESRC Centre, commented that synthetic biology adds little to the understanding of life already gained from molecular biology and biochemistry. Nevertheless, he said, synthetic biology might allow us to go boldly into the realms of biological possibility where evolution has not gone before.An engineered synthetic organism could, for example, express novel amino acids, proteins, nucleic acids or vesicular forms. A synthetic organism could use pyranosyl-RNA, which produces a stronger and more selective pairing system than the natural existent furanosyl-RNA (Bolli et al, 1997). Furthermore, the synthesis of proteins that do not exist in nature—so-called never-born proteins—could help scientists to understand why evolutionary pressures only selected certain structures.As Luisi remarked, the ratio between the number of theoretically possible proteins containing 100 amino acids and the real number present in nature is close to the ratio between the space of the universe and the space of a single hydrogen atom, or the ratio between all the sand in the Sahara Desert and a single grain. Exploring never-born proteins could, therefore, allow synthetic biologists to determine whether particular physical, structural, catalytic, thermodynamic and other properties maximized the evolutionary fitness of natural proteins, or whether the current protein repertoire is predominately the result of chance (Luisi, 2007).In the final analysis, as with all science, deep understanding is more important than labelling with words.“Synthetic biology also could conceivably help overcome the ‘n = 1 problem''—namely, that we base biological theorising on terrestrial life only,” Nicholson said. “In this way, synthetic biology could contribute to the development of a more general, broader understanding of what life is and how it might be defined.”No matter the uncertainties, researchers will continue their attempts to create life in the test tube—it is, after all, one of the greatest scientific challenges. Whether or not they succeed will depend partly on the definition of life that they use, though in any case, the research should yield numerous insights that are beneficial to biologists generally. “The process of creating a living system from chemical components will undoubtedly offer many rich insights into biology,” Davies concluded. “However, the definition will, I fear, reflect politics more than biology. Any definition will, therefore, be subject to a lot of inter-lab political pressure. Definitions are also important for bioethical legislation and, as a result, reflect larger politics more than biology. In the final analysis, as with all science, deep understanding is more important than labelling with words.”  相似文献   

11.
Samuel Caddick 《EMBO reports》2008,9(12):1174-1176
  相似文献   

12.
Lessons from science studies for the ongoing debate about ‘big'' versus ‘little'' research projectsDuring the past six decades, the importance of scientific research to the developed world and the daily lives of its citizens has led many industrialized countries to rebrand themselves as ‘knowledge-based economies''. The increasing role of science as a main driver of innovation and economic growth has also changed the nature of research itself. Starting with the physical sciences, recent decades have seen academic research increasingly conducted in the form of large, expensive and collaborative ‘big science'' projects that often involve multidisciplinary, multinational teams of scientists, engineers and other experts.Although laboratory biology was late to join the big science trend, there has nevertheless been a remarkable increase in the number, scope and complexity of research collaborations…Although laboratory biology was late to join the big science trend, there has nevertheless been a remarkable increase in the number, scope and complexity of research collaborations and projects involving biologists over the past two decades (Parker et al, 2010). The Human Genome Project (HGP) is arguably the most well known of these and attracted serious scientific, public and government attention to ‘big biology''. Initial exchanges were polarized and often polemic, as proponents of the HGP applauded the advent of big biology and argued that it would produce results unattainable through other means (Hood, 1990). Critics highlighted the negative consequences of massive-scale research, including the industrialization, bureaucratization and politicization of research (Rechsteiner, 1990). They also suggested that it was not suited to generating knowledge at all; Nobel laureate Sydney Brenner joked that sequencing was so boring it should be done by prisoners: “the more heinous the crime, the bigger the chromosome they would have to decipher” (Roberts, 2001).A recent Opinion in EMBO reports summarized the arguments against “the creeping hegemony” of ‘big science'' over ‘little science'' in biomedical research. First, many large research projects are of questionable scientific and practical value. Second, big science transfers the control of research topics and goals to bureaucrats, when decisions about research should be primarily driven by the scientific community (Petsko, 2009). Gregory Petsko makes a valid point in his Opinion about wasteful research projects and raises the important question of how research goals should be set and by whom. Here, we contextualize Petsko''s arguments by drawing on the history and sociology of science to expound the drawbacks and benefits of big science. We then advance an alternative to the current antipodes of ‘big'' and ‘little'' biology, which offers some of the benefits and avoids some of the adverse consequences.Big science is not a recent development. Among the first large, collaborative research projects were the Manhattan Project to develop the atomic bomb, and efforts to decipher German codes during the Second World War. The concept itself was put forward in 1961 by physicist Alvin Weinberg, and further developed by historian of science Derek De Solla Price in his pioneering book, Little Science, Big Science. “The large-scale character of modern science, new and shining and all powerful, is so apparent that the happy term ‘Big Science'' has been coined to describe it” (De Solla Price, 1963). Weinberg noted that science had become ‘big'' in two ways. First, through the development of elaborate research instrumentation, the use of which requires large research teams, and second, through the explosive growth of scientific research in general. More recently, big science has come to refer to a diverse but strongly related set of changes in the organization of scientific research. This includes expensive equipment and large research teams, but also the increasing industrialization of research activities, the escalating frequency of interdisciplinary and international collaborations, and the increasing manpower needed to achieve research goals (Galison & Hevly, 1992). Many areas of biological research have shifted in these directions in recent years and have radically altered the methods by which biologists generate scientific knowledge.Despite this long history of collaboration, laboratory biology remained ‘small-scale'' until the rising prominence of molecular biology changed the research landscapeUnderstanding the implications of this change begins with an appreciation of the history of collaborations in the life sciences—biology has long been a collaborative effort. Natural scientists accompanied the great explorers in the grand alliance between science and exploration during the sixteenth and seventeenth centuries (Capshew & Rader, 1992), which not only served to map uncharted territories, but also contributed enormously to knowledge of the fauna and flora discovered. These early expeditions gradually evolved into coordinated, multidisciplinary research programmes, which began with the International Polar Years, intended to concentrate international research efforts at the North and South Poles (1882–1883; 1932–1933). The Polar Years became exemplars of large-scale life science collaboration, begetting the International Geophysical Year (1957–1958) and the International Biological Programme (1968–1974).For Weinberg, the potentially negative consequences associated with big science were “adminstratitis, moneyitis, and journalitis”…Despite this long history of collaboration, laboratory biology remained ‘small-scale'' until the rising prominence of molecular biology changed the research landscape. During the late 1950s and early 1960s, many research organizations encouraged international collaboration in the life sciences, spurring the creation of, among other things, the European Molecular Biology Organization (1964) and the European Molecular Biology Laboratory (1974). In addition, international mapping and sequencing projects were developed around model organisms such as Drosophila and Caenorhabditis elegans, and scientists formed research networks, exchanged research materials and information, and divided labour across laboratories. These new ways of working set the stage for the HGP, which is widely acknowledged as the cornerstone of the current ‘post-genomics era''. As an editorial on ‘post-genomics cultures'' put it in the journal Nature, “Like it or not, big biology is here to stay” (Anon, 2001).Just as big science is not new, neither are concerns about its consequences. As early as 1948, the sociologist Max Weber worried that as equipment was becoming more expensive, scientists were losing autonomy and becoming more dependent on external funding (Weber, 1948). Similarly, although Weinberg and De Solla Price expressed wonder at the scope of the changes they were witnessing, they too offered critical evaluations. For Weinberg, the potentially negative consequences associated with big science were “adminstratitis, moneyitis, and journalitis”; meaning the dominance of science administrators over practitioners, the tendency to view funding increases as a panacea for solving scientific problems, and progressively blurry lines between scientific and popular writing in order to woo public support for big research projects (Weinberg, 1961). De Solla Price worried that the bureaucracy associated with big science would fail to entice the intellectual mavericks on which science depends (De Solla Price, 1963). These concerns remain valid and have been voiced time and again.As big science represents a major investment of time, money and manpower, it tends to determine and channel research in particular directions that afford certain possibilities and preclude others (Cook & Brown, 1999). In the worst case, this can result in entire scientific communities following false leads, as was the case in the 1940s and 1950s for Soviet agronomy. Huge investments were made to demonstrate the superiority of Lamarckian over Mendelian theories of heritability, which held back Russian biology for decades (Soyfer, 1994). Such worst-case scenarios are, however, rare. A more likely consequence is that big science can diminish the diversity of research approaches. For instance, plasma fusion scientists are now under pressure to design projects that are relevant to the large-scale International Thermonuclear Experimental Reactor, despite the potential benefits of a wide array of smaller-scale machines and approaches (Hackett et al, 2004). Big science projects can also involve coordination challenges, take substantial time to realize success, and be difficult to evaluate (Neal et al, 2008).Importantly, big science projects allow for the coordination and activation of diverse forms of expertise across disciplinary, national and professional boundariesAnother danger of big science is that researchers will lose the intrinsic satisfaction that arises from having personal control over their work. Dissatisfaction could lower research productivity (Babu & Singh, 1998) and might create the concomitant danger of losing talented young researchers to other, more engaging callings. Moreover, the alienation of scientists from their work as a result of big science enterprises can lead to a loss of personal responsibility for research. In turn, this can increase the likelihood of misconduct, as effective social control is eroded and “the satisfactions of science are overshadowed by organizational demands, economic calculations, and career strategies” (Hackett, 1994).Practicing scientists are aware of these risks. Yet, they remain engaged in large-scale projects because they must, but also because of the real benefits these projects offer. Importantly, big science projects allow for the coordination and activation of diverse forms of expertise across disciplinary, national and professional boundaries to solve otherwise intractable basic and applied problems. Although calling for international and interdisciplinary collaboration is popular, practicing it is notably less popular and much harder (Weingart, 2000). Big science projects can act as a focal point that allows researchers from diverse backgrounds to cooperate, and simultaneously advances different scientific specialties while forging interstitial connections among them. Another major benefit of big science is that it facilitates the development of common research standards and metrics, allowing for the rapid development of nascent research frontiers (Fujimura, 1996). Furthermore, the high profile of big science efforts such as the HGP and CERN draw public attention to science, potentially enhancing scientific literacy and the public''s willingness to support research.Rather than arguing for or against big science, molecular biology would best benefit from strategic investments in a diverse portfolio of big, little and ‘mezzo'' research projectsBig science can also ease some of the problems associated with scientific management. In terms of training, graduate students and junior researchers involved in big science projects can gain additional skills in problem-solving, communication and team working (Court & Morris, 1994). The bureaucratic structure and well-defined roles of big science projects also make leadership transitions and researcher attrition easier to manage compared with the informal, refractory organization of most small research projects. Big science projects also provide a visible platform for resource acquisition and the recruitment of new scientific talent. Moreover, through their sheer size, diversity and complexity, they can also increase the frequency of serendipitous social interactions and scientific discoveries (Hackett et al, 2008). Finally, large-scale research projects can influence scientific and public policy. Big science creates organizational structures in which many scientists share responsibility for, and expectations of, a scientific problem (Van Lente, 1993). This shared ownership and these shared futures help coordinate communication and enable researchers to present a united front when advancing the potential benefits of their projects to funding bodies.Given these benefits and pitfalls of big science, how might molecular biology best proceed? Petsko''s response is that, “[s]cientific priorities must, for the most part, be set by the free exchange of ideas in the scientific literature, at meetings and in review panels. They must be set from the bottom up, from the community of scientists, not by the people who control the purse strings.” It is certainly the case, as Petsko also acknowledges, that science has benefited from a combination of generous public support and professional autonomy. However, we are less sanguine about his belief that the scientific community alone has the capacity to ascertain the practical value of particular lines of inquiry, determine the most appropriate scale of research, and bring them to fruition. In fact, current mismatches between the production of scientific knowledge and the information needs of public policy-makers strongly suggest that the opposite is true (Sarewitz & Pielke, 2007).Instead, we maintain that these types of decision should be determined through collective decision-making that involves researchers, governmental funding agencies, science policy experts and the public. In fact, the highly successful HGP involved such collaborations (Lambright, 2002). Taking into account the opinions and attitudes of these stakeholders better links knowledge production to the public good (Cash et al, 2003)—a major justification for supporting big biology. We do agree with Petsko, however, that large-scale projects can develop pathological characteristics, and that all programmes should therefore undergo regular assessments to determine their continuing worth.Rather than arguing for or against big science, molecular biology would best benefit from strategic investments in a diverse portfolio of big, little and ‘mezzo'' research projects. Their size, duration and organizational structure should be determined by the research question, subject matter and intended goals (Westfall, 2003). Parties involved in making these decisions should, in turn, aim at striking a profitable balance between differently sized research projects to garner the benefits of each and allow practitioners the autonomy to choose among them.This will require new, innovative methods for supporting and coordinating research. An important first step is ensuring that funding is made available for all kinds of research at a range of scales. For this to happen, the current funding model needs to be modified. The practice of allocating separate funds for individual investigator-driven and collective research projects is a positive step in the right direction, but it does not discriminate between projects of different sizes at a sufficiently fine resolution. Instead, multiple funding pools should be made available for projects of different sizes and scales, allowing for greater accuracy in project planning, funding and evaluation.It is up to scientists and policymakers to discern how to benefit from the advantages that ‘bigness'' has to offer, while avoiding the pitfalls inherent in doing soSecond, science policy should consciously facilitate the ‘scaling up'', ‘scaling down'' and concatenation of research projects when needed. For instance, special funds might be established for supporting small-scale but potentially transformative research with the capacity to be scaled up in the future. Alternatively, small-scale satellite research projects that are more nimble, exploratory and risky, could complement big science initiatives or be generated by them. This is also in line with Petsko''s statement that “the best kind of big science is the kind that supports and generates lots of good little science.” Another potentially fruitful strategy we suggest would be to fund independent, small-scale research projects to work on co-relevant research with the later objective of consolidating them into a single project in a kind of building-block assembly. By using these and other mechanisms for organizing research at different scales, it could help to ameliorate some of the problems associated with big science, while also accruing its most important benefits.Within the life sciences, the field of ecology perhaps best exemplifies this strategy. Although it encompasses many small-scale laboratory and field studies, ecologists now collaborate in a variety of novel organizations that blend elements of big, little and mezzo science and that are designed to catalyse different forms of research. For example, the US National Center for Ecological Analysis and Synthesis brings together researchers and data from many smaller projects to synthesize their findings. The Long Term Ecological Research Network consists of dozens of mezzo-scale collaborations focused on specific sites, but also leverages big science through cross-site collaborations. While investments are made in classical big science projects, such as the National Ecological Observatory Network, no one project or approach has dominated—nor should it. In these ways, ecologists have been able to reap the benefits of big science whilst maintaining diverse research approaches and individual autonomy and still being able to enjoy the intrinsic satisfaction associated with scientific work.Big biology is here to stay and is neither a curse nor a blessing. It is up to scientists and policy-makers to discern how to benefit from the advantages that ‘bigness'' has to offer, while avoiding the pitfalls inherent in so doing. The challenge confronting molecular biology in the coming years is to decide which kind of research projects are best suited to getting the job done. Molecular biology itself arose, in part, from the migration of physicists to biology; as physics research projects and collaborations grew and became more dependent on expensive equipment, appreciating the saliency of one''s own work became increasingly difficult, which led some to seek refuge in the comparatively little science of biology (Dev, 1990). The current situation, which Petsko criticizes in his Opinion article, is thus the result of an organizational and intellectual cycle that began more than six decades ago. It would certainly behoove molecular biologists to heed his warnings and consider the best paths forward.? Open in a separate windowNiki VermeulenOpen in a separate windowJohn N. ParkerOpen in a separate windowBart Penders  相似文献   

13.
Zhang JY 《EMBO reports》2011,12(4):302-306
How can grass-roots movements evolve into a national research strategy? The bottom-up emergence of synthetic biology in China could give some pointers.Given its potential to aid developments in renewable energy, biosensors, sustainable chemical industries, microbial drug factories and biomedical devices, synthetic biology has enormous implications for economic development. Many countries are therefore implementing strategies to promote progress in this field. Most notably, the USA is considered to be the leader in exploring the industrial potential of synthetic biology (Rodemeyer, 2009). Synthetic biology in Europe has benefited from several cross-border studies, such as the ‘New and Emerging Science and Technology'' programme (NEST, 2005) and the ‘Towards a European Strategy for Synthetic Biology'' project (TESSY; Gaisser et al, 2008). Yet, little is known in the West about Asia''s role in this ‘new industrial revolution'' (Kitney, 2009). In particular, China is investing heavily in scientific research for future developments, and is therefore likely to have an important role in the development of synthetic biology.Initial findings seem to indicate that the emergence of synthetic biology in China has been a bottom-up construction of a new scientific framework…In 2010, as part of a study of the international governance of synthetic biology, the author visited four leading research teams in three Chinese cities (Beijing, Tianjin and Hefei). The main aims of the visits were to understand perspectives in China on synthetic biology, to identify core themes among its scientific community, and to address questions such as ‘how did synthetic biology emerge in China?'', ‘what are the current funding conditions?'', ‘how is synthetic biology generally perceived?'' and ‘how is it regulated?''. Initial findings seem to indicate that the emergence of synthetic biology in China has been a bottom-up construction of a new scientific framework; one that is more dynamic and comprises more options than existing national or international research and development (R&D) strategies. Such findings might contribute to Western knowledge of Chinese R&D, but could also expose European and US policy-makers to alternative forms and patterns of research governance that have emerged from a grass-roots level.…the process of developing a framework is at least as important to research governance as the big question it might eventually addressA dominant narrative among the scientists interviewed is the prospect of a ‘big-question'' strategy to promote synthetic-biology research in China. This framework is at a consultation stage and key questions are still being discussed. Yet, fieldwork indicates that the process of developing a framework is at least as important to research governance as the big question it might eventually address. According to several interviewees, this approach aims to organize dispersed national R&D resources into one grand project that is essential to the technical development of the field, preferably focusing on an industry-related theme that is economically appealling to the Chinese public.Chinese scientists have a pragmatic vision for research; thinking of science in terms of its ‘instrumentality'' has long been regarded as characteristic of modern China (Schneider, 2003). However, for a country in which the scientific community is sometimes described as an “uncoordinated ‘bunch of loose ends''” (Cyranoski, 2001) “with limited synergies between them” (OECD, 2007), the envisaged big-question approach implies profound structural and organizational changes. Structurally, the approach proposes that the foundational (industry-related) research questions branch out into various streams of supporting research and more specific short-term research topics. Within such a framework, a variety of Chinese universities and research institutions can be recruited and coordinated at different levels towards solving the big question.It is important to note that although this big-question strategy is at a consultation stage and supervised by the Ministry of Science and Technology (MOST), the idea itself has emerged in a bottom-up manner. One academic who is involved in the ongoing ministerial consultation recounted that, “It [the big-question approach] was initially conversations among we scientists over the past couple of years. We saw this as an alternative way to keep up with international development and possibly lead to some scientific breakthrough. But we are happy to see that the Ministry is excited and wants to support such an idea as well.” As many technicalities remain to be addressed, there is no clear time-frame yet for when the project will be launched. Yet, this nationwide cooperation among scientists with an emerging commitment from MOST seems to be largely welcomed by researchers. Some interviewees described the excitement it generated among the Chinese scientific community as comparable with the establishment of “a new ‘moon-landing'' project”.Of greater significance than the time-frame is the development process that led to this proposition. On the one hand, the emergence of synthetic biology in China has a cosmopolitan feel: cross-border initiatives such as international student competitions, transnational funding opportunities and social debates in Western countries—for instance, about biosafety—all have an important role. On the other hand, the development of synthetic biology in China has some national particularities. Factors including geographical proximity, language, collegial familiarity and shared interests in economic development have all attracted Chinese scientists to the national strategy, to keep up with their international peers. Thus, to some extent, the development of synthetic biology in China is an advance not only in the material synthesis of the ‘cosmos''—the physical world—but also in the social synthesis of aligning national R&D resources and actors with the global scientific community.To comprehend how Chinese scientists have used national particularities and global research trends as mutually constructive influences, and to identify the implications of this for governance, this essay examines the emergence of synthetic biology in China from three perspectives: its initial activities, the evolution of funding opportunities, and the ongoing debates about research governance.China''s involvement in synthetic biology was largely promoted by the participation of students in the International Genetically Engineered Machine (iGEM) competition, an international contest for undergraduates initiated by the Massachusetts Institute of Technology (MIT) in the USA. Before the iGEM training workshop that was hosted by Tianjin University in the Spring of 2007, there were no research records and only two literature reviews on synthetic biology in Chinese scientific databases (Zhao & Wang, 2007). According to Chunting Zhang of Tianjin University—a leading figure in the promotion of synthetic biology in China—it was during these workshops that Chinese research institutions joined their efforts for the first time (Zhang, 2008). From the outset, the organization of the workshop had a national focus, while it engaged with international networks. Synthetic biologists, including Drew Endy from MIT and Christina Smolke from Stanford University, USA, were invited. Later that year, another training camp designed for iGEM tutors was organized in Tianjin and included delegates from Australia and Japan (Zhang, 2008).Through years of organizing iGEM-related conferences and workshops, Chinese universities have strengthened their presence at this international competition; in 2007, four teams from China participated. During the 2010 competition, 11 teams from nine universities in six provinces/municipalities took part. Meanwhile, recruiting, training and supervising iGEM teams has become an important institutional programme at an increasing number of universities.…training for iGEM has grown beyond winning the student awards and become a key component of exchanges between Chinese researchers and the international communityIt might be easy to interpret the enthusiasm for the iGEM as a passion for winning gold medals, as is conventionally the case with other international scientific competitions. This could be one motive for participating. Yet, training for iGEM has grown beyond winning the student awards and has become a key component of exchanges between Chinese researchers and the international community (Ding, 2010). Many of the Chinese scientists interviewed recounted the way in which their initial involvement in synthetic biology overlapped with their tutoring of iGEM teams. One associate professor at Tianjin University, who wrote the first undergraduate textbook on synthetic biology in China, half-jokingly said, “I mainly learnt [synthetic biology] through tutoring new iGEM teams every year.”Participation in such contests has not only helped to popularize synthetic biology in China, but has also influenced local research culture. One example of this is that the iGEM competition uses standard biological parts (BioBricks), and new BioBricks are submitted to an open registry for future sharing. A corresponding celebration of open-source can also be traced to within the Chinese synthetic-biology community. In contrast to the conventional perception that the Chinese scientific sector consists of a “very large number of ‘innovative islands''” (OECD, 2007; Zhang, 2010), communication between domestic teams is quite active. In addition to the formally organized national training camps and conferences, students themselves organize a nationwide, student-only workshop at which to informally test their ideas.More interestingly, when the author asked one team whether there are any plans to set up a ‘national bank'' for hosting designs from Chinese iGEM teams, in order to benefit domestic teams, both the tutor and team members thought this proposal a bit “strange”. The team leader responded, “But why? There is no need. With BioBricks, we can get any parts we want quite easily. Plus, it directly connects us with all the data produced by iGEM teams around the world, let alone in China. A national bank would just be a small-scale duplicate.”From the beginning, interest in the development of synthetic biology in China has been focused on collective efforts within and across national borders. In contrast to conventional critiques on the Chinese scientific community''s “inclination toward competition and secrecy, rather than openness” (Solo & Pressberg, 2007; OECD, 2007; Zhang, 2010), there seems to be a new outlook emerging from the participation of Chinese universities in the iGEM contest. Of course, that is not to say that the BioBricks model is without problems (Rai & Boyle, 2007), or to exclude inputs from other institutional channels. Yet, continuous grass-roots exchanges, such as the undergraduate-level competition, might be as instrumental as formal protocols in shaping research culture. The indifference of Chinese scientists to a ‘national bank'' seems to suggest that the distinction between the ‘national'' and ‘international'' scientific communities has become blurred, if not insignificant.However, frequent cross-institutional exchanges and the domestic organization of iGEM workshops seem to have nurtured the development of a national synthetic-biology community in China, in which grass-roots scientists are comfortable relying on institutions with a cosmopolitan character—such as the BioBricks Foundation—to facilitate local research. To some extent, one could argue that in the eyes of Chinese scientists, national and international resources are one accessible global pool. This grass-roots interest in incorporating local and global advantages is not limited to student training and education, but also exhibited in evolving funding and regulatory debates.In the development of research funding for synthetic biology, a similar bottom-up consolidation of national and global resources can also be observed. As noted earlier, synthetic-biology research in China is in its infancy. A popular view is that China has the potential to lead this field, as it has strong support from related disciplines. In terms of genome sequencing, DNA synthesis, genetic engineering, systems biology and bioinformatics, China is “almost at the same level as developed countries” (Pan, 2008), but synthetic-biology research has only been carried out “sporadically” (Pan, 2008; Huang, 2009). There are few nationally funded projects and there is no discernible industrial involvement (Yang, 2010). Most existing synthetic-biology research is led by universities or institutions that are affiliated with the Chinese Academy of Science (CAS). As one CAS academic commented, “there are many Chinese scientists who are keen on conducting synthetic-biology research. But no substantial research has been launched nor has long-term investment been committed.”The initial undertaking of academic research on synthetic biology in China has therefore benefited from transnational initiatives. The first synthetic-biology project in China, launched in October 2006, was part of the ‘Programmable Bacteria Catalyzing Research'' (PROBACTYS) project, funded by the Sixth Framework Programme of the European Union (Yang, 2010). A year later, another cross-border collaborative effort led to the establishment of the first synthetic-biology centre in China: the Edinburgh University–Tianjing University Joint Research Centre for Systems Biology and Synthetic Biology (Zhang, 2008).There is also a comparable commitment to national research coordination. A year after China''s first participation in iGEM, the 2008 Xiangshan conference focused on domestic progress. From 2007 to 2009, only five projects in China received national funding, all of which came from the National Natural Science Foundation of China (NSFC). This funding totalled ¥1,330,000 (approximately £133,000; www.nsfc.org), which is low in comparison to the £891,000 funding that was given in the UK for seven Networks in Synthetic Biology in 2007 alone (www.bbsrc.ac.uk).One of the primary challenges in obtaining funding identified by the interviewees is that, as an emerging science, synthetic biology is not yet appreciated by Chinese funding agencies. After the Xiangshan conference, the CAS invited scientists to a series of conferences in late 2009. According to the interviewees, one of the main outcomes was the founding of a ‘China Synthetic Biology Coordination Group''; an informal association of around 30 conference delegates from various research institutions. This group formulated a ‘regulatory suggestion'' that they submitted to MOST, which stated the necessity and implications of supporting synthetic-biology research. In addition, leading scientists such as Chunting Zhang and Huanming Yang—President of the Beijing Genomic Institute (BGI), who co-chaired the Beijing Institutes of Life Science (BILS) conferences—have been active in communicating with government institutions. The initial results of this can be seen in the MOST 2010 Application Guidelines for the National Basic Research Program, in which synthetic biology was included for the first time, among ‘key supporting areas'' (MOST, 2010). Meanwhile, in 2010, NSFC allocated ¥1,500,000 (approximately £150,000) to synthetic-biology research, which is more than the total funding the area had received in the past three years.The search for funding further demonstrates the dynamics between national and transnational resources. Chinese R&D initiatives have to deal with the fact that scientific venture-capital and non-governmental research charities are underdeveloped in China. In contrast to the EU or the USA, government institutions in China, such as the NSFC and MOST, are the main and sometimes only domestic sources of funding. Yet, transnational funding opportunities facilitate the development of synthetic biology by alleviating local structural and financial constraints, and further integrate the Chinese scientific community into international research.This is not a linear ‘going-global'' process; it is important for Chinese scientists to secure and promote national and regional support. In addition, this alignment of national funding schemes with global research progress is similar to the iGEM experience, as it is being initiated through informal bottom-up associations between scientists, rather than by top-down institutional channels.As more institutions have joined iGEM training camps and participated in related conferences, a shared interest among the Chinese scientific community in developing synthetic biology has become visible. In late 2009, at the conference that founded the informal ‘coordination group'', the proposition of integrating national expertise through a big-question approach emerged. According to one professor in Beijing—who was a key participant in the discussion at the time—this proposition of a nationwide synergy was not so much about ‘national pride'' or an aim to develop a ‘Chinese'' synthetic biology, it was about research practicality. She explained, “synthetic biology is at the convergence of many disciplines, computer modelling, nano-technology, bioengineering, genomic research etc. Individual researchers like me can only operate on part of the production chain. But I myself would like to see where my findings would fit in a bigger picture as well. It just makes sense for a country the size of China to set up some collective and coordinated framework so as to seek scientific breakthrough.”From the first participation in the iGEM contest to the later exploration of funding opportunities and collective research plans, scientists have been keen to invite and incorporate domestic and international resources, to keep up with global research. Yet, there are still regulatory challenges to be met.…with little social discontent and no imminent public threat, synthetic biology in China could be carried out in a ‘research-as-usual'' mannerThe reputation of “the ‘wild East'' of biology” (Dennis, 2002) is associated with China'' previous inattention to ethical concerns about the life sciences, especially in embryonic-stem-cell research. Similarly, synthetic biology creates few social concerns in China. Public debate is minimal and most media coverage has been positive. Synthetic biology is depicted as “a core in the fourth wave of scientific development” (Pan, 2008) or “another scientific revolution” (Huang, 2009). Whilst recognizing its possible risks, mainstream media believe that “more people would be attracted to doing good while making a profit than doing evil” (Fang & He, 2010). In addition, biosecurity and biosafety training in China are at an early stage, with few mandatory courses for students (Barr & Zhang, 2010). The four leading synthetic-biology teams I visited regarded the general biosafety regulations that apply to microbiology laboratories as sufficient for synthetic biology. In short, with little social discontent and no imminent public threat, synthetic biology in China could be carried out in a ‘research-as-usual'' manner.Yet, fieldwork suggests that, in contrast to this previous insensitivity to global ethical concerns, the synthetic-biology community in China has taken a more proactive approach to engaging with international debates. It is important to note that there are still no synthetic-biology-specific administrative guidelines or professional codes of conduct in China. However, Chinese stakeholders participate in building a ‘mutual inclusiveness'' between global and domestic discussions.One of the most recent examples of this is a national conference about the ethical and biosafety implications of synthetic biology, which was jointly hosted by the China Association for Science and Technology, the Chinese Society of Biotechnology and the Beijing Institutes of Life Science CAS, in Suzhou in June 2010. The discussion was open to the mainstream media. The debate was not simply a recapitulation of Western worries, such as playing god, potential dual-use or ecological containment. It also focused on the particular concerns of developing countries about how to avoid further widening the developmental gap with advanced countries (Liu, 2010).In addition to general discussions, there are also sustained transnational communications. For example, one of the first three projects funded by the NSFC was a three-year collaboration on biosafety and risk-assessment frameworks between the Institute of Botany at CAS and the Austrian Organization for International Dialogue and Conflict Management (IDC).Chinese scientists are also keen to increase their involvement in the formulation of international regulations. The CAS and the Chinese Academy of Engineering are engaged with their peer institutions in the UK and the USA to “design more robust frameworks for oversight, intellectual property and international cooperation” (Royal Society, 2009). It is too early to tell what influence China will achieve in this field. Yet, the changing image of the country from an unconcerned wild East to a partner in lively discussions signals a new dynamic in the global development of synthetic biology.Student contests, funding programmes, joint research centres and coordination groups are only a few of the means by which scientists can drive synthetic biology forward in ChinaFrom self-organized participation in iGEM to bottom-up funding and governance initiatives, two features are repeatedly exhibited in the emergence of synthetic biology in China: global resources and international perspectives complement national interests; and the national and cosmopolitan research strengths are mostly instigated at the grass-roots level. During the process of introducing, developing and reflecting on synthetic biology, many formal or informal, provisional or long-term alliances have been established from the bottom up. Student contests, funding programmes, joint research centres and coordination groups are only a few of the means by which scientists can drive synthetic biology forward in China.However, the inputs of different social actors has not led to disintegration of the field into an array of individualized pursuits, but has transformed it into collective synergies, or the big-question approach. Underlying the diverse efforts of Chinese scientists is a sense of ‘inclusiveness'', or the idea of bringing together previously detached research expertise. Thus, the big-question strategy cannot be interpreted as just another nationally organized agenda in response to global scientific advancements. Instead, it represents a more intricate development path corresponding to how contemporary research evolves on the ground.In comparison to the increasingly visible grass-roots efforts, the role of the Chinese government seems relatively small at this stageIn comparison to the increasingly visible grass-roots efforts, the role of the Chinese government seems relatively small at this stage. Government input—such as the potential stewardship of the MOST in directing a big-question approach or long-term funding—remain important; the scientists who were interviewed expend a great deal of effort to attract governmental participation. Yet, China'' experience highlights that the key to comprehending regional scientific capacity lies not so much in what the government can do, but rather in what is taking place in laboratories. It is important to remember that Chinese iGEM victories, collaborative synthetic-biology projects and ethical discussions all took place before the government became involved. Thus, to appreciate fully the dynamics of an emerging science, it might be necessary to focus on what is formulated from the bottom up.The experience of China in synthetic biology demonstrates the power of grass-roots, cross-border engagement to promote contemporary researchThe experience of China in synthetic biology demonstrates the power of grass-roots, cross-border engagement to promote contemporary research. More specifically, it is a result of the commitment of Chinese scientists to incorporating national and international resources, actors and social concerns. For practical reasons, the national organization of research, such as through the big-question approach, might still have an important role. However, synthetic biology might be not only a mosaic of national agendas, but also shaped by transnational activities and scientific resources. What Chinese scientists will collectively achieve remains to be seen. Yet, the emergence of synthetic biology in China might be indicative of a new paradigm for how research practices can be introduced, normalized and regulated.  相似文献   

14.
Katrin Weigmann 《EMBO reports》2013,14(12):1043-1046
Scientists are exploring crowdfunding as a potential new source of cash for their research.One day early in 2011, Jarrett Byrnes and Jai Ranganathan, both ecologists at the National Center for Ecological Analysis and Synthesis, Santa Barbara, CA, USA, had a great idea about an alternative way to fund research projects. Byrnes was sitting in his office when Ranganathan came in to tell him about a proposal he had seen on the crowdfunding website Kickstarter (www.kickstarter.com) to erect in Detroit a statue of RoboCop, the robot-human hero of a US science fiction action film. “The RoboCop project seemed a little esoteric, but the proposers had done a fabulous job at communicating why it was both interesting and important,” Byrnes said. It took the internet by storm and raised more than US$65,000 from almost 3,000 backers.If this could be done for RoboCop, Byrnes and Ranganathan wondered whether it could be done for science as well. They asked friends and fellow scientists whether they would be interested in crowdfunding a research project and, after receiving substantial positive feedback, launched #SciFund Challenge in November 2011 with 59 research proposals. #SciFund Challenge helps researchers put together a crowdfunding proposal, supports their outreach activities and launches coordinated campaigns on the crowdfunding website of their partner RocketHub (www.rockethub.com). “We thought we would do it all at the same time so we could help each other out,” explained Byrnes, who is now chief networking officer for #SciFund Challenge.Crowdfunding is the practice of funding a project by raising many small contributions from a large number of individuals, typically via the internet. An artist, film-maker or musician would put together an online profile of their project and choose a platform such as Kickstarter, RocketHub or Indiegogo for its presentation. If people like the project, they can pledge money to it. Backers are usually charged only if the project succeeds in reaching its funding goal before the deadline. Kickstarter is one of the largest crowdfunding portals and focuses on creative projects; in 2012, more than 2 million backers pledged more than US$320 million to Kickstarter projects (http://www.kickstarter.com/year/2012?ref=what_is_kickstarter#overall_stats).Creative projects always work towards a concrete product—an exhibition, a DVD or a computer game—which is not necessarily the case for science, particularly basic researchThe challenge for Byrnes and others is whether crowdfunding works for science. Creative projects always work towards a concrete product—an exhibition, a DVD or a computer game—which is not necessarily the case for science, particularly basic research. Would people donate money for the pursuit of knowledge? In a time of economic crisis and budget cuts, many scientists are eager to give it a try. Crowdfunding of science has exploded in recent years, with funding goals becoming increasingly ambitious; some projects have attracted US$10,000–20,000 or even more. New platforms, such as Petridish, FundaGeek or Microryza, specifically cater to research projects [1]. The academic system is starting to adapt too: the University of California, San Francisco, CA, USA, has made a deal with the crowdfunding site Indiegogo that allows backers to make money donated via the site tax deductible [2].Kristina Killgrove, now assistant professor in the department of anthropology, University of West Florida, USA, became interested in crowdfunding when traditional ways of funding research were not available to her. “At the time, I did not have a permanent faculty job. I was an adjunct instructor, with a contract for only one semester, so there was no good way for me to apply for a grant through regular channels, like our National Science Foundation,” she explained. Her proposal to study the DNA of ancient Roman skeletons to learn more about the geographical origins and heritage of the lower classes and slaves in the Roman Empire was part of the first round of #SciFund Challenge projects and attracted donors interested in ancient Rome and in DNA analysis. She exceeded her financial target of $6,000 in less than 2 weeks and eventually raised more than $10,000 from 170 funders.Ethan Perlstein had also reached an academic deadlock when he first turned to crowdfunding. His independent postdoctoral fellowship at Princeton University, USA, came to an end in late 2012 and his future was unclear. Grants for basic research came in his experience from the government or foundations, and he had never questioned that premise. “But as I felt my own existential crisis emerging—I might not get an academic job—I started to think about crowdfunding more seriously. I searched for other scientists who had tried it,” Perlstein said.“It is time to experiment with the way we experiment,” Perlstein''s crowdfunding video proclaims. Indeed, he approached crowdfunding in a scientific way. He analysed various successful projects in search for some general principles. “I wanted a protocol,” he said. “I wanted to do as much as I could beforehand to increase the likelihood of success.” Together with his colleagues, David Sulzer, professor in the departments of neurology and psychiatry at Columbia Medical School, New York, NY, USA, and lead experimentalist Daniel Korostyshevsky, he asked for $25,000 to study the distribution of amphetamines within mouse brain cells to elucidate the mechanism by which these drugs increase dopamine levels at synapses. The crowdfunding experiment worked and their project was fully funded.Creation of a good website with a convincing video is a crucial step towards success. “When crafting your project, it is important to try to put yourselves in the shoes of the audience,” recommended Cindy Wu, who founded San-Francisco-based science crowdfunding company, Microryza, with Denny Luan when they were in graduate school. “You as a scientist find your work absolutely fascinating. Communicating this passion to a broader audience is absolutely key,” said Byrnes. However, recruitment of people to the website is at least as important as the site itself. Many successful crowdfunders build their campaign on existing social networks to channel potential funders to their own website [3]. “Building an audience for your work, having people aware of you and what you are doing, is of paramount importance,” Byrnes explained, and added that crowdfunding can be as time consuming as grant applications. “But it''s a different kind of time. I find it actually quite satisfying,” he said.“Be scientific about it,” Perlstein advised. How many donors are needed to reach a funding goal? How many page views would be required accordingly, assuming a certain conversion rate? “If you approach a crowdfunding campaign methodically, it doesn''t guarantee success, but at least you implement best practices.” Perlstein is now an independent scientist renting laboratory space from the Molecular Sciences Institute, a non-profit research facility in Berkeley, CA, USA. “Academia and I were in a long-term relationship for over a decade but we broke up,” he explained. With federal and state funding flat or on a downwards trend, he sees his future in fundraising from patrons, venture philanthropists or disease foundations in addition to crowdfunding. Yet Perlstein remains an exception. Most scientists do not use crowdfunding as an alternative to normal funding opportunities, but rather as a supplement. A typical crowdfunding project nowadays would raise a few thousand dollars, which is enough to fund a student''s work for a summer or to buy some equipment [3].A typical crowdfunding project nowadays would raise a few thousand dollars, which is enough to fund a student''s work for a summer or to buy some equipmentCrowdfunding is also ideal to get new ideas off the ground, which was a key incentive to found Microryza. Cindy Wu''s experience with the academic funding system in graduate school taught her how difficult it was to get small grants for seed ideas. Together with her colleague Denny Luan she interviewed 100 scientists on the topic. “Every single person said there is always a seed idea they want to work on but it is difficult to get funding for early stage research,” she said. The two students concluded that crowdfunding would be able to fill that gap and a few months later set up Microryza. “Crowdfunding is a fantastic way to begin a project and collect preliminary data on something that might be a little risky but very exciting,” Byrnes said. “When you then write up a proposal for a larger governmentally funded grant you have evidence that you are doing outreach work and that you are bringing the results of your work to a broader audience.”Crowdfunded projects cover a wide range of research from ecology, medicine, physics and chemistry to engineering and economics. Some projects are pure basic science, such as investigating polo kinase in yeast (http://www.rockethub.com/projects/3753-cancer-yeast-has-answers), whereas others are applied, for instance developing a new method to clean up ocean oil spills (http://www.kickstarter.com/projects/cesarminoru/protei-open-hardware-oil-spill-cleaning-sailing-ro). Project creators may be students, professors or independent scientists, and research is carried out in universities, companies or hired laboratory space or outsourced to core facilities. Some projects aim to touch people''s heartstrings, such as saving butterflies (http://www.rockethub.com/projects/11903did-you-know-butterflies-have-std) and others address politically relevant topics, such as gun policy and safety (https://www.microryza.com/projects/gun-control-research-project).Some proposals have immediate relevance, such as the excavation of a triceratops skeleton to display it in the Seattle museum (https://www.microryza.com/projects/bring-a-triceratops-to-seattle). Backers can follow the project and see that the promise has been kept. For many projects in basic research, however, progress is much more abstract even if there are long-term goals, such as a cure for cancer, conservation strategies to save butterflies or so on. But will a non-scientist be able to evaluate the relevance of a particular project for such long-term goals? Will interested donors be able to judge whether these goals are within reach? Indeed, science crowdfunding has drawn criticism for its lack of peer review and has been accused of pushing scientists into overselling their research [1,4,5]. “There is a risk that it provides opportunities for scientists who are less than scrupulous to deceive the general public,” commented Stephen Curry, professor of structural biology at Imperial College, London, UK.…science crowdfunding has drawn criticism for its lack of peer review and has been accused of pushing scientists into overselling their researchMany scientific crowdfunding sites have systems in place to check the credibility of research proposals. “At #SciFund Challenge we have what we like to call a gentle peer review. If an undergraduate is promising to overthrow they theory of gravity we will have some questions about that,” explained Byrnes. Microryza would also not let any project pass. The team checks the proposal creator''s identity and evaluates whether the proposal addresses a scientific question and the project goals are within the capabilities of the researcher. “We plan to have some sort of crowd-sourced peer review sometime in the future,” said Wu. Other platforms, such as FundaGeek, have discussion forums where potential donors are encouraged to debate the merits of a proposal. As crowdfunding does not involve spending large amounts of public money, it might be an ideal way to try out new forms of peer review. According to Curry, however, there are important aspects of academic peer review that cannot be provided by these systems. “The advantage of grant committees considering many applications in competition with one another is that it allows the best ones to be selected. Details of prior work and expected feasibility are necessary to judge a project,” he said.Crowdfunding is selling science to the crowd, and, just like in any outreach activity, there might be cases of conveying projects too optimistically or overstating their impact. Yet, a main advantage of crowdfunding is that it allows donors to stay involved in projects and that it encourages direct interaction between scientists and non-scientists [3]. If a crowdfunding project does not live up to its promises, the donors will find out. “Microryza is really about sharing the discovery process directly with the donors,” Wu explained. “Every time something happens in the lab scientists post an update and an email goes out to all donors.” Perlstein also maintains close contact with his backers, having met many of them in person. “If we accept their money we are going to give them front row seats to the science,” he said. Research is a labour-intensive, slow process that includes technical difficulties and reconsideration of hypotheses, a fact that might come as a surprise to non-scientists. “We are actually doing a service here to enlighten the non-scientists that this is the rhythm of basic science,” said Perlstein.Crowdfunding of science has exploded in recent years, with funding goals becoming increasingly ambitious; some projects have attracted US$10,000–20,000 or even moreCrowdfunding is by no means a gold mine, with most research projects raising only a few thousand dollars. Byrnes, however, is optimistic that it will grow and inspire a larger crowd to get involved. “Now you see $10 million projects in gaming technology and the arts. That took some years to happen. We will get there, but we still have a lot to learn. I think science crowdfunding is still in the early growth phase,” he said. As crowdfunding increases, scientists will find themselves confronted with some questions. The open sharing of the scientific process with a broader public is a key aspect of crowdfunded projects. In many cases, scientists make the primary record of a research project publicly available. What does this entail when it comes to publications or patents? “Most journals don''t have a policy on open notebooks,” acknowledged Wu. Filing of patents could also become difficult if scientists have already made all their work and results public.Crowdfunding also enables projects to be undertaken outside the academic system where rules and regulations are less well defined. uBiome, a citizen science start-up, draws on crowds not only for funding but also for providing data. The company collected more than US$300,000 through Indiegogo to sequence the microbiome of its donors (http://www.indiegogo.com/projects/ubiome-sequencing-your-microbiome). Whereas academic biomedical research involving humans has to be reviewed by an independent ethics committee, this requirement would not apply to the uBiome project. “[P]rojects that don''t want federal money, FDA approval, or to publish in traditional journals require no ethical review at all as far as we know,” Jessica Richman and Zachary Apte, cofounders of uBiome, wrote in an invited guest blog on Scientific American (http://blogs.scientificamerican.com/guest-blog/2013/07/22/crowdfunding-and-irbs-the-case-of-ubiome/). The researchers worked with an independent institutional review board to provide ethics oversight. Some crowdfunding websites, such as Microryza, make sure their researchers have approval from an institutional review board. Greater consistency is needed, however, to ensure that research is carried out according to ethics standards.Crowdfunding is not going to substitute public funding … rather, it would coexist as a more democratic form of philanthropyCrowdfunding is not a one-size-fits-all revenue stream for science. It might be easier to get support for ‘catchy'' topics than for investigation of molecular interactions or protein structures. Crowdfunding is not going to substitute public funding either; rather, it would coexist as a more democratic form of philanthropy. But for those who embrace it, crowdfunding can be a rewarding experience. “I had a lot of fun being part of #SciFund—I got to meet a lot of other interesting scientists, I raised some money, and I learned a bit about working with journalists and science writers to get my ideas and results disseminated to the public,” Killgrove said. Crowdfunding provides an opportunity for public engagement, raises public awareness, and gives scientists an incentive to communicate their research to a broader public. “In many cases, scientists do not receive any real incentive for doing outreach work” Byrnes said. “Crowdfunding can be seen as a means to reward them for their effort.”  相似文献   

15.
Blurring lines     
The research activities of direct-to-consumer genetic testing companies raise questions about consumers as research subjectsThe recent rise of companies that offer genetic testing directly to consumers, bypassing the traditional face-to-face consultation with a health-care professional, has created a steady stream of debate over the actual and potential value of these services (Hogarth et al, 2008). Despite the debates, however, the reality remains that these services are being offered and have genuine consequences for consumers. As opposed to the issues that have regularly been discussed regarding direct-to-consumer (DTC) genetic testing, the fact that these companies use consumers'' data to perform research has been given relatively little attention. This omission is misconceived as this practice—within the wider realm of DTC genetic testing services—raises its own questions and concerns. In particular, it is blurring the line between consumers and research subjects, which threatens to undermine the public trust and confidence in genetic research that the scientific community has been trying to build over the past decades.Even when a company is relatively transparent about its research activities, one might still be concerned by a lack of consumer awareness of these activitiesWith this in mind, we analysed the websites—including informed consent forms and privacy policies—of five companies that offer DTC full genome testing: 23andMe, deCODE, Navigenics, Gene Essence—the genetic testing service offered by the company BioMarker Pharmaceuticals—and SeqWright. Two questions guided our study: Are consumers aware that the data generated by the company to fulfil the terms of their service will later be used for research? Even if this is the case, is the process of consent provided by companies ethically acceptable from the point of view of academic research?As there are no empirical data available to answer the first question, we turned to the websites of the companies to understand how explicitly they present their research activities. At the time of the study—from July 2009 to January 2010—23andMe, deCODE and Navigenics candidly revealed on their websites that they conduct research using consumer data (Sidebar A). By contrast, SeqWright and Gene Essence provided what we identified as indirect and even ambiguous information about their research activities. For example, in a SeqWright online order form, the company notes: “Please volunteer any diseases from which you currently suffer (this can help us advance medical research by enabling us [sic] discover new SNP [single nucleotide polymorphism]/Disease associations)”. The information in Gene Essence''s privacy policy was similarly vague (http://geneessence.com/our-labs/privacy-policy.html), stating that “electing to provide Optional Profile Information may enable the Company to advance the science of genetics and provide you with an even better understanding of who you are genetically”.

Sidebar A | Information provided by direct-to-consumer genetic testing companies*

23andMe“You understand that your genetic and other contributed personal information will be stored in 23andMe research databases, and authorized personnel of 23andMe will conduct research using said databases.” (https://www.23andme.com/about/consent; accessed 29 January 2010)deCODE“Information that you provide about yourself under the security of your account and privacy of your chosen username may be used by deCODEme only to gather statistical aggregate information about the users of the deCODEme website. Such analysis may include information that we would like to be able to report back to you and other users of deCODEme, such as in counting the number of users grouped by gender or age, or associating genetic variants with any of the self-reported user attributes. In any such analyses and in presenting any such statistical information, deCODE will ensure that user identities are not exposed.” (http://www.decodeme.com/faq; accessed 29 January 2010)Navigenics“Navigenics is continuously improving the quality of our service, and we strive to contribute to scientific and medical research. To that end, we might de-link Your Genetic Data and Your Phenotype Information and combine it with other members'' information so that we can perform research to: […] Discover or validate associations between certain genetic variations and certain health conditions or traits, as well as other insights regarding human health.” (http://www.navigenics.com/visitor/what_we_offer/our_policies/informed_consent/health_compass; accessed 29 January 2010)*See main text for information from SeqWright and Gene Essence.If, as appears to be the case, these statements are the only declarations offered by these two companies alluding to their presumed research activities, it is virtually impossible for consumers to understand that their data will be used for research purposes. Moreover, despite the fact that the three other companies do state that they conduct research using consumer genotypes, even their declarations still give cause for concern. For instance, both Navigenics and deCODE ‘tuck away'' most of the information in their terms of service agreements, privacy policies, or in the informed consent sections of their websites. This is worrisome, as most consumers do not even read and/or understand the ‘legalese'' or ‘small print'' when signing online forms (ICO, 2008).…many studies show that participants who have agreed to have their tissue used for one type of research do not necessarily automatically agree to take part in other studies…Even when a company is relatively transparent about its research activities, one might still be concerned by a lack of consumer awareness of these activities. Between July and September 2009, 23andMe offered a new service called the “23andMe research edition”, which was prominently displayed on the company website. This version of their service, which was part of what the company calls the “23andMe research revolution”, was offered for US$99—one-quarter of the price of their traditional personal genome scan—and it offered less information to consumers than the “traditional” service. For instance, the abridged research edition neither offered information about carrier status, pharmacogenomic information and ancestry, nor could the customer browse or download the raw genomic data (https://www.23andme.com/researchrevolution/compare).At a glance, it seemed that 23andMe were marketing the “research edition” as a more affordable option, owing to the fact that the consumers were being given less information and because its name implied that the data would be used for research. Granted, the company did not explicitly express this last assumption, but the term “research edition” could have easily led consumers to this conclusion. However, what is particularly troubling about the two options—“research edition” and “traditional”, presented as distinct products—is that the consent forms for both services were identical. The issue is therefore whether, by calling one option “research edition”, 23andMe made it less clear to individuals purchasing the “traditional” service that their data would also be used for research purposes.Even were we assured that consumers are at least aware of the research being conducted, we must still ask whether the companies are obtaining adequate consent compared with that required from volunteers for similar research studies? To answer this question, we considered official guidelines covering consent, public views on the topic and information gleaned from the websites of DTC genetic testing companies.Concerning public opinion, many studies show that participants who have agreed to have their tissue used for one type of research do not necessarily automatically agree to take part in other studies (Goodson & Vernon, 2004; Schwartz et al, 2001). Furthermore, in a survey of more than 1,000 patients, 72% considered it important to be notified when leftover blood taken for clinical use was to be used for research (Hull et al, 2008). Most of those patients who wanted to be notified would require the researchers to get permission for other research (Hull et al, 2008).…requesting additional information could still be understood by consumers as an additional service that they purchased and not an explicit invitation to take part in researchAlthough some of the companies in our study do mention the diseases that they might study, they are not specific and do not describe the scope of the research that will be done. Indeed, beyond the initial customer signature required to complete the purchase of the genetic testing service, it is not always clear whether the companies would ever contact consumers to obtain explicit consent for internally conducted research. That said, if they were to send out surveys or questionnaires to request supplementary phenotype information, and consumers were to fill out and return those forms, the companies might consider this as consent to research. We would argue, however, that this blurs the line between individuals as consumers and as research participants: requesting additional information could still be understood by consumers as an additional service that they purchased and not an explicit invitation to take part in research.The issue of the identifiability of genomic data is inextricably related to the issue of consent as “[p]romises of anonymity and privacy are important to a small but significant proportion of potential participants” (Andrews, 2009). In the study performed by Hull and colleagues, 23% of participants differentiated between scenarios where samples and data were stored anonymously or with identifiers (Hull et al, 2008). The issue of anonymity is particularly important under the US Common Rule definition of ‘human subject'' research (HHS, 2009). It dictates that research conducted using samples from people that cannot be identified is not considered human subject research and as such does not require consent. Although this rule applies only to federally funded research, it might become pertinent if companies collaborate with publicly funded institutions, such as universities. More generally, regulations such as the Common Rule and the US Food and Drug Administration''s regulations for the protection of human subjects highlight the importance of the protection of individuals in research. Research activities conducted by companies selling DTC genetic tests should therefore be similarly transparent and accountable to a regulatory body.On the basis of the information from the websites of the companies we surveyed, it is not unambiguously clear whether the data used in their research is anonymized or not. That said, 23andMe claims it will keep consumers informed of future advancements in science and might ask them for additional phenotype information, suggesting that it maintains the link between genotype data and the personal information of its customers. As such, research conducted by 23andMe could be considered to involve human subjects. Thus, if 23andMe were to comply voluntarily with the Common Rule, they would have to obtain adequate informed consent.Even in cases in which data or samples are anonymized, studies show that people do care about what happens to their sample (Hull et al, 2008; Schwartz et al, 2001). Furthermore, it is becoming more and more apparent that there are intrinsic limits to the degree of protection that can be achieved through sample and data de-identification and anonymization in genomic research (Homer et al, 2008; Lin et al, 2004; McGuire & Gibbs, 2006; P3G Consortium et al, 2009). This further weakens the adequacy of companies obtaining broad-sense consent from consumers who, most probably, are not even aware that research is being conducted.The European Society of Human Genetics (ESHG) has recently issued a statement on DTC genetic testing for health-related purposes, which states that “[t]he ESHG is concerned with the inadequate consent process through which customers are enrolled in such research. If samples or data are to be used in any research, this should be clear to consumers, and a separate and unambiguous consent procedure should take place” (ESHG, 2010). Another document was recently drafted by the UK Human Genetics Commission (HGC), entitled ‘Common Framework of Principles for Direct-to-Consumers Genetic Testing Services'' (HGC, 2009). The principles were written with the intention of promoting high standards and consistency in the DTC genetic testing market and to protect the interests of consumers and their families. Although this document is not finalized and the principles themselves cannot control or regulate the market in any tangible way, this framework, along with the ESHG statement, constitute the most up-to-date and exhaustive documents addressing DTC genetic testing activities.On the basis of the information from the websites of the companies we surveyed, it is not unambiguously clear whether the data used in their research is anonymized or not…companies should be completely transparent with the public about whether people purchasing their tests are consumers or research subjects or bothPrinciple 4.5 states: “If a test provider intends to use a consumer''s biological samples and/or associated personal or genetic data for research purposes, the consumer should be informed whether the research has been approved by a research ethics committee or other competent authority, whether the biological sample and data will be transferred to or kept in a biobank or database, and about measures to ensure the security of the sample. The consumer should be informed of any risks or potential benefits associated with participating in the research.” Principle 5.6 of the HGC''s draft states that a “[s]eparate informed consent should be requested by the test provider before biological samples are used for any secondary purposes, e.g research, or before any third party is permitted access to biological samples. Consumers'' biological samples and personal and genetic data should only be used for research that has been approved by a research ethics committee (REC) or other relevant competent authority.”None of the companies we surveyed reveal on their websites whether internal research protocols have been approved by a REC or by an independent “competent authority”. Furthermore, no such independent body exists that deals specifically with the research activities of commercial companies selling DTC genetic tests. Additionally, if a company did claim to have internal ethical oversight, it would be questionable whether such a committee would really have any power to veto or change the company''s research activities.Moreover, while all five companies do state what will happen to the DNA sample—in most cases, unless asked otherwise by the consumer, the DNA sample will be destroyed shortly after testing—not enough is revealed about what will happen to the data. Some companies say where data is kept and comment on the security of the website, but as mentioned previously, companies are not clear about whether data will be anonymized. Traditionally, a great deal of focus has been placed on the fate and storage of biological samples, but genome-wide testing of hundreds of thousands of individuals for thousands or even millions of SNPs generates a lot of data. This information is not equivalent, of course, to a full genome sequence, but it can fuel numerous genomic studies in the immediate and medium-term future. As such, additional issues above and beyond basic informed consent also become a concern. For instance, what will happen to the data if a company goes bankrupt or is sold? Will the participants be sent new consent forms if the nature of the company or the research project changes drastically?The activities of companies offering DTC genetic testing have not only blurred the lines between medical services and consumer products, but also between these two activities and research. As a consequence, the appropriate treatment and autonomy of individuals who purchase DTC genetic testing services could be undermined. Paramount to this issue is the fact that companies should be completely transparent with the public about whether people purchasing their tests are consumers or research subjects or both. Although an individual who reads through the websites of such companies might be considered a simple ‘browser'' of the website, once the terms and conditions are signed—irrespective of an actual reading or comprehension—the curious consumer becomes a client and a research subject.…consumers who become research participants should be treated with the same respect and under the same norms as those involved in biobank researchCompanies using consumer samples and data to conduct research are in essence creating databases of information that can be mined and studied in the same way as biobanks and databases generated by academic institutions. As such, consumers who become research participants should be treated with the same respect and under the same norms as those involved in biobank research. As stated by the Organization for Economic Co-operation and Development, research should “respect the participants and be conducted in a manner that upholds human dignity, fundamental freedoms and human rights and be carried out by responsible researchers” (OECD, 2009). On the basis of our analysis of the websites of five companies offering DTC full genome testing, there is little evidence that the participation of ‘consumers'' in research is fully informed.The analysis of company websites was conducted in 2009 and early 2010. The information offered to consumers by the companies mentioned in this Outlook might have changed following the study''s completion or the article''s publication.? Open in a separate windowHeidi C. HowardOpen in a separate windowPascal BorryOpen in a separate windowBartha Maria Knoppers  相似文献   

16.
17.
Martinson BC 《EMBO reports》2011,12(8):758-762
Universities have been churning out PhD students to reap financial and other rewards for training biomedical scientists. This deluge of cheap labour has created unhealthy competition, which encourages scientific misconduct.Most developed nations invest a considerable amount of public money in scientific research for a variety of reasons: most importantly because research is regarded as a motor for economic progress and development, and to train a research workforce for both academia and industry. Not surprisingly, governments are occasionally confronted with questions about whether the money invested in research is appropriate and whether taxpayers are getting the maximum value for their investments.…questions about the size and composition of the research workforce have historically been driven by concerns that the system produces an insufficient number of scientistsThe training and maintenance of the research workforce is a large component of these investments. Yet discussions in the USA about the appropriate size of this workforce have typically been contentious, owing to an apparent lack of reliable data to tell us whether the system yields academic ‘reproduction rates'' that are above, below or at replacement levels. In the USA, questions about the size and composition of the research workforce have historically been driven by concerns that the system produces an insufficient number of scientists. As Donald Kennedy, then Editor-in-Chief of Science, noted several years ago, leaders in prestigious academic institutions have repeatedly rung alarm bells about shortages in the science workforce. Less often does one see questions raised about whether too many scientists are being produced or concerns about unintended consequences that may result from such overproduction. Yet recognizing that resources are finite, it seems reasonable to ask what level of competition for resources is productive, and at what level does competition become counter-productive.Finding a proper balance between the size of the research workforce and the resources available to sustain it has other important implications. Unhealthy competition—too many people clamouring for too little money and too few desirable positions—creates its own problems, most notably research misconduct and lower-quality, less innovative research. If an increasing number of scientists are scrambling for jobs and resources, some might begin to cut corners in order to gain a competitive edge. Moreover, many in the science community worry that every publicized case of research misconduct could jeopardize those resources, if politicians and taxpayers become unwilling to invest in a research system that seems to be riddled with fraud and misconduct.The biomedical research enterprise in the USA provides a useful context in which to examine the level of competition for resources among academic scientists. My thesis is that the system of publicly funded research in the USA as it is currently configured supports a feedback system of institutional incentives that generate excessive competition for resources in biomedical research. These institutional incentives encourage universities to overproduce graduate students and postdoctoral scientists, who are both trainees and a cheap source of skilled labour for research while in training. However, once they have completed their training, they become competitors for money and positions, thereby exacerbating competitive pressures.Questions raised about whether too many scientists are being produced or concerns about the unintended consequences of such overproduction are less commonThe resulting scarcity of resources, partly through its effect on peer review, leads to a shunting of resources away from both younger researchers and the most innovative ideas, which undermines the effectiveness of the research enterprise as a whole. Faced with an increasing number of grant applications and the consequent decrease in the percentage of projects that can be funded, reviewers tend to ‘play it safe'' and favour projects that have a higher likelihood of yielding results, even if the research is conservative in the sense that it does not explore new questions. Resource scarcity can also introduce unwanted randomness to the process of determining which research gets funded. A large group of scientists, led by a cancer biologist, has recently mounted a campaign against a change in a policy of the National Institutes of Health (NIH) to allow only one resubmission of an unfunded grant proposal (Wadman, 2011). The core of their argument is that peer reviewers are likely able to distinguish the top 20% of research applications from the rest, but that within that top 20%, distinguishing the top 5% or 10% means asking peer reviewers for a level of precision that is simply not possible. With funding levels in many NIH institutes now within that 5–10% range, the argument is that reviewers are being forced to choose at random which excellent applications do and do not get funding. In addition to the inefficiency of overproduction and excessive competition in terms of their costs to society and opportunity costs to individuals, these institutional incentives might undermine the integrity and quality of science, and reduce the likelihood of breakthroughs.My colleagues and I have expressed such concerns about workforce dynamics and related issues in several publications (Martinson, 2007; Martinson et al, 2005, 2006, 2009, 2010). Early on, we observed that, “missing from current analyses of scientific integrity is a consideration of the wider research environment, including institutional and systemic structures” (Martinson et al, 2005). Our more recent publications have been more specific about the institutional and systemic structures concerned. It seems that at least a few important leaders in science share these concerns.In April 2009, the NIH, through the National Institute of General Medical Sciences (NIGMS), issued a request for applications (RFA) calling for proposals to develop computational models of the research workforce (http://grants.nih.gov/grants/guide/rfa-files/RFA-GM-10-003.html). Although such an initiative might be premature given the current level of knowledge, the rationale behind the RFA seems irrefutable: “there is a need to […] pursue a systems-based approach to the study of scientific workforce dynamics.” Roughly four decades after the NIH appeared on the scene, this is, to my knowledge, the first official, public recognition that the biomedical workforce tends not to conform nicely to market forces of supply and demand, despite the fact that others have previously made such arguments.Early last year, Francis Collins, Director of the NIH, published a PolicyForum article in Science, voicing many of the concerns I have expressed about specific influences that have led to growth rates in the science workforce that are undermining the effectiveness of research in general, and biomedical research in particular. He notes the increasing stress in the biomedical research community after the end of the NIH “budget doubling” between 1998 and 2003, and the likelihood of further disruptions when the American Recovery and Reinvestment Act of 2009 (ARRA) funding ends in 2011. Arguing that innovation is crucial to the future success of biomedical research, he notes the tendency towards conservatism of the NIH peer-review process, and how this worsens in fiscally tight times. Collins further highlights the ageing of the NIH workforce—as grants increasingly go to older scientists—and the increasing time that researchers are spending in itinerant and low-paid postdoctoral positions as they stack up in a holding pattern, waiting for faculty positions that may or may not materialize. Having noted these challenging trends, and echoing the central concerns of a 2007 Nature commentary (Martinson, 2007), he concludes that “…it is time for NIH to develop better models to guide decisions about the optimum size and nature of the US workforce for biomedical research. A related issue that needs attention, though it will be controversial, is whether institutional incentives in the current system that encourage faculty to obtain up to 100% of their salary from grants are the best way to encourage productivity.”Similarly, Bruce Alberts, Editor-in-Chief of Science, writing about incentives for innovation, notes that the US biomedical research enterprise includes more than 100,000 graduate students and postdoctoral fellows. He observes that “only a select few will go on to become independent research scientists in academia”, and argues that “assuming that the system supporting this career path works well, these will be the individuals with the most talent and interest in such an endeavor” (Alberts, 2009).His editorial is not concerned with what happens to the remaining majority, but argues that even among the select few who manage to succeed, the funding process for biomedical research “forces them to avoid risk-taking and innovation”. The primary culprit, in his estimation, is the conservatism of the traditional peer-review system for federal grants, which values “research projects that are almost certain to ‘work''”. He continues, “the innovation that is essential for keeping science exciting and productive is replaced by […] research that has little chance of producing the breakthroughs needed to improve human health.”If an increasing number of scientists are scrambling for jobs and resources, some might begin to cut corners in order to gain a competitive edgeAlthough I believe his assessment of the symptoms is correct, I think he has misdiagnosed the cause, in part because he has failed to identify which influence he is concerned with from the network of influences in biomedical research. To contextualize the influences of concern to Alberts, we must consider the remaining majority of doctorally trained individuals so easily dismissed in his editorial, and further examine what drives the dynamics of the biomedical research workforce.Labour economists might argue that market forces will always balance the number of individuals with doctorates with the number of appropriate jobs for them in the long term. Such arguments would ignore, however, the typical information asymmetry between incoming graduate students, whose knowledge about their eventual job opportunities and career options is by definition far more limited than that of those who run the training programmes. They would also ignore the fact that universities are generally not confronted with the externalities resulting from overproduction of PhDs, and have positive financial incentives that encourage overproduction. During the past 40 years, NIH ‘extramural'' funding has become crucial for graduate student training, faculty salaries and university overheads. For their part, universities have embraced NIH extramural funding as a primary revenue source that, for a time, allowed them to implement a business model based on the interconnected assumptions that, as one of the primary ‘outputs'' or ‘products'' of the university, more doctorally trained individuals are always better than fewer, and because these individuals are an excellent source of cheap, skilled labour during their training, they help to contain the real costs of faculty research.“…the current system has succeeded in maximizing the amount of research […] it has also degraded the quality of graduate training and led to an overproduction of PhDs…”However, it has also made universities increasingly dependent on NIH funding. As recently documented by the economist Paula Stephan, most faculty growth in graduate school programmes during the past decade has occurred in medical colleges, with the majority—more than 70%—in non-tenure-track positions. Arguably, this represents a shift of risk away from universities and onto their faculty. Despite perennial cries of concern about shortages in the research workforce (Butz et al, 2003; Kennedy et al, 2004; National Academy of Sciences et al, 2005) a number of commentators have recently expressed concerns that the current system of academic research might be overbuilt (Cech, 2005; Heinig et al, 2007; Martinson, 2007; Stephan, 2007). Some explicitly connect this to structural arrangements between the universities and NIH funding (Cech, 2005; Collins, 2007; Martinson, 2007; Stephan, 2007).In 1995, David Korn pointed out what he saw as some problematic aspects of the business model employed by Academic Medical Centers (AMCs) in the USA during the past few decades (Korn, 1995). He noted the reliance of AMCs on the relatively low-cost, but highly skilled labour represented by postdoctoral fellows, graduate students and others—who quickly start to compete with their own professors and mentors for resources. Having identified the economic dependence of the AMCs on these inexpensive labour pools, he noted additional problems with the graduate training programmes themselves. “These programs are […] imbued with a value system that clearly indicates to all participants that true success is only marked by the attainment of a faculty position in a high-profile research institution and the coveted status of principal investigator on NIH grants.” Pointing to “more than 10 years of severe supply/demand imbalance in NIH funds”, Korn concluded that, “considering the generative nature of each faculty mentor, this enterprise could only sustain itself in an inflationary environment, in which the society''s investment in biomedical research and clinical care was continuously and sharply expanding.” From 1994 to 2003, total funding for biomedical research in the USA increased at an annual rate of 7.8%, after adjustment for inflation. The comparable rate of growth between 2003 and 2007 was 3.4% (Dorsey et al, 2010). These observations resonate with the now classic observation by Derek J. de Solla Price, from more than 30 years before, that growth in science frequently follows an exponential pattern that cannot continue indefinitely; the enterprise must eventually come to a plateau (de Solla Price, 1963).In May 2009, echoing some of Korn''s observations, Nobel laureate Roald Hoffmann caused a stir in the US science community when he argued for a “de-coupling” of the dual roles of graduate students as trainees and cheap labour (Hoffmann, 2009). His suggestion was to cease supporting graduate students with faculty research grants, and to use the money instead to create competitive awards for which graduate students could apply, making them more similar to free agents. During the ensuing discussion, Shirley Tilghman, president of Princeton University, argued that “although the current system has succeeded in maximizing the amount of research performed […] it has also degraded the quality of graduate training and led to an overproduction of PhDs in some areas. Unhitching training from research grants would be a much-needed form of professional ‘birth control''” (Mervis, 2009).The greying of the NIH research workforce is another important driver of workforce dynamics, and it is integrally linked to the fate of young scientistsAlthough the issue of what I will call the ‘academic birth rate'' is the central concern of this analysis, the ‘academic end-of-life'' also warrants some attention. The greying of the NIH research workforce is another important driver of workforce dynamics, and it is integrally linked to the fate of young scientists. A 2008 news item in Science quoted then 70-year-old Robert Wells, a molecular geneticist at Texas A&M University, “‘if I and other old birds continue to land the grants, the [young scientists] are not going to get them.” He worries that the budget will not be able to support “the 100 people ‘I''ve trained […] to replace me''” (Kaiser, 2008). While his claim of 100 trainees might be astonishing, it might be more astonishing that his was the outlying perspective. The majority of senior scientists interviewed for that article voiced intentions to keep doing science—and going after NIH grants—until someone forced them to stop or they died.Some have looked at the current situation with concern, primarily because of the threats it poses to the financial and academic viability of universities (Korn, 1995; Heinig et al, 2007; Korn & Heinig, 2007), although most of those who express such concerns have been distinctly reticent to acknowledge the role of universities in creating and maintaining the situation. Others have expressed concerns about the differential impact of extreme competition and meagre job prospects on the recruitment, development and career survival of young and aspiring scientists (Freeman et al, 2001; Kennedy et al, 2004; Martinson et al, 2006; Anderson et al, 2007a; Martinson, 2007; Stephan, 2007). There seems to be little disagreement, however, that the system has generated excessively high competition for federal research funding, and that this threatens to undermine the very innovation and production of knowledge that is its raison d''etre.The production of knowledge in science, particularly of the ‘revolutionary'' variety, is generally not a linear input–output process with predictable returns on investment, clear timelines and high levels of certainty (Lane, 2009). On the contrary, it is arguable that “revolutionary science is a high risk and long-term endeavour which usually fails” (Charlton & Andras, 2008). Predicting where, when and by whom breakthroughs in understanding will be produced has proven to be an extremely difficult task. In the face of such uncertainty, and denying the realities of finite resources, some have argued that the best bet is to maximize the number of scientists, using that logic to justify a steady-state production of new PhDs, regardless of whether the labour market is sending signals of increasing or decreasing demand for that supply. Only recently have we begun to explore the effects of the current arrangement on the process of knowledge production, and on innovation in particular (Charlton & Andras, 2008; Kolata, 2009).…most of those who express such concerns have been reticent to acknowledge the role of universities themselves in creating and maintaining the situationBruce Alberts, in the above-mentioned editorial, points to several initiatives launched by the NIH that aim to get a larger share of NIH funding into the hands of young scientists with particularly innovative ideas. These include the “New Innovator Award,” the “Pioneer Award” and the “Transformational R01 Awards”. The proportion of NIH funding dedicated to these awards, however, amounts to “only 0.27% of the NIH budget” (Alberts, 2009). Such a small proportion of the NIH budget does not seem likely to generate a large amount of more innovative science. Moreover, to the extent that such initiatives actually succeed in enticing more young investigators to become dependent on NIH funds, any benefit these efforts have in terms of innovation may be offset by further increases in competition for resources that will come when these new ‘innovators'' reach the end of this specialty funding and add to the rank and file of those scrapping for funds through the standard mechanisms.Our studies on research integrity have been mostly oriented towards understanding how the influences within which academic scientists work might affect their behaviour, and thus the quality of the science they produce (Anderson et al, 2007a, 2007b; Martinson et al, 2009, 2010). My colleagues and I have focused on whether biomedical researchers perceive fairness in the various exchange relationships within their work systems. I am persuaded by the argument that expectations of fairness in exchange relationships have been hard-wired into us through evolution (Crockett et al, 2008; Hsu et al, 2008; Izuma et al, 2008; Pennisi, 2009), with the advent of modern markets being a primary manifestation of this. Thus, violations of these expectations strike me as potentially corrupting influences. Such violations might be prime motivators for ill will, possibly engendering bad-faith behaviour among those who perceive themselves to have been slighted, and therefore increasing the risk of research misconduct. They might also corrupt the enterprise by signalling to talented young people that biomedical research is an inhospitable environment in which to develop a career, possibly chasing away some of the most talented individuals, and encouraging a selection of characteristics that might not lead to optimal effectiveness, in terms of scientific innovation and productivity (Charlton, 2009).To the extent that we have an ecology with steep competition that is fraught with high risks of career failure for young scientists after they incur large costs of time, effort and sometimes financial resources to obtain a doctoral degree, why would we expect them to take on the additional, substantial risks involved in doing truly innovative science and asking risky research questions? And why, in such a cut-throat setting, would we not anticipate an increase in corner-cutting, and a corrosion of good scientific practice, collegiality, mentoring and sociability? Would we not also expect a reduction in high-risk, innovative science, and a reversion to a more career-safe type of ‘normal'' science? Would this not reduce the effectiveness of the institution of biomedical research? I do not claim to know the conditions needed to maximize the production of research that is novel, innovative and conducted with integrity. I am fairly certain, however, that putting scientists in tenuous positions in which their careers and livelihoods would be put at risk by pursuing truly revolutionary research is one way to insure against it.  相似文献   

18.
The Women in Cell Biology (WICB) committee of the American Society for Cell Biology (ASCB) was started in the 1970s in response to the documented underrepresentation of women in academia in general and cell biology in particular. By coincidence or causal relationship, I am happy to say that since WICB became a standing ASCB committee, women have been well represented in ASCB''s leadership and as symposium speakers at the annual meeting. However, the need to provide opportunities and information useful to women in developing their careers in cell biology is still vital, given the continuing bias women face in the larger scientific arena. With its emphasis on mentoring, many of WICB''s activities benefit the development of both men and women cell biologists. The WICB “Career Column” in the monthly ASCB Newsletter is a source of accessible wisdom. At the annual ASCB meeting, WICB organizes the career discussion and mentoring roundtables, childcare awards, Mentoring Theater, career-related panel and workshop, and career recognition awards. Finally, the WICB Speaker Referral Service provides a list of outstanding women whom organizers of scientific meetings, scientific review panels, and university symposia/lecture series can reach out to when facing the proverbial dilemma, “I just don''t know any women who are experts.”Although women are approaching parity in earning PhD and MD degrees, studies of their underrepresentation in academia, as principal investigators in funded science and in leadership positions, have led to the conclusion that gender schemas (Valian, 1999 ) work against women and diminish their success. This picture is supported by the National Academy of Sciences’ (NAS) report Beyond Bias and Barriers, which concludes that “Neither our academic institutions nor our nation can afford such underuse of precious human capital in science and engineering” (NAS, 2007 ) A New Yorker cartoon captures the scene at too many scientific gatherings (Figure 1).Open in a separate windowFIGURE 1:“The subject of tonight''s discussion is: Why are there no women on this panel?” Cartoon by David Sipress from The New Yorker Collection, www.cartoonbank.com. Used under Rights Managed License. Copyright holder Conde Nast.The Women in Cell Biology (WICB) committee was started in the early 1970s with notices of ad hoc meetings posted in women''s washrooms during the American Society for Cell Biology (ASCB) annual meeting and a mimeographed newsletter (Williams, 1996a , 1996b ). One goal for WICB''s “founding mothers” was to achieve more equitable representation as participants within ASCB, including more accurate representation of women within the ASCB leadership and as speakers. Consonant with this goal, WICB was delighted to become a standing committee of the ASCB in 1992. In the 30 years prior to this watershed date, only 13% of ASCB presidents were women. Since 1992, 50% of ASCB presidents have been women. I cannot determine whether this is causal, reflective of a third variable, or pure coincidence. But it is remarkable.The number of women leaders and speakers within ASCB suggests that WICB''s initial goal of more accurate representation of women has been largely achieved. However, the goals of helping women cell biologists successfully juggle career and family, find mentors, and achieve gender equity in job placement continue to be challenges. We have developed multiple WICB-sponsored activities throughout the year, and especially at the annual ASCB meeting, to give cell biologists tools with which to meet these challenges.  相似文献   

19.
20.
Brothers in arms     
Andrea Rinaldi 《EMBO reports》2013,14(10):866-870
The horrific injuries and difficult working conditions faced by military medical personnel have forced the military to fund biomedical research to treat soldiers; those new technologies and techniques contribute significantly to civilian medicine.War is the father of all things, Heraclitus believed. The military''s demand for better weapons and transportation, as well as tools for communication, detection and surveillance has driven technological progress during the past 150 years or so, producing countless civilian applications as a fallout. The military has invested heavily into high-energy physics, materials science, navigation systems and cryptology. Similarly, military-funded biomedical research encompasses the whole range from basic to applied research programmes (Fig 1), and the portion of military-funded research in the biological and medical fields is now considerable.Open in a separate windowFigure 11944 advertisement for Diebold Inc. (Ohio, USA) in support of blood donations for soldiers wounded in the Second World War. The military has traditionally been one of the greatest proponents of active research on synthetic blood production, blood substitutes and oxygen therapeutics for treating battlefield casualties. One recent approach in this direction is The Defense Advanced Research Projects Agency''s (DARPA''s) Blood Pharming programme, which plans to use human haematopoietic stem cells—such as those obtained from umbilical cord blood—as a “starting material to develop an automated, fieldable cell culture and packaging system capable of producing transfusable amounts of universal donor red blood cells” (http://www.darpa.mil/Our_Work/DSO/Programs/Blood_Pharming.aspx).War has always driven medical advances. From ancient Roman to modern times, treating the wounds of war has yielded surgical innovations that have been adopted by mainstream medicine. For example, the terrible effect of modern artillery on soldiers in the First World War was a major impetus for plastic surgery. Similarly, microbiology has benefited from war and military research: from antiseptics to prevent and cure gangrene to the massive production of penicillin during the Second World War, as well as more basic research into a wide range of pathogens, militaries worldwide have long been enthusiastic sponsors of microbiology research. Nowadays, military-funded research on pathogens uses state-of-the-art genotyping methods to study outbreaks and the spread of infection and seeks new ways to combat antibiotic resistance that afflicts both combatants and civilians.…military-funded biomedical research encompasses the whole range from basic to applied research programmes…The US Military Infectious Diseases Research Program (MIDRP) is particularly active in vaccine development to protect soldiers, especially those deployed overseas. Its website notes that: “Since the passing of the 1962 Kefauver–Harris Drug Amendment, which added the FDA requirement for proof of efficacy in addition to proof of safety for human products, there have been 28 innovative vaccines licenced in the US, including 13 vaccines currently designated for paediatric use. These 28 innovative vaccine products targeted new microorganisms, utilized new technology, or consisted of novel combinations of vaccines. Of these 28, the US military played a significant role in the development of seven licenced vaccines” (https://midrp.amedd.army.mil/). These successes include tetravalent meningococcal vaccine and oral typhoid vaccine, while current research is looking into the development of vaccines against malaria, dengue fever and hepatitis E.Similarly, the US Military HIV Research Program (MHRP) is working on the development of a global HIV-1 vaccine (http://www.hivresearch.org). MHRP scientists were behind the RV144 vaccine study in Thailand—the largest ever HIV vaccine study conducted in humans—that demonstrated that the vaccine was capable of eliciting modest and transient protection against the virus [1]. In the wake of the cautious optimism raised by the trial, subsequent research is providing insights into the workings of RV144 and is opening new doors for vaccine designers to strengthen the vaccine. In a recent study, researchers isolated four monoclonal antibodies induced by the RV144 vaccine and directed at a particular region of the HIV virus envelope associated with reduced infection, the variable region 2. They found that these antibodies recognized HIV-1-infected CD4(+) T cells and tagged them for attack by the immune system [2].In response to the medical problems military personnel are suffering in Iraq and Afghanistan, a recent clinical trial funded by the US Department of the Army demonstrated the efficacy of the aminoglycoside antibiotic paromomycin—either with or without gentamicin—for the topical treatment of cutaneous leishmaniasis, the most common form of infection by Leishmania parasites. Cutaneous leishmaniasis—which is endemic in Iraq and Afghanistan and rather frequent among soldiers deployed there—is transmitted to humans through the bite of infected sandflies: it causes skin ulcers and sores and can cause serious disability and social prejudice [3]. Topical treatments would offer advantages over therapies that require the systemic administration of antiparasitic drugs. The study—a phase 3 trial—was conducted in Tunisia and enrolled some 375 patients with one to five ulcerative lesions from cutaneous leishmaniasis. Patients, all aged between 5 and 65, received topical applications of a cream containing either 15% paromomycin with 0.5% gentamicin, 15% paromomycin alone or the control cream, which contained no antibiotic. The combination of paromomycin and gentamicin cured cutaneous leishmaniasis with an efficacy of 81%, compared with 82% for paromomycin alone and just 58% for control—the skin sores of cutaneous leishmaniasis often heal on their own. Researchers reported no adverse reactions to paronomycin-containing creams. Because the combination therapy with gentamicin is probably effective against a larger range of Leishmania parasitic species and strains causing the disease, it could become a first-line treatment for cutaneous leishmaniasis on a global scale the authors concluded [3].…military-funded research on pathogens uses state-of-the-art genotyping methods to study outbreaks and the spread of infectionNot surprisingly, trauma and regenerative and reconstructive medicine are other large areas of research in which military influence is prevalent. The treatment of wounds, shock and the rehabilitation of major trauma patients are the very essence of medical aid on the battlefield (Figs 2, ,3).3). “Our experience of military conflict, in particular the medicine required to deal with severe injuries, has led to significant improvements in trauma medicine. Through advances in the prevention of blood loss and the treatment of coagulopathy for example, patients are now surviving injuries that 5–10 years ago would have been fatal,” said Professor Janet Lord, who leads a team investigating the inflammatory response in injured soldiers at the National Institute for Health Research Surgical Reconstruction and Microbiology Research Centre (NIHR SRMRC) in Birmingham, UK (http://www.srmrc.nihr.ac.uk/).Open in a separate windowFigure 2Medical services in Britain, 1917. Making an artificial leg for a wounded serviceman at Roehampton Hospital in Surrey. This image is from The First World War Poetry Digital Archive, University of Oxford (www.oucs.ox.ac.uk/ww1lit). Copyright: The Imperial War Museum.Open in a separate windowFigure 3US soldiers use the fireman''s carry to move a simulated casualty to safety during a hyper-realistic training environment, known as trauma lanes, as part of the final phase of the Combat Lifesaver Course given by medics from Headquarters Support Company, Headquarters and Headquarters Battalion, 25th Inf. Div., USD-C, at Camp Liberty, Iraq, March 2011. Credit: US Army, photo by Sgt Jennifer Sardam.NIHR SRMRC integrates basic researchers at Birmingham University with clinicians and surgeons at the Royal Centre for Defence Medicine and University Hospital Birmingham to improve the treatment of traumatic injury in both military and civilian patients. As Lord explained, the centre has two trauma-related themes. The first is looking at, “[t]he acute response to injury, which analyses the kinetics and nature of the inflammatory response to tissue damage and is developing novel therapies to ensure the body responds appropriately to injury and does not stay in a hyper-inflamed state. The latter is a particular issue with older patients whose chance of recovery from trauma is significantly lower than younger patients,” she said. The second theme is, “[n]eurotrauma and regeneration, which studies traumatic brain injury, trying to develop better ways to detect this early to prevent poor outcomes if it goes undiagnosed,” Lord said.Kevlar helmets and body armour have saved the lives of many soldiers, but they do not protect much the face and eyes, and in general against blasts to the head. Because human retinas and brains show little potential for regeneration, patients with face and eye injuries often suffer from loss of vision and other consequences for the rest of their lives. However, a new stem cell and regenerative approach for the treatment of retinal injury and blindness is on the horizon. “Recent progress in stem cell research has begun to emerge on the possible exploitation of stem cell-based strategies to repair the damaged CNS (central nervous system). In particular, research from our laboratory and others have demonstrated that Müller cells—dormant stem-like cells found throughout the retina—can serve as a source of retinal progenitor cells to regenerate photoreceptors as well as all other types of retinal neurons,” explained Dong Feng Chen at the Schepens Eye Research Institute, Massachusetts Eye and Ear of the Harvard Medical School in Boston (Massachusetts, USA). In collaboration with the US Department of Defence, the Schepens Institute is steering the Military Vision Research Program, “to develop new ways to save the vision of soldiers injured on today''s battlefield and to push the frontier of vision technologies forward” (http://www.schepens.harvard.edu).“My laboratory has shown that adult human and mouse Müller cells can not only regenerate retina-specific neurons, but can also do so following induction by a single small molecule compound, alpha-aminoadipate,” Chen explained. She said that alpha-aminoadipate causes isolated Müller glial cells in culture to loose their glial phenotype, express progenitor cell markers and divide. Injection of alpha-aminoadipate into the subretinal space of adult mice in vivo induces mature Müller glia to de-differentiate and generate new retinal neurons and photoreceptor cells [4]. “Our current effort seeks to further elucidate the molecular pathways underlying the regenerative behaviour of Muller cells and to achieve functional regeneration of the damaged retina with small molecule compounds,” Chen said. “As the retina has long served as a model of the CNS, and Müller cells share commonalities with astroglial lineage cells in the brain and spinal cord, the results of this study can potentially be broadened to future development of treatment strategies for other neurodegenerative diseases, such as brain and spinal cord trauma, or Alzheimer and Parkinson disease.”The treatment of wounds, shock and the rehabilitation of major trauma patients are the very essence of medical aid on the battlefieldBrain injuries account for a large percentage of the wounds sustained by soldiers. The Defense Advanced Research Projects Agency (DARPA), an agency of the US Department of Defense, recently awarded US$6 million to a team of researchers to develop nanotechnology therapies for the treatment of traumatic brain injury and associated infections. The researchers are trying to develop nanoparticles carrying small interfering RNA (siRNA) molecules to reach and treat drug-resistant bacteria and inflammatory cells in the brain. Protecting the siRNA within a nanocomplex covered with specific tissue homing and cell-penetrating peptides will make it possible to deliver the therapeutics to infected cells beyond the blood–brain barrier—which normally makes it difficult to get antibiotics to the brain. The project has been funded within the framework of DARPA''s In Vivo Nanoplatforms programme that “seeks to develop new classes of adaptable nanoparticles for persistent, distributed, unobtrusive physiologic and environmental sensing as well as the treatment of physiologic abnormalities, illness and infectious disease” (www.darpa.mil).“The DARPA funding agency often uses the term ‘DARPA-hard'' to refer to problems that are extremely tough to solve. What makes this a DARPA-hard problem is the fact that it is so difficult to deliver therapeutics to the brain. This is an underserved area of research,” explained team leader Michael Sailor, from the University of California San Diego, in a press release (http://ucsdnews.ucsd.edu/pressrelease/darpa_awards_6_million_to_develop_nanotech_therapies_for_traumatic_brain_in).In the near future, DARPA, whose budget is set for a 1.8% increase to US$2.9 billion next year, will focus on another important project dealing with the CNS. The BRAIN Initiative—short for Brain Research through Advancing Innovative Neurotechnologies—is a new research effort whose proponents intend will “revolutionize our understanding of the human mind and uncover new ways to treat, prevent, and cure brain disorders like Alzheimer''s, schizophrenia, autism, epilepsy and traumatic brain injury” (www.whitehouse.gov). Out of a total US$110 million investment, DARPA will obtain US$50 million to work on understanding the dynamic functions of the brain and demonstrating breakthrough applications based on these insights (Fig 4). In addition to exploring new research areas, this money will be directed towards ongoing projects of typical—although not exclusive—military interest that involve enhancing or recovering brain functions, such as the development of brain-interfaced prosthetics and uncovering the mechanisms underlying neural reorganization and plasticity to accelerate injury recovery.Open in a separate windowFigure 4The BRAIN (Brain Research through Advancing Innovative Neurotechnologies) Initiative infographic. A complete version can be downloaded at http://www.whitehouse.gov/infographics/brain-initiative.“[T]here is this enormous mystery waiting to be unlocked, and the BRAIN Initiative will change that by giving scientists the tools they need to get a dynamic picture of the brain in action and better understand how we think and how we learn and how we remember. And that knowledge could be—will be—transformative,” said US President Obama, presenting the initiative (http://www.whitehouse.gov/the-press-office/2013/04/02/remarks-president-brain-initiative-and-american-innovation).“The President''s initiative reinforces the significance of understanding how the brain records, processes, uses, stores and retrieves vast quantities of information. This kind of knowledge of brain function could inspire the design of a new generation of information processing systems; lead to insights into brain injury and recovery mechanisms; and enable new diagnostics, therapies and devices to repair traumatic injury,” explained DARPA Director Arati Prabhakar in a press release (http://www.darpa.mil/NewsEvents/Releases/2013/04/02.aspx).But BRAIN is also stirring up some controversy. Some scientists fear that this kind of ‘big and bold'' science, with a rigid top-down approach and vaguely defined objectives, will drain resources from smaller projects in fundamental biology [5]. Others ask whether the BRAIN project investment will really generate the huge return hinted at in Obama''s speech during the initiative''s launch, or whether a substantial amount of hype about the potential outcomes was used to sell the project (http://ksj.mit.edu/tracker/2013/04/obamas-brain-initiative-and-alleged-140).As these examples show, the most important player in military-funded biomedical research is the USA, with the UK following at a distance. But other countries with huge defence budgets are gearing up, although with less visibility. In July 2011, for instance, India and Kyrgyzstan opened the joint Mountain Biomedical Research Centre at the Kyrgyz capital Bishkek, to carry out research into the mechanisms of short- and long-term high-altitude adaptation. The institute will use molecular biological approaches to identify markers for screening people for high-altitude resistance and susceptibility to high-altitude sickness, and development of other mountain maladies. On the Indian side, the scientists involved in the new research centre belong to the Defence Institute of Physiology and Applied Sciences, and the money came from India''s defence budget.As mankind seems unlikely to give up on armed conflicts anytime soon, war-torn human bodies will still need to be cured and wounds healed. Whether the original impetus for military-funded biomedical research is noble or not, it nonetheless fuels considerable innovation leading to important medical discoveries that ultimately benefit all.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号