首页 | 本学科首页   官方微博 | 高级检索  
   检索      


Time to talk
Authors:Holger Breithaupt
Abstract:Debate over the publication of the H5N1 flu virus papers highlights the need for better risk management of dual-use research. Scientists should start this discussion instead of waiting for governments to implement regulation.EMBO reports (2012) 13, 578; doi:10.1038/embor.2012.77Freedom of research is a concept widely respected in democratic societies and is often enshrined in constitutional law. We tout academics′ freedom to pursue their quest for knowledge and understanding as a hallmark of a truly free society.The reality, though, is a little more nuanced. Although academics in democracies are usually free to investigate any idea they like, they are not as free when it comes to the design and conduct of their experiments. In fact, academic research is probably more regulated than most enterprises. Researchers who falsify or misrepresent data might well find themselves joining the queue at the local job centre; when it comes to corporate or financial fraud, only the most egregious cases are ever punished. Experiments conducted on vertebrates must be vetted to ensure they meet the standards of animal welfare; industrial agriculture has no such qualms about how it treats its livestock. Privacy and consent are paramount when conducting research using databases of human genetic and medical data; the business models of Facebook or Google play fast and loose with privacy and consent.Despite the plethora of rules and laws that govern the conduct of science and hold scientists to high standards they do not seem to have slowed the overall pace of research, even if some areas are seen as overregulated. Yet, as Michele Garfinkel pointed out in the context of stem cell research, regulation is a relatively small price to pay in exchange for public trust in—and financial support of—research and researchers1].New laws and stricter rules should therefore not be seen as an end to academic freedom or an undue hurdle for research. We may well see a new wave of regulations addressing renewed concerns about biosafety and biosecurity, triggered by research into a mutant version of the avian H5N1 influenza virus. Two papers—one published in Nature, one still to come from Science—have attracted considerable attention from the media and politicians, and led to the unprecedented recommendation by the National Science Advisory Board for Biosecurity (NSABB), an advisory board for the US government, to restrict the publication of crucial information. In the meantime, the NSABB has recommended publication of revised versions of these papers, but their original argument—that the information about manipulating the virus could be abused to create a biological weapon—remains valid.Notwithstanding the discussion about murky risks versus vague benefits for public health, there is a broader need to address concerns about research that could endanger human health or the environment. Many biologists feel that this was addressed at the 1975 Asilomar conference on recombinant DNA technology and that nothing bad has happened. Although true, the concerns that inspired the conference have also led to regulations on how to handle recombinant DNA and organisms.What has changed, though, is the ability of molecular biologists to manipulate living matter. At the time of Asilomar, DNA recombinant technology was in its infancy and PCR was not even invented. Modern technologies now allow scientists to analyse whole organisms at different levels of organization, manipulate their genomes and biochemistry, and even create new viruses and bacteria from scratch. Whilst some of these technologies still require a level of instrumentation, know-how and sophistication that few laboratories can muster, it is only a matter of time until these technologies and skills become widely available—including to the mentally unhinged researcher or someone else who may have less then beneficial intents. Misuse need not be criminal in intent: scientists who use cholera toxin for their experiments or work on filoviruses may not even be aware that they are handling a dual-use agent.These risks are ill-defined, but they are not negligible. The lesson from the H5N1 debates is that biological research might require new regulations to manage these risks without unduly hindering research or public health. It would not be in the interests of science, however, if such debates were left to policy-makers and the media. The scientific community should proactively acknowledge the need for better risk management and set discussions in motion.The H5N1 experience has also provided some pointers as to how a system to manage biosafety and biosecurity could look. Two crucial choking points in research are funding and publication. Funding agencies could determine whether any given research project poses dual-use risks, and whether the potential benefits outweigh possible future abuse, and could accordingly demand stricter safety measures. The NSABB has also recommended expanding the role of institutional biosafety review to address dual-use risks and biosecurity. Journals could perform a similar risk–benefit analysis—preferably involving experts in biosafety and biosecurity—to determine whether the information offered for publication poses an undue risk to public or environmental safety. Many journals and funding agencies already require that experiments using human subjects or animals are done in an ethically acceptable manner; biosecurity review would become another measure by which to ensure scientists act responsibly and benefit from public trust in science. It''s time to start talking.
Keywords:
本文献已被 PubMed 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号