首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
David K. Chan 《Bioethics》2015,29(4):274-282
Despite criticism that dignity is a vague and slippery concept, a number of international guidelines on bioethics have cautioned against research that is contrary to human dignity, with reference specifically to genetic technology. What is the connection between genetic research and human dignity? In this article, I investigate the concept of human dignity in its various historical forms, and examine its status as a moral concept. Unlike Kant's ideal concept of human dignity, the empirical or relational concept takes human dignity as something that is affected by one's circumstances and what others do. I argue that the dignity objection to some forms of genetic research rests on a view of human nature that gives humans a special status in nature – one that is threatened by the potential of genetic research to reduce individuals to their genetic endowment. I distinguish two main philosophical accounts of human nature. One of these, the Aristotelian view, is compatible with the use of genetic technology to help humans realize their inherent potential to a fuller extent.  相似文献   

2.
Legal controls over data collection in European countries have badly affected the work of epidemiologists. By contrast, journalists have been allowed far greater freedoms. The aims and tasks of both professions are in line with accepted values in our society--especially those of inquiry and the benefits of an open society. Society seems willing to accept that, in the interests of wider public good, journalism may sometimes invade individuals'' privacy and do them harm, but it is not prepared to offer epidemiology an equal measure of tolerance.  相似文献   

3.
Enzyme design and engineering strategies rely almost exclusively on nature's alphabet of twenty canonical amino acids. Recent years have seen the emergence of powerful genetic code expansion methods that allow hundreds of structurally diverse amino acids to be installed into proteins in a site-selective manner. Here, we will highlight how the availability of an expanded alphabet of amino acids has opened new avenues in enzyme engineering research. Genetically encoded noncanonical amino acids have provided new tools to probe complex enzyme mechanisms, improve biocatalyst activity and stability, and most ambitiously to design enzymes with new catalytic mechanisms that would be difficult to access within the constraints of the genetic code. We anticipate that the studies highlighted in this article, coupled with the continuing advancements in genetic code expansion technology, will promote the widespread use of noncanonical amino acids in biocatalysis research in the coming years.  相似文献   

4.
Water     
Water remains a scarce and valuable resource. Improving technologies for water purification, use and recycling should be a high priority for all branches of science.One of our most crucial and finite resources is freshwater. How often do biologists spare a thought for this substance, other than to think about its purity for the sake of an experiment? How often do we consider that 30 litres of cooling water are used to make one litre of double-distilled water? Americans use approximately 100 gallons per person per day, whereas millions of the world''s poor subsist on less than 5 gallons per day. Within the next 15 years, it is estimated that more than 1.8 billion people will be living in regions with severe water scarcity, partly as a result of climate change. By 2030 it is estimated that the annual global demand for water will increase from 4,500 billion m3 to 6,900 billion m3—approximately 40% more than the amount of freshwater available (Water Resources Group, 2009). We are not only facing an increasing scarcity of water, but we also misuse the available water. Approximately 2.5 billion people use rivers to dispose of waste—not to mention what industry dumps into them—while freshwater dams generate problems of their own including population displacement, the spread of new and more diseases to people living in the vicinity of the river, as well as effects on ecology and farming downstream.Many factors influence the supply of and demand for water, and a one-fits-all solution for all regions is therefore not possible. There are essentially two strategies to ensure a sound supply of freshwater: we either use less water, or we make more of the water that we do use. The first is a typical accounting approach and is limited in scope, whereas the second calls for better science and engineering approaches.Although the surface of the Earth is mostly covered with water, more than 95% of it is salty or inaccessible. One clear solution to increase fresh water supply is desalination, which can be done by distillation or osmosis, through the use of carbon nanotubes, or by using another promising new technology: biomimetics. Water can be filtered through aquaporins—proteins that transport water molecules in and out of cells. Such biotechnologies could reach the market as early as 2013, although other exciting technologies are already available. Simple chemistry can be used, for example, in the ‘PUR'' water purifier that uses gravity to precipitate water-born contaminants and pathogens or the water purifier akin to a trash bag, which cleanses water through a nanofibre filter containing microbicides and carbon to remove pollutants and pathogens. Such simple and cheap technology is ideal for billions of the world''s poor who do not have access to clean drinking water.Of the available freshwater, agriculture uses the largest share—up to 70% in many regions—and technological and biotechnological solutions can also contribute to preserving water in this context. New farming processes that can retain water in the soil, recycle it or reduce its use include no-till farming, crop intensification, improved fertilizer usage, crop development, waste water re-use and pre- and post-harvest food processing, among many others. The different degrees of water quality can also be exploited for agriculture; ‘grey water''—which is unsafe for human consumption—could still be used in agriculture.In addition to improving management practices, there is no question that we need considerably more innovation in water technology to close the supply–demand gap. These developments should include better processes for purification and desalination, more efficient industrial use and re-use and improved agricultural usage. The problem is that the water sector is poorly funded in all respects, including research. New technologies could help to re-use water and reclaim resources from wastewater while generating biogas from the waste. There is also enormous potential for the use of water beyond its consumption in households, agriculture and industry. ‘Blue energy'', for instance, generates power from reverse electrodialysis by mixing saltwater and freshwater across an ion exchange membrane stack. This could potentially generate energy wherever rivers flow into the sea.With so many innovations already under way with so little funding, what other technologies can we come up with to reduce water usage and deal with medical, industrial and individual waste? The issue of waste is a serious and pressing problem: we find pharmaceutical chemicals in fish, which are in turn consumed by humans and other species in the food chain. We need to find ways to effectively transform waste into biodegradable products that can be used as fertilizers, as well as to recover valuable molecules such as rare metals. The downstream consequences of such technologies will be the regeneration of coastal estuaries, lower levels of contaminants in marine life and cleaner rivers. Ultimately, we need much more research into reducing water use, purification, bioremediation and recycling. I submit that this should be a priority research area for all the natural sciences and engineering.Companies are held accountable these days for socially responsible projects, sustainability and their carbon footprint—this includes water usage. Why should research institutions not be held responsible too? After all, we claim to be at the cutting edge of science and should set the trend. Research grants should have a ‘green component'' and a score should be given to applications according to water usage and ‘green work''.  相似文献   

5.
6.
Paige Brown 《EMBO reports》2012,13(11):964-967
Many scientists blame the media for sensationalising scientific findings, but new research suggests that things can go awry at all levels, from the scientific report to the press officer to the journalist.Everything gives you cancer, at least if you believe what you read in the news or see on TV. Fortunately, everything also cures cancer, from red wine to silver nanoparticles. Of course the truth lies somewhere in between, and scientists might point out that these claims are at worst dangerous sensationalism and at best misjudged journalism. These kinds of media story, which inflate the risks and benefits of research, have led to a mistrust of the press among some scientists. But are journalists solely at fault when science reporting goes wrong, as many scientists believe [1]? New research suggests it is time to lay to rest the myth that the press alone is to blame. The truth is far more nuanced and science reporting can go wrong at many stages, from the researchers to the press officers to the diverse producers of news.Many science communication researchers suggest that science in the media is not as distorted as scientists believe, although they do admit that science reporting tends to under-represent risks and over-emphasize benefits [2]. “I think there is a lot less of this [misreported science] than some scientists presume. I actually think that there is a bit of laziness in the narrative around science and the media,” said Fiona Fox, Director of the UK Science Media Centre (London, UK), an independent press office that serves as a liaison between scientists and journalists. “My bottom line is that, certainly in the UK, a vast majority of journalists report science accurately in a measured way, and it''s certainly not a terrible story. Having said that, lots of things do go wrong for a number of reasons.”Fox said that the centre sees everything from fantastic press releases to those that completely misrepresent and sensationalize scientific findings. They have applauded news stories that beautifully reported the caveats and limitations of a particular scientific study, but they have also cringed as a radio talk show pitted a massive and influential body of research against a single non-scientist sceptic.“You ask, is it the press releases, is it the universities, is it the journalists? The truth is that it''s all three,” Fox said. “But even admitting that is admitting more complexity. So anyone who says that scientists and university press officers deliver perfectly accurate science and the media misrepresent it […] that really is not the whole story.”Scientists and scientific institutions today invest more time and effort into communicating with the media than they did a decade ago, especially given the modern emphasis on communicating scientific results to the public [3]. Today, there are considerable pressures on scientists to reach out and even ‘sell their work'' to public relations officers and journalists. “For every story that a journalist has hyped and sensationalized, there will be another example of that coming directly from a press release that we [scientists] hyped and sensationalized,” Fox said. “And for every time that that was a science press officer, there will also be a science press officer who will tell you, ‘I did a much more nuanced press release, but the academic wanted me to over claim for it''.”Although science public relations has helped to put scientific issues on the public agenda, there are also dangers inherent in the process of translation from original research to press release to media story. Previous research in the area of science communication has focused on conflicting scientific and media values, and the effects of science media on audiences. However, studies have raised awareness of the role of press releases in distorting information from the lab bench to published news [4].In a 2011 study of genetic research claims made in press releases and mainstream print media, science communication researcher Jean Brechman, who works at the US advertising and marketing research firm Gallup & Robinson, found evidence that scientific knowledge gets distorted as it is “filtered and translated for mass communication” with “slippages and inconsistencies” occurring along the way, such that the end message does not accurately represent the original science [4]. Although Brechman and colleagues found a concerning point of distortion in the transition between press release and news article, they also observed a misrepresentation of the original science in a significant portion of the press releases themselves.In a previous study, Brechman and his colleagues had also concluded that “errors commonly attributed to science journalists, such as lack of qualifying details and use of oversimplified language, originate in press releases.” Even more worrisome, as Fox told a Nature commentary author in 2009, public relations departments are increasingly filling the need of the media for quick content [5].Fox believes that a common characteristic of misrepresented science in press releases and the media is the over-claiming of preliminary studies. As such, the growing prevalence of rapid, short-format publications that publicize early results might be exacerbating the problem. Research has also revealed that over-emphasis on the beneficial effects of experimental medical treatments seen in press releases and news coverage, often called ‘spin'', can stem from bias in the abstract of the original scientific article itself [6]. Such findings warrant a closer examination of the language used in scientific articles and abstracts, as the wording and ‘spin'' of conclusions drawn by researchers in their peer-reviewed publications might have significant impacts on subsequent media coverage.Of course, some stories about scientific discoveries are just not easy to tell owing to their complexity. They are “messy, complicated, open to interpretation and ripe for misreporting,” as Fox wrote in a post on her blog On Science and the Media (fionafox.blogspot.com). They do not fit the single-page blog post or the short press release. Some scientific experiments and the peer-reviewed articles and media stories that flow from them are inherently full of caveats, contexts and conflicting results and cannot be communicated in a short format [7].In a 2012 issue of Perspectives on Psychological Science, Marco Bertamini at the University of Liverpool (UK) and Marcus R. Munafo at the University of Bristol (UK) suggested that a shift toward “bite-size” publications in areas of science such as psychology might be promoting more single-study models of research, fewer efforts to replicate initial findings, curtailed detailing of previous relevant work and bias toward “false alarm” or false-positive results [7]. The authors pointed out that larger, multi-experiment studies are typically published in longer papers with larger sample sizes and tend to be more accurate. They also suggested that this culture of brief, single-study reports based on small data sets will lead to the contamination of the scientific literature with false-positive findings. Unfortunately, false science far more easily enters the literature than leaves it [8].One famous example is that of Andrew Wakefield, whose 1998 publication in The Lancet claimed to link autism with the combined measles, mumps and rubella (MMR) vaccination. It took years of work by many scientists, and the aid of an exposé by British investigative reporter Brian Deer, to finally force retraction of the paper. However, significant damage had already been done and many parents continue to avoid immunizing their children out of fear. Deer claims that scientific journals were a large part of the problem: “[D]uring the many years in which I investigated the MMR vaccine controversy, the worst and most inexcusable reporting on the subject, apart from the original Wakefield claims in the Lancet, was published in Nature and republished in Scientific American,” he said. “There is an enormous amount of hypocrisy among those who accuse the media of misreporting science.”What factors are promoting this shift to bite-size science? One is certainly the increasing pressure and competition to publish many papers in high-impact journals, which prefer short articles with new, ground-breaking findings.“Bibliometrics is playing a larger role in academia in deciding who gets a job and who gets promoted,” Bertamini said. “In general, if things are measured by citations, there is pressure to publish as much and as often as possible, and also to focus on what is surprising; thus, we can see how this may lead to an inflation in the number of papers but also an increase in publication bias.”Bertamini points to the real possibility that measured effects emerging from a group of small samples can be much larger than the real effect in the total population. “This variability is bad enough, but it is even worse when you consider that what is more likely to be written up and accepted for publication are exactly the larger differences,” he explained.Alongside the endless pressure to publish, the nature of the peer-reviewed publication process itself prioritizes exciting and statistically impressive results. Fluke scientific discoveries and surprising results are often considered newsworthy, even if they end up being false-positives. The bite-size article aggravates this problem in what Bertamini fears is a growing similarity between academic writing and media reporting: “The general media, including blogs and newspapers, will of course focus on what is curious, funny, controversial, and so on. Academic papers must not do the same, and the quality control system is there to prevent that.”The real danger is that, with more than one million scientific papers published every year, journalists can tend to rely on only a few influential journals such as Science and Nature for science news [3]. Although the influence and reliability of these prestigious journals is well established, the risk that journalists and other media producers might be propagating the exciting yet preliminary results published in their pages is undeniable.Fox has personal experience of the consequences of hype surrounding surprising but preliminary science. Her sister has chronic fatigue syndrome (CFS), a debilitating medical condition with no known test or cure. When Science published an article in 2009 linking CFS with a viral agent, Fox was naturally both curious and sceptical [9]. “I thought even if I knew that this was an incredibly significant finding, the fact that nobody had ever found a biological link before also meant that it would have to be replicated before patients could get excited,” Fox explained. “And of course what happened was all the UK journalists were desperate to splash it on the front page because it was so surprising and so significant and could completely revolutionize the approach to CFS, the treatment and potential cure.”Fox observed that while some journalists placed the caveats of the study deep within their stories, others left them out completely. “I gather in the USA it was massive, it was front page news and patients were going online to try and find a test for this particular virus. But in the end, nobody could replicate it, literally nobody. A Dutch group tried, Imperial College London, lots of groups, but nobody could replicate it. And in the end, the paper has been withdrawn from Science.”For Fox, the fact that the paper was withdrawn, incidentally due to a finding of contamination in the samples, was less interesting than the way that the paper was reported by journalists. “We would want any journal press officer to literally in the first paragraph be highlighting the fact that this was such a surprising result that it shouldn''t be splashed on the front page,” she said. Of course to the journalist, waiting for the study to be replicated is anathema in a culture that values exciting and new findings. “To the scientific community, the fact that it is surprising and new means that we should calm down and wait until it is proved,” Fox warned.So, the media must also take its share of the blame when it comes to distorting science news. Indeed, research analysing science coverage in the media has shown that stories tend to exaggerate preliminary findings, use sensationalist terms, avoid complex issues, fail to mention financial conflicts of interest, ignore statistical limitations and transform inherent uncertainties into controversy [3,10].One concerning development within journalism is the ‘balanced treatment'' of controversial science, also called ‘false balance'' by many science communicators. This balanced treatment has helped supporters of pseudoscientific notions gain equal ground with scientific experts in media stories on issues such as climate change and biotechnology [11].“Almost every time the issue of creationism or intelligent design comes up, many newspapers and other media feel that they need to present ‘both sides'', even though one is clearly nonsensical, and indeed harmful to public education,” commented Massimo Pigliucci, author of Nonsense on Stilts: How to Tell Science from Bunk [12].Fox also criticizes false balance on issues such as global climate change. “On that one you can''t blame the scientific community, you can''t blame science press officers,” she said. “That is a real clashing of values. One of the values that most journalists have bred into them is about balance and impartiality, balancing the views of one person with an opponent when it''s controversial. So on issues like climate change, where there is a big controversy, their instinct as a journalist will be to make sure that if they have a climate scientist on the radio or on TV or quoted in the newspaper, they pick up the phone and make sure that they have a climate skeptic.” However, balanced viewpoints should not threaten years of rigorous scientific research embodied in a peer-reviewed publication. “We are not saying generally that we [scientists] want special treatment from journalists,” Fox said, “but we are saying that this whole principle of balance, which applies quite well in politics, doesn''t cross over to science…”Bertamini believes the situation could be made worse if publication standards are relaxed in favour of promoting a more public and open review process. “If today you were to research the issue of human contribution to global warming you would find a consensus in the scientific literature. Yet you would find no such consensus in the general media. In part this is due to the existence of powerful and well-funded lobbies that fill the media with unfounded skepticism. Now imagine if these lobbies had more access to publish their views in the scientific literature, maybe in the form of post publication feedback. This would be a dangerous consequence of blurring the line that separates scientific writing and the broader media.”In an age in which the way science is presented in the news can have significant impacts for audiences, especially when it comes to health news, what can science communicators and journalists do to keep audiences reading without having to distort, hype, trivialize, dramatize or otherwise misrepresent science?Pigliucci believes that many different sources—press releases, blogs, newspapers and investigative science journalism pieces—can cross-check reported science and challenge its accuracy, if necessary. “There are examples of bloggers pointing out technical problems with published scientific papers,” Pigliucci said. “Unfortunately, as we all know, the game can be played the other way around too, with plenty of bloggers, ‘twitterers'' and others actually obfuscating and muddling things even more.” Pigliucci hopes to see a cultural change take place in science reporting, one that emphasizes “more reflective shouting, less shouting of talking points,” he said.Fox believes that journalists still need to cover scientific developments more responsibly, especially given that scientists are increasingly reaching out to press officers and the public. Journalists can inform, intrigue and entertain whilst maintaining accurate representations of the original science, but need to understand that preliminary results must be replicated and validated before being splashed on the front page. They should also strive to interview experts who do not have financial ties or competing interests in the research, and they should put scientific stories in the context of a broader process of nonlinear discovery. According to Pigliucci, journalists can and should be educating themselves on the research process and the science of logical conclusion-making, giving themselves the tools to provide critical and investigative coverage when needed. At the same time, scientists should undertake proper media training so that they are comfortable communicating their work to journalists or press officers.“I don''t think there is any fundamental flaw in how we communicate science, but there is a systemic flaw in the sense that we simply do not educate people about logical fallacies and cognitive biases,” Pigliucci said, advising that scientists and communicators alike should be intimately familiar with the subjects of philosophy and psychology. “As for bunk science, it has always been with us, and it probably always will be, because human beings are naturally prone to all sorts of biases and fallacious reasoning. As Carl Sagan once put it, science (and reason) is like a candle in the dark. It needs constant protection and a lot of thankless work to keep it alive.”  相似文献   

7.
Chan S  Quigley M 《Bioethics》2007,21(8):439-448
Recent ethical and legal challenges have arisen concerning the rights of individuals over their IVF embryos, leading to questions about how, when the wishes of parents regarding their embryos conflict, such situations ought to be resolved. A notion commonly invoked in relation to frozen embryo disputes is that of reproductive rights: a right to have (or not to have) children. This has sometimes been interpreted to mean a right to have, or not to have, one's own genetic children. But can such rights legitimately be asserted to give rise to claims over embryos? We examine the question of property in genetic material as applied to gametes and embryos, and whether rights over genetic information extend to grant control over IVF embryos. In particular we consider the purported right not to have one's own genetically related children from a property‐based perspective. We argue that even if we concede that such (property) rights do exist, those rights become limited in scope and application upon engaging in reproduction. We want to show that once an IVF embryo is created for the purpose of reproduction, any right not to have genetically‐related children that may be based in property rights over genetic information is ceded. There is thus no right to prevent one's IVF embryos from being brought to birth on the basis of a right to avoid having one's own genetic children. Although there may be reproductive rights over gametes and embryos, these are not grounded in genetic information.  相似文献   

8.
In this review the state of the art in animal cell technology, using suspension culture techniques is updated as far as the end of 1980. We have tried to discuss, on a broad basis, the current status and potential developments of both, the purely biological and the biochemical engineering aspects which may be important to improve the performance and design of animal cell technologies. The process economics could be considerably improved by the use of transformed animal cell substrates, the use of cheaper cultivation media and by methodological and engineering means. Most of these aspects are in the state of realization. Nevertheless, for a great variety of biological active substances which do not require so – or posttranslational processing, recombinant DNA-techniques (e.g. genetic engineering) are a promising alternative for the production of animal cell derived substances.  相似文献   

9.
The existing literature on the development of recombinant DNA technology and genetic engineering tends to focus on Stanley Cohen and Herbert Boyer’s recombinant DNA cloning technology and its commercialization starting in the mid-1970s. Historians of science, however, have pointedly noted that experimental procedures for making recombinant DNA molecules were initially developed by Stanford biochemist Paul Berg and his colleagues, Peter Lobban and A. Dale Kaiser in the early 1970s. This paper, recognizing the uneasy disjuncture between scientific authorship and legal invention in the history of recombinant DNA technology, investigates the development of recombinant DNA technology in its full scientific context. I do so by focusing on Stanford biochemist Berg’s research on the genetic regulation of higher organisms. As I hope to demonstrate, Berg’s new venture reflected a mass migration of biomedical researchers as they shifted from studying prokaryotic organisms like bacteria to studying eukaryotic organisms like mammalian and human cells. It was out of this boundary crossing from prokaryotic to eukaryotic systems through virus model systems that recombinant DNA technology and other significant new research techniques and agendas emerged. Indeed, in their attempt to reconstitute ‹life’ as a research technology, Stanford biochemists’ recombinant DNA research recast genes as a sequence that could be rewritten thorough biochemical operations. The last part of this paper shifts focus from recombinant DNA technology’s academic origins to its transformation into a genetic engineering technology by examining the wide range of experimental hybridizations which occurred as techniques and knowledge circulated between Stanford biochemists and the Bay Area’s experimentalists. Situating their interchange in a dense research network based at Stanford’s biochemistry department, this paper helps to revise the canonized history of genetic engineering’s origins that emerged during the patenting of Cohen–Boyer’s recombinant DNA cloning procedures.  相似文献   

10.
Plant agriculture is poised at a technological inflection point. Recent advances in genome engineering make it possible to precisely alter DNA sequences in living cells, providing unprecedented control over a plant''s genetic material. Potential future crops derived through genome engineering include those that better withstand pests, that have enhanced nutritional value, and that are able to grow on marginal lands. In many instances, crops with such traits will be created by altering only a few nucleotides among the billions that comprise plant genomes. As such, and with the appropriate regulatory structures in place, crops created through genome engineering might prove to be more acceptable to the public than plants that carry foreign DNA in their genomes. Public perception and the performance of the engineered crop varieties will determine the extent to which this powerful technology contributes towards securing the world''s food supply.
This article is part of the PLOS Biology Collection “The Promise of Plant Translational Research.”
Over the past 100 years, technological advances have resulted in remarkable increases in agricultural productivity. Such advances include the production of hybrid plants and the use of the genes of the Green Revolution—genes that alter plant stature and thereby increase productivity [1],[2]. More recently, transgenesis, or the introduction of foreign DNA into plant genomes, has been a focus of crop improvement efforts. In the US, more than 90% of cultivated soybeans and corn contain one or more transgenes that provide traits such as resistance to insects or herbicides [3]. Transgenesis, however, has limitations: it is fundamentally a process of gene addition and does not harness a plant''s native genetic repertoire to create traits of agricultural value. Furthermore, public concerns over the cultivation of crops with foreign DNA, particularly those generated by the introduction of genes from distantly related organisms, have impeded their widespread use. The regulatory frameworks created to protect the environment and to address public safety concerns have added considerably to the cost of transgenic crop production [4]. These costs have limited the use of transgenesis for creating crops with agriculturally valuable traits to a few high-profit crops, such as cotton, soybean, and corn.The tools of genome engineering allow DNA in living cells to be precisely manipulated (reviewed in [5]). Although genome engineering can be used to add transgenes to specific locations in genomes, thereby offering an improvement over existing methods of transgenesis, a more powerful application is to modify genetic information to create new traits. Traditionally, new traits are introduced into cultivated varieties through breeding regimes that take advantage of existing natural genetic variation. Alternatively, new genetic variation is created through mutagenesis. With genome engineering, it is possible to first determine the DNA sequence modifications that are desired in the cultivated variety and then introduce this genetic variation precisely and rapidly. The ability to control the type of genetic variation introduced into crop plants promises to change the way new varieties are generated. Already genome engineering is being used in crop production pipelines in the developed world, and this technology can also be used to improve the crops that feed the burgeoning populations of developing countries.  相似文献   

11.
Bachman''s Sparrow (Peucaea aestivalis) is a fire-dependent species that has undergone range-wide population declines in recent decades. We examined genetic diversity in Bachman''s Sparrows to determine whether natural barriers have led to distinct population units and to assess the effect of anthropogenic habitat loss and fragmentation. Genetic diversity was examined across the geographic range by genotyping 226 individuals at 18 microsatellite loci and sequencing 48 individuals at mitochondrial and nuclear genes. Multiple analyses consistently demonstrated little genetic structure and high levels of genetic variation, suggesting that populations are panmictic. Based on these genetic data, separate management units/subspecies designations or translocations to promote gene flow among fragmented populations do not appear to be necessary. Panmixia in Bachman''s Sparrow may be a consequence of an historical range expansion and retraction. Alternatively, high vagility in Bachman''s Sparrow may be an adaptation to the ephemeral, fire-mediated habitat that this species prefers. In recent times, high vagility also appears to have offset inbreeding and loss of genetic diversity in highly fragmented habitat.  相似文献   

12.
Legal aspects of genetic information   总被引:4,自引:0,他引:4  
The federally funded Human Genome Initiative will lead to the development of new capabilities to learn about an individual''s genetic status. Legal issues are raised concerning patients'' and other parties'' access to that information. This article discusses the effect of existing statutes and case law on three pivotal questions: To what sort of information are people entitled? What control should people have over their genetic information? Do people have a right to refuse genetic information? The article emphasizes that the law protects a patient''s right to obtain or refuse genetic information about oneself, as well as the right to control the dissemination of that information to others.  相似文献   

13.
道德规范教育如今已经提升到了专业的水平。因此在专业领域里 (例如工程学和医学 ) ,道德规范教育应作为必修课程。但至今很多理科课程仍没有把它列为必修课。这就给我们提出了一个疑问 :理科是专业课程吗 ?如果是的话 ,那么科学家例如动物学家需不需要熟悉他们职责范围内的道德准则和尺度呢 ?动物学家对医学上暴露的一些问题很敏感———包括我们怎样对待动物以及我们怎样或者是否开展基因工程。但是从道德观念上来看 ,道德规范教育的实行是比这两件事更实际的。这篇论文就以上观点进行了进一步的论述 ,并且对把道德规范教育加入理科课程的需求和可能性做了评估。在现实社会里 ,动物科学家是被敬重的专业人士。他们每天面对着许多极可能影响我们生活环境的决策。有鉴于此 ,动物科学家必需掌握道德规范的标准 ,并且有能力做出与此相符合的决策。这才能使我们在动物学的教学过程中确保动物学家持续稳定的专业发展方向  相似文献   

14.
Rapid update     
Herpes virus and Alzheimer's: Infection of the brain with Herpes simplex virus (HSV) could be a risk factor for Alzheimer's disease (AD). Using PCR, Ruth Itzhaki (UMIST, Manchester, UK) and colleagues have found that the APOE4 allele is far more common in AD patients who have HSV in their brains than in people who do not have AD (whether or not they have HSV in their brains). HSV is also more common in the peripheral nervous systems of APOE4-positive people. These findings support the theory that an environmental factor (HSV) and a genetic factor (possession of the APOE4 allele) could, in combination, lead to neurodegeneration, and suggest that treatment with antivirals might delay or prevent the onset of AD in APOE4-positive individuals. Professor Itzhaki reported her findings at the Biochemical Society's Winter Meeting in Reading, UK on December 16. Cheaper chips for all?: DNA chip giants Affymetrix and Molecular Dynamics have formed a consortium to standardize their technology. This should make DNA chips, readers, software and reagents from different sources compatible and more affordable. It is hoped that this will speed up the development of new chip-based diagnostic, therapeutic and disease-management products. Copper the key to prion disease?: A report by Hans Kretzschmar and colleagues in the 18/25 December issue of Nature suggests that normal prion proteins might transport or store copper. The evidence is twofold: first, the structure of normal prion proteins suggests that they can bind copper with high affinity; and, second, knockout mice lacking a prion protein have reduced amounts of copper in membranes extracted from brain tissue. Interestingly, several proteins implicated in neurodegenerative diseases, including superoxide dismutase 1, monoamine oxidase and amyloid precursor protein, are also copper-binding proteins. Cocaine vaccine heading for trials: Immunologic has received clearance from the FDA for an Investigational New Drug licence for an anti-cocaine vaccine. Phase I trials are due to start early in 1998. The vaccine comprises a cocaine–protein conjugate; it is expected to raise an antibody response to the drug, which should `mop up' cocaine before it reaches the brain.  相似文献   

15.
16.
The CRISPR–Cas system is the newest targeted nuclease for genome engineering. In less than 1 year, the ease, robustness and efficiency of this method have facilitated an immense range of genetic modifications in most model organisms. Full and conditional gene knock-outs, knock-ins, large chromosomal deletions and subtle mutations can be obtained using combinations of clustered regularly interspaced short palindromic repeats (CRISPRs) and DNA donors. In addition, with CRISPR–Cas compounds, multiple genetic modifications can be introduced seamlessly in a single step. CRISPR–Cas not only brings genome engineering capacities to species such as rodents and livestock in which the existing toolbox was already large, but has also enabled precise genetic engineering of organisms with difficult-to-edit genomes such as zebrafish, and of technically challenging species such as non-human primates. The CRISPR–Cas system allows generation of targeted mutations in mice, even in laboratories with limited or no access to the complex, time-consuming standard technology using mouse embryonic stem cells. Here we summarize the distinct applications of CRISPR–Cas technology for obtaining a variety of genetic modifications in different model organisms, underlining their advantages and limitations relative to other genome editing nucleases. We will guide the reader through the many publications that have seen the light in the first year of CRISPR–Cas technology.  相似文献   

17.
18.
In this paper I argue that we can learn much about ‘wild justice’ and the evolutionary origins of social morality – behaving fairly – by studying social play behavior in group-living animals, and that interdisciplinary cooperation will help immensely. In our efforts to learn more about the evolution of morality we need to broaden our comparative research to include animals other than non-human primates. If one is a good Darwinian, it is premature to claim that only humans can be empathic and moral beings. By asking the question ‘What is it like to be another animal?’ we can discover rules of engagement that guide animals in their social encounters. When I study dogs, for example, I try to be a ‘dogocentrist’ and practice ‘dogomorphism.’ My major arguments center on the following ‘big’ questions: Can animals be moral beings or do they merely act as if they are? What are the evolutionary roots of cooperation, fairness, trust, forgiveness, and morality? What do animals do when they engage in social play? How do animals negotiate agreements to cooperate, to forgive, to behave fairly, to develop trust? Can animals forgive? Why cooperate and play fairly? Why did play evolve as it has? Does ‘being fair’ mean being more fit – do individual variations in play influence an individual's reproductive fitness, are more virtuous individuals more fit than less virtuous individuals? What is the taxonomic distribution of cognitive skills and emotional capacities necessary for individuals to be able to behave fairly, to empathize, to behave morally? Can we use information about moral behavior in animals to help us understand ourselves? I conclude that there is strong selection for cooperative fair play in which individuals establish and maintain a social contract to play because there are mutual benefits when individuals adopt this strategy and group stability may be also be fostered. Numerous mechanisms have evolved to facilitate the initiation and maintenance of social play to keep others engaged, so that agreeing to play fairly and the resulting benefits of doing so can be readily achieved. I also claim that the ability to make accurate predictions about what an individual is likely to do in a given social situation is a useful litmus test for explaining what might be happening in an individual's brain during social encounters, and that intentional or representational explanations are often important for making these predictions.  相似文献   

19.
基因编辑技术通过对特定DNA片段的插入、敲除、修饰或替换等,实现对生物体中目标基因的编辑。与早期基因工程技术将遗传物质随机插入宿主基因组中的方式不同的是,基因编辑技术能够定点需要插入的位置,从而实现对生物体基因组特定位点的准确修饰、人为地改造生物体的遗传信息,目前广泛应用于斑马鱼的基因组学、遗传发育和基因功能研究中。其方法包括诱变技术、Tol2转座子、Morpholino、ZFNs、TALEN和CRISPR/Cas系统等。本研究主要介绍了基因编辑技术的作用机理与发展概况。作为一种精准而高效的基因工程方法,基因编辑技术在近年来得到了飞速地发展。它既可以采用对特定基因的靶向突变来研究基因的功能,也可以通过将功能性基因插入并替代缺陷基因而用于某些遗传性疾病的基因治疗。可以肯定的是,基因编辑技术未来将在基础生物学、医学、生物技术等多个领域具有重要的研究价值和应用价值。  相似文献   

20.
Innovative new genome engineering technologies for manipulating chromosomes have appeared in the last decade. One of these technologies, recombination mediated genetic engineering (recombineering) allows for precision DNA engineering of chromosomes and plasmids in Escherichia coli. Single-stranded DNA recombineering (SSDR) allows for the generation of subtle mutations without the need for selection and without leaving behind any foreign DNA. In this review we discuss the application of SSDR technology in lactic acid bacteria, with an emphasis on key factors that were critical to move this technology from E. coli into Lactobacillus reuteri and Lactococcus lactis. We also provide a blueprint for how to proceed if one is attempting to establish SSDR technology in a lactic acid bacterium. The emergence of CRISPR-Cas technology in genome engineering and its potential application to enhancing SSDR in lactic acid bacteria is discussed. The ability to perform precision genome engineering in medically and industrially important lactic acid bacteria will allow for the genetic improvement of strains without compromising safety.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号