首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 937 毫秒
1.
Research needs a balance of risk‐taking in “breakthrough projects” and gradual progress. For building a sustainable knowledge base, it is indispensable to provide support for both. Subject Categories: Careers, Economics, Law & Politics, Science Policy & Publishing

Science is about venturing into the unknown to find unexpected insights and establish new knowledge. Increasingly, academic institutions and funding agencies such as the European Research Council (ERC) explicitly encourage and support scientists to foster risky and hopefully ground‐breaking research. Such incentives are important and have been greatly appreciated by the scientific community. However, the success of the ERC has had its downsides, as other actors in the funding ecosystem have adopted the ERC’s focus on “breakthrough science” and respective notions of scientific excellence. We argue that these tendencies are concerning since disruptive breakthrough innovation is not the only form of innovation in research. While continuous, gradual innovation is often taken for granted, it could become endangered in a research and funding ecosystem that places ever higher value on breakthrough science. This is problematic since, paradoxically, breakthrough potential in science builds on gradual innovation. If the value of gradual innovation is not better recognized, the potential for breakthrough innovation may well be stifled.
While continuous, gradual innovation is often taken for granted, it could become endangered in a research and funding ecosystem that places ever higher value on breakthrough science.
Concerns that the hypercompetitive dynamics of the current scientific system may impede rather than spur innovative research have been voiced for many years (Alberts et al, 2014). As performance indicators continue to play a central role for promotions and grants, researchers are under pressure to publish extensively, quickly, and preferably in high‐ranking journals (Burrows, 2012). These dynamics increase the risk of mental health issues among scientists (Jaremka et al, 2020), dis‐incentivise relevant and important work (Benedictus et al, 2016), decrease the quality of scientific papers (Sarewitz, 2016) and induce conservative and short‐term thinking rather than risk‐taking and original thinking required for scientific innovation (Alberts et al, 2014; Fochler et al, 2016). Against this background, strong incentives for fostering innovative and daring research are indispensable.  相似文献   

2.
Academic Core Facilities are optimally situated to improve the quality of preclinical research by implementing quality control measures and offering these to their users. Subject Categories: Methods & Resources, Science Policy & Publishing

During the past decade, the scientific community and outside observers have noted a concerning lack of rigor and transparency in preclinical research that led to talk of a “reproducibility crisis” in the life sciences (Baker, 2016; Bespalov & Steckler, 2018; Heddleston et al, 2021). Various measures have been proposed to address the problem: from better training of scientists to more oversight to expanded publishing practices such as preregistration of studies. The recently published EQIPD (Enhancing Quality in Preclinical Data) System is, to date, the largest initiative that aims to establish a systematic approach for increasing the robustness and reliability of biomedical research (Bespalov et al, 2021). However, promoting a cultural change in research practices warrants a broad adoption of the Quality System and its underlying philosophy. It is here that academic Core Facilities (CF), research service providers at universities and research institutions, can make a difference.It is fair to assume that a significant fraction of published data originated from experiments that were designed, run, or analyzed in CFs. These academic services play an important role in the research ecosystem by offering access to cutting‐edge equipment and by developing and testing novel techniques and methods that impact research in the academic and private sectors alike (Bikovski et al, 2020). Equipment and infrastructure are not the only value: CFs employ competent personnel with profound knowledge and practical experience of the specific field of interest: animal behavior, imaging, crystallography, genomics, and so on. Thus, CFs are optimally positioned to address concerns about the quality and robustness of preclinical research.  相似文献   

3.

In “Structural basis of transport and inhibition of the Plasmodium falciparum transporter PfFNT” by Lyu et al (2021), the authors depict the inhibitor MMV007839 in its hemiketal form in Fig 3A and F, Fig 4C, and Appendix Figs S10A, B and S13. We note that Golldack et al (2017) reported that the linear vinylogous acid tautomer of MMV007839 constitutes the binding and inhibitory entity of PfFNT. The authors are currently obtaining higher resolution cryo‐EM structural data of MMV007839‐bound PfFNT to ascertain which of the interconvertible isoforms is bound and the paper will be updated accordingly.  相似文献   

4.
Lessons from implementing quality control systems in an academic research consortium to improve Good Scientific Practice and reproducibility. Subject Categories: Microbiology, Virology & Host Pathogen Interaction, Science Policy & Publishing

Low reproducibility rates within biomedical research negatively impact productivity and translation. One promising approach to enhance the transfer of robust results from preclinical research into clinically relevant and transferable data is the systematic implementation of quality measures in daily laboratory routines.
Although many universities expect their scientists to adhere to GSPs, they often neither systematically support, nor monitor the quality of their research activities.
Today''s fast‐evolving research environment needs effective quality measures to ensure reproducibility and data integrity (Macleod et al, 2014; Begley et al, 2015; Begley & Ioannidis, 2015; Baker, 2016). Academic research institutions and laboratories may be as committed to good scientific practices (GSPs) as their counterparts in the biotech and pharmaceutical industry but operate largely without clearly defined standards (Bespalov et al, 2021; Emmerich et al, 2021). Although many universities expect their scientists to adhere to GSPs, they often neither systematically support, nor monitor the quality of their research activities. Peer review of publications is still regarded as the primary validation of quality control in academic research. However, reviewers only assess work after it has been performed—often over years—and interventions in the experimental process are thus no longer possible.The reasons for the lack of dedicated quality management (QM) implementations in academic laboratories include an anticipated overload of regulatory tasks that could negatively affect productivity, concerns about the loss of scientific freedom, and importantly, limited resources in academia and academic funding schemes.  相似文献   

5.
The response by the author. Subject Categories: S&S: Economics & Business, S&S: Ethics

I thank Michael Bronstein and Sophia Vinogradov for their interest and comments. I would like to respond to a few of their points.First, I agree with the authors that empirical studies should be conducted to validate any approaches to prevent the spread of misinformation before their implementation. Nonetheless, I think that the ideas I have proposed may be worth further discussion and inspire empirical studies to test their effectiveness.Second, the authors warn that informing about the imperfections of scientific research may undermine trust in science and scientists, which could result in higher vulnerability to online health misinformation (Roozenbeek et al, 2020; Bronstein & Vinogradov, 2021). I believe that transparency about limitations and problems in research does not necessarily have to diminish trust in science and scientists. On the contrary, as Veit et al put it, “such honesty… is a prerequisite for maintaining a trusting relationship between medical institutions (and practitioners) and the public” (Veit et al, 2021). Importantly, to give an honest picture of scientific research, information about its limitations should be put in adequate context. In particular, the public also should be aware that “good science” is being done by many researchers; we do have solid evidence of effectiveness of many medical interventions; and efforts are being taken to address the problems related to quality of research.Third, Bronstein and Vinogradov suggest that false and dangerous information should be censored. I agree with the authors that “[c]ensorship can prevent individuals from being exposed to false and potentially dangerous ideas” (Bronstein & Vinogradov, 2021). I also recognize that some information is false beyond any doubt and its spread may be harmful. What I am concerned about are, among others, the challenges related to defining what is dangerous and false information and limiting censorship only to this kind of information. For example, on what sources should decisions to censor be based and who should make such decisions? Anyone, whether an individual or an organization, with a responsibility to censor information will likely not only be prone to mistakes, but also to abuses of power to foster their interests. Do the benefits we want to achieve by censorship outweigh the potential risks?Fourth, we need rigorous empirical studies examining the actual impact of medical misinformation. What exactly are the harms we try to protect against and what is their scale? This information is necessary to choose proportionte and effective measures to reduce the harms. Bronstein and Vinogradov give an example of a harm which may be caused by misinformation—an increase in methanol poisoning in Iran. Yet, as noticed by the authors, misinformation is not the sole factor in this case; there are also cultural and other contexts (Arasteh et al, 2020; Bronstein & Vinogradov, 2021). Importantly, the methods of studies exploring the effects of misinformation should be carefully elaborated, especially when study participants are asked to self‐report. A recent study suggests that some claims about the prevalence of dangerous behaviors, such as drinking bleach, which may have been caused by misinformation are largely exaggerated due to the presence of problematic respondents in surveys (preprint: Litman et al, 2021).Last but not least, I would like to call attention to the importance of how veracity of information is determined in empirical studies on misinformation. For example, in a study of Roozenbeek et al, cited by Bronstein and Vinogradov, the World Health Organization (WHO) was used as reliable source of information, which raises questions. For instance, Roozenbeek et al (2020) used a statement “the coronavirus was bioengineered in a military lab in Wuhan” as an example of false information, relying on the judgment of the WHO found on its “mythbusters” website (Roozenbeek et al, 2020). Yet, is there a solid evidence to claim that this statement is false? At present, at least some scientists declare that we cannot rule out that the virus was genetically manipulated in a laboratory (Relman, 2020; Segreto & Deigin, 2020). Interestingly, the WHO also no longer excludes such a possibility and has launched an investigation on this issue (https://www.who.int/health‐topics/coronavirus/origins‐of‐the‐virus, https://www.who.int/emergencies/diseases/novel‐coronavirus‐2019/media‐resources/science‐in‐5/episode‐21‐‐‐covid‐19‐‐‐origins‐of‐the‐sars‐cov‐2‐virus); the information about the laboratory origin of the virus being false is no longer present on the WHO “mythbusters” website (https://www.who.int/emergencies/diseases/novel‐coronavirus‐2019/advice‐for‐public/myth‐busters). Against this backdrop, some results of the study by Roozenbeek et al (2020) seem misleading. In particular, the perception of the reliability of the statement about bioengineered virus by study participants in Roozenbeek et al (2020) does not reflect the susceptibility to misinformation, as intended by the researchers, but rather how the respondents perceive reliability of uncertain information.I hope that discussion and research on these and related issues will continue.  相似文献   

6.
Even if the predominant model of science communication with the public is now based on dialogue, many experts still adhere to the outdated deficit model of informing the public. Subject Categories: Genetics, Gene Therapy & Genetic Disease, S&S: History & Philosophy of Science, S&S: Ethics

During the past decades, public communication of science has undergone profound changes: from policy‐driven to policy‐informing, from promoting science to interpreting science, and from dissemination to interaction (Burgess, 2014). These shifts in communication paradigms have an impact on what is expected from scientists who engage in public communication: they should be seen as fellow citizens rather than experts whose task is to increase scientific literacy of the lay public. Many scientists engage in science communication, because they see this as their responsibility toward society (Loroño‐Leturiondo & Davies, 2018). Yet, a significant proportion of researchers still “view public engagement as an activity of talking to rather than with the public” (Hamlyn et al, 2015). The highly criticized “deficit model” that sees the role of experts as educating the public to mitigate skepticism still persists (Simis et al, 2016; Suldovsky, 2016).Indeed, a survey we conducted among experts in training seems to corroborate the persistence of the deficit model even among younger scientists. Based on these results and our own experience with organizing public dialogues about human germline gene editing (Box 1), we discuss the implications of this outdated science communication model and an alternative model of public engagement, that aims to align science with the needs and values of the public.Box 1

The DNA‐dialogue project

The Dutch DNA‐dialogue project invited citizens to discuss and form opinions about human germline gene editing. During 2019 and 2020, this project organized twenty‐seven dialogues with professionals, such as embryologists and midwives, and various lay audiences. Different scenarios of a world in 2039 (https://www.rathenau.nl/en/making‐perfect‐lives/discussing‐modification‐heritable‐dna‐embryos) served as the starting point. Participants expressed their initial reactions to these scenarios with emotion‐cards and thereby explored the values they themselves and other participants deemed important as they elaborated further. Starting each dialogue in this way provides a context that enables everyone to participate in dialogue about complex topics such as human germline gene editing and demonstrates that scientific knowledge should not be a prerequisite to participate.An important example of “different” relevant knowledge surfaced during a dialogue with children between 8 and 12 years in the Sophia Children’s Hospital in Rotterdam (Fig 1). Most adults in the DNA‐dialogues accepted human germline gene modification for severe genetic diseases, as they wished the best possible care and outcome for their children. The children at Sophia, however, stated that they would find it terrible if their parents had altered something about them before they had been born; their parents would not even have known them. Some children went so far to say they would no longer be themselves without their genetic condition, and that their condition had also given them experiences they would rather not have missed.Open in a separate windowFigure 1 Children participating in a DNA‐dialogue meeting. Photographed by Levien Willemse.  相似文献   

7.
Open Science calls for transparent science and involvement of various stakeholders. Here are examples of and advice for meaningful stakeholder engagement. Subject Categories: Economics, Law & Politics, History & Philosophy of Science

The concepts of Open Science and Responsible Research and Innovation call for a more transparent and collaborative science, and more participation of citizens. The way to achieve this is through cooperation with different actors or “stakeholders”: individuals or organizations who can contribute to, or benefit from research, regardless of whether they are researchers themselves or not. Examples include funding agencies, citizens associations, patients, and policy makers (https://aquas.gencat.cat/web/.content/minisite/aquas/publicacions/2018/how_measure_engagement_research_saris1_aquas2018.pdf). Such cooperation is even more relevant in the current, challenging times—even apart from a global pandemic—when pseudo‐science, fake news, nihilist attitudes, and ideologies too often threaten social and technological progress enabled by science. Stakeholder engagement in research can inform and empower citizens, help render research more socially acceptable, and enable policies grounded on evidence‐based knowledge. Beyond, stakeholder engagement is also beneficial to researchers and to research itself. In a recent survey, the majority of scientists reported benefits from public engagement (Burns et al, 2021). This can include increased mutual trust and mutual learning, improved social relevance of research, and improved adoption of results and knowledge (Cottrell et al, 2014). Finally, stakeholder engagement is often regarded as an important factor to sustain public investment in the life sciences (Burns et al, 2021).
Stakeholder engagement in research can inform and empower citizens, help render research more socially acceptable and enable policies grounded on evidence‐based knowledge
Here, we discuss different levels of stakeholder engagement by way of example, presenting various activities organized by European research institutions. Based on these experiences, we propose ten reflection points that we believe should be considered by the institutions, the scientists, and the funding agencies to achieve meaningful and impactful stakeholder engagement.  相似文献   

8.
Removing the 14‐day limit for research on human embryos without public deliberation could jeopardize public trust in and support of research on human development. Subject Categories: Development & Differentiation, S&S: Economics & Business, Molecular Biology of Disease

In On Revolution, Hannah Arendt, one of the great political thinkers of the 20th century, stated that “promises and agreements deal with the future and provide stability in the ocean of future uncertainty where the unpredictable may break in from all sides” (Arendt, 1963). She cited the Mayflower Compact, which was “drawn up on the ship and signed upon landing” on the uncharted territory of the American continent, as such an example of promise in Western history. Human beings are born with the capacity to act freely amid the vast ocean of uncertainty, but this capacity also creates unpredictable and irreversible consequences. Thus, in society and in politics, moral virtues can only persist through “making promises and keeping them” (Arendt, 1959).  相似文献   

9.
Lazy hazy days     
Scientists have warned about the looming climate crisis for decades, but the world has been slow to act. Are we in danger of making a similar mistake, by neglecting the dangers of other climactic catastrophes? Subject Categories: Biotechnology & Synthetic Biology, Economics, Law & Politics, Evolution & Ecology

On one of my trips to Antarctica, I was enjoined to refer not to “global warming” or even to “climate change.” The former implies a uniform and rather benign process, while the second suggests just a transition from one state to another and seems to minimize all the attendant risks to survival. Neither of these terms adequately or accurately describes what is happening to our planet''s climate system as a result of greenhouse gas emissions; not to mention the effects of urbanization, intensive agriculture, deforestation, and other consequences of human population growth. Instead, I was encouraged to use the term “climate disruption,” which embraces the multiplicity of events taking place, some of them still hard to model, that are altering the planetary ecosystem in dramatic ways.With climate disruption now an urgent and undeniable reality, policymakers are finally waking up to the threats that scientists have been warning about for decades. They have accepted the need for action (UNFCCC Conference of the Parties, 2021), even if the commitment remains patchy or lukewarm. But to implement all the necessary changes is a massive undertaking, and it is debatable whether we have enough time left. The fault lies mostly with those who resisted change for so long, hoping the problem would just go away, or denying that it was happening at all. The crisis situation that we face today is because the changes needed simply cannot be executed overnight. It will take time for the infrastructure to be put in place, whether for renewable electricity, for the switch to carbon‐neutral fuels, for sustainable agriculture and construction, and for net carbon capture. If the problems worsen, requiring even more drastic action, at least we do have a direction of travel, though we would be starting off from an even more precarious situation.However, given the time that it has taken—and will still take—to turn around the juggernaut of our industrial society, are we in danger of making the same mistakes all over again, by ignoring the risks of the very opposite process happening in our lifetime? The causes of historic climate cooling are still debated, and though we have fairly convincing evidence regarding specific, sudden events, there is no firm consensus on what is behind longer‐term and possibly cyclical changes in the climate.The two best‐documented examples are the catastrophe of 536–540 AD and the effects of the Laki Haze of 1783–1784. The cause of the 536–540 event is still debated, but is widely believed to have been one or more massive volcanic eruptions that created a global atmospheric dust‐cloud, resulting in a temperature drop of up to 2°C with concomitant famines and societal crises (Toohey et al, 2016; Helama et al, 2018). The Laki Haze was caused by the massive outpouring of sulfurous fumes from the Laki eruption in Iceland. Its effects on the climate, though just as immediate, were less straightforward. The emissions, combined with other meteorological anomalies, produced a disruption of the jetstream, as well as other localized effects. In northwest Europe, the first half of the summer of 1783 was exceptionally hot, but the following winters were dramatically cold, and the mean temperature across much of the northern hemisphere is estimated to have dropped by around 1.3°C for 2–3 years (Thordarson & Self, 2003). In Iceland itself, as well as much of western and northern Europe, the effects were even more devastating, with widespread crop failures and deaths of both livestock and humans exacerbated by the toxicity of the volcanic gases (Schmidt et al, 2011).Other volcanic events in recorded time have produced major climactic disturbances, such as the 1816 Tambora eruption in Indonesia, which resulted in “the year without a summer,” marked by temperature anomalies of up to 4°C (Fasullo et al, 2017), again precipitating worldwide famine. The 1883 Krakatoa eruption produced similar disruption, albeit of a lesser magnitude, though the effects are proposed to have been much longer lasting (Gleckler et al, 2006).Much more scientifically challenging is the so‐called Little Ice Age in the Middle Ages, approximately from 1250 to 1700 AD, when global temperatures were significantly lower than in the preceding and following centuries. It was marked by particularly frigid and prolonged winters in the northern hemisphere. There is no strong consensus as to its cause(s) or even its exact dates; nor even that it can be considered a global‐scale event rather than a summation of several localized phenomena. A volcanic eruption in 1257 with similar effects to the one of 1816 has been suggested as an initiating event. Disruption of the oceanic circulation system resulting from prolonged anomalies in solar activity is another possible explanation (Lapointe & Bradley, 2021). Nevertheless, and despite an average global cooling of < 1°C, the effects on global agriculture, settlement, migration and trade, pandemics such as the Black Death and perhaps even wars and revolutions, were profound.Once or twice in the past century, we have faced devastating wars, tsunamis and pandemics that seemed to come out of the blue and exacted massive tolls on humanity. From the most recent of each of these, there is a growing realization that, although these events are rare and poorly predictable, we can greatly limit the damage if we prepare properly. Devoting a small proportion of our resources over time, we can build the infrastructure and the mechanisms to cope, when these disasters do eventually strike.Without abandoning any of the emergency measures to combat anthropogenic warming, I believe that the risk of climate cooling needs to be addressed in the same way. The infrastructure for burning fossil fuels needs to be mothballed, not destroyed. Carbon capture needs to be implemented in a way that is rapidly reversible, if this should ever be needed. Alternative transportation routes need to be planned and built in case existing ones become impassable due to ice or flooding. Properly insulated buildings are not just a way of saving energy. They are essential for survival in extreme cold, as those of us who live in the Arctic countries are well aware—but many other regions also experience severe winters, for which we should all prepare.Biotechnology needs to be set to work to devise ways of mitigating the effects of sudden climactic events such as the Laki Haze or the Tambora and Krakatoa eruptions, as well as longer‐term phenomena like the Little Ice Age. Could bacteria be used, for example, to detoxify and dissipate a sulfuric aerosol such as the one generated by the Laki eruption? Methane is generally regarded as a major contributor to the greenhouse effect, but it is short‐lived in the atmosphere. So, could methanogens somehow be harnessed to bring about a temporary rise in global temperatures to offset short‐term cooling effects of a volcanic dust‐cloud?We already have a global seed bank in Svalbard (Asdal & Guarino, 2018): It might easily be expanded to include a greater representation of cold‐resistant varieties of the world''s crop plants that might one day be vital to human survival. And, the experience of the Laki Haze indicates a need for varieties capable of withstanding acid rains and other volcanic pollutants, as well as drought and water saturation. An equivalent (embryo) bank for strains of agriculturally important animals potentially threatened by the effects of abrupt cooling of the climate or catastrophic toxification of the atmosphere is also worth considering.It has generally been thought impractical and pointless to prepare for even rarer events, such as cometary impacts, but events that have occurred repeatedly in recorded history and over an even longer time scale (Helama et al, 2021) are likely to happen again. We should and can be better prepared. This is not to say that we should pay attention to every conspiracy theorist or crank, or paid advocates for energy corporations that seek short‐term profits at the expense of long‐term survival, but the dangers of climate disruption of all kinds are too great to ignore. Instead of our current rather one‐dimensional thinking, we need an “all‐risks” approach to the subject: learning from the past and the present to prepare for the future.  相似文献   

10.
A survey of academics in Germany shows a lack of and a great demand for training in leadership skills. Subject Categories: Careers, Science Policy & Publishing

Success and productivity in science is measured largely by the number of publications in scientific journals and the acquisition of third‐party funding to finance further research (Detsky, 2011). Consequently, as young researchers advance in their careers, they become highly trained in directly related skills, such as scientific writing, so as to increase their chances in securing publications and grants. Acquiring leadership skills, however, is often neglected as these do not contribute to the evaluation of scientific success (Detsky, 2011). Therefore, an early‐career researcher may become leader of a research group based on publication record and solicitation of third‐party funding, but without any training of leadership or team management skills (Lashuel, 2020). Leadership, in the context of academic research, requires a unique list of competencies, knowledge and skills in addition to “traditional” leadership skills (Anthony & Antony, 2017), such as managing change, adaptability, empathy, motivating individuals, and setting direction and vision among others. Academic leadership also requires promoting the research group’s reputation, networking, protecting staff autonomy, promoting academic credibility, and managing complexity (Anthony & Antony, 2017).  相似文献   

11.
Commercial screening services for inheritable diseases raise concerns about pressure on parents to terminate “imperfect babies”. Subject Categories: S&S: Economics & Business, Molecular Biology of Disease

Nearly two decades have passed since the first draft sequences of the human genome were published at the eyewatering cost of nearly US$3 billion for the publicly funded project. Sequencing costs have dropped drastically since, and a range of direct‐to‐consumer genetics companies now offer partial sequencing of your individual genome in the US$100 price range, and whole‐genome sequencing for less than US$1,000.While such tests are mainly for personal peruse, there have also been substantial drops in price in clinical genome sequencing, which has greatly enabled the study of and screening for inheritable disorders. This has both advanced our understanding of these diseases in general, and benefitted early diagnosis of many genetic disorders, which is crucial for early and efficient treatment. Such detection can, in fact, now occur long before birth: from cell‐free DNA testing during the first trimester of pregnancy, to genetic testing of embryos generated by in vitro fertilization, to preconception carrier screening of parents to find out if both are carriers of an autosomal recessive condition. While such prenatal testing of foetuses or embryos primarily focuses on diseases caused by chromosomal abnormalities, technological advances allow also for the testing of an increasing number of heritable monogenic conditions in cases where the disease‐causing variants are known.The medical benefits of such screening are obvious: I personally have lost two pregnancies, one to Turner''s syndrome and the other to an extremely rare and lethal autosomal recessive skeletal dysplasia, and I know first‐hand the heartbreak and devastation involved in finding out that you will lose the child you already love so much. It should be noted though that, very rarely, Turner syndrome is survivable and the long‐term outlook is typically good in those cases (GARD, 2021). In addition, I have Kallmann syndrome, a highly genetically complex dominant endocrine disorder (Maoine et al, 2018), and early detection and treatment make a difference in outcome. Being able to screen early during pregnancy or childhood therefore has significant benefits for affected children. Many other genetic disorders similarly benefit from prenatal screening and detection.But there is also obvious cause for concern: the concept of “designer babies” selected for sex, physical features, or other apparent benefits is well entrenched in our society – and indeed culture – as a product from a dystopian future. Just as a recent example, Philipp Ball, writing for the Guardian in 2017, described designer babies as “an ethical horror waiting to happen” (Ball, 2017). In addition, various commercial enterprises hope to capitalize on these screening technologies. Orchid Inc claims that their preconception screening allows you to “… safely and naturally, protect your baby from diseases that run in your family”. The fact that this is hugely problematic if not impossible from a technological perspective has already been extensively clarified by Lior Pachter, a computational biologist at Caltech (Pachter, 2021). George Church at Harvard University suggested creating a DNA‐based dating app that would effectively prevent people who are both carriers for certain genetic conditions from matching (Flynn, 2019). Richard Dawkins at Oxford University recently commented that “…the decision to deliberately give birth to a Down [syndrome] baby, when you have the choice to abort it early in the pregnancy, might actually be immoral from the point of view of the child’s own welfare” (Dawkins, 2021).These are just a few examples, and as screening technology becomes cheaper, more companies will jump on the bandwagon of perfect “healthy” babies. Conversely, this creates a risk that parents come under pressure to terminate pregnancies with “imperfect babies” as I have experienced myself. What does this mean for people with rare diseases? From my personal moral perspective, the ethics are clear in cases where the pregnancy is clearly not viable. Yet, there are literally thousands of monogenic conditions and even chromosomal abnormalities, not all of which are lethal, and we are making constant strides in treating conditions that were previously considered untreatable. In addition, there is still societal prejudice against people with genetic disorders, and ignorance about how it is to live with a rare disease. In reality, however, all rare disease patients I have encountered are happy to be alive and here, even those whose conditions have significant impact on their quality of life. Many of us also don''t like the term “disorder” or “syndrome”, as we are so much more than merely a disorder or a syndrome.Unfortunately, I also see many parents panic about the results of prenatal testing. Without adequate genetic counselling, they do not understand that their baby’s condition may have actually a quite good prognosis without major impact on the quality of life. Following from this, a mere diagnosis of a rare disease – many of which would not even necessarily have been detectable until later in life, if at all – can be enough to make parents consider termination, due to social stigma.This of course raises the thorny issue of regulation, which range from the USA where there is little to no regulation of such screening technologies (ACOG, 2020), to Sweden where such screening technologies are banned with the exception of specific high‐risk/lethal medical conditions both parents are known carriers for (SMER, 2021). As countries come to grips with both the potential and the risks involved in new screening technologies, medical ethics board have approached this issue. And as screening technologies advance, we will need to ask ourselves difficult questions as a society. I know that in the world of “perfect babies” that some of these companies and individuals are trying to promote, I would not exist, nor would my daughter. I have never before had to find myself so often explaining to people that our lives have value, and I do not want to continue having to do so. Like other forms of diversity, genetic diversity is important and makes us richer as a society. As these screening technologies quickly advance and become more widely available, regulation should at least guarantee that screening must involve proper genetic counselling from a trained clinical geneticist so that parents actually understand the implications of the test results. More urgently, we need to address the problem of societal attitudes towards rare diseases, face the prejudice and fear towards patients, and understand that abolishing genetic diversity in a quest for perfect babies would impoverish humanity and make the world a much poorer place.  相似文献   

12.
Synthetic biology could harness the ability of microorganisms to use highly toxic cyanide compounds for growth applied to bioremediation of cyanide‐contaminated mining wastes and areas. Subject Categories: Biotechnology & Synthetic Biology, Evolution & Ecology, Metabolism

Cyanide is a highly toxic chemical produced in large amounts by the mining and jewellery industries, steel manufacturing, coal coking, food processing and chemical synthesis (Luque‐Almagro et al, 2011). The mining industry uses so‐called cyanide leaching to extract gold and other precious metals from ores, which leaves large amounts of cyanide‐containing liquid wastes with arsenic, mercury, lead, copper, zinc and sulphuric acid as cocontaminants.Although these techniques are very efficient, they still produce about one million tonnes of toxic wastewaters each year, which are usually stored in artificial ponds that are prone to leaching or dam breaks and pose a major threat to the environment and human health (Luque‐Almagro et al, 2016). In 2000, a dam burst in Baia Mare, Romania, caused one of the worst environmental disasters in Europe. Liquid waste from a gold mining operation containing about 100 tonnes of cyanide spilled into the Somes River and eventually reached the Danube, killing up to 80% of wildlife in the affected areas. A more recent spill was caused by a blast furnace at Burns Harbor, IN, USA, which released 2,400 kg of ammonia and 260 kg of cyanide at concentrations more than 1,000 times over the legal limit into Calumet River and Lake Michigan, severely affecting wildlife. Notwithstanding the enormous damage such major spills cause, industrial activities that continuously release small amounts of waste are similarly dangerous for human and environmental health.The European Parliament, as part of its General Union Environment Action Programme, has called for a ban on cyanide in mining activities to protect water resources and ecosystems against pollution. Although several EU member states have joined this initiative, there is still no binding legislation. Similarly, there are no general laws in the USA to prevent cyanide spills, and former administration even authorized the use of cyanide for control predators in agriculture.  相似文献   

13.

Correction to: The EMBO Journal (2021) 40: e107786. DOI 10.15252/embj.2021107786 | Published online 8 June 2021The authors would like to add three references to the paper: Starr et al and Zahradník et al also reported that the Q498H or Q498R mutation has enhanced binding affinity to ACE2; and Liu et al reported on the binding of bat coronavirus to ACE2.Starr et al and Zahradník et al have now been cited in the Discussion section, and the following sentence has been corrected from:“According to our data, the SARS‐CoV‐2 RBD with Q498H increases the binding strength to hACE2 by 5‐fold, suggesting the Q498H mutant is more ready to interact with human receptor than the wildtype and highlighting the necessity for more strict control of virus and virus‐infected animals”.to“Here, according to our data and two recently published papers, the SARS‐CoV‐2 RBD with Q498H or Q498R increases the binding strength to hACE2 (Starr et al, 2020; Zahradník et al, 2021), suggesting the mutant with Q498H or Q498R is more ready to interact with human receptor than the wild type and highlighting the necessity for more strict control of virus and virus‐infected animals”.The Liu et al citation has been added to the following sentence:“In another paper published by our group recently, RaTG13 RBD was found to bind to hACE2 with much lower binding affinity than SARS‐CoV‐2 though RaTG13 displays the highest whole‐genome sequence identity (96.2%) with the SARS‐CoV‐2 (Liu et al, 2021)”.Additionally, the authors have added the GISAID accession IDs to the sequence names of the SARS‐CoV‐2 in two human samples (Discussion section). To make identification unambiguous, the sequence names have been updated from “SA‐lsf‐27 and SA‐lsf‐37” to “GISAID accession ID: EPI_ISL_672581 and EPI_ISL_672589”.Lastly, the authors declare in the Materials and Methods section that all experiments employed SARS‐CoV‐2 pseudovirus in cultured cells. These experiments were performed in a BSL‐2‐level laboratory and approved by Science and Technology Conditions Platform Office, Institute of Microbiology, Chinese Academy of Sciences.These changes are herewith incorporated into the paper.  相似文献   

14.
Giant viruses continue to yield fascinating discoveries from ancient eukaryotic immune defenses to viruses’ role in the global carbon cycle. Subject Categories: Ecology, Microbiology, Virology & Host Pathogen Interaction

The identification of the first giant virus shook up the field of virology in 2003 and challenged common ideas about the early evolution of viruses and eukaryotes (La Scola et al, 2003). Since, more giant viruses from different host species have been discovered, along with virophages that are viral parasites of giant viruses. It has also become increasingly clear that giant viruses and their parasites are not just another curiosity from an ecological niche but do play an important role in eukaryotic evolution and also perhaps in global marine carbon cycles. Notwithstanding, the evolution and ecology of giant viruses has become a fascinating field of study in itself.  相似文献   

15.
Mycobacterium tuberculosis is a fascinating object of study: it is one of the deadliest pathogens of humankind, able to fend off persistent attacks by the immune system or drugs Subject Categories: Immunology, Microbiology, Virology & Host Pathogen Interaction, Chemical Biology

I have always been interested in infectious diseases since I began to study biology. As a graduate student, my pathogen of choice was Salmonella typhimurium, which typically causes diarrhea that can potentially lead to death. Salmonella''s rapid doubling time, and the availability of elegant genetic tools, a wealth of reagents, and a robust animal infection model put this bug at the apex of ideal host–pathogen systems to study. After I finished my PhD studies—and for reasons to be told another day—my career took an unexpected detour into an area of research I never thought I would be interested in: I went from the sublime to the ridiculous, from Salmonella to Mycobacterium tuberculosis (Mtb), an excruciatingly slow‐growing bacillus with few genetic tools, a paucity of reagents, and an animal model in which an experiment can take a year or longer. Having said all of that, I love working on this pathogen.For those of you who do not know much about Mtb, it is the world''s deadliest bacterium that causes the disease tuberculosis (TB). As Mtb is spread in aerosol droplets coughed up by infected individuals, TB is highly contagious, and about one‐third of the world''s population may be infected with Mtb, although this number has been reasonably challenged (Behr et al, 2021). Even if the numbers of latent or asymptomatic infections are debated, there are some back‐of‐the‐envelope estimates that Mtb has killed more than a billion humans over the millennia. Although TB is often treatable with antibiotics and most Mtb‐infected healthy individuals are asymptomatic, the appearance of multi‐drug‐resistant Mtb and HIV/AIDS has further increased the number of deaths caused by this pathogen.How has Mtb become such a successful pathogen? For one, we lack an effective vaccine to prevent infection. Many readers may point out that they have themselves been given a TB vaccine; known as “BCG” for bacille Calmette–Guérin, this is a laboratory‐attenuated strain of a species related to Mtb called Mycobacterium bovis. While BCG does provide some protection for children against TB, BCG is essentially ineffective against pulmonary TB in adults. For this reason, it is not used in the USA and many other countries.Another major challenge to treating TB has been a lack of antimicrobials that can access Mtb bacilli in privileged sites known as granulomas, which are cell‐fortified structures our immune system builds to contain microbial growth. In addition to the granuloma walls, Mtb has a highly complex cell envelope that protects it from many small molecules. I imagine that antimicrobial molecules have the challenging task of reaching an enemy shielded in armor, hiding deep inside a castle keep, and surrounded by a vast moat, and an army of orcs.On top of these therapeutic barriers, most antimicrobials work on metabolically active or growing bacteria. Mtb, however, grows very slowly, with a doubling time under optimal laboratory conditions of about 20 h—compared with 20 min for Salmonella. Moreover, Mtb is believed to enter a “persistent” or “latent” state in its natural host with limited cell divisions. This extremely slow growth makes treatment a long and tedious prospect: 6–12 months of antibiotic treatment are generally required, during which time one cannot drink alcohol due to the potential liver toxicity of the drugs. Believe it or not, there are people who would rather refuse TB treatment than give up alcohol for a few months. Additionally, the perception of “feeling cured” after a few weeks of TB therapy can also lead to a lapse in compliance. The consequence of failing to clear a partially treated infection is the emergence of drug resistance, which has created strains that are extensively resistant to most frontline TB drugs.When thinking about the difficulty of curing Mtb infections, I am reminded of the fierce and fearless honey badger, which came to fame through a viral YouTube video. The narrator points out how honey badgers “don''t care” about battling vicious predators in order to get food: venomous snakes, stinging bees—you name it. I once saw a photo of a honey badger that looked more like a pin cushion, harpooned with numerous porcupine quills. This battle royale of the wilderness is a perfect analogy of Mtb versus the immune system: Like the honey badger, Mtb really don''t care.Vaccines primarily work by coaxing our immune system to make antibodies that neutralize foreign invaders, most typically viruses, but also bacteria, some of which have evolved mechanisms to evade detection by antibodies or otherwise render them useless. In most cases, phagocytes then gobble up and kill invading bacteria. While phagocytes are critical in controlling Mtb infections, it is unclear which of their molecules or “effectors” act as executioners of Mtb. For example, nitric oxide and copper play roles in controlling Mtb in a mouse model, but it is unknown how these molecules exert their host‐protective activity, and whether or not they play a similar role in humans. Furthermore, despite the production of these antibacterial effectors—the “porcupine quills”—Mtb often persists due to intrinsic resistance mechanisms. Thus, while our immune system may have the tools to keep Mtb under control, it falls short of eradicating it from our bodies and, in many cases, fails to prevent the progression of the disease. Perhaps a most worrying observation is that prior infection, which is generally considered the most effective path to immunity for many infectious diseases, does not consistently protect against reinfection with Mtb.The above facts have left the TB field scrambling to identify new ways to fight this disease. Much of this work requires that researchers understand both the fundamental processes of the bacterium and its host. Studies of human populations around the globe have revealed differences in susceptibility to infection, the genetic and immunological bases of which are being investigated (Bellamy et al, 2000; Berry et al, 2010; Möller et al, 2010). These studies have made researchers increasingly aware that how the immune system responds to Mtb may play a critical role in disease control. For example, understanding why some individuals or populations are more or less susceptible to TB may help in the development of better vaccines. Also, the more we understand what makes this pathogen so resilient to the immune system could facilitate the development of new antibacterial drugs or host‐directed therapies. These questions can only be answered once we fully understand how the host combats Mtb infections, and how the bacteria counteract these host defenses. While it is a daunting endeavor, my hope is that the efforts of many laboratories around the world will get a better understanding of the host–Mtb interface and ultimately help to eradicate this disease for good.  相似文献   

16.
17.
The EU''s Biodiversity Strategy for 2030 makes great promises about halting the decline of biodiversity but it offers little in terms of implementation. Subject Categories: S&S: Economics & Business, Ecology, S&S: Ethics

Earth is teeming with a stunning variety of life forms. Despite hundreds of years of exploration and taxonomic research, and with 1.2 million species classified, we still have no clear picture of the real extent of global biodiversity, with estimates ranging from 3 to 100 million species. A highly quoted—although not universally accepted—study predicted some 8.7 million species, of which about 2.2 million are marine (Mora et al, 2011). Although nearly any niche on the surface of Earth has been colonized by life, species richness is all but evenly distributed. A large share of the known species is concentrated in relatively small areas, especially in the tropics (Fig 1). Ultimately, it is the network of the interactions among life forms and the physical environment that make up the global ecosystem we call biosphere and that supports life itself.Open in a separate windowFigure 1Biological hotspots of the worldA total of 36 currently recognized hotspots make up < 3% of the planet''s land area but harbor half of the world''s endemic plant species and 42% of all terrestrial vertebrates. Overall, hotspots have lost more than 80% of their original extension. Credit: Richard J. Weller, Claire Hoch, and Chieh Huang, 2017, Atlas for the End of the World, http://atlas‐for‐the‐end‐of‐the‐world.com/. Reproduced with permission.Driven by a range of complex and interwoven causes–such as changes in land and sea use, habitat destruction, overexploitation of organisms, climate change, pollution, and invasive species–biodiversity is declining at an alarming pace. A report by the Intergovernmental Science‐Policy Platform on Biodiversity and Ecosystem Services (IPBES) issued a clear warning: “An average of around 25 per cent of species in assessed animal and plant groups are threatened, suggesting that around 1 million species already face extinction, many within decades, unless action is taken to reduce the intensity of drivers of biodiversity loss. Without such action, there will be a further acceleration in the global rate of species extinction, which is already at least tens to hundreds of times higher than it has averaged over the past 10 million years” (IPBES, 2019) (Fig 2). Although focused on a smaller set of organisms, a more recent assessment by WWF has reached similar conclusions. Their Living Planet Index, that tracks the abundance of thousands of populations of mammals, birds, fish, reptiles, and amphibians around the world, shows a stark decline in monitored populations (WWF, 2020). As expected, the trend of biodiversity decline is not homogeneous with tropical areas paying a disproportionately high price, mostly because of unrestrained deforestation and exploitation of natural resources.Open in a separate windowFigure 2The global, rapid decline of biodiversity(A) Percentage of species threatened with extinction in taxonomic groups that have been assessed comprehensively, or through a “sampled” approach, or for which selected subsets have been assessed by the IUCN Red List of Threatened Species. Groups are ordered according to the best estimate, assuming that data‐deficient species are as threatened as non‐data deficient species. (B) Extinctions since 1500 for vertebrate groups. (C) Red List Index of species survival for taxonomic groups that have been assessed for the IUCN Red List at least twice. A value of 1 is equivalent to all species being categorized as Least Concern; a value of zero is equivalent to all species being classified as Extinct. Data for all panels from www.iucnredlist.org. Reproduced from (IPBES, 2019), with permission.
Driven by a range of complex and interwoven causes […] biodiversity is declining at an alarming pace.
Against this dire background, the EU has drafted a Biodiversity Strategy 2030, an ambitious framework aimed to tackling the key reasons behind biodiversity loss. The plan hinges around a few main elements, such as the establishment of protected areas for at least 30% of Europe''s lands and seas (Fig 3); a significant increase of biodiversity‐rich landscape features on agricultural land by establishing buffer zones like hedges and fallow fields; halting and reversing the decline of pollinators; and planting 3 billion trees by 2030 (https://ec.europa.eu/info/strategy/priorities‐2019‐2024/european‐green‐deal/actions‐being‐taken‐eu/eu‐biodiversity‐strategy‐2030_en). The budget for implementing these measures was set at €20 billion per year.Open in a separate windowFigure 3Natura 2000, the EU''s network of protected areasIn 2019, 18% of land in the EU was protected as Natura 2000, with the lowest share of protected land in Denmark (8%) and the highest in Slovenia (38%). In 2019, the largest national network of terrestrial Natura 2000 sites was located in Spain, covering 138,111 km2, followed by France (70,875 km2) and Poland (61,168 km2). Reproduced from Eurostat: https://ec.europa.eu/eurostat/statistics‐explained/index.php?title=Main_Page “Nature is vital for our physical and mental wellbeing, it filters our air and water, it regulates the climate and it pollinates our crops. But we are acting as if it didn''t matter, and losing it at an unprecedented rate”, said Virginijus Sinkevičius, Commissioner for the Environment, Oceans and Fisheries, at the press launch of the new EU action (https://ec.europa.eu/commission/presscorner/detail/en/ip_20_884). “This new Biodiversity Strategy builds on what has worked in the past, and adds new tools that will set us on a path to true sustainability, with benefits for all. The EU''s aim is to protect and restore nature, to contribute to economic recovery from the current crisis, and to lead the way for an ambitious global framework to protect biodiversity around the planet”.Environmental groups and other stakeholders have welcomed the EU''s pledge in principle. “This is a unique opportunity to shape a new society in harmony with nature”, applauded Wetlands International. “We must not forget that the biodiversity and climate crisis is a much bigger and persistent challenge for humanity than COVID‐19”, (https://europe.wetlands.org/news/welcoming‐the‐eu‐biodiversity‐strategy‐for‐2030/). EuroNatur, a foundation focused on conservation, stated that the goals set out by the new strategy provide a strong basis for improving the state of nature in the EU (www.euronatur.org).Alongside the voices of praise, however, many have expressed concerns that the strategy could turn into a little more than a wish list. “The big issue of the strategy is that while setting a goal for financial funds, the EU does not specify where the money is supposed to come from. It only says it should include ‘EU funds and national and private funding’”, commented the European Wilderness Society, an environmental advocacy non‐profit organization headquartered in Tamsweg, Austria. “Goals are important, but do not create change without an organized and sustainable implementation. It''s a good and ambitious document, but what is also obvious is the lack of strategy of how to implement it, and a lack of discussion of why previous documents of this type failed” (https://wilderness‐society.org/ambitious‐eu‐biodiversity‐strategy‐2030/).
Alongside the voices of praise, however, many have expressed concerns that the strategy could turn into a little more than a wish list.
The Institute for European Environmental Policy (IEEP) is on the same page. The sustainability think‐tank based in Brussels and London noted that the outgoing EU 2020 biodiversity strategy showed major implementation problems, especially because of lack of engagement at national level and of ad hoc legislation supporting the meeting of key targets. Therefore, “[it] can be argued that a legally binding approach to the biodiversity governance framework is urgently needed unless Member States and other key stakeholders can show greater intrinsic ownership to deliver on agreed objectives”, (https://ieep.eu/news/first‐impressions‐of‐the‐eu‐biodiversity‐strategy‐to‐2030). In addition, IEEP remarked that money is an issue, since the €20 billion figure appears more as an estimate than a certified obligation.“The intentions of the Commission are good and the strategy contains a number of measures and targets that can really make a difference. However, implementation depends critically on the member states and experiences with the Common Agricultural Policy the past decade or so have taught us that many of them are more interested in short‐term economic objectives than in safeguarding the natural wealth of their country for future generations”, commented David Kleijn, an ecologist and nature conservation expert at the Wageningen University, the Netherlands. “I think it is important that we now have an ambitious Biodiversity Strategy but at the same time I have little hope that we will be able to achieve its objectives”.
I think it is important that we now have an ambitious Biodiversity Strategy but at the same time I have little hope that we will be able to achieve its objectives.
There is further criticism against specific measures, such as the proposal of planting 3 billion trees. “To have lots of trees planted in an area does not necessarily translate into an increase of biodiversity. Biodiverse ecosystems are the result of million years of complex multi‐species interactions and evolutionary processes, which are not as easy to restore”, explained plant ecologist Susana Gómez‐González, from the University of Cádiz, Spain. Planting a large number of trees is a too simplistic approach for saving European forests from the combined effects of excessive anthropic pressure and climate change, and could even have detrimental effects (see Box 1). More emphasis should be placed instead in reducing tree harvesting in sensitive areas and in promoting natural forest renewal processes (Gómez‐González et al, 2020). “For a biodiversity strategy, increasing the number of trees, or even increasing the forest area, should not be an objective; priority should be given to the conservation and restoration of natural ecosystems, forests and non‐forests”, Gómez‐González said.In other cases, it could be difficult, if not impossible, to reach some of the goals because of lack of information. For example, one of the roadmap''s targets is to restore at least 25,000 km of Europe''s rivers back to free‐flowing state. However, the number of barriers dispersed along European rivers will probably prevent even getting close to the mark. An international research team has collected detailed information on existing instream barriers for 147 rivers in 36 European countries, coming up with the impressive figure of over 1.2 million obstacles that inevitably impact on river ecosystems, affecting the transport and dispersion of aquatic organisms, nutrients, and sediments (Belletti et al, 2020). Existing inventories mainly focused on dams and other large barriers, while, in fact, a large number of artificial structures are much smaller, such like weirs, locks, ramps, and fords. As a result, river fragmentation has been largely underestimated, and the models used to plan flow restoration might be seriously flawed. “To avoid ‘death by a thousand cuts’, a paradigm shift is necessary: to recognize that although large dams may draw most of the attention, it is the small barriers that collectively do most of the damage. Small is not beautiful”, concluded the authors (Belletti et al, 2020).

Box 1: Why many trees don''t (always) make a forestForests are cathedrals of biodiversity. They host by far the largest number of species on land, which provide food and essential resources for hundreds of millions of people worldwide. However, forests are disappearing and degrading at an alarming pace. The loss of these crucial ecosystems has given new impulses to a variety of projects aimed at stopping this devastation and possibly reversing the trend.Once it is gone, can you rebuild a forest? Many believe the answer is yes, and the obvious solution is to plant trees. Several countries have thus launched massive tree‐planting programs, notably India and Ethiopia, where 350 million trees have been planted in single day (https://www.unenvironment.org/news‐and‐stories/story/ethiopia‐plants‐over‐350‐million‐trees‐day‐setting‐new‐world‐record). The World Economic Forum has set up its own One Trillion Tree initiative (https://www.1t.org/) “to conserve, restore, and grow one trillion trees by 2030”. Launched in January last year at Davos, 1t.org was conceived as a platform for governments, companies and NGOs/civil society groups to support the UN Decade on Ecosystem Restoration (2021–2030). The initiative has been christened by renowned naturalist Jane Goodall, who commented: “1t.org offers innovative technologies which will serve to connect tens of thousands of small and large groups around the world that are engaged in tree planting and forest restoration”, (https://www.weforum.org/agenda/2020/01/one‐trillion‐trees‐world‐economic‐forum‐launches‐plan‐to‐help‐nature‐and‐the‐climate/).However, things are way more complicated than they appear: large‐scale tree planting schemes are rarely a viable solution and can even be harmful. “[A] large body of literature shows that even the best planned restoration projects rarely fully recover the biodiversity of intact forests, owing to a lack of sources of forest‐dependent flora and fauna in deforested landscapes, as well as degraded abiotic conditions resulting from anthropogenic activities”, commented Karen Holl from the University of Caliornia, Santa Cruz, and Pedro Brancalion from the University of São Paulo (Holl & Brancalion, 2020). A common problem of tree plantations, for example, is the low survival rate of seedlings, mostly because the wrong tree species are selected and due to poor maintenance after planting. Moreover, grasslands and savannas, which are often targeted for establishing new forests, are themselves treasure troves of biodiversity. Ending indiscriminate deforestation, improving the protection of existing forests, and promoting their restoration would therefore be a more efficient strategy to preserve biodiversity in the shorter term. If tree planting is indeed necessary, it should be well planned by selecting the right areas for reforestation, using suitable tree species that can maximize biodiversity, and involving local populations to maintain the plantations, Holl and Brancalion argue (Holl & Brancalion, 2020).

…even the best planned restoration projects rarely fully recover the biodiversity of intact forests, owing to a lack of sources of forest‐dependent flora and fauna in deforested landscapes…
The health of soil, where a high proportion of biodiversity is hosted, is another problem the new strategy should address in a more focused manner. “In my opinion, the EU Biodiversity Strategy is already a leap forward in terms of policy interest in soils in general and in soil biodiversity in particular. Compared with other nations/regions of the world, Europe is by far in the forefront regarding this issue”, commented Carlos António Guerra at the German Centre for Integrative Biodiversity Research (iDiv) in Leipzig, Germany, and Co‐leader of the Global Soil Biodiversity Observation Network (https://geobon.org/bons/thematic‐bon/soil‐bon/). “Nevertheless, the connection between soil biodiversity and ecological functions needs further commitments. Soils allow for horizontal integration of several policy agendas, from climate to agriculture and, very importantly, nature conservation. This is not explicit in the EU Biodiversity Strategy in regard to soils”. It remains to be seen if EU restoration plan will emphasize soil biodiversity, or consider it as a mere side effect of other initiatives, Guerra added. “A soil nature conservation plan should be proposed”, he said. “Only such a plan, that implies that current and future protected areas have to consider, describe and protect their soil biodiversity would make a significant push to help protect such a valuable resource”.More generally, research shows that the current paradigm of protection must be shifted to prevent further losses to biodiversity. In fact, an analysis of LIFE projects—a cornerstone of EU nature protection—found that conservation efforts are extremely polarized and strongly taxonomically biased (Mammola et al, 2020). From 1992 to 2018, investment in vertebrates was sixfold higher than that for invertebrates, with birds and mammals alone accounting for 72% of the targeted species and 75% of the total budget. In relative terms, investment per species for vertebrates has been 468 times higher than for invertebrates (Fig 4). There is no sound scientific reasoning behind this uneven conservation attention, but just popularity. “[T]he species covered by a greater number of LIFE projects were also those which attracted the most interest online, suggesting that conservation in the EU is largely driven by species charisma, rather than objective features”, the researchers wrote (Mammola et al, 2020).Open in a separate windowFigure 4Taxonomic bias in EU fauna protection effortsBreakdown of the number of projects (A) and budget allocation (B) across main animal groups covered by the LIFE projects (n = 835). (C) The most covered 30 species of vertebrates (out of 410) and invertebrates (out of 78) in the LIFE projects analyzed (n = 835). The vertical bar represents monetary investment and the blue scatter line the number of LIFE projects devoted to each species. Reproduced from (Mammola et al, 2020), with permission.  相似文献   

18.
19.
20.
Debates about the source of antibodies and their use are confusing two different issues. A ban on life immunization would have no repercussions on the quality of antibodies. Subject Categories: S&S: Economics & Business, Methods & Resources, Chemical Biology

There is an ongoing debate on how antibodies are being generated, produced and used (Gray, 2020; Marx, 2020). Or rather, there are two debates, which are not necessarily related to each other. The first one concerns the quality of antibodies used in scientific research and the repercussions for the validity of results (Bradbury & Pluckthun, 2015). The second debate is about the use of animals to generate and produce antibodies. Although these are two different issues, we observe that the debates have become entangled with arguments for one topic incorrectly being used to motivate the other and vice versa. This is not helpful, and we should disentangle the knot.Polyclonal antibodies are being criticized because they suffer from cross‐reactivity, high background and batch‐to‐batch variation (Bradbury & Pluckthun, 2015). Monoclonal antibodies produced from hybridomas are criticized because they often lack specificity owing to genetic heterogeneity introduced during hybridoma generation that impairs the quality of the monoclonals (Bradbury et al, 2018). These are valid criticisms and producing antibodies in a recombinant manner will, indeed, help to improve quality and specificity. But a mediocre antibody will remain a mediocre antibody, no matter how it is produced. Recombinant methods will just produce a mediocre antibody more consistently.Getting a good antibody is not easy and much depends on the nature and complexity of the antigen. And low‐quality antibodies are often the result of poor screening, poor quality control, incomplete characterization and the lack of international standards. Nevertheless, the technologies to ensure good selection and to guarantee consistent quality are much more advanced than a decade ago, and scientists and antibody producers should implement these to deliver high‐quality antibodies. Whether antibodies are generated by animal immunization or from naïve or synthetic antibody libraries is less relevant; they can all be produced recombinantly, and screening and characterization are needed in all cases to determine quality, and if the antibody is fit for purpose.But criticisms on the quality of many antibodies and pleas for switching to recombinant production of antibodies cannot be mixed up with a call to ban animal immunization. The EU Reference Laboratory for Alternatives to Animal Testing (EURL ECVAM) recently published a recommendation to stop using animals for generating and producing antibodies for scientific, diagnostic and even therapeutic applications (EURL ECVAM, 2020). This recommendation is mainly supported by scientists who seem to be biased towards synthetic antibody technology for various reasons. Their main argument is that antibodies derived from naïve or synthetic libraries are a valid (and exclusive) alternative. But are they?One can certainly select antibodies from non‐immune libraries, and, depending on the antigen and the type of application, these antibodies can be fit for purpose. In fact, a few of such antibodies have made it to the market as therapeutics, Adalimumab (Humira®) being a well‐known example. But up to now, the vast majority of antibodies continues to come from animal immunization (Lu et al, 2020). And there is a good reason for that. It is generally possible to generate a few positive hits in a naïve/synthetic library; and the more diverse the library, the more hits one is likely to get. But many decades of experience with immunization of animals—especially when they are outbred—shows that they generate larger amounts of antibodies with superior properties. And the more complex your antigen is, the more the balance swings towards animal immunization if you want to have a guarantee for success.There are different factors at work here. First, the immune system of mammals has evolved over millions of years to efficiently produce excellent antibodies against a very diverse range of antigens. Second, presenting the antigen multiple times in its desired (native) conformation to the animal immune system exploits the natural maturation process to fine‐tune the immune response against particular qualities. Another factor is that in vivo maturation seems to select against negative properties such as self‐recognition and aggregation. It also helps to select for important properties that go beyond mere molecular recognition (Jain et al, 2017). In industrial parlance, antibodies from animal immunization are more “developable” and have favourable biophysical properties (Lonberg, 2005). Indeed, the failure rate for antibodies selected from naïve or synthetic libraries is significantly higher.Of course, the properties of synthetic antibodies selected from non‐immune libraries can be further matured in vitro, for example by light chain shuffling or targeted mutagenesis of the complementarity determining region (CDR). While this method has become more sophisticated over the years, it remains a very complex and iterative process without guarantee that it produces a high‐quality antibody.Antibodies are an ever more important tool in scientific research and a growing area in human and veterinary therapeutics. Major therapeutic breakthroughs in immunology and oncology in the past decades are based on antibodies (Lu et al, 2020). The vast majority of these therapeutic antibodies were derived from animals. An identical picture appears when you look at the antibodies in fast‐track development to combat the current COVID‐19 crisis: again, the vast majority are either derived from patients or from animal immunizations. The same holds true for antibodies that are used in diagnostics and epidemiologic studies for COVID‐19.It is for that reason that we need the tools and methods that guarantee antibodies of the highest quality and provide the best chance for success. The COVID‐19 pandemic is only one illustration of this need. If we block access to these tools, both scientific research and society at large will be negatively impacted. We therefore should not limit ourselves to naïve and synthetic libraries. Animal immunization remains an inevitable method that needs to stay. But we all agree that these immunizations must be performed under best practice to further reduce the harm to animals.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号