首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 296 毫秒
1.

Background

Can political controversy have a “chilling effect” on the production of new science? This is a timely concern, given how often American politicians are accused of undermining science for political purposes. Yet little is known about how scientists react to these kinds of controversies.

Methods and Findings

Drawing on interview (n = 30) and survey data (n = 82), this study examines the reactions of scientists whose National Institutes of Health (NIH)-funded grants were implicated in a highly publicized political controversy. Critics charged that these grants were “a waste of taxpayer money.” The NIH defended each grant and no funding was rescinded. Nevertheless, this study finds that many of the scientists whose grants were criticized now engage in self-censorship. About half of the sample said that they now remove potentially controversial words from their grant and a quarter reported eliminating entire topics from their research agendas. Four researchers reportedly chose to move into more secure positions entirely, either outside academia or in jobs that guaranteed salaries. About 10% of the group reported that this controversy strengthened their commitment to complete their research and disseminate it widely.

Conclusions

These findings provide evidence that political controversies can shape what scientists choose to study. Debates about the politics of science usually focus on the direct suppression, distortion, and manipulation of scientific results. This study suggests that scholars must also examine how scientists may self-censor in response to political events.  相似文献   

2.
3.
4.
5.
The association of Zika virus (ZIKV) infections with microcephaly and neurological diseases has highlighted an emerging public health concern. Here, we report the crystal structure of the full‐length ZIKV nonstructural protein 1 (NS1), a major host‐interaction molecule that functions in flaviviral replication, pathogenesis, and immune evasion. Of note, a long intertwined loop is observed in the wing domain of ZIKV NS1, and forms a hydrophobic “spike”, which can contribute to cellular membrane association. For different flaviviruses, the amino acid sequences of the “spike” are variable but their common characteristic is either hydrophobic or positively charged, which is a beneficial feature for membrane binding. Comparative studies with West Nile and Dengue virus NS1 structures reveal conserved features, but diversified electrostatic characteristics on both inner and outer faces. Our results suggest different mechanisms of flavivirus pathogenesis and should be considered during the development of diagnostic tools.  相似文献   

6.
The development of robust science policy depends on use of the best available data, rigorous analysis, and inclusion of a wide range of input. While director of the National Institute of General Medical Sciences (NIGMS), I took advantage of available data and emerging tools to analyze training time distribution by new NIGMS grantees, the distribution of the number of publications as a function of total annual National Institutes of Health support per investigator, and the predictive value of peer-review scores on subsequent scientific productivity. Rigorous data analysis should be used to develop new reforms and initiatives that will help build a more sustainable American biomedical research enterprise.Good scientists almost invariably insist on obtaining the best data potentially available and fostering open and direct communication and criticism to address scientific problems. Remarkably, this same approach is only sometimes used in the context of the development of science policy. In my opinion, several factors underlie the reluctance to apply scientific methods rigorously to inform science policy questions. First, obtaining the relevant data can be challenging and time-consuming. Tools relatively unfamiliar to many scientists may be required, and the data collected may have inherent limitations that make their use challenging. Second, reliance on data may require the abandonment of preconceived notions and a willingness to face potentially unwanted political consequences, depending on where the data analysis leads.One of my first experiences witnessing the application of a rigorous approach to a policy question involved previous American Society for Cell Biology Public Service awardee Tom Pollard when he and I were both at Johns Hopkins School of Medicine. Tom was leading an effort to reorganize the first-year medical school curriculum, trying to move toward an integrated plan and away from an entrenched departmentally based system (DeAngelis, 2000 ). He insisted that every lecture in the old curriculum be on the table for discussion, requiring frank discussions and defusing one of the most powerful arguments in academia: “But, we''ve always done it that way.” As the curriculum was being implemented, he recruited a set of a dozen or so students who were tasked with filling out questionnaires immediately after every lecture; this enabled evaluation and refinement of the curriculum and yielded a data set that changed the character of future discussions.After 13 years as a department director at Johns Hopkins (including a number of years as course director for the Molecules and Cells course in the first-year medical school curriculum), I had the opportunity to become director of the National Institute of General Medical Sciences (NIGMS) at the National Institutes of Health (NIH). NIH supports large data systems, as these are essential for NIH staff to perform their work in receiving, reviewing, funding, and monitoring research grants. While these rich data sources were available, the resources for analysis were not as sophisticated as they could have been. This became apparent when we tried to understand how long successful young scientists spent at various early-career stages (in graduate school, doing postdoctoral fellowships, and in faculty positions before funding). This was a relatively simple question to formulate, but it took considerable effort to collect the data because the relevant data were in free-text form. An intrepid staff member took on the challenge, and went through three years’ worth of biosketches by hand to find 360 individuals who had received their first R01 awards from NIGMS and then compiled data on the years those individuals had graduated from college, completed graduate school, started their faculty positions, and received their R01 awards. Analysis of these data revealed that the median time from BS/BA to R01 award was ∼15 years, including a median of 3.6 years between starting a faculty position and receiving the grant. These results were presented to the NIGMS Advisory Council but were not shared more widely, because of the absence of a good medium at the time for reporting such results. I did provide them subsequently through a blog in the context of a discussion of similar issues (DrugMonkey, 2012 ). To address the communications need, we had developed the NIGMS Feedback Loop, first as an electronic newsletter (NIGMS, 2005 ) and subsequently as a blog (NIGMS, 2009 ). This vehicle has been of great utility for bidirectional communication, particularly under unusual circumstances. For example, during the period prior to the implementation of the American Recovery and Reinvestment Act, that is, the “stimulus bill,” I shared our thoughts and solicited input from the community. I subsequently received and answered hundreds of emails that offered reactions and suggestions. Having these admittedly nonscientific survey data in hand was useful in subsequent NIH-wide policy-development discussions.At this point, staff members at several NIH institutes, including NIGMS, were developing tools for data analysis, including the ability to link results from different data systems. Many of the questions I was most eager to address involved the relationship between scientific productivity and other parameters, including the level of grant support and the results of peer review that led to funding in the first place. With an initial system that was capable of linking NIH-funded investigators to publications, I performed an analysis of the number of publications from 2007 to mid-2010 attributed to NIH funding as a function of the total amount of annual NIH direct-cost support for 2938 NIGMS-funded investigators from fiscal year 2006 (Berg, 2010 ). The results revealed that the number of publications did not increase monotonically but rather reached a plateau near an annual funding level near $700,000. This observation received considerable attention (Wadman, 2010 ) and provided support for a long-standing NIGMS policy of imposing an extra level of oversight for well-funded investigators. It is important to note that, not surprisingly, there was considerable variation in the number of publications at all funding levels and, in my opinion, this observation is as important as the plateau in moving policies away from automatic caps and toward case-by-case analysis by staff armed with the data.This analysis provoked considerable discussion on the Feedback Loop blog and elsewhere regarding whether the number of publications was an appropriate measure of productivity. With better tools, it was possible to extend such analyses to other measures, including the number of citations, the number of citations relative to other publications, and many other factors. This extended set of metrics was applied to an analysis of the ability of peer-review scores to predict subsequent productivity (Berg, 2012a , b ). Three conclusions were supported by this analysis. First, the various metrics were sufficiently correlated with one another that the choice of metric did not affect any major conclusions (although metrics such as number of citations performed slightly better than number of publications). Second, peer-review scores could predict subsequent productivity to some extent (compared with randomly assigned scores), but the level of prediction was modest. Importantly, this provided some of the first direct evidence that peer review is capable of identifying applications that are more likely to be productive. Finally, the results revealed no noticeable drop-off in productivity, even near the 20th percentile, supporting the view that a substantial amount of productive science is being left unfunded with pay lines below the 20th percentile, let alone the 10th percentile.In 2011, I moved to the University of Pittsburgh and also became president-elect of the American Society for Biochemistry and Molecular Biology (ASBMB). In my new positions, I have been able to gain a more direct perspective on the current state of the academic biomedical research enterprise. It is exciting to be back in the trenches again. On the other hand, my observations support a conclusion I had drawn while I was at NIH: the biomedical research enterprise is not sustainable in its present form due not only to the level of federal support, but also to the duration of training periods, the number of individuals being trained to support the research effort, the lack of appropriate pathways for individuals interested in careers as bench scientists, challenges in the interactions between the academic and private sectors, and other factors. Working with the Public Affair Advisory Committee at ASBMB, we have produced a white paper (ASBMB, 2013 ) that we hope will help initiate conversations about imagining and then moving toward more sustainable models for biomedical research. We can expect to arrive at effective policy changes and initiatives only through data-driven and thorough self-examination and candid discussions between different stakeholders. We look forward to working with leaders and members from other scientific societies as we tackle this crucial set of issues.Open in a separate windowJeremy M. Berg  相似文献   

7.
There is a common misconception that the United States is suffering from a “STEM shortage,” a dearth of graduates with scientific, technological, engineering, and mathematical backgrounds. In biomedical science, however, we are likely suffering from the opposite problem and could certainly better tailor training to actual career outcomes. At the Future of Research Symposium, various workshops identified this as a key issue in a pipeline traditionally geared toward academia. Proposals for reform all ultimately come up against the same problem: there is a shocking lack of data at institutional and national levels on the size, shape, and successful careers of participants in the research workforce. In this paper, we call for improved institutional reporting of the number of graduate students and postdocs and their training and career outcomes.We and our fellow postdocs across the Boston area (from institutions including Tufts, Harvard Medical School, MIT, Brandeis, and Boston University) organized the Future of Research Symposium (http://futureofresearch.org). In so doing, we sought to give young scientists in Boston a voice in discussions of fundamental challenges facing the research enterprise, such as hypercompetition, skewed incentives, and an unsustainable workforce model (Alberts et al., 2014 ). During the symposium, attendees (largely postdocs and graduate students) participated in workshops designed to identify the most pressing concerns for trainees and to solicit their thoughts on possible solutions. While the complete outcomes of those sessions are listed in our meeting report (McDowell et al., 2015 ) and the supporting data (McDowell et al., 2015 , Data set 1), the organizing committee identified three principles crucial to building a more sustainable scientific enterprise, among them transparency in collecting and sharing information on the research workforce.Our culture is affected by a deeply ingrained notion that there is a “STEM shortage”—a dearth of graduates with scientific, technological, engineering, and mathematical backgrounds— an assertion that has been repeated too many times to count (Teitelbaum, 2014 ). For example, the President''s Council of Advisors on Science and Technology called for an additional one million science, technology, engineering, and mathematics (STEM) trainees in 2012 (PCAST, 2012 ). Yet a recent report by the Center for Immigration Studies using U.S. census data is one of a chorus of recent publications asserting that STEM graduates are actually struggling to get relevant jobs (Camarota and Zeigler, 2014 ). For example, only 11% of those who hold a bachelor''s degree in science actually work in a science field (table 2 in Camarota and Zeigler, 2014 ). This rhetoric is also blatantly misleading for PhD holders in biomedical science and probably lulls students interested in this path into a false sense of job security. The number of graduate students has roughly doubled from 1990 to 2012 along with a comparable increase in the number of postdocs (figures 1 and 5 in National Institutes of Health [NIH], 2012 ). Yet there is little evidence to suggest that permanent research positions, whether in academia or industry, have increased concomitantly. The problem has been eloquently summed up by Henry Bourne, referring to the swelling postdoc pool (Bourne, 2013a ) that becomes a “holding tank” (Bourne, 2013b ) from which PhD holders find great difficulty transitioning into permanent positions. Tellingly, in the National Science Foundation''s (NSF) Science and Engineering Indicators 2014 report, the most rapidly growing reason cited for starting a postdoc is “other employment not available” (table 5-19 in National Science Board, 2014, p. 5-34 ). Recent efforts to make PhD programs broadly applicable outside academia (through the NIH BEST grants and other efforts) have bolstered the argument that a PhD in biomedical sciences is broadly applicable for many careers, but a culture still exists in academia that graduate students should be training only for academic tracks. While there may be some argument for maintaining current levels of graduate student numbers, on the condition that they receive training relevant to their own career goals, the benefits of a large postdoctoral workforce are still being called sharply into question.Despite this, many leading officials have yet to take a position on the issue of the size of the workforce. For example, Sally Rockey and Francis Collins have written that “there is no definitive evidence that PhD production exceeds current employment opportunities” (Rockey and Collins, 2013 ).Technically, this is correct, but only because there are no definitive data at all. Take, for example, a very basic metric: How many postdocs are there in the U.S. research system? This is clearly a statistic that the NIH should have on hand to make the bold assertion that PhD numbers do not exceed employment opportunities: after all, many PhDs simply transition into becoming postdoctoral researchers. Except, the NIH does not know how many postdocs there are. The Boston Globe recently reported that, “The National Institutes of Health estimates there are somewhere between 37,000 and 68,000 postdocs in the country,” a tolerance of 15,500 (Johnson, 2014 ). The NIH''s Biomedical Research Workforce Working Group Report gives no concrete numbers, and it qualifies data it does show with “the number of postdoctoral researchers … may be underestimated by as much as a factor of 2” (National Institutes of Health, 2012, p. 2 ) One estimate puts the number at a little more than 50,000 (National Research Council, 2011 ), while the NSF, using data from the Survey of Graduate Students and Postdoctorates in Science and Engineering, estimates 63,000 postdocs, 44,000 of whom are in science and engineering (National Science Board, 2014 ). From data from Boston-area postdoctoral offices, we are certain the number of postdocs in the Boston area alone approaches 9000, and so we agree with the National Postdoctoral Association that all these estimates are too low and that the number of postdoctoral researchers in the United States is close to 90,000 (www.nationalpostdoc.org/policy-22/what-is-a-postdoc). But the fact that this number is up for debate at all speaks to a need for better accounting practices, especially since alarms have sounded at the pyramidal nature of the workforce for more than a decade (National Research Council, 1998 ; Kennedy et al., 2004 ).While data on the biomedical research workforce are still incomplete, anecdotal evidence suggests graduate students are finally becoming savvier about their professional futures. We conducted an informal poll of a dozen students from across the United States, asking them what they thought of the job market for PhDs at the time they accepted the offer to go to graduate school (Figure 1). Those who entered graduate school earlier reported not considering the job market before starting their PhD; by contrast, those who matriculated more recently reported low expectations, especially for academic careers. While our extremely small survey would suggest that some students are entering graduate school with no expectation of staying in academia whatsoever, their choices are by necessity based on hearsay rather than concrete information.Open in a separate windowFIGURE 1:Excerpted quotes from survey respondents. The question posed was “What did you think of the job market for PhDs at the time you accepted the offer to go to graduate school?” The year of matriculation is listed below each quote. Full responses are listed in Supplemental Table 1.Therefore we believe that graduate programs and postdoc offices have a moral imperative to inform students and fellows of what they are getting into. We call for increased efforts in collecting and sharing data on student and fellow demographics and career outcomes, such as by conducting thorough exit and alumni surveys. We also encourage our recently graduated peers to cooperate fully with such requests from our alma maters. In biomedical science, some institutions are leading the way on this front, with the University of California–San Francisco and Duke University''s Program in Cell and Molecular Biology posting some statistics online (UCSF Graduate Division, 2013 ; Duke University, 2015 ). We believe that there is an obligation for other institutions to follow their lead. In addition, we believe that a culture supporting transparency will ultimately strengthen the scientific enterprise.First, clear communication of career information may increase student and postdoc productivity down the road. While research shows that postdocs are able to accurately estimate their chances of attaining a faculty position (Sauermann, 2013 ), our experience suggests that many current graduate students do not gain this awareness until later in their careers. When rosy illusions are shattered only after an investment of many years, the ensuing disgruntlement can negatively impact trainees themselves, others in the lab, and even entire communities at particular institutions. Instead, making student outcomes more readily available is likely to select for students with realistic expectations of their training. Much like Orion Weiner''s finding that students with prior research experience subjectively perform better in graduate school, trainees who “know what they are getting into” may be more likely to display sustained motivation (Weiner, 2014 ).Second, disclosure of these data will act as a catalyst for change. Increased transparency of program outcomes will help hold institutions and programs accountable for the quality of training they provide. Also, increased awareness of the actual career paths chosen by trainees will encourage programs to offer training in skills apart from those required to conduct academic research. Increased instruction in writing, management, and leadership will benefit all trainees, including those who do stay in academic research.Students'' motivations for entering graduate school are already changing; academic institutions must now discard old rhetoric about the purpose of graduate school and confront this new landscape. It can no longer be acceptable to drive graduate programs purely toward academic career paths. While critics may worry that honesty could discourage some trainees from applying, it will also encourage those whose goals are better in line with their likely outcome. While the research enterprise is changing shape, students and postdocs deserve to enter it with their eyes open.  相似文献   

8.
For permanent secondary growth in plants, cell proliferation and differentiation should be strictly controlled in the vascular meristem consisting of (pro)cambial cells. A peptide hormone tracheary element differentiation inhibitory factor (TDIF) functions to inhibit xylem differentiation, while a plant hormone brassinosteroid (BR) promotes xylem differentiation in (pro)cambial cells. However, it remains unclear how TDIF and BR cooperate to regulate xylem differentiation for the proper maintenance of the vascular meristem. In this study, I developed an easy evaluation method for xylem differentiation frequency in a vascular induction system Vascular cell Induction culture System Using Arabidopsis Leaves (VISUAL) by utilizing a xylem-specific luciferase reporter line. In this quantitative system, TDIF suppressed and BR promoted xylem differentiation in a dose-dependent manner, respectively. Moreover, simultaneous treatment of TDIF and BR with (pro)cambial cells revealed that they can cancel their each other’s effect on xylem differentiation, suggesting a competitive relationship between TDIF and BR. Thus, mutual inhibition of “ON” and “OFF” signal enables the fine-tuned regulation of xylem differentiation in the vascular meristem.  相似文献   

9.
10.
11.
12.
13.
Biosensors for signaling molecules allow the study of physiological processes by bringing together the fields of protein engineering, fluorescence imaging, and cell biology. Construction of genetically encoded biosensors generally relies on the availability of a binding “core” that is both specific and stable, which can then be combined with fluorescent molecules to create a sensor. However, binding proteins with the desired properties are often not available in nature and substantial improvement to sensors can be required, particularly with regard to their durability. Ancestral protein reconstruction is a powerful protein-engineering tool able to generate highly stable and functional proteins. In this work, we sought to establish the utility of ancestral protein reconstruction to biosensor development, beginning with the construction of an l-arginine biosensor. l-arginine, as the immediate precursor to nitric oxide, is an important molecule in many physiological contexts including brain function. Using a combination of ancestral reconstruction and circular permutation, we constructed a Förster resonance energy transfer (FRET) biosensor for l-arginine (cpFLIPR). cpFLIPR displays high sensitivity and specificity, with a Kd of ∼14 µM and a maximal dynamic range of 35%. Importantly, cpFLIPR was highly robust, enabling accurate l-arginine measurement at physiological temperatures. We established that cpFLIPR is compatible with two-photon excitation fluorescence microscopy and report l-arginine concentrations in brain tissue.  相似文献   

14.
15.
New technologies drive progress in many research fields, including cell biology. Much of technological innovation comes from “bottom-up” efforts by individual students and postdocs. However, technology development can be challenging, and a successful outcome depends on many factors. This article outlines some considerations that are important when embarking on a technology development project. Despite the challenges, developing a new technology can be extremely rewarding and could lead to a lasting impact in a given field.As is true for many fields of research, cell biology has always been propelled forward by technological innovations (Botstein, 2010). Thanks to these advances we now have access to microscopes and other equipment with exquisite resolution and sensitivity, a variety of methods to track and quantify biological molecules, and many ingenious tools to manipulate genes, molecules, organelles, and cells. In addition, we have hardware and software that enable us to analyze our data, and build models of cells and their components.Naturally, even today’s technologies have limitations, and hence there is always need for improvements and for completely novel approaches that create new opportunities. Cell biology is one of the research areas with many chances for individual young scientists to invent and develop such new technologies. Numerous recent examples illustrate that such “bottom-up” efforts can be highly successful across all areas in cell biology; e.g., as a handy vector for RNA interference (Brummelkamp et al., 2002); as methods for visualization of protein–protein or protein–DNA interactions (Roux et al., 2012; Kind et al., 2013); as tools to study chromatin (van Steensel et al., 2001), ribonucleoprotein complexes (Ule et al., 2003), or translation (Ingolia et al., 2009); or as tags for sensitive protein detection (Tanenbaum et al., 2014), just to name a few examples.As a student or postdoc, you may similarly conceive an idea for a new method or tool. Usually this idea is inspired by a biological question that you are trying to address in your ongoing research project. You might then also realize that the new method, at least on paper, may have additional applications. Yet, the development of a new technique typically requires a substantial effort. Should you halt or delay your ongoing research and embark on the development of this new technique? And if so, what is the best strategy to minimize the risks and maximize the chance of success? How do you get the most out of the investment that it takes to develop the method? Here I will discuss some issues that students and postdocs might want to consider when venturing into the development of a new technique.

To develop or not to develop

Development of a new technique can take one to five years of full-time effort, and hence can be a risky endeavor for a young scientist. The decision to start such a project therefore requires careful weighing of the pros and cons (see text box). In essence, there are four main considerations.

Points to consider before starting to develop a new technology.

•Literature search: Does a similar technology already exist? Is there published evidence for or against its feasibility?•How much time and effort will it take?•What is the chance of success?•Are you in the right environment to develop the technology?•Are simple assays available for testing and optimization?•How important are the biological questions that can be addressed?•How broadly applicable will the technology be?•What are the advantages compared with existing methods?•Is the timing right (will there be substantial interest in the technology)?•Is there potential for future applications/modifications that will further enhance the technology?•How easy will it be for other researchers to use the technology?First, conduct a thorough literature survey to ensure that the method has not been developed by others already, and to search for indications that the method may or may not work. The second consideration is the potential impact of the new technology. Impact is often difficult to predict, but it is linked to how broadly applicable the technology will be. Will the new technology only provide an answer to your specific biological question, or will it be more widely applicable? It may be helpful to ask: how many other scientists will be interested in using the technology, or at least will profit substantially from the resulting biological data or knowledge? If the answer is “about five,” then the impact will likely be low; if the answer is “possibly hundreds,” then it will certainly be worth the investment. This potential impact must be balanced against the third consideration, which is the estimated amount of time and effort it takes to develop the technology. The fourth major consideration is: What is the chance that my technique will actually work and what is the risk of failure? There is no general answer to this question, but below I will outline strategies to reduce the risk of failure and minimize the associated loss of time and effort. For this I will consider the common phases of technology development (Fig. 1).Open in a separate windowFigure 1.Flow diagram showing the typical phases of technology development.

Quick proof-of-principle

An adage that is often heard in the biotechnology industry is “fail fast.” It is OK if a project turns out to be unsuccessful, as long as the failure becomes obvious soon after the start. This way the lost investment will be minimal. In an academic setting, it may also be good to prevent finding yourself empty-handed after years of work. As a rule of thumb, I suggest that one should aim to obtain a basic proof-of-principle within approximately four months of full-time work. If after this period there still is no indication that the method may eventually work, then it may be wise to terminate the project, because further efforts are then also likely to be too time-consuming. It is thus advisable to schedule a “continue/terminate” decision point about four months after the start of the project—and stick to it. Note that at this stage the proof-of-principle evidence may be rudimentary, but it is crucial that it is convincing enough to be a firm basis for the next step: optimization.

Optimization cycles

Obtaining the first proof-of-principle evidence is a reason to celebrate, but usually it is still a long way toward a robust, generally applicable method. Careful optimization is required, through iterations of systematic tuning of parameters and testing of the performance. This can be the most time-consuming phase of technology development. To keep the cycle time of the iterative optimizations short, it is essential that a quick, easy readout is chosen. This readout should be based on a simple assay that ideally requires no more than 1–2 d. It is important that the required equipment is readily accessible; for example, if for each iteration you have to wait for several weeks to get access to an overbooked shared FACS or sequencing machine, or if you depend on the goodwill of a distant collaborator who has many other things on his mind, then the optimization process will be slow and frustrating. If your technology consists of a lengthy protocol with multiple steps, try to optimize each step individually (separated from the rest of the protocol), and include good positive and negative controls.Remember that statistical analysis is your ally: it is a tool to distinguish probable signals from random noise and thus enables you to make rational decisions in the optimization process (did condition A really yield better results than condition B?). Assays with quantitative readouts are easier to analyze statistically and are therefore preferable.

Version 1.0: Reaping the first biological insights

During the optimization process it is helpful to define an endpoint that will result in “version 1.0” of the technology. Typically this is when the technology is ready to address its first interesting biological question. Once you have reached this point, it may be useful to temporarily refrain from further optimization of the technology, and focus on applying it to this biological question. This has two purposes. First, it subjects the technology to a real-life test that may expose some of its shortcomings, which then need to be addressed in further optimization cycles. Second, it may yield biological data that illustrates the usefulness of the technology, which may inspire other scientists to adopt the method. If you are based in a strictly technology-oriented laboratory, collaboration with a colleague who is an expert in the biological system at hand may expedite this phase and help to work out bugs in the methodology.If version 1.0 performs well in this biological test, it may be time to publish the method. For senior postdocs, this may also be a good moment to start your own laboratory. A new technology is usually a perfect basis for such a step.

Disseminating and leveraging the technology

When, upon publication, other scientists adopt your new technology, they will often implement improvements and new applications, which makes the technology attractive to yet more scientists. This snowball effect is one of the hallmarks of a high-impact technology. An extreme example is the recently developed CRISPR–Cas9 technology (Doudna and Charpentier, 2014), for which improvements and new applications are currently reported almost on a weekly basis. What can you do to get such a snowball rolling?First, it helps to publish the new technology in a widely read or Open Access journal, to present it at conferences, and to initiate collaborations in order to reach a broad group of potential users. Second, the threshold for others to use the new technology must be as low as possible. Thus, implementation of the technology must be simple, and users must have easy access to detailed protocols. A website with troubleshooting advice, answers to frequently asked questions, and (if applicable) software for download will also help. Depending on the complexity of the technology, it may be worth considering whether to organize hands-on training, perhaps in the form of a short course. This may seem like a big investment, but it can substantially contribute to the snowball effect.Third, materials and software required for the technology should be readily available. Technology transfer offices of research institutes often insist on the signing of a material transfer agreement (MTA) before materials such as plasmids can be shared. But all too often this leads to a substantial administrative burden and delays of weeks or even months. Free “no-strings-attached” sharing of reagents is often the best way to promote your technology—and scientific progress in general.

Patents and the commercial route

Before publication of the technology, you may consider protecting the intellectual property by filing a patent application. Most academic institutes do this, but often the associated costs are high and the ultimate profits uncertain, in part because it can be difficult to enforce protection of a patented technology (how do you prove that your technology was used by someone else?). That said, some technologies or associated materials may be more effectively scaled up and disseminated through a commercial route than via purely academic channels. Specific companies may have distribution infrastructure or technical expertise that is hard to match in an academic laboratory. Founding your own company may also be a way to give the technology more leverage, as it provides access to funds not available in an academic setting. In these cases, timely filing of a patent application may be essential. Note that in certain countries one cannot apply for a patent once the technology has been publicly disclosed (e.g., at a conference).

Competing technologies

Often different technologies for the same purpose are invented independently and more or less simultaneously. It is therefore quite likely that sooner or later an alternative technology emerges in the literature, or appears on the commercial market. This is sometimes referred to as “competing technology,” but in an academic setting this is somewhat of a misnomer, as solid science requires multiple independent methods to cross-validate results. Moreover, it is extremely rare that two independent technologies cover exactly the same spectrum of applications. For example, one technology may have a higher resolution, but the other may be superior in sensitivity. The sudden emergence of a competing technology can however have strategic consequences, and it is important to carefully define the advantages of your technology and focus on these strengths.

A bright future for technology development

New technologies generally consist of a new combination of available technologies, or apply newly discovered fundamental principles. Because the pool of available knowledge and tools continues to expand, the opportunities to devise and test new methods will only improve. This is further facilitated by the increasing quality of basic methods and tools to build on. Thus, there is a bright future for technology development. With a carefully designed strategy, the risks associated with such efforts can be minimized and the overall impact maximized. In the end, it is extremely gratifying to apply a “home-grown” technology to exciting biological questions, and to see other laboratories use it.  相似文献   

16.
17.
Martinson BC 《EMBO reports》2011,12(8):758-762
Universities have been churning out PhD students to reap financial and other rewards for training biomedical scientists. This deluge of cheap labour has created unhealthy competition, which encourages scientific misconduct.Most developed nations invest a considerable amount of public money in scientific research for a variety of reasons: most importantly because research is regarded as a motor for economic progress and development, and to train a research workforce for both academia and industry. Not surprisingly, governments are occasionally confronted with questions about whether the money invested in research is appropriate and whether taxpayers are getting the maximum value for their investments.…questions about the size and composition of the research workforce have historically been driven by concerns that the system produces an insufficient number of scientistsThe training and maintenance of the research workforce is a large component of these investments. Yet discussions in the USA about the appropriate size of this workforce have typically been contentious, owing to an apparent lack of reliable data to tell us whether the system yields academic ‘reproduction rates'' that are above, below or at replacement levels. In the USA, questions about the size and composition of the research workforce have historically been driven by concerns that the system produces an insufficient number of scientists. As Donald Kennedy, then Editor-in-Chief of Science, noted several years ago, leaders in prestigious academic institutions have repeatedly rung alarm bells about shortages in the science workforce. Less often does one see questions raised about whether too many scientists are being produced or concerns about unintended consequences that may result from such overproduction. Yet recognizing that resources are finite, it seems reasonable to ask what level of competition for resources is productive, and at what level does competition become counter-productive.Finding a proper balance between the size of the research workforce and the resources available to sustain it has other important implications. Unhealthy competition—too many people clamouring for too little money and too few desirable positions—creates its own problems, most notably research misconduct and lower-quality, less innovative research. If an increasing number of scientists are scrambling for jobs and resources, some might begin to cut corners in order to gain a competitive edge. Moreover, many in the science community worry that every publicized case of research misconduct could jeopardize those resources, if politicians and taxpayers become unwilling to invest in a research system that seems to be riddled with fraud and misconduct.The biomedical research enterprise in the USA provides a useful context in which to examine the level of competition for resources among academic scientists. My thesis is that the system of publicly funded research in the USA as it is currently configured supports a feedback system of institutional incentives that generate excessive competition for resources in biomedical research. These institutional incentives encourage universities to overproduce graduate students and postdoctoral scientists, who are both trainees and a cheap source of skilled labour for research while in training. However, once they have completed their training, they become competitors for money and positions, thereby exacerbating competitive pressures.Questions raised about whether too many scientists are being produced or concerns about the unintended consequences of such overproduction are less commonThe resulting scarcity of resources, partly through its effect on peer review, leads to a shunting of resources away from both younger researchers and the most innovative ideas, which undermines the effectiveness of the research enterprise as a whole. Faced with an increasing number of grant applications and the consequent decrease in the percentage of projects that can be funded, reviewers tend to ‘play it safe'' and favour projects that have a higher likelihood of yielding results, even if the research is conservative in the sense that it does not explore new questions. Resource scarcity can also introduce unwanted randomness to the process of determining which research gets funded. A large group of scientists, led by a cancer biologist, has recently mounted a campaign against a change in a policy of the National Institutes of Health (NIH) to allow only one resubmission of an unfunded grant proposal (Wadman, 2011). The core of their argument is that peer reviewers are likely able to distinguish the top 20% of research applications from the rest, but that within that top 20%, distinguishing the top 5% or 10% means asking peer reviewers for a level of precision that is simply not possible. With funding levels in many NIH institutes now within that 5–10% range, the argument is that reviewers are being forced to choose at random which excellent applications do and do not get funding. In addition to the inefficiency of overproduction and excessive competition in terms of their costs to society and opportunity costs to individuals, these institutional incentives might undermine the integrity and quality of science, and reduce the likelihood of breakthroughs.My colleagues and I have expressed such concerns about workforce dynamics and related issues in several publications (Martinson, 2007; Martinson et al, 2005, 2006, 2009, 2010). Early on, we observed that, “missing from current analyses of scientific integrity is a consideration of the wider research environment, including institutional and systemic structures” (Martinson et al, 2005). Our more recent publications have been more specific about the institutional and systemic structures concerned. It seems that at least a few important leaders in science share these concerns.In April 2009, the NIH, through the National Institute of General Medical Sciences (NIGMS), issued a request for applications (RFA) calling for proposals to develop computational models of the research workforce (http://grants.nih.gov/grants/guide/rfa-files/RFA-GM-10-003.html). Although such an initiative might be premature given the current level of knowledge, the rationale behind the RFA seems irrefutable: “there is a need to […] pursue a systems-based approach to the study of scientific workforce dynamics.” Roughly four decades after the NIH appeared on the scene, this is, to my knowledge, the first official, public recognition that the biomedical workforce tends not to conform nicely to market forces of supply and demand, despite the fact that others have previously made such arguments.Early last year, Francis Collins, Director of the NIH, published a PolicyForum article in Science, voicing many of the concerns I have expressed about specific influences that have led to growth rates in the science workforce that are undermining the effectiveness of research in general, and biomedical research in particular. He notes the increasing stress in the biomedical research community after the end of the NIH “budget doubling” between 1998 and 2003, and the likelihood of further disruptions when the American Recovery and Reinvestment Act of 2009 (ARRA) funding ends in 2011. Arguing that innovation is crucial to the future success of biomedical research, he notes the tendency towards conservatism of the NIH peer-review process, and how this worsens in fiscally tight times. Collins further highlights the ageing of the NIH workforce—as grants increasingly go to older scientists—and the increasing time that researchers are spending in itinerant and low-paid postdoctoral positions as they stack up in a holding pattern, waiting for faculty positions that may or may not materialize. Having noted these challenging trends, and echoing the central concerns of a 2007 Nature commentary (Martinson, 2007), he concludes that “…it is time for NIH to develop better models to guide decisions about the optimum size and nature of the US workforce for biomedical research. A related issue that needs attention, though it will be controversial, is whether institutional incentives in the current system that encourage faculty to obtain up to 100% of their salary from grants are the best way to encourage productivity.”Similarly, Bruce Alberts, Editor-in-Chief of Science, writing about incentives for innovation, notes that the US biomedical research enterprise includes more than 100,000 graduate students and postdoctoral fellows. He observes that “only a select few will go on to become independent research scientists in academia”, and argues that “assuming that the system supporting this career path works well, these will be the individuals with the most talent and interest in such an endeavor” (Alberts, 2009).His editorial is not concerned with what happens to the remaining majority, but argues that even among the select few who manage to succeed, the funding process for biomedical research “forces them to avoid risk-taking and innovation”. The primary culprit, in his estimation, is the conservatism of the traditional peer-review system for federal grants, which values “research projects that are almost certain to ‘work''”. He continues, “the innovation that is essential for keeping science exciting and productive is replaced by […] research that has little chance of producing the breakthroughs needed to improve human health.”If an increasing number of scientists are scrambling for jobs and resources, some might begin to cut corners in order to gain a competitive edgeAlthough I believe his assessment of the symptoms is correct, I think he has misdiagnosed the cause, in part because he has failed to identify which influence he is concerned with from the network of influences in biomedical research. To contextualize the influences of concern to Alberts, we must consider the remaining majority of doctorally trained individuals so easily dismissed in his editorial, and further examine what drives the dynamics of the biomedical research workforce.Labour economists might argue that market forces will always balance the number of individuals with doctorates with the number of appropriate jobs for them in the long term. Such arguments would ignore, however, the typical information asymmetry between incoming graduate students, whose knowledge about their eventual job opportunities and career options is by definition far more limited than that of those who run the training programmes. They would also ignore the fact that universities are generally not confronted with the externalities resulting from overproduction of PhDs, and have positive financial incentives that encourage overproduction. During the past 40 years, NIH ‘extramural'' funding has become crucial for graduate student training, faculty salaries and university overheads. For their part, universities have embraced NIH extramural funding as a primary revenue source that, for a time, allowed them to implement a business model based on the interconnected assumptions that, as one of the primary ‘outputs'' or ‘products'' of the university, more doctorally trained individuals are always better than fewer, and because these individuals are an excellent source of cheap, skilled labour during their training, they help to contain the real costs of faculty research.“…the current system has succeeded in maximizing the amount of research […] it has also degraded the quality of graduate training and led to an overproduction of PhDs…”However, it has also made universities increasingly dependent on NIH funding. As recently documented by the economist Paula Stephan, most faculty growth in graduate school programmes during the past decade has occurred in medical colleges, with the majority—more than 70%—in non-tenure-track positions. Arguably, this represents a shift of risk away from universities and onto their faculty. Despite perennial cries of concern about shortages in the research workforce (Butz et al, 2003; Kennedy et al, 2004; National Academy of Sciences et al, 2005) a number of commentators have recently expressed concerns that the current system of academic research might be overbuilt (Cech, 2005; Heinig et al, 2007; Martinson, 2007; Stephan, 2007). Some explicitly connect this to structural arrangements between the universities and NIH funding (Cech, 2005; Collins, 2007; Martinson, 2007; Stephan, 2007).In 1995, David Korn pointed out what he saw as some problematic aspects of the business model employed by Academic Medical Centers (AMCs) in the USA during the past few decades (Korn, 1995). He noted the reliance of AMCs on the relatively low-cost, but highly skilled labour represented by postdoctoral fellows, graduate students and others—who quickly start to compete with their own professors and mentors for resources. Having identified the economic dependence of the AMCs on these inexpensive labour pools, he noted additional problems with the graduate training programmes themselves. “These programs are […] imbued with a value system that clearly indicates to all participants that true success is only marked by the attainment of a faculty position in a high-profile research institution and the coveted status of principal investigator on NIH grants.” Pointing to “more than 10 years of severe supply/demand imbalance in NIH funds”, Korn concluded that, “considering the generative nature of each faculty mentor, this enterprise could only sustain itself in an inflationary environment, in which the society''s investment in biomedical research and clinical care was continuously and sharply expanding.” From 1994 to 2003, total funding for biomedical research in the USA increased at an annual rate of 7.8%, after adjustment for inflation. The comparable rate of growth between 2003 and 2007 was 3.4% (Dorsey et al, 2010). These observations resonate with the now classic observation by Derek J. de Solla Price, from more than 30 years before, that growth in science frequently follows an exponential pattern that cannot continue indefinitely; the enterprise must eventually come to a plateau (de Solla Price, 1963).In May 2009, echoing some of Korn''s observations, Nobel laureate Roald Hoffmann caused a stir in the US science community when he argued for a “de-coupling” of the dual roles of graduate students as trainees and cheap labour (Hoffmann, 2009). His suggestion was to cease supporting graduate students with faculty research grants, and to use the money instead to create competitive awards for which graduate students could apply, making them more similar to free agents. During the ensuing discussion, Shirley Tilghman, president of Princeton University, argued that “although the current system has succeeded in maximizing the amount of research performed […] it has also degraded the quality of graduate training and led to an overproduction of PhDs in some areas. Unhitching training from research grants would be a much-needed form of professional ‘birth control''” (Mervis, 2009).The greying of the NIH research workforce is another important driver of workforce dynamics, and it is integrally linked to the fate of young scientistsAlthough the issue of what I will call the ‘academic birth rate'' is the central concern of this analysis, the ‘academic end-of-life'' also warrants some attention. The greying of the NIH research workforce is another important driver of workforce dynamics, and it is integrally linked to the fate of young scientists. A 2008 news item in Science quoted then 70-year-old Robert Wells, a molecular geneticist at Texas A&M University, “‘if I and other old birds continue to land the grants, the [young scientists] are not going to get them.” He worries that the budget will not be able to support “the 100 people ‘I''ve trained […] to replace me''” (Kaiser, 2008). While his claim of 100 trainees might be astonishing, it might be more astonishing that his was the outlying perspective. The majority of senior scientists interviewed for that article voiced intentions to keep doing science—and going after NIH grants—until someone forced them to stop or they died.Some have looked at the current situation with concern, primarily because of the threats it poses to the financial and academic viability of universities (Korn, 1995; Heinig et al, 2007; Korn & Heinig, 2007), although most of those who express such concerns have been distinctly reticent to acknowledge the role of universities in creating and maintaining the situation. Others have expressed concerns about the differential impact of extreme competition and meagre job prospects on the recruitment, development and career survival of young and aspiring scientists (Freeman et al, 2001; Kennedy et al, 2004; Martinson et al, 2006; Anderson et al, 2007a; Martinson, 2007; Stephan, 2007). There seems to be little disagreement, however, that the system has generated excessively high competition for federal research funding, and that this threatens to undermine the very innovation and production of knowledge that is its raison d''etre.The production of knowledge in science, particularly of the ‘revolutionary'' variety, is generally not a linear input–output process with predictable returns on investment, clear timelines and high levels of certainty (Lane, 2009). On the contrary, it is arguable that “revolutionary science is a high risk and long-term endeavour which usually fails” (Charlton & Andras, 2008). Predicting where, when and by whom breakthroughs in understanding will be produced has proven to be an extremely difficult task. In the face of such uncertainty, and denying the realities of finite resources, some have argued that the best bet is to maximize the number of scientists, using that logic to justify a steady-state production of new PhDs, regardless of whether the labour market is sending signals of increasing or decreasing demand for that supply. Only recently have we begun to explore the effects of the current arrangement on the process of knowledge production, and on innovation in particular (Charlton & Andras, 2008; Kolata, 2009).…most of those who express such concerns have been reticent to acknowledge the role of universities themselves in creating and maintaining the situationBruce Alberts, in the above-mentioned editorial, points to several initiatives launched by the NIH that aim to get a larger share of NIH funding into the hands of young scientists with particularly innovative ideas. These include the “New Innovator Award,” the “Pioneer Award” and the “Transformational R01 Awards”. The proportion of NIH funding dedicated to these awards, however, amounts to “only 0.27% of the NIH budget” (Alberts, 2009). Such a small proportion of the NIH budget does not seem likely to generate a large amount of more innovative science. Moreover, to the extent that such initiatives actually succeed in enticing more young investigators to become dependent on NIH funds, any benefit these efforts have in terms of innovation may be offset by further increases in competition for resources that will come when these new ‘innovators'' reach the end of this specialty funding and add to the rank and file of those scrapping for funds through the standard mechanisms.Our studies on research integrity have been mostly oriented towards understanding how the influences within which academic scientists work might affect their behaviour, and thus the quality of the science they produce (Anderson et al, 2007a, 2007b; Martinson et al, 2009, 2010). My colleagues and I have focused on whether biomedical researchers perceive fairness in the various exchange relationships within their work systems. I am persuaded by the argument that expectations of fairness in exchange relationships have been hard-wired into us through evolution (Crockett et al, 2008; Hsu et al, 2008; Izuma et al, 2008; Pennisi, 2009), with the advent of modern markets being a primary manifestation of this. Thus, violations of these expectations strike me as potentially corrupting influences. Such violations might be prime motivators for ill will, possibly engendering bad-faith behaviour among those who perceive themselves to have been slighted, and therefore increasing the risk of research misconduct. They might also corrupt the enterprise by signalling to talented young people that biomedical research is an inhospitable environment in which to develop a career, possibly chasing away some of the most talented individuals, and encouraging a selection of characteristics that might not lead to optimal effectiveness, in terms of scientific innovation and productivity (Charlton, 2009).To the extent that we have an ecology with steep competition that is fraught with high risks of career failure for young scientists after they incur large costs of time, effort and sometimes financial resources to obtain a doctoral degree, why would we expect them to take on the additional, substantial risks involved in doing truly innovative science and asking risky research questions? And why, in such a cut-throat setting, would we not anticipate an increase in corner-cutting, and a corrosion of good scientific practice, collegiality, mentoring and sociability? Would we not also expect a reduction in high-risk, innovative science, and a reversion to a more career-safe type of ‘normal'' science? Would this not reduce the effectiveness of the institution of biomedical research? I do not claim to know the conditions needed to maximize the production of research that is novel, innovative and conducted with integrity. I am fairly certain, however, that putting scientists in tenuous positions in which their careers and livelihoods would be put at risk by pursuing truly revolutionary research is one way to insure against it.  相似文献   

18.
Brothers in arms     
Andrea Rinaldi 《EMBO reports》2013,14(10):866-870
The horrific injuries and difficult working conditions faced by military medical personnel have forced the military to fund biomedical research to treat soldiers; those new technologies and techniques contribute significantly to civilian medicine.War is the father of all things, Heraclitus believed. The military''s demand for better weapons and transportation, as well as tools for communication, detection and surveillance has driven technological progress during the past 150 years or so, producing countless civilian applications as a fallout. The military has invested heavily into high-energy physics, materials science, navigation systems and cryptology. Similarly, military-funded biomedical research encompasses the whole range from basic to applied research programmes (Fig 1), and the portion of military-funded research in the biological and medical fields is now considerable.Open in a separate windowFigure 11944 advertisement for Diebold Inc. (Ohio, USA) in support of blood donations for soldiers wounded in the Second World War. The military has traditionally been one of the greatest proponents of active research on synthetic blood production, blood substitutes and oxygen therapeutics for treating battlefield casualties. One recent approach in this direction is The Defense Advanced Research Projects Agency''s (DARPA''s) Blood Pharming programme, which plans to use human haematopoietic stem cells—such as those obtained from umbilical cord blood—as a “starting material to develop an automated, fieldable cell culture and packaging system capable of producing transfusable amounts of universal donor red blood cells” (http://www.darpa.mil/Our_Work/DSO/Programs/Blood_Pharming.aspx).War has always driven medical advances. From ancient Roman to modern times, treating the wounds of war has yielded surgical innovations that have been adopted by mainstream medicine. For example, the terrible effect of modern artillery on soldiers in the First World War was a major impetus for plastic surgery. Similarly, microbiology has benefited from war and military research: from antiseptics to prevent and cure gangrene to the massive production of penicillin during the Second World War, as well as more basic research into a wide range of pathogens, militaries worldwide have long been enthusiastic sponsors of microbiology research. Nowadays, military-funded research on pathogens uses state-of-the-art genotyping methods to study outbreaks and the spread of infection and seeks new ways to combat antibiotic resistance that afflicts both combatants and civilians.…military-funded biomedical research encompasses the whole range from basic to applied research programmes…The US Military Infectious Diseases Research Program (MIDRP) is particularly active in vaccine development to protect soldiers, especially those deployed overseas. Its website notes that: “Since the passing of the 1962 Kefauver–Harris Drug Amendment, which added the FDA requirement for proof of efficacy in addition to proof of safety for human products, there have been 28 innovative vaccines licenced in the US, including 13 vaccines currently designated for paediatric use. These 28 innovative vaccine products targeted new microorganisms, utilized new technology, or consisted of novel combinations of vaccines. Of these 28, the US military played a significant role in the development of seven licenced vaccines” (https://midrp.amedd.army.mil/). These successes include tetravalent meningococcal vaccine and oral typhoid vaccine, while current research is looking into the development of vaccines against malaria, dengue fever and hepatitis E.Similarly, the US Military HIV Research Program (MHRP) is working on the development of a global HIV-1 vaccine (http://www.hivresearch.org). MHRP scientists were behind the RV144 vaccine study in Thailand—the largest ever HIV vaccine study conducted in humans—that demonstrated that the vaccine was capable of eliciting modest and transient protection against the virus [1]. In the wake of the cautious optimism raised by the trial, subsequent research is providing insights into the workings of RV144 and is opening new doors for vaccine designers to strengthen the vaccine. In a recent study, researchers isolated four monoclonal antibodies induced by the RV144 vaccine and directed at a particular region of the HIV virus envelope associated with reduced infection, the variable region 2. They found that these antibodies recognized HIV-1-infected CD4(+) T cells and tagged them for attack by the immune system [2].In response to the medical problems military personnel are suffering in Iraq and Afghanistan, a recent clinical trial funded by the US Department of the Army demonstrated the efficacy of the aminoglycoside antibiotic paromomycin—either with or without gentamicin—for the topical treatment of cutaneous leishmaniasis, the most common form of infection by Leishmania parasites. Cutaneous leishmaniasis—which is endemic in Iraq and Afghanistan and rather frequent among soldiers deployed there—is transmitted to humans through the bite of infected sandflies: it causes skin ulcers and sores and can cause serious disability and social prejudice [3]. Topical treatments would offer advantages over therapies that require the systemic administration of antiparasitic drugs. The study—a phase 3 trial—was conducted in Tunisia and enrolled some 375 patients with one to five ulcerative lesions from cutaneous leishmaniasis. Patients, all aged between 5 and 65, received topical applications of a cream containing either 15% paromomycin with 0.5% gentamicin, 15% paromomycin alone or the control cream, which contained no antibiotic. The combination of paromomycin and gentamicin cured cutaneous leishmaniasis with an efficacy of 81%, compared with 82% for paromomycin alone and just 58% for control—the skin sores of cutaneous leishmaniasis often heal on their own. Researchers reported no adverse reactions to paronomycin-containing creams. Because the combination therapy with gentamicin is probably effective against a larger range of Leishmania parasitic species and strains causing the disease, it could become a first-line treatment for cutaneous leishmaniasis on a global scale the authors concluded [3].…military-funded research on pathogens uses state-of-the-art genotyping methods to study outbreaks and the spread of infectionNot surprisingly, trauma and regenerative and reconstructive medicine are other large areas of research in which military influence is prevalent. The treatment of wounds, shock and the rehabilitation of major trauma patients are the very essence of medical aid on the battlefield (Figs 2, ,3).3). “Our experience of military conflict, in particular the medicine required to deal with severe injuries, has led to significant improvements in trauma medicine. Through advances in the prevention of blood loss and the treatment of coagulopathy for example, patients are now surviving injuries that 5–10 years ago would have been fatal,” said Professor Janet Lord, who leads a team investigating the inflammatory response in injured soldiers at the National Institute for Health Research Surgical Reconstruction and Microbiology Research Centre (NIHR SRMRC) in Birmingham, UK (http://www.srmrc.nihr.ac.uk/).Open in a separate windowFigure 2Medical services in Britain, 1917. Making an artificial leg for a wounded serviceman at Roehampton Hospital in Surrey. This image is from The First World War Poetry Digital Archive, University of Oxford (www.oucs.ox.ac.uk/ww1lit). Copyright: The Imperial War Museum.Open in a separate windowFigure 3US soldiers use the fireman''s carry to move a simulated casualty to safety during a hyper-realistic training environment, known as trauma lanes, as part of the final phase of the Combat Lifesaver Course given by medics from Headquarters Support Company, Headquarters and Headquarters Battalion, 25th Inf. Div., USD-C, at Camp Liberty, Iraq, March 2011. Credit: US Army, photo by Sgt Jennifer Sardam.NIHR SRMRC integrates basic researchers at Birmingham University with clinicians and surgeons at the Royal Centre for Defence Medicine and University Hospital Birmingham to improve the treatment of traumatic injury in both military and civilian patients. As Lord explained, the centre has two trauma-related themes. The first is looking at, “[t]he acute response to injury, which analyses the kinetics and nature of the inflammatory response to tissue damage and is developing novel therapies to ensure the body responds appropriately to injury and does not stay in a hyper-inflamed state. The latter is a particular issue with older patients whose chance of recovery from trauma is significantly lower than younger patients,” she said. The second theme is, “[n]eurotrauma and regeneration, which studies traumatic brain injury, trying to develop better ways to detect this early to prevent poor outcomes if it goes undiagnosed,” Lord said.Kevlar helmets and body armour have saved the lives of many soldiers, but they do not protect much the face and eyes, and in general against blasts to the head. Because human retinas and brains show little potential for regeneration, patients with face and eye injuries often suffer from loss of vision and other consequences for the rest of their lives. However, a new stem cell and regenerative approach for the treatment of retinal injury and blindness is on the horizon. “Recent progress in stem cell research has begun to emerge on the possible exploitation of stem cell-based strategies to repair the damaged CNS (central nervous system). In particular, research from our laboratory and others have demonstrated that Müller cells—dormant stem-like cells found throughout the retina—can serve as a source of retinal progenitor cells to regenerate photoreceptors as well as all other types of retinal neurons,” explained Dong Feng Chen at the Schepens Eye Research Institute, Massachusetts Eye and Ear of the Harvard Medical School in Boston (Massachusetts, USA). In collaboration with the US Department of Defence, the Schepens Institute is steering the Military Vision Research Program, “to develop new ways to save the vision of soldiers injured on today''s battlefield and to push the frontier of vision technologies forward” (http://www.schepens.harvard.edu).“My laboratory has shown that adult human and mouse Müller cells can not only regenerate retina-specific neurons, but can also do so following induction by a single small molecule compound, alpha-aminoadipate,” Chen explained. She said that alpha-aminoadipate causes isolated Müller glial cells in culture to loose their glial phenotype, express progenitor cell markers and divide. Injection of alpha-aminoadipate into the subretinal space of adult mice in vivo induces mature Müller glia to de-differentiate and generate new retinal neurons and photoreceptor cells [4]. “Our current effort seeks to further elucidate the molecular pathways underlying the regenerative behaviour of Muller cells and to achieve functional regeneration of the damaged retina with small molecule compounds,” Chen said. “As the retina has long served as a model of the CNS, and Müller cells share commonalities with astroglial lineage cells in the brain and spinal cord, the results of this study can potentially be broadened to future development of treatment strategies for other neurodegenerative diseases, such as brain and spinal cord trauma, or Alzheimer and Parkinson disease.”The treatment of wounds, shock and the rehabilitation of major trauma patients are the very essence of medical aid on the battlefieldBrain injuries account for a large percentage of the wounds sustained by soldiers. The Defense Advanced Research Projects Agency (DARPA), an agency of the US Department of Defense, recently awarded US$6 million to a team of researchers to develop nanotechnology therapies for the treatment of traumatic brain injury and associated infections. The researchers are trying to develop nanoparticles carrying small interfering RNA (siRNA) molecules to reach and treat drug-resistant bacteria and inflammatory cells in the brain. Protecting the siRNA within a nanocomplex covered with specific tissue homing and cell-penetrating peptides will make it possible to deliver the therapeutics to infected cells beyond the blood–brain barrier—which normally makes it difficult to get antibiotics to the brain. The project has been funded within the framework of DARPA''s In Vivo Nanoplatforms programme that “seeks to develop new classes of adaptable nanoparticles for persistent, distributed, unobtrusive physiologic and environmental sensing as well as the treatment of physiologic abnormalities, illness and infectious disease” (www.darpa.mil).“The DARPA funding agency often uses the term ‘DARPA-hard'' to refer to problems that are extremely tough to solve. What makes this a DARPA-hard problem is the fact that it is so difficult to deliver therapeutics to the brain. This is an underserved area of research,” explained team leader Michael Sailor, from the University of California San Diego, in a press release (http://ucsdnews.ucsd.edu/pressrelease/darpa_awards_6_million_to_develop_nanotech_therapies_for_traumatic_brain_in).In the near future, DARPA, whose budget is set for a 1.8% increase to US$2.9 billion next year, will focus on another important project dealing with the CNS. The BRAIN Initiative—short for Brain Research through Advancing Innovative Neurotechnologies—is a new research effort whose proponents intend will “revolutionize our understanding of the human mind and uncover new ways to treat, prevent, and cure brain disorders like Alzheimer''s, schizophrenia, autism, epilepsy and traumatic brain injury” (www.whitehouse.gov). Out of a total US$110 million investment, DARPA will obtain US$50 million to work on understanding the dynamic functions of the brain and demonstrating breakthrough applications based on these insights (Fig 4). In addition to exploring new research areas, this money will be directed towards ongoing projects of typical—although not exclusive—military interest that involve enhancing or recovering brain functions, such as the development of brain-interfaced prosthetics and uncovering the mechanisms underlying neural reorganization and plasticity to accelerate injury recovery.Open in a separate windowFigure 4The BRAIN (Brain Research through Advancing Innovative Neurotechnologies) Initiative infographic. A complete version can be downloaded at http://www.whitehouse.gov/infographics/brain-initiative.“[T]here is this enormous mystery waiting to be unlocked, and the BRAIN Initiative will change that by giving scientists the tools they need to get a dynamic picture of the brain in action and better understand how we think and how we learn and how we remember. And that knowledge could be—will be—transformative,” said US President Obama, presenting the initiative (http://www.whitehouse.gov/the-press-office/2013/04/02/remarks-president-brain-initiative-and-american-innovation).“The President''s initiative reinforces the significance of understanding how the brain records, processes, uses, stores and retrieves vast quantities of information. This kind of knowledge of brain function could inspire the design of a new generation of information processing systems; lead to insights into brain injury and recovery mechanisms; and enable new diagnostics, therapies and devices to repair traumatic injury,” explained DARPA Director Arati Prabhakar in a press release (http://www.darpa.mil/NewsEvents/Releases/2013/04/02.aspx).But BRAIN is also stirring up some controversy. Some scientists fear that this kind of ‘big and bold'' science, with a rigid top-down approach and vaguely defined objectives, will drain resources from smaller projects in fundamental biology [5]. Others ask whether the BRAIN project investment will really generate the huge return hinted at in Obama''s speech during the initiative''s launch, or whether a substantial amount of hype about the potential outcomes was used to sell the project (http://ksj.mit.edu/tracker/2013/04/obamas-brain-initiative-and-alleged-140).As these examples show, the most important player in military-funded biomedical research is the USA, with the UK following at a distance. But other countries with huge defence budgets are gearing up, although with less visibility. In July 2011, for instance, India and Kyrgyzstan opened the joint Mountain Biomedical Research Centre at the Kyrgyz capital Bishkek, to carry out research into the mechanisms of short- and long-term high-altitude adaptation. The institute will use molecular biological approaches to identify markers for screening people for high-altitude resistance and susceptibility to high-altitude sickness, and development of other mountain maladies. On the Indian side, the scientists involved in the new research centre belong to the Defence Institute of Physiology and Applied Sciences, and the money came from India''s defence budget.As mankind seems unlikely to give up on armed conflicts anytime soon, war-torn human bodies will still need to be cured and wounds healed. Whether the original impetus for military-funded biomedical research is noble or not, it nonetheless fuels considerable innovation leading to important medical discoveries that ultimately benefit all.  相似文献   

19.
20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号