首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
There is no perfect recipe to balance work and life in academic research. Everyone has to find their own optimal balance to derive fulfilment from life and work. Subject Categories: S&S: Careers & Training

A few years ago, a colleague came into my office, looking a little irate, and said, “I just interviewed a prospective student, and the first question was, ‘how is work‐life balance here?’”. Said colleague then explained how this question was one of his triggers. Actually, this sentiment isn''t unusual among many PIs. And, yet, asking about one''s expected workload is a fair question. While some applicants are actually coached to ask it at interviews, I think that many younger scientists have genuine concerns about whether or not they will have enough time away from the bench in order to have a life outside of work.In a nutshell, I believe there is no one‐size‐fits‐all definition of work–life balance (WLB). I also think WLB takes different forms depending on one''s career stage. As a new graduate student, I didn''t exactly burn the midnight oil; it took me a couple of years to get my bench groove on, but once I did, I worked a lot and hard. I also worked on weekends and holidays, because I wanted answers to the questions I had, whether it was the outcome of a bacterial transformation or the result from a big animal experiment. As a post‐doc, I worked similarly hard although I may have actually spent fewer hours at the bench because I just got more efficient and because I read a lot at home and on the six train. But I also knew that I had to do as much as I could to get a job in NYC where my husband was already a faculty member. The pressure was high, and the stress was intense. If you ask people who knew me at the time, they can confirm I was also about 30 pounds lighter than I am now (for what it''s worth, I was far from emaciated!).As an assistant professor, I still worked a lot at the bench in addition to training students and writing grant applications (it took me three‐plus years and many tears to get my first grant). As science started to progress, work got even busier, but in a good way. By no means did I necessarily work harder than those around me—in fact, I know I could have worked even more. And I’m not going to lie, there can be a lot of guilt associated with not working as much as your neighbor.My example is only one of millions, and there is no general manual on how to handle WLB. Everyone has their own optimal balance they have to figure out. People with children or other dependents are particularly challenged; as someone without kids, I cannot even fathom how tough it must be. Even with some institutions providing child care or for those lucky enough to have family take care of children, juggling home life with “lab life” can create exceptional levels of stress. What I have observed over the years is that trainees and colleagues with children become ridiculously efficient; they are truly remarkable. One of my most accomplished trainees had two children, while she was a post‐doc and she is a force to be reckoned with—although no longer in my laboratory, she still is a tour de force at work, no less with child number three just delivered! I think recruiters should view candidates with families as well—if not better—equipped to multi‐task and get the job done.There are so many paths one can take in life, and there is no single, “correct” choice. If I had to define WLB, I would say it is whatever one needs to do in order to get the work done to one''s satisfaction. For some people, putting in long days and nights might be what is needed. Does someone who puts in more hours necessarily do better than one who doesn''t, or does a childless scientist produce more results than one with kids? Absolutely not. People also have different goals in life: Some are literally “wedded” to their work, while others put much more emphasis on spending time with their families and see their children grow up. Importantly, these goals are not set in stone and can fluctuate throughout one''s life. Someone recently said to me that there can be periods of intense vertical growth where “balance” is not called for, and other times in life where it is important and needed. I believe this sentiment eloquently sums up most of our lives.Now that I''m a graying, privileged professor, I have started to prioritize other areas of life, in particular, my health. I go running regularly (well, maybe jog very slowly), which takes a lot of time but it is important for me to stay healthy. Pre‐pandemic, I made plans to visit more people in person as life is too short not to see family and friends. In many ways, having acquired the skills to work more efficiently after many years in the laboratory and office, along with giving myself more time for my health, has freed up my mind to think of science differently, perhaps more creatively. It seems no matter how much I think I’m tipping the balance toward life, work still creeps in, and that’s perfectly OK. At the end of the day, my work is my life, gladly, so I no longer worry about how much I work, nor do I worry about how much time I spend away from it. If you, too, accomplish your goals and derive fulfillment from your work and your life, neither should you.  相似文献   

2.
3.
4.
An interview with Facundo D Batista, The EMBO Journal new Editor‐in‐Chief.

An interview with Facundo D. Batista, The EMBO Journal new Chief Editor. Facundo D. Batista has shaped our understanding of the molecular and cellular biology of B‐cell activation. In 2016, he relocated his lab to Massachusetts General Hospital/M.I.T./Harvard’s Ragon Institute to explore the translational potential of two decades of basic research in B‐cell biology. The interview was conducted by Thiago Carvalho. Thiago Carvalho (TC): What inspired you to pursue a career in science? Facundo D Batista (FDB): I was very inspired by my undergraduate course on molecular biology at the University of Buenos Aires. The course was given for the first time, and we were taught the basic techniques of handling DNA, producing insulin, and so forth. Two professors in the course, Daniel Goldstein and Alberto Kornblihtt, really primed us to open our horizons and encouraged training in centers of excellence abroad. I did not speak any English at all, and applying to graduate school in the United States and doing the GRE was impossible for me. I would not have passed. Then, an opportunity to go to Italy and get experience in institutes that could provide me with better training came up. If I recall correctly, we were the first generation of Argentinian biology graduates—myself, Pablo Pomposiello, and many others—that left Argentina looking for a PhD. In general, people would try for a postdoc.I applied to a PhD program in Italy. I went with an open ticket for a year. If I had not passed the ICGEB/SISSA (Trieste) examination, I had three thousand dollars to travel around, and then I would go back to Argentina. I had never been in Europe before. So, for me it was an experience. What happened was that I was very lucky to be admitted in probably the first generation of this new institution, the International Centre for Genetic Engineering and Biotechnology in Italy. In three years, I finished my PhD, and then, to be honest, as an Argentinian in Europe, I did not have many postdoctoral funding opportunities either. TC: How did you move from Trieste to Cambridge’s Laboratory of Molecular Biology? FDB: I found Michael Neuberger’s laboratory to be very appealing, and I wrote to Michael. He replied to me, in a letter that I still keep, that—if I was able to obtain a fellowship—he would take me in his laboratory. A wonderful thing about EMBO was that it would recognize the country where you did your PhD when considering postdoctoral fellowship applications, giving me access to this important funding support. 1 It was the very early days of diversity—the notion that people could be eligible for support based not only on their nationality, but also on their “scientific nationality”. It gave me a unique opportunity. TC: It was also an opportunity to meet another source of inspiration for you, César Milstein FDB: César was not well at the time, he had heart problems. But I met him, and I felt very close because Michael was working with César, and he worked next door. For me, walking in those corridors with César Milstein and several other Nobel Prize winners—you know, Aaron Klug and Max Perutz—it was a dream. I could not believe that you could have lunch with these wonderful people, and they would come and talk to you, not as Dr. Klug or Dr. Milstein, but they would be César, Aaron, and Max. That for me was totally mind‐changing, together with my relationship with Michael, whom I love. They completely changed my perspective on science. TC: What do you remember most about Michael Neuberger as a mentor? FDB: What was incredible about Michael was his clarity. You would present any biological problem to him, and he would crystallize in one sentence what the real question behind it was. He was amazing. Michael would enter into a state of thinking where he would stop looking at you and would start looking up at a wall and would start to concentrate for those 10, 20 minutes that you’d explain the problem. Then, he would come up with critical questions and he would be critical to the bones. I think that that is something that science has lost these days. I think that this notion of going deep into critically asking the right scientific questions has been lost as a tradition. It is something that I try to transmit to my postdocs and PhD students: Scientific criticism is not about personal or emotional evaluation. It is really about trying to nail down what the question is and how a project develops. I think that is what I remember most of Michael, his commitment to the people that worked with him and who surrounded him and that deep thinking and constant challenging about what is the next step. TC: In 2002, you started your laboratory at the London Research Institute FDB: I was at one stage considering staying at the LMB with my independent lab, and César and Michael were very supportive of that. But then came the opportunity to join the LRI—which at the time was still the ICRF. I was the last employee recruited (to the ICRF), and it was wonderful. The notion of changing environments again, changing colleagues. The LMB was not an immunology institute. It was a general research institute and the ICRF at that time was similar, with very little immunology. I have always valued the whole spectrum of biology from mathematical modeling to quantitative biology to biochemistry to technological inputs, to development, and so forth. TC: Your LRI laboratory revealed entirely new aspects of the molecular and cellular biology of B lymphocytes—one was the existence of organized membrane structures reminiscent of the immunological synapse first described in T cells that were crucial for activation. What are the implications of the immunological synapse for B‐cell function? FDB: It was a concept that was resisted by the B‐cell field. The notion at the time was that B cells would get activated by soluble antigens. But if you think about it, that does not make any sense. You will never reach a physiological concentration of a ligand that will allow you to engage a receptor in vivo at a low affinity. So in order to reach that concentration, you need to aggregate antigen on the surface of other cells first. And that makes the whole process much more efficient. It not only localizes the process into lymph nodes or spleens, but it also allows focusing the response into what the arrangement of a membrane is. I was not the first—the notion that antigens are on follicular dendritic cells was well‐established by early experiments. But I think our work transformed the field. A lot of laboratories have incorporated the notion that stimulating cells at the level of membranes changes the way that receptors perceive signals. This does not apply only to the B‐cell receptor, it applies to chemokines too, many of them are also coating the surface of other cells and that helps guide the signals that cells receive.I think that it is an important concept that is likely to be applicable to vaccines. There are several papers now showing that helping to aggregate antigens on the surface of macrophages or dendritic cells makes antigens more potent by driving them more efficiently into where they are used in follicles and lymph nodes. TC: What prompted your pivot to translational research? FDB: I had learned a lot about basic principles of B‐cell biology and antibody responses, but on model antigens. I felt at the time that translating that into humans and trying to understand how vaccines could be improved was an important step. I always like to recognize mentors or people who influenced me and one person who really influenced me in this thinking was Dennis Burton at Scripps. He was very early to incorporate into his HIV vaccine and antibody research people like me or Michelle Nussenzweig that were coming from basic B‐cell immunology to try to help to think about how vaccines can be improved. I decided to take a risk. I left a tenured, core‐funded position at the best institution in Europe to lead the Ragon Institute with Bruce Walker—I am the Associate Director and he is the Director—and brought my years of expertise at the ICGEB, LMB, LRI, and CRICK to a unique environment that is based on translational research. There is the incredible ecosystem of Harvard, MIT, and MGH, and the notion is to incorporate technologies and to incorporate immunology to tackle incredible challenges, like COVID‐19 is today. TC: Are there any major initiatives that you plan to focus on at The EMBO Journal? FDB: One of the things that I would really like to do is to involve the younger generations in the journal. I think that we have an opportunity for direct “translation”. I mean, EMBO has EMBO postdoctoral fellowships and EMBO young investigators, involving early career European scientists, but also scientists across the globe. We are discussing initiatives like, for example, inviting postdocs from different laboratories to present at the editorial meetings. The EMBO Journal has an open‐door policy in terms of people wanting to participate in the editorial meetings.I think that we have amazing scientists around the world that can really bring new views as to where the journal should be going. I feel strongly about that and about keeping a real sense of diversity in the journal, in terms of fields, in terms of gender, in terms of race, in getting people involved from Brazil, getting people involved from China, getting people involved from Japan, from across the globe. EMBO is no longer a European journal. EMBO is a journal whose office faces Europe, but it has a global outlook. TC: Early in their career, many researchers do not feel comfortable engaging with editors FDB: I sent one of my first papers as an independent P.I. to EMBO. That paper was editorially rejected. I replied to that rejection, saying that EMBO should stop publishing just biochemistry, and that they needed to appreciate the importance of quantitative cell biology. The paper was ultimately sent to review and accepted. What was also very positive was that a later review of the scope of The EMBO Journal came to a similar conclusion. That resulted in my appointment to the editorial advisory board of The EMBO Journal (I was not an EMBO member at the time). The positive message is that the journal very much welcomed receiving feedback. That was what made me like the journal. I felt that the journal was ready to listen, to change.This is not my journal. It is the community’s journal. I am just playing a role, putting in some time and effort. There are a lot of things that I do not see and other young people could see, and I am looking for inspiration there, to listen and translate those things into good policies for the journal. I think that this is important and I think that this is at the basis of what I want to be as a chief editor.  相似文献   

5.
When will COVID‐19 ever end? Various countries employ different strategies to address this; time will tell what the best response was. Subject Categories: S&S: Economics & Business, Microbiology, Virology & Host Pathogen Interaction, S&S: Ethics

Peter Seeger’s anti‐war song with its poignant refrain, stretching out the second “ever” to convey hopeless fatigue with the continuing loss of life, applies to the pandemic too. “Where have all the old folks gone?” may replace the loss of young men in Seeger’s song. But they keep going, and it is not happening on distant continents; it is happening with them distanced in places they called home. At the time of writing in early March, there are a few answers to Seeger’s question from around the world. There are the isolationists who say that maintaining a tight cordon around a COVID‐free zone is the way to get out of the pandemic. There are the optimists with undiluted faith in the vaccines who say it will be all over when everyone will get a jab. And there are the fatalists who say it will eventually end when herd immunity stops the pandemic after many people have died or fallen ill.Living in Australia where there are only sporadic cases of COVID, it is tempting to see the merits of the isolationist strategy. Only a small number of international travelers can enter the continent every week. Coming back from Europe in November, arrival at Brisbane airport was followed by police‐cordoned transfer to a pre‐allocated hotel—no choice, no balcony, no open windows—where we stayed (and paid) for a 14‐day confinement. On release, it was strange to find that life was close to normal: no masks and nearly no restrictions for public and private meetings. Sporting events and concerts do not have attendance restrictions. All that was different were easy‐to‐follow rules about social distancing in shops or on the streets, limited numbers of people on lifts, and a requirement to register when going to a restaurant or bar.Since I settled back to COVID‐free life in Australia, the last incident in Queensland occurred a month ago when a cleaner at a quarantined hotel got infected. It was “treated” with an instant 3‐day “circuit‐breaker” lockdown for the whole community. Forensic contact tracing was easy, and large numbers of people lined up for testing. Seven days later, the outbreak was declared over. A police inquiry examined the case to see whether regulations needed to be changed. The same rapid and uncompromising lockdown protocols have been employed in Melbourne, Perth, or New Zealand whenever somebody in the community tested positive. There is also continuous monitoring of public wastewater for viral RNA to quickly identify any new outbreak. Small numbers of positive cases are treated with maximum restrictions until life can return to “normal”. The plan is to expand these state policies to achieve a COVID‐free in Australia along with New Zealand and eventually the Pacific Islands.The strict isolationist policy has its downsides. Only Australian citizens or permanent residents are allowed to enter the country. Families have been separated for months. Sudden closing of borders makes the country play some musical chair game: When the whistle is blown, you stay where you are. Freedoms that have been considered as human rights have been side‐stepped. Government control is overt. Nonetheless, the dominant mood is that the good of the community trumps that the individual rights, which may come as a surprise in a liberal democratic society. People benefit from the quality of (local) life, and while there is an economic hiatus for tourism and international student business, the overall economy will come out without too much damage. Interestingly, the most draconian State leaders get the highest rating in the polls and elections. Clear, unwavering leadership is appreciated.Given their geographical situation, Australia, New Zealand, and other islands have clear advantages in pursuing their successful isolationist policies. For most of the rest of the world though, the answer to “when will it ever end” points resolutely and confidently to vaccines. With amazing speed and fantastic efforts, scientists in university and industry laboratories all over the world developed these silver bullets, the Krypton that will put the virus in its place. Most countries have now placed all their chips on the vaccine square of the roulette table.However, there are some aspects to consider before COVID will raise the white flag. It will take months to achieve herd immunity; a long time during which deaths, illness, and restrictions will continue. With different vaccines in production and use, it is likely that some will protect better against the virus than others. The duration of their protection is still unclear, and hence, the vaccine roll‐out could be interminable. More SARS‐CoV‐2 variants are on the rise challenging the long‐term efficacy of the vaccine(s). The logistics and production demands are significant and will become even more acute as the vaccines go to developing countries. Anti‐vaxxers already see this as an opportunity to spread their mixture of lies, exaggerations, and selective information, which may make it more difficult to inoculate sufficient numbers in some communities. And yet, for most countries, there is no real alternative to breaking the vicious cycle of persistent local infections that are slowed by restrictions only to explode again when Christmas or business or the public mood demands a break. The optimists are realists in this scenario.The third cohort are the fatalists. The Spanish flu ended after two years, and 50 million deaths and COVID will also run out of susceptible targets in due course. But herd immunity is a crude concept when the herd is people: our families, friends, and neighbors. Fatalism could translate into doing nothing and let people die and that is not a great policy when facing disaster.The alternative of doing nothing is to combine various strategies as Israel and the UK are doing: to adopt some of the isolationist approaches while vaccinating as many people as quickly as possible. The epidemiological data indeed show that restrictions on interactions do reduce the number of cases. Some countries, Ireland for example, have seen ten‐fold reductions in daily cases even before the first needle hit an arm following tightening of social interactions. This shows that the real impact of the vaccination will only be known when a sufficient percentage of the population has been immunized and the social restrictions are lifted. Australia with its significant travel restrictions is another successful example. In addition, contact tracing and testing are very helpful to contain outbreaks and create corona‐free zones that can be expanded in a controlled manner. Of course, there are local, political, and economic factors at play, but these should not block attempts to lower infection rates until sufficient numbers of vaccine doses become available.So, the answer to the question “when will it ever end?” will require a combination of the isolationists and the optimists such that the fatalist solution does not prevail. It will be interesting to revisit this question in two years’ time to see what the correct answer turns out to be.  相似文献   

6.
Writing and receiving reference letters in the time of COVID. Subject Categories: Careers

“People influence people. Nothing influences people more than a recommendation from a trusted friend. A trusted referral influences people more than the best broadcast message.” —Mark Zuckerberg.
I regularly teach undergraduate courses in genetics and genomics. Sure enough, at the end of each semester, after the final marks have been submitted, my inbox is bombarded with reference letter requests. “Dear Dr. Smith, I was a student in your Advanced Genetics course this past term and would be forever grateful if you would write me a reference for medical school…” I understand how hard it can be to find references, but I have a general rule that I will only write letters of support for individuals that I have interacted with face‐to‐face on at least a few occasions. This could include, for example, research volunteers in my laboratory, honors thesis students that I have supervised, and students who have gone out of their way to attend office hours and/or been regularly engaged in class discussions. I am selective about who I will write references for, not because I am unkind or lazy, but because I know from experience that a strong letter should include concrete examples of my professional interactions with the individual and should speak to their character and their academic abilities. In today''s highly competitive educational system, a letter that merely states that a student did well on the midterm and final exams will not suffice to get into medical or graduate school.However, over the past 2 years many, if not most, students have been attending university remotely with little opportunity to foster meaningful relationships with their instructors, peers, and mentors, especially for those in programs with large enrollments. Indeed, during the peak of Covid‐19, I stopped taking on undergraduate volunteers and greatly reduced the number of honors students in my laboratory. Similarly, my undergraduate lectures have been predominantly delivered online via Zoom, meaning I did not see or speak with most of the students in my courses. It did not help that nearly all of them kept their cameras and microphones turned off and rarely attended online office hours. Consequently, students are desperately struggling to identify individuals who can write them strong letters of reference. In fact, this past spring, I have had more requests for reference letters than ever before, and the same is true for many of my colleagues. Some of the emails I have received have been heartfelt and underscore how taxing the pandemic has been on young adults. With permission, I have included an excerpt from a message I received in early May:Hi Dr. Smith. You may not remember me, but I was in Genome Evolution this year. I enjoyed the class despite being absent for most of your live Zoom lectures because of the poor internet connection where I live. Believe it or not, my mark from your course was the highest of all my classes this term! Last summer, I moved back home to rural Northern Ontario to be closer to my family. My mom is a frontline worker and so I''ve been helping care for my elderly grandmother who has dementia as well as working part‐time as a tutor at the local high school to help pay tuition. All of this means that I''ve not paid as much attention to my studies as I should have. I''m hoping to go to graduate school this coming fall, but I have yet to find a professor who will write a reference for me. Would you please, please consider writing me a letter?I am sympathetic to the challenges students faced and continue to face during Covid‐19 and, therefore, I have gone out of my way to provide as many as I can with letters of support. But, it is no easy feat writing a good reference for someone you only know via an empty Zoom box and a few online assignments. My strategy has been to focus on their scholarly achievements in my courses, providing clear, tangible examples from examinations and essays, and to highlight the notable aspects of their CVs. I also make a point to stress how hard online learning can be for students (and instructors), reiterating some of the themes touched upon above. This may sound unethical to some readers but, in certain circumstances, I have allowed students to draft their own reference letters, which I can then vet, edit, and rewrite as I see fit.But it is not just undergraduates. After months and months of lockdowns and social distancing, many graduate students, postdocs, and professors are also struggling to find suitable references. In April, I submitted my application for promotion to Full Professor, which included the names of 20 potential reviewers. Normally, I would have selected at least some of these names from individuals I met at recent conferences and invited to university seminars, except I have not been to a conference in over 30 months. Moreover, all my recent invited talks have been on Zoom and did not include any one‐on‐one meetings with faculty or students. Thus, I had to include the names of scientists that I met over 3 years ago, hoping that my research made a lasting impression on them. I have heard similar anecdotes from many of my peers both at home and at other universities. Given all of this, I would encourage academics to be more forthcoming than they may have traditionally been when students or colleagues approach them for letters of support. Moreover, I think we could all be a little more forgiving and understanding when assessing our students and peers, be it for admissions into graduate school, promotion, or grant evaluations.Although it seems like life on university campuses is returning to a certain degree of normality, many scholars are still learning and working remotely, and who knows what the future may hold with regard to lockdowns. With this uncertainty, we need to do all we can to engage with and have constructive and enduring relationships with our university communities. For undergraduate and graduate students, this could mean regularly attending online office hours, even if it is only to introduce yourself, as well as actively participating in class discussions, whether they are in‐person, over Zoom, or on digital message boards. Also, do not disregard the potential and possibilities of remote volunteer research positions, especially those related to bioinformatics. Nearly, every laboratory in my department has some aspect of their research that can be carried out from a laptop computer with an Internet connection. Although not necessarily as enticing as working at the bench or in the field, computer‐based projects can be rewarding and an excellent path to a reference letter.If you are actively soliciting references, try and make it as easy as possible on your potential letter writers. Clearly and succinctly outline why you want this person to be a reference, what the letter writing/application process entails, and the deadline. Think months ahead, giving your references ample time to complete the letter, and do not be shy about sending gentle reminders. It is great to attach a CV, but also briefly highlight your most significant achievements in bullet points in your email (e.g., Dean''s Honours List 2021–22). This will save time for your references as they will not have to sift through many pages of a CV. No matter the eventual result of the application or award, be sure to follow up with your letter writers. There is nothing worse than spending time crafting a quality support letter and never learning the ultimate outcome of that effort. And, do not be embarrassed if you are unsuccessful and need to reach out again for another round of references—as Winston Churchill said, “Success is stumbling from failure to failure with no loss of enthusiasm.”  相似文献   

7.
We need more openness about age‐related infertility as it is a particular risk for many female scientists in academia who feel that they have to delay having children. Subject Categories: S&S: Careers & Training, Genetics, Gene Therapy & Genetic Disease

Balancing motherhood and a career in academic research is a formidable challenge, and there is substantial literature available on the many difficulties that scientists and mothers face (Kamerlin, 2016). Unsurprisingly, these challenges are very off‐putting for many female scientists, causing us to keep delaying motherhood while pursuing our hypercompetitive academic careers with arguments “I’ll wait until I have a faculty position”, “I’ll wait until I have tenure”, and “I’ll wait until I’m a full professor”. The problem is that we frequently end up postponing getting children based on this logic until the choice is no longer ours: Fertility unfortunately does decline rapidly over the age of 35, notwithstanding other potential causes of infertility.This column is therefore not about the challenges of motherhood itself, but rather another situation frequently faced by women in academia, and one that is still not discussed openly: What if you want to have children and cannot, either because biology is not on your side, or because you waited too long, or both? My inspiration for writing this article is a combination of my own experiences battling infertility in my path to motherhood, and an excellent piece by Dr. Arghavan Salles for Time Magazine, outlining the difficulties she faced having spent her most fertile years training to be a surgeon, just to find out that it might be too late for motherhood when she came out the other side of her training (Salles, 2019). Unfortunately, as academic work models remain unsupportive of parenthood, despite significant improvements, this is not a problem faced only by physicians, but also one faced by both myself and many other women I have spoken to.I want to start by sharing my own story, because it is a bit more unusual. I have a very rare (~ 1 in 125,000 in women (Laitinen et al, 2011)) congenital endocrine disorder, Kallmann syndrome (KS) (Boehm et al, 2015); as a result, my body is unable to produce its own sex hormones and I don’t have a natural cycle. It doesn’t take much background in science to realize that this has a major negative impact on my fertility—individuals with KS can typically only conceive with the help of fertility treatment. It took me a long time to get a correct diagnosis, but even before that, in my twenties, I was being told that it is extremely unlikely I will ever have biological children. I didn’t realize back then that KS in women is a very treatable form of infertility, and that fertility treatments are progressing forward in leaps and bounds. As I was also adamant that I didn’t even want to be a mother but rather focus on my career, this was not something that caused me too much consternation at the time.In parallel, like Dr. Salles, I spent my most fertile years chasing the academic career path and kept finding—in my mind—good reasons to postpone even trying for a child. There is really never a good time to have a baby in academia (I tell any of my junior colleagues who ask to not plan their families around “if only X…” because there will always be a new X). Like many, I naïvely believed that in vitro fertilization (IVF) would be the magic bullet that can solve all my fertility problems. I accordingly thought it safe to pursue first a faculty position, then tenure, then a full professorship, as I will have to have fertility treatment anyhow. In my late twenties, my doctors suggested that I consider fertility preservation, for example, through egg freezing. At the time, however, the technology was both extravagantly expensive and unreliable and I brushed it off as unnecessary: when the time comes, I would just do IVF. In reality, the IVF success rates for women in their mid‐to‐late 30s are typically only ~ 40% per egg retrieval, and this only gets worse with age, something many women are not aware of when planning parenthood and careers. It is also an extremely strenuous process both physically and emotionally, as one is exposed to massive doses of hormones, multiple daily injections, tremendous financial cost, and general worries about whether it will work or not.Then reality hit. What I believed would be an easy journey turned out to be extremely challenging, and took almost three years, seven rounds of treatment, and two late pregnancy losses. While the driving factor for my infertility remained my endocrine disorder, my age played an increasing role in problems responding to treatment, and it was very nearly too late for me, despite being younger than 40. Despite these challenges, we are among the lucky ones and there are many others who are not.I am generally a very open person, and as I started the IVF process, I talked freely about this with female colleagues. Because I was open about my own predicament, colleagues from across the world, who had never mentioned it to me before, opened up and told me their own children were conceived through IVF. However, many colleagues also shared stories of trying, and how they are for various—not infrequently age‐related—reasons unable to have children, even after fertility treatment. These experiences are so common in academia, much more than you could ever imagine, but because of the societal taboos that still surround infertility and pregnancy and infant loss, they are not discussed openly. This means that many academic women are unprepared for the challenges surrounding infertility, particularly with advanced age. In addition, the silence surrounding this issue means that women lose out on what would have otherwise been a natural support network when facing a challenging situation, which can make you feel tremendously alone.There is no right or wrong in family planning decisions, and having children young, delaying having children or deciding to not have children at all are all equally valid choices. However, we do need more openness about the challenges of infertility, and we need to bring this discussion out of the shadows. My goal with this essay is to contribute to breaking the silence, so that academics of both genders can make informed choices, whether about the timing of when to build a family or about exploring fertility preservation—which in itself is not a guaranteed insurance policy—as relevant to their personal choices. Ultimately, we need an academic system that is supportive of all forms of family choices, and one that creates an environment compatible with parenthood so that so many academics do not feel pressured to delay parenthood until it might be too late.  相似文献   

8.
Freelancer     
What long‐term changes can we expect, in how academic work is conducted and remunerated, in the post‐pandemic world? Subject Categories: S&S: Economics & Business, S&S: History & Philosophy of Science, S&S: Ethics

Although still two years away, my looming “retirement” from university employment is inevitably going to herald a major change of life. “Of course, you''ll become ‘Emeritus’”, most colleagues have opined. My answer to all of them has been a firm “No. I''ll become a freelancer”. The concept of a freelance scientist is obviously so alien to most of them that they invariably change the subject immediately. However, my gut feeling is that in 20 years or less, almost all of us will be freelancers of some kind.The COVID‐19 pandemic has altered the world of work in very obvious ways. There has been much talk of how the changes are likely to carry over to the future, even if more traditional patterns will probably reassert themselves in the short to medium term. Working from home, conducting meetings remotely, not wasting days travelling between continents for a few precious hours of face‐time and being free to structure workdays around our own priorities: these are the most obvious novelties that many believe will continue long after the effects of the pandemic on health and wealth have faded. But I have a slightly different take.Major disruptive events of worldwide import—world wars, global economic slumps, cataclysmic volcanic eruptions and pandemics—have often been harbingers of profound social change. This is not only due to their direct and immediate effects, but more so because the disruption accelerates and facilitates changes that were already happening. In the case of COVID‐19, one may place in this category the demise of cash, the rise of streaming services in place of live entertainment, online grocery shopping and even virtual dating. Another is paying people to stay home and do nothing, otherwise known as the universal basic income (or, in the USA, “stimulus cheques”).Inefficient practices in academia are equally ripe for change. Why bother with classes for 500 first‐year students when a much better edition of the lecture by an expert communicator is available on the internet? What’s the use of an ageing PhD advisor 20 years away from bench science, who struggles to guide the next generation of experimentalists in the lab, when the expertise of a plethora of specialists can easily be accessed online? What’s the value in published papers that are read by fewer people than wrote them? Or in seminars delivered to a roomful of attentive postdocs and PhD students who lack the courage or the time to address even a single question to the speaker?Yes, there is still great value in small‐group teaching and mentorship, in the creative verve of a close‐knit laboratory team, and in good writing and oratory: but the required skills are already different from those in which we were schooled. Thus, even if I do not hold in my palm the crystal ball to predict exactly which changes will happen and how fast, I believe that our traditional jobs are going to melt away very fast in the post‐pandemic world. Universities and research institutes may still exist, but I expect that their practices will be different, reshaped by rational need more than by tradition. Today’s academic science is already quite unlike that of 1920, but it has evolved so slowly during that century—spanning a much longer time period than the lifetime of a scientific career—that we barely perceive the changes that have occurred. In contrast, the changes now afoot will certainly happen much faster, especially since the funds to support the current “inefficient” model are likely to diminish rapidly.So, I predict that university teaching and science communication in general will be the first to evolve into freelance activities, where universities will invite bids from individuals or their agents and award commissions on a fee‐paying basis rather than using salaried employees. But these are not the only component parts of academia facing such a shake‐up. The practices of laboratory science are also likely to be rebuilt. When discussing with colleagues how research might be undertaken on a freelance basis, they usually raise issues such as bricks and mortar and the complex infrastructure that is needed to sustain cutting‐edge research, especially in the life sciences: how, they ask, could a freelancer access state‐of‐the‐art imaging, mass spectrometry or DNA sequencing? How could their acquisition of such expensive hardware possibly be financed, especially if they had to somehow acquire it personally and set it up in the garage or carry it around with them?But the answers to these questions are already evident in the practices of some major research agencies, most notably in Europe’s pioneering funder of single‐investigator grants for blue‐skies science, the European Research Council (ERC). The ERC already treats its awardees as freelancers, in the sense that it encourages them to shop around for the most attractive venue in which to embed and implement their research project. The quest for the best host institution takes place not only at the preparatory step of an ERC application: it also happens after the grant is awarded, since the grant money is considered inherently portable and can even be moved later on from one institution to another. This encourages potential host universities to compete for providing the best research environment, in which many factors come into play, not just but not least, the quality of its research infrastructure. How well it supports, rather than burdens its staff with administrative tasks, the nature of its recruitment and personnel policies, how it handles relocation issues for incoming researchers and their families, what opportunities it provides for further training in relevant skills and career development: these are just some of the factors in play.In recent years, universities have seen their primary role in this process as encouraging their own tenured or tenure‐track staff to apply for ERC grants. But I foresee the emphasis shifting increasingly to investigators who seek out universities that can make the most appealing offer, whilst universities and government agencies standing behind them will shape their policies so as to remain competitive. Moreover, in such a landscape there is no reason why a scientist cannot operate research projects on multiple sites if this offers the most convenient arrangement. The tools for remote meetings and cloud computing to which we have all become accustomed mean that there is no longer any need for a research group to be located in one building or even in one country, to operate efficiently as a team.At the same time, many of the tasks involved in running a research institute or department can be efficiently outsourced to the most competitive bidder—to be assessed on the basis of value‐for‐money, not just minimum cost. As a society, we should be asking ourselves why we continue to waste the talents of highly specialized scientists on performing admin tasks for which they are neither properly trained nor motivated, instead of just engaging a smart‐software developer. Why should we fund creative thinkers to undertake laboratory projects in host institutions that do not have the required state‐of‐the‐art facilities to perform them? Or allocate budgets that are so pared down that grantholders cannot even afford to purchase such services elsewhere? Why should we expect them to make do with poorly paid trainees instead of a team of professionals? And why should we continue to organize research in pyramid structures where everything depends on commands from the top, where all findings are announced using an institutional slide template, where colleagues elsewhere are considered as untrustworthy “competitors”, and where credit for individual creativity is usurped by seniors who barely know the contents of the papers they “write”?In the “old system”, we have all gotten used to making do with sub‐optimal working arrangements and grumbling about them, whilst considering them an immutable fact of life. But I envisage a time coming soon where we scientists will have the edge in reshaping the market for teaching and research in a way that is much more to our liking and properly aligned with our skills. At the same time, our individual success in accomplishing our professional goals will have a direct effect on our income and job satisfaction, and steer us towards activities where our talents are most effectively deployed. In short, I believe that we, as freelance scientists, will be much more firmly in control of science in the future and that time is not far off.  相似文献   

9.
Academia has fostered an unhealthy relationship with alcohol that has an undeniable impact on the health and behaviour of students and staff. Subject Categories: S&S: History & Philosophy of Science, Chemical Biology, S&S: Ethics

University life has a lot to offer. And, for better or worse, much of it goes hand in hand with a bottle. Believe it or not, I was a bit of teetotaler in my undergraduate days but quickly made up for it in graduate school, where each celebration included inebriation. Indeed, my initial tour of the laboratory I eventually worked in included a refreshing visit to the grad club. Orientation week ended with a marathon beer blitz at a nightclub. The semester’s first invited seminar speaker was welcomed with the sounds of loose change, ice buckets and the clickity‐clack of organic microbrews being opened. Our inaugural genome evolution journal club was such a success that we vowed to spill even more red wine onto our notebooks the following week. In hindsight, I should have realized at this early stage in my studies that I was fostering an unhealthy and unsustainable relationship between biology and booze. Unfortunately, my post‐graduate education in alcohol didn’t stop there.Like many keen students, I arrived at my first scientific conference with a belly full of nerves and a fistful of drink tickets, which I quickly put to good use at the poster session. The successful completion of my PhD proposal assessment was met with pats on the back as I was swiftly marched off to a local pub with no chance of escape. My first peer‐reviewed paper literally arrived with a pop as Champagne was generously poured into plastic cups for the entire laboratory group. My failures, too, were greeted with a liberal dose of ethanol. “Sorry you came up short on that scholarship application, Smitty. It’s nothing a little weapons‐grade Chianti won’t cure.” “That experiment failed again! Come on, let me buy you a lunchtime martini to make up for it.” Soon I learnt that every academic event, achievement or ailment, no matter how big or small, could be appropriately paired with beer, wine or spirit. Missing from the menu were two crucial ingredients for any burgeoning researcher: moderation and mindfulness.But it was the older vintages that really inspired me – the legendary drinking escapades of my scientific mentors, advisors and idols. The tale of professor so‐and‐so who at that epic meeting in 1993 polished off an entire magnum of rosé at dinner and then went on to deliver among the greatest keynote lectures on record at 9 am the following morning. That celebrated chaired researcher who kept the single malt next to the pipette tips for quick and easy access. The grizzled evolutionary ecologist who never went into the field without half a dozen cans of high‐end smoked oysters and two hip flaks, which didn’t contain water. And so, when I was told by someone in the know of how the most famous geneticist on campus wrote that monumental Nature paper (the one I’d read ten times!) while locked in his office for twelve hours with a six‐pack, I bought into the romance hook, line and sinker. The result: I’ve been nursing a recurring headache for nearly two decades and I’m still waiting on that Nature paper. Most importantly, I now realize the various dangers of romanticizing the bottle, especially for individuals in mentorship positions.Like my idols before me, I’ve accrued a cask full of well‐oaked academic drinking stories, except that they haven’t aged well. There is that heroic evening of intense scotch‐fueled scientific discussion, which led to me forfeiting two front teeth to the concrete sidewalk (my mother still thinks it was a squash accident). Or that time I commemorated the end of a great conference in Barcelona by throwing up on the front window of a café while the most prominent minds in my field sipped aperitifs inside (thank god this was before Twitter). Even more romantic: me buying a bottle of Cotes de Nuits Burgundy at Calgary airport on route to a job interview, discreetly opening the bottle in‐flight because economy class wine sucks, and then being met by airport security upon landing. Let’s just say I didn’t get the job. To some, these anecdotes might seem light‐hearted or silly, but they are actually rather sad and underscore the seriousness of substance abuse. Many readers will have their own complicated experiences with alcohol in academia and, I believe, will agree that it is high time we asked ourselves: are we training our graduate students to be great thinkers or great drinkers? Moreover, this question does not address the equally if not more serious issue of excessive drinking among undergraduate students.As I sit at my desk writing this, I think to myself: is it normal that within a two‐minute walk of my university office there are three different places on campus that I can have a beer before lunch, not including the minifridge behind my desk? Is it normal that in my department the first thing we do after a student defends their thesis is go to the grad club where they can have any alcoholic drink of their choosing for free from the goblet of knowledge, which is kept on a pedestal behind the bar? Is it normal that before the COVID pandemic when I was visiting a prominent university for an invited talk, one of the professors I met with offered me a glass of expensive Japanese gin at 11 am in the morning? (And, yes, I accepted the drink.)Of course, if you don’t want to drink you can just say no. But we are learning more and more how institutional cultures – “the deeply embedded patterns of organisational behaviour and the shared values, assumptions, beliefs or ideologies that members have about their organisation or its work” (Peterson & Spencer, 1991) – can have powerful effects on behaviour. Excessive alcohol consumption is undeniably an aspect of collegial culture, one that is having major impacts on the health and behaviour of students and staff, and one that I’ve been an active participant in for far too long. I’ll be turning forty in a few months and I have to face the fact that I’ve already drunk enough alcohol for two lifetimes, and not one drop of it has made me a better scientist, teacher or mentor. The question remains: how much more juice can I squeeze into this forty‐year‐old pickled lemon? Well, cheers to that.  相似文献   

10.
When universities need to make staffing cuts to balance the books, do they always do so in the fairest and most rational manner?

Universities, like most other large organizations, undergo periodic restructuring, expansions and contractions, driven largely by changes in their balance sheet. The contractions are often abrupt and painful. Typically, this outcome is preceded by a period when the top management already perceived a growing problem but postponed action because of its debilitating effects on morale and on the organization’s core functions, in the hope that “something might turn up” that would obviate the need for drastic cuts. Most often, something does not turn up, and the cuts end up being even more severe.The first sign of trouble is usually a “message to all staff” that seems to come out of the blue, couched in the most anodyne of terms, or even trying to put a positive spin on what is a fundamentally destructive process. But staff about to be made redundant do not appreciate being referred to as “person‐years”. It also seems completely pointless to dress up more or less arbitrary decisions on whom to terminate as “negotiations”, thus apportioning a share of blame to the union representatives who have not much say in these “negotiations” anyway. Although the cuts typically affect non‐academic staff, such as lab technicians, IT and audiovisual support, financial administrators, providers of student welfare services or travel and hospitality arrangers, academic staff at all levels are sometimes affected as well.Most of us can read between the lines and find disingenuous statements of this type insulting rather than reassuring. Let me translate this for you into plain English:“Due to the fact that our senior management has totally screwed up the university’s finances, we need to decrease our staff costs by 20%. Since most staff would not accept to do the same work for 20% less pay, we will instead just fire 20% of the staff. If you are one of the 20% who are able to do your job and have been properly trained, we strongly advise you to immediately seek employment elsewhere. After all, you might find yourselves fired in the next round of cuts, if these ones don’t prove sufficient”.The duties of the 20% who leave will be transferred to the 80% who remain. Since they will be required to carry out additional tasks beyond those that they currently undertake but were never trained to perform in the first place, it is natural that many of them will go on permanent sick leave or retire early, thus reducing our staff costs further and avoiding undue stress on persons we have failed to train properly.As a result, academic staff currently engaged on less quantifiable activities such as research and teaching must shoulder some of the burden. We hope to avoid firing academic staff but be aware that you may also be terminated if performance targets are missed, especially if you are unable to undertake simple obligations to help the university to function properly instead of wasting all your energy on research and teaching.The next phase of this process will be the outsourcing of many of the duties that the poorly trained 80% of staff and academics cannot do or are unwilling to do. We will invite tenders from different companies and always pick the lowest bidder, regardless of the quality of the services they are able to perform. However, be aware that academic staff who make use of these services must employ the companies contracted by the university and no others. You will also have to pay for these services from research grants or other income.Spin‐off companies that you have created could also bid for some of these services, which might enable you to cover the costs during the 2–3 years before your fledgling company goes bankrupt.Please also consider if and how you can outsource your own research and teaching, which would enable us to fire academic staff as well, in any future cost‐cutting exercise.If this process is successful, the university hopes to recycle any excess savings into a new round of Strategic Interconvertibility and Sustainable Innovation (SISI) grants.And above all, please do remember and implement our university’s new slogan “The only useful research is market research”. Kit from HR, aka Howy Jacobs  相似文献   

11.
12.
Since COVID‐19 hit last year, lecturers and professors have been exploring digital and other tools to teach and instruct their students. Subject Categories: S&S: Careers & Training, Methods & Resources

As Director of the Digital Pedagogy Lab at the University of Colorado in Denver, USA, Michael Sean Morris’ work took on new significance as the COVID19 pandemic hit campuses around the world. “What happened with the pandemic was a lot of people who weren''t accustomed to teaching online, or dealing with distance learning, or remote learning in any way, shape, or form, really tried to create a live classroom situation on their screen, mostly using Zoom or other similar technologies”, Morris said. “With technology now, we can do things which make us feel closer. So, we can do a Zoom; there can be synchronous chat in technologies like Slack, or discussion forums or what‐have‐you to make you feel like you''re closer, to make you feel like you''re sort of together at the same time. But the majority of online learning actually has been asynchronous, it''s been everyone coming in when they can and doing their work when they can”.Educators have been divided over the use of online learning. But this changed when a deadly pandemic forced everyone from kindergarten to university into digital spaces. Luckily, many digital tools, such as Zoom, Slack, Blackboard Collaborate, or WhatsApp, were available to enable the migration. Nonetheless, teachers, lecturers, and professors struggle to educate their students with knowledge and the hands‐on training that is paramount for teaching biology.
… teachers, lecturers and professors struggle to educate their students with knowledge and the hands‐on training that is paramount for teaching biology.
  相似文献   

13.
14.
Borrowed robes     
Should scientists indulge their fantasies by writing fiction? Subject Categories: Careers, Economics, Law & Politics, History & Philosophy of Science

Like a substantial fraction of the literate population, I have a collection of unpublished novels in the drawer. Six of them in fact. Some of them were composed in barely more than a week, and others I have been struggling to complete for over 10 years: so maybe it is more accurate to say five and a half. Anyhow, most of them are good to go, give or take a bit of editorial redlining. Or, as my helpful EMBO editor would say, the removal of thousands of unnecessary adverbs and dubiously positioned commas.What do I write about and why? My style is not unique but rather particular. I write fiction in the style of non‐fiction. My subject matter is somewhere in the general realms of science fiction, alternate history and political drama. Putting these ingredients together, and taking account of my purported day job as a serious scientist, it is easy to see why my fictional work is potentially subversive—which is one reason why I have been rather reluctant thus far to let it out of the drawer. At the very least, I should take pains to conceal my identity, lest it corrupts perceptions of my scientific work. Even if I regularly tell my students not to believe everything they read, it would impose far too great a burden on them if they came to question my peer‐reviewed articles purely on the basis of untrue statements published in my name, spoken by jaded politicians, washed‐up academics or over‐credulous journalists. Even if they are imaginary. Real journalists are theoretically bound by strict rules of conduct. But imaginary ones can do whatever they like.Today, I noticed a passage in one of these unpublished works that is clearly written in the style of a young William Shakespeare, dealing with a subject matter that fits neatly into one of his most famous plays. In fact, the illusion was such that I was sure I must have lifted the passage from the play in question and set about searching for the quote, which I then could and should cite. Yet, all Internet searches failed to find any match. The character in whose mouth I placed the words was depicted as being in a delirious state where the boundaries of fact and fiction in his life were already blurred; borrowed identities being one of the themes of the entire novel and arguably of my entire oeuvre. But am I guilty here of plagiarism or poetry, in adopting the borrowed identity of my national playwright?In another work, I lay great emphasis on the damaging role of mitochondrial reactive oxygen species (ROS) as the cause of biological ageing. I have even grafted this explanation onto a thinly disguised version of one of my most valued colleagues. Although there is some support for such a hypothesis from real science, including some papers that I have myself co‐authored, it is also a dangerously broad generalization that leads easily into wrong turnings and misconstructions—let alone questionable policies and diet advice. But, by advancing this misleading and overly simplistic idea in print, have I potentially damaged not only my own reputation, but that of other scientists whom I respect? Even if the author’s identity remains hidden.In one novel, I fantasize that nuclear weapons, whilst they do undoubtedly exist, have in fact been engineered by their inventors so as never actually to work, thus preventing their possible misuse by vainglorious or lunatic politicians unconcerned with the loss of millions of lives and planetary ruin. But if any insane national leader—of which there are unfortunately far too many—would actually come to believe that my fiction in the style of non‐fiction were true, they might indeed risk the outbreak of nuclear war by starting a conventional one in order to secure their strategic goals.Elsewhere, I vindicate one author of published claims that were manifestly based on falsified data, asserting him to have instead been the victim of a conspiracy launched to protect the family of an otherwise much respected American President. None of which is remotely true. Or at least there is no actual evidence supporting my ridiculous account.I have great fun writing fiction of this kind. It is both liberating and relaxing to be able to ignore facts and the results of real experiments and just invent or distort them to suit an imaginary scenario. In an age when the media and real politicians have no qualms about propagating equally outrageous “alternative facts”, I can at least plead innocent by pointing out that my lies are deliberate and labelled as such, even if people might choose to believe them.In a further twist, the blurb I have written to describe my latest work characterizes it as the “semi‐fictionalized” biography of a real person, who was, in fact, a distant relative of mine. But if it is semi‐fictionalized, which bits are true and which are made up? Maybe almost the whole thing is invented? Or maybe 99% of it is based on demonstrable facts? Maybe the subject himself concocted his own life story and somehow planted it in falsified documents and newspaper articles to give it an air of truth. Or maybe the assertion that the story is semi‐fictionalized is itself a fictional device, that is, a lie. Perhaps the central character never existed at all.It is true (sic) that the most powerful fiction is grounded in fact—if something is plausible, it is all the more demanding of our attention. And, it can point the way to truths that are not revealed by a simple catalogue of factual information, such as in a scientific report.But I have already said too much: if any of my novels ever do find their way into print, and should you chance to read them, I will be instantly unmasked. So maybe I’ll have to slot in something else in place of my pseudo‐Shakespearean verse, mitochondrial ROS hypothesis, defunct weapons of mass destruction and manipulated data manipulation.  相似文献   

15.
I am just starting my career as a cancer biologist, but I have always been a Black man in America. This means that I have always inhabited a world that generally disregarded my existence in some form or another. It is June 17th, 2020 and protests have been happening for weeks since the killing of George Floyd in Minneapolis. The current state of America may be uneasy for some, but for many Americans, the looming threat of exclusion and violence has been an unwelcome companion since birth. This letter is not about a single person, but the Black academic’s experience of race inside and outside of the academy during a time of social upheaval. I have trained in a variety of institutions, big and small, and all the while acutely aware of the impact of my Blackness on my science. The intent of the following is to provoke the reader to reflect on how we as a nation can move toward radically positive change and not incremental adjustments to the status quo. The views expressed are my own and are the result of years of personal experience observing the anti-Black standard in America.

About the AuthorI am currently a cancer biologist at the University of Minnesota Medical School. My lab works to eliminate cancer health disparities in African Heritage communities and investigates the roles of lipids in modifying the immune response in tumors. This is what I do, but not all of who I am. I am also the eldest child of a mother, who managed to convince me that she had eyes in the back of her head (thank you, Mom; it kept me honest). I am a big brother, a husband, and a father. I also consider myself a fortunate Black man in America. I grew up in places where many of my friends did not live to adulthood. If they managed to survive past adolescence, it was usually their dreams that died prematurely. I was lucky to have survived and to continue chasing my dream of becoming a scientist. I never considered myself the fastest, strongest, or even smartest kid growing up, but I was the most determined. Determined, despite the lack of access to role models in science that looked like me or shared my life experience. Now my mission is to increase the number of dreams achieved and impact as many young minds as my time on this planet permits.As a Black scientist, I sometimes have to remind myself that I have never been immune to racism. Because as you spend thousands of hours delving into the microscopic world, the macroworld starts to fade into the background like white noise. And if you get good at it, you almost forget about the strange looks, the excessive questioning, or even the obligatory “tailing” in stores, on campus, or at home. But it is strange to realize how much you have grown accustomed to discrimination and the fact that you unconsciously prepare for it daily, before it ever shows its ugly head, like a prize fighter training months before a fight.This past month, amid the Coronavirus Disease 2019 (COVID-19) pandemic, the rest of the world has decided to say police are bad, and oh, by the way, Black lives matter too—as if the oppression of Black bodies was new, or as though the recent string of names added to the ever-growing list of innocent Black Americans killed by authorities is an atypical occurrence. Well sadly it is not, and it never has been in this country or any other place with colonial origins. That is the truth, and there is no other way to state it. America is a country built on and driven by racist ideology.So, as a Black American in an “essential” worker role (I am now working on COVID–19-related research), I have physically been at work daily during the pandemic, as the spirit of solidarity sweeps the globe. As much as I want to say this is progress, I find myself asking “why now, and not then?” Why didn’t this happen when Trayvon Martin was murdered; why didn’t this happen when Rodney King was beaten (Alvarez and Buckley, 2013; Mullen and Skitka, 2006)? Is it a sign of the end times, or is it just that racism/White supremacy has finally run its course?I have a theory about why we are now seeing a mass movement against discrimination and police brutality (a.k.a. state-sanctioned murder). My theory states that had it not been for COVID-19 and the nationwide shutdown of normal life, none of this protesting would even be feasible. Why do you ask? The simple answer is that some people with the financial means can normally find ways to distract themselves with various activities, some noble and some … not so much, whereas other folks are less able to disconnect from the drudgery of hand-to-mouth living. Leave it to a global health crisis to reprioritize everyone’s entire life in one fell swoop. Suddenly, people who had vacation plans are stuck at home, whereas people who were just making ends meet are now unable to make those ends meet anymore. The haves and the have-nots are now both in an altered reality. Does this make them equal now? No, but it does allow people to see who their real friends, allies, and enemies are. I suspect that it’s the pulling back of the curtain that has made many people ready to fight, not to mention it is also very likely that many folks, after experiencing weeks of cabin fever, just needed some way to let off all that pent-up energy.Before COVID-19 became a full-time concern, tensions in the United States were already high as the recent killings of unarmed Black Americans (Breonna Taylor and Ahmaud Arbery) had gone viral and cries for justice echoed from coast to coast (Lovan, 2020). Once the reality of the pandemic set in and shelter-in-place orders were issued nationally, the situation became a powder keg waiting for just the right moment. That moment happened in North Minneapolis on May 25, 2020. With the release of the video showing the killing of George Floyd, the entire country and much of the world had a reason to go on a “righteous rampage” that has seemed to get the results some thought impossible to achieve. It cannot be overstated how critical social media has been in displaying the oppression of Black Americans at the hands of authorities to the entire world.Now, several months into the protests, the possibility of a “new’’ new normal has people dreaming of singing Kumbaya in technicolor. Yet, as one of the few Black faculty on my campus, I still feel like people are watching me, but for a different reason now. As various reforms are broadcast across the university, the random wellness “check-ins” start creeping in, and the requests for feedback on “new initiatives’’ seem to be like a new flavor of spam in my inbox.Now, I do appreciate the fact that people are starting to notice the oppressive nature of not being White in today’s world (in particular being Black in America), but I have been doing this for a while now, and I am not sure if hashtagged initiatives are healthy for anyone. Plus, it’s kind of creepy watching all of these people jump on the social justice bandwagon, when they weren’t here 4 mo ago or 4 years ago. For many Black academics, it is not about being involved with something when it’s trending; it’s about being “about that life” when it is inconvenient as hell. Again, I do appreciate the fact that more people are willing to fight oppression, racism, and White supremacy (even if only digitally), but you will have to forgive me if I do not trust you just yet. I mean, you are just checking in during what could be the last leg of a marathon, and we’ve been running this whole damn time!Here is a short answer to every wellness check-in email that many of the Black academics I know have received in the last 2 mo: “we were never okay in the first place, but thanks for FINALLY asking!” We don’t need any more bias training, hashtags, or email check-ins. It was a nice start, but it too has become a part of the status quo. The work now and always has been the eradication of underrepresentation, hurtful socialization, and ridiculously skewed power dynamics, not just the awareness of the fact. I don’t have all the answers, but if real change is desired, I think we can first start by teaching history accurately to EVERYONE, no more whitewashing the reality of America’s story and ignoring the contributions of Black academics (and Black Americans in general). Second, stop being silent when you see or hear racism at work or home. If you do nothing when racism shows up, you ARE a racist! Third, the privileged class must relinquish their “privilege” once and for all. That means the powers that were inherited based on historical (and present day) theft and oppression have to dissipate, with the ultimate goal of power sharing. The country club atmosphere of academia and the “fit culture” must erode in favor of true meritocracy. The best person for the job and not “the person who won’t make me uncomfortable by making me see my own deeply held prejudices and fears.”Honestly, Black academics SHOULD not be charged with the task of fixing broken systems, along with protecting themselves and mentees, while working toward tenure. But if we (Black academics) are not driving the car, progress will likely go the wrong way again (getting rid of Uncle Ben and Aunt Jemima does not correct the underlying pathology). Paulo Freire’s Pedagogy of the Oppressed speaks to this in saying, “the violence of the oppressors prevents the oppressed from being fully human, the response of the latter to this violence is grounded in the desire to pursue the right to be human … the oppressed, fighting to be human, take away the oppressors’ power to dominate and suppress, they restore to the oppressors the humanity they had lost in the exercise of oppression.” (Friere, 1972, p. 56). This means that if we (Black academics) want to be treated as humans and as scholars, we must show you what that humanity looks like FIRST. Now the question is, are you willing to learn or are you going to co-opt this moment, this movement to make it into something that fits your preconceived notion of the acceptable levels of Blackness in the academy?  相似文献   

16.
Elda Grabocka investigates the role of stress granules in obesity and cancer.

When one thinks of high school, sharing hallways with students from 80 different countries is not the usual image that springs to mind. This was indeed Elda Grabocka’s experience. She grew up in Pogradec, a remote town in Albania—her parents, both physicians, were assigned to this location by the state. Elda won one of the two spots available for Albanian students in a national competition to attend the United World College of the Adriatic in Trieste, Italy, a high school focused on social change that brings together students from around the globe to promote intercultural understanding. Elda still remembers, with a smile on her face, the first glimpse at the laboratories as the senior students were working on their thesis projects: “That was exactly what I wanted to do!” She barely spoke English at the time and had to catch up to the level of her peers, but her perseverance and passion prevailed, and she obtained the International Baccalaureate Diploma (IBD). For the independent study of the IBD program, she submitted a research project in chemistry, which ended up being an important learning and life lesson: “That helped me understand that I was more suited to biology! In hindsight, it was great to have that experience so early; I certainly had no awareness then how essential failing and then learning from your failures is to science, but having a level of comfort with it from the beginning was probably a bonus.”Elda Grabocka. Photo courtesy of Chris Hamilton Photography.But science was not the only professional option Elda contemplated—her volunteering experience with relief organizations in various refugee camps made her consider a career in public health and humanitarian relief efforts. She finally sought a PhD degree in molecular pharmacology and structural biology in the laboratory of Phil Wedegaertner at Thomas Jefferson University. After studying heterotrimeric G-proteins and how the subcellular localization of their exchange factors regulates function, Elda felt the need to seek greener pastures. She went on to do a postdoc on one of longest-studied oncogenes, RAS—her choice wasn’t motivated by the field, but by the mentor, Dafna Bar-Sagi. Elda’s admiration for Dafna is notable when she speaks about her time at the New York University Langone Medical Center: “It’s remarkable how many novel aspects of RAS biology that have shaped and then re-shaped our thinking about this oncogene have come out of her lab; I felt there was a depth and breadth to her approach to scientific research that if I could learn, I’d be able to see more of the angles, so to speak, ask better questions; she has really expanded my mind in all those aspects.” Elda’s work focused on the interplay between the mutated forms of RAS and the wild-type isoforms, which she and others have shown is context dependent, with the wild-type isoforms acting as both tumor suppressors and tumor promoters (1). While still in Dafna’s laboratory, Elda pursued a more independent scientific interest: the role of stress granules in mutant KRAS cells. In 2016, Elda returned to her alma mater, joining the Department of Cancer Biology at the Sidney Kimmel Cancer Center at Jefferson as an assistant professor, with stress granules in cancer as the focus of her laboratory. We contacted her to learn more about her research journey.What interested you about stress granules and their connection with obesity and cancer?I became interested in stress granules and their potential role in cancer early in my postdoc. I read a review by Stephen Elledge’s group where they described the “stress phenotype” of cancer as an important player in tumorigenesis. I realized that cancer cells exist mostly in a state of stress—for example, mutated genes, like oncogenic RAS, are potent inducers of many types of cellular stresses. I was working on a RAS ubiquitination project, and one of the candidates for a RAS de-ubiquitinating enzyme we were looking at was implicated in stress granule formation. Little was known about stress granules at the time—they are induced by types of stresses associated with tumors (hypoxia, oxidative stress, osmotic pressure, proteotoxic stress, metabolic stress, etc.), so the question I asked was whether stress granules could function as a stress coping/adaptation mechanism in cancer. Indeed, I found that stress granules are prevalent in tissues from patients with pancreatic cancer and mouse models of pancreatic cancer. Remarkably, not all cancer cells are the same in their capacity to form stress granules—all cells will make stress granules under stress, but KRAS mutant cancer cells have a heightened ability to do so because signaling from mutant KRAS enhances the levels of a critical molecule to stress granule formation, 15-deoxy-prostaglandin J2 (2). This enhanced capacity to make stress granules, in turn, renders KRAS mutant cells more resistant to stress and more dependent on stress granules; inhibition of stress granules leads to increased cell death in KRAS mutant versus KRAS wild-type cancer cells.Immunofluorescence staining of pancreatic ductal adenocarcinoma tissue showing cancer cells in red, stress granules in green, and nuclei in blue. Image courtesy of the Grabocka laboratory.The work establishing this dependence was in vitro, so the primary goal when I started my laboratory was to determine their relevance in tumorigenesis, which led me to explore their connection to obesity and cancer for several reasons. First, obesity is a major predisposing factor for several cancers, including pancreatic and colon, which are prevalent KRAS-driven cancers for which treatment options are limited. Second, obesity is a complex pathology which likely impacts the pathobiology, the therapy response, and even the evolution of cancers that arise in this setting. Given that cell stress and inflammation are key features in obesity, this would make the ideal background to study the contribution of stress granules in tumorigenesis. I think this pre-existing stress [obesity] might necessitate the engagement of stress adaptive mechanisms from the early stages of tumorigenesis and may also lead to a high dependence on these processes.What are you currently working on, and what is up next for you?It’s a very exciting time to be working on stress granules! The field has grown significantly over the past 10 yr or so, especially with the renewed interest in phase separation. As organelles that form via phase separation when a cell is under stress, stress granules are perhaps one of the best examples of phase separation in vivo and a great platform to understand its relevance. The recent advances in defining the composition, as well as key molecular drivers and their functional domains in stress granule assembly, have been of great benefit. We are now better positioned to define the stress granule–specific functions in health and disease. Because stress granules are induced by various types of stresses, they could function as a pan-stress adaptation mechanism in cancer. This is a very appealing angle, as if we can solve how stress granules enable stress adaptation, which is a major focus of my laboratory, we could have better anti-cancer therapies.The composition of stress granules, comprising hundreds of proteins and mRNAs involved in several aspects of cell biology, prompted me to ask whether cytoprotection under stress is their main and/or only function. What other cellular processes stress granules regulate, whether these vary with the type of stress, and how such processes are integrated into the stress response of cancer cells are burning questions we are currently working on, as the answers will advance our understanding of the role of stress granules in cancer. The “chronic stress” of cancer is heterogenous in both spatial and temporal terms, as well as in the type of stress and intensity. I am also very curious to see if and how heterogeneity in stress stimuli impact the composition of stress granules and the processes they regulate, and how this may affect tumor evolution. Also, cancer cells are not the only cells in the tumor that make stress granules. As a matter of fact, we reported that KRAS mutant cells can stimulate stress granule formation in a paracrine manner. An ongoing project in the laboratory that I’m very excited about is focused on understanding the contribution of stress granules to the pro-tumorigenic microenvironment.What kind of approach do you bring to your work?My approach is very hypothesis and observation driven; the latter in the sense that it can often be that initial spark that inspires an idea, draws connections, and looks for context and meaning. I also find that sometimes the answer to my next question or the question I don’t know to ask yet is hidden right in front of my eyes, so paying careful attention to the data is key. It is also where objective and critical evaluation of experimental results starts. There’s one line that’s firmly ingrained in my mind from my postdoctoral training, which is “Science is self-correcting.” It’s a note of caution that if you don’t pay attention and see only what you want to see, it will still eventually prove you wrong, and you’d have wasted a lot of time in the process. So I try to minimize that waste as much as possible—unavoidable entirely, having a favorite hypothesis is part of the scientific thinking process, but crucial to remember to follow the data and not just convince yourself.What has been the biggest accomplishment in your career so far?I’m still quite early in my career to start listing accomplishments. I feel privileged to do the work I do; I essentially get funded to pursue ideas that I find interesting. So I have a hard time with this question because it has a hint of pride, and when you start adding pride to privilege, as a junior principal investigator especially, it gets a bit too self-serving. I hope that the work we are doing stands the test of time and leads to or helps lead to a meaningful impact on patients’ lives—that would be a great accomplishment.What has been the biggest challenge in your career so far?The past two years of COVID have certainly been a different reality, and a constantly shifting one at that. From a career perspective, so much of a scientific career happens at the bench: experiments happen at the bench, we train at the bench, animal work is long and requires multiple dedicated essential personnel and facilities, so inevitably, remote work, or shift work, limited occupancy, and the shortages we are now seeing in the supply chain have been a major challenge for everyone. I do think junior laboratories like mine experience that a bit harder. The bandwidth to absorb these challenges is much smaller if you’re just starting out, or if you’ve had a laboratory for a couple of years and are just ramping up. I must say though that it has made for stronger teamwork in the laboratory, and we’ve had to be really focused and efficient—so there’s an upside!Out for a paddle. Photo courtesy of Elda Grabocka.Any tips for a successful research career?Hard to say, because certainly it means different things to different people. The only tip I would give perhaps is to define what that means, what that success looks like for oneself, and be true to that. I expect how each one defines it also changes with time and experience, but I do think it’s very important to identify what success means as early as possible and let that be what you measure your efforts against. It’s easy to get distracted, overwhelmed, or even disheartened otherwise. My own definition is quite simple: success is doing what I love to do, working toward answering a meaningful scientific question, and enabling/supporting my trainees to reach their potential—keeping that in mind has been very important and helpful.  相似文献   

17.
18.
Commercial screening services for inheritable diseases raise concerns about pressure on parents to terminate “imperfect babies”. Subject Categories: S&S: Economics & Business, Molecular Biology of Disease

Nearly two decades have passed since the first draft sequences of the human genome were published at the eyewatering cost of nearly US$3 billion for the publicly funded project. Sequencing costs have dropped drastically since, and a range of direct‐to‐consumer genetics companies now offer partial sequencing of your individual genome in the US$100 price range, and whole‐genome sequencing for less than US$1,000.While such tests are mainly for personal peruse, there have also been substantial drops in price in clinical genome sequencing, which has greatly enabled the study of and screening for inheritable disorders. This has both advanced our understanding of these diseases in general, and benefitted early diagnosis of many genetic disorders, which is crucial for early and efficient treatment. Such detection can, in fact, now occur long before birth: from cell‐free DNA testing during the first trimester of pregnancy, to genetic testing of embryos generated by in vitro fertilization, to preconception carrier screening of parents to find out if both are carriers of an autosomal recessive condition. While such prenatal testing of foetuses or embryos primarily focuses on diseases caused by chromosomal abnormalities, technological advances allow also for the testing of an increasing number of heritable monogenic conditions in cases where the disease‐causing variants are known.The medical benefits of such screening are obvious: I personally have lost two pregnancies, one to Turner''s syndrome and the other to an extremely rare and lethal autosomal recessive skeletal dysplasia, and I know first‐hand the heartbreak and devastation involved in finding out that you will lose the child you already love so much. It should be noted though that, very rarely, Turner syndrome is survivable and the long‐term outlook is typically good in those cases (GARD, 2021). In addition, I have Kallmann syndrome, a highly genetically complex dominant endocrine disorder (Maoine et al, 2018), and early detection and treatment make a difference in outcome. Being able to screen early during pregnancy or childhood therefore has significant benefits for affected children. Many other genetic disorders similarly benefit from prenatal screening and detection.But there is also obvious cause for concern: the concept of “designer babies” selected for sex, physical features, or other apparent benefits is well entrenched in our society – and indeed culture – as a product from a dystopian future. Just as a recent example, Philipp Ball, writing for the Guardian in 2017, described designer babies as “an ethical horror waiting to happen” (Ball, 2017). In addition, various commercial enterprises hope to capitalize on these screening technologies. Orchid Inc claims that their preconception screening allows you to “… safely and naturally, protect your baby from diseases that run in your family”. The fact that this is hugely problematic if not impossible from a technological perspective has already been extensively clarified by Lior Pachter, a computational biologist at Caltech (Pachter, 2021). George Church at Harvard University suggested creating a DNA‐based dating app that would effectively prevent people who are both carriers for certain genetic conditions from matching (Flynn, 2019). Richard Dawkins at Oxford University recently commented that “…the decision to deliberately give birth to a Down [syndrome] baby, when you have the choice to abort it early in the pregnancy, might actually be immoral from the point of view of the child’s own welfare” (Dawkins, 2021).These are just a few examples, and as screening technology becomes cheaper, more companies will jump on the bandwagon of perfect “healthy” babies. Conversely, this creates a risk that parents come under pressure to terminate pregnancies with “imperfect babies” as I have experienced myself. What does this mean for people with rare diseases? From my personal moral perspective, the ethics are clear in cases where the pregnancy is clearly not viable. Yet, there are literally thousands of monogenic conditions and even chromosomal abnormalities, not all of which are lethal, and we are making constant strides in treating conditions that were previously considered untreatable. In addition, there is still societal prejudice against people with genetic disorders, and ignorance about how it is to live with a rare disease. In reality, however, all rare disease patients I have encountered are happy to be alive and here, even those whose conditions have significant impact on their quality of life. Many of us also don''t like the term “disorder” or “syndrome”, as we are so much more than merely a disorder or a syndrome.Unfortunately, I also see many parents panic about the results of prenatal testing. Without adequate genetic counselling, they do not understand that their baby’s condition may have actually a quite good prognosis without major impact on the quality of life. Following from this, a mere diagnosis of a rare disease – many of which would not even necessarily have been detectable until later in life, if at all – can be enough to make parents consider termination, due to social stigma.This of course raises the thorny issue of regulation, which range from the USA where there is little to no regulation of such screening technologies (ACOG, 2020), to Sweden where such screening technologies are banned with the exception of specific high‐risk/lethal medical conditions both parents are known carriers for (SMER, 2021). As countries come to grips with both the potential and the risks involved in new screening technologies, medical ethics board have approached this issue. And as screening technologies advance, we will need to ask ourselves difficult questions as a society. I know that in the world of “perfect babies” that some of these companies and individuals are trying to promote, I would not exist, nor would my daughter. I have never before had to find myself so often explaining to people that our lives have value, and I do not want to continue having to do so. Like other forms of diversity, genetic diversity is important and makes us richer as a society. As these screening technologies quickly advance and become more widely available, regulation should at least guarantee that screening must involve proper genetic counselling from a trained clinical geneticist so that parents actually understand the implications of the test results. More urgently, we need to address the problem of societal attitudes towards rare diseases, face the prejudice and fear towards patients, and understand that abolishing genetic diversity in a quest for perfect babies would impoverish humanity and make the world a much poorer place.  相似文献   

19.
2020 has been one of the craziest and strangest years we have lived through. Now that it’s over, it’s an opportunity to show gratitude for all the good things. Subject Categories: S&S: History & Philosophy of Science

I moved to New York City the year of the attacks on September 11, 2001, one of the bleakest moments in the history of the United States. I was also in New York City when Superstorm Sandy hit in 2012. Luckily, much fewer people died due to the storm, but it was incredibly disruptive to many scientists in the affected area—my laboratory had to move four times over a period of 6 years in the storm’s aftermath. These were awful, tragic events, but 2020 may go down in the record books as one of the most stressful and crazy years in modern times. Not to be outdone, 2021 has started terribly as well with COVID‐19 still ravaging the world and an attack on the US Capitol, something I thought I’d never see in my lifetime. The unnecessary deaths and the damage to America’s “House of the People” were heartbreaking.While these events were surely awful, nothing will be as crushing as the deaths of family members, close friends, and the children of friends; perhaps, it is these experiences—and the death of a beloved dog—that prepared me for this year and made me grateful, maybe even more than usual, for what I have. But in the age of a pandemic, what am I particularly grateful for?I''m ridiculously grateful to have a job, a roof over my head, and food security. The older I get, the more I see illness and injury affect my colleagues, family, and friends, I increasingly appreciate my good health. I am grateful for Zoom (no, I have no investment in Zoom)—not for the innumerable seminars or meetings I have attended, but for the happy hours that helped to keep me sane during the lockdown. Some of these were with my laboratory, others with friends or colleagues, sometimes spread over nine time zones. Speaking of which, I’m also grateful for getting a more powerful router for the home office.I''m immeasurably grateful to be a scientist, as it allows me to satisfy my curiosity. While not a year‐round activity, it is immensely gratifying to be able to go to my laboratory, set up experiments, and watch the results coming in. Teaching and learning from students is an incredible privilege and educating the next generation of scientists how to set up a PCR or run a protein gel can, as a well‐known lifestyle guru might say, spark serious joy. For this reason, I’m eternally grateful to my trainees; their endless curiosity, persistence, and energy makes showing up to the laboratory a pleasure. My dear friend Randy Hampton recently told me he received a student evaluation, thanking him for telling his virtually taught class that the opportunity to educate and to be educated is something worth being grateful for, a sentiment I passed onto a group of students I taught this past fall. I believe they, too, were grateful.While all of the above things focus on my own life, there are much broader things. For one, I am so grateful to all of those around the globe who wear masks and keep their distance and thereby keep themselves and others safe. I am grateful for the election of an American president who proudly wears a mask—often quite stylishly with his trademark Ray‐Ban Aviators—and has made fighting the COVID‐19 pandemic his top priority. President Biden''s decision to ramp up vaccine production and distribution, along with his federal mask mandate, will save lives, hopefully not just in the United States but worldwide.This Gen‐X‐er is also especially grateful to the citizens of Generations Y and Z around the world for fighting for social justice; I am hopeful that the Black Lives Matter movement has got traction and that we may finally see real change in how communities of color are treated. I have been heartened to see that in my adopted home state of New York, our local politicians ensure that communities that have been historically underserved are prioritized for COVID‐19 testing and vaccinations. Along these lines, I am also incredibly encouraged by the election of the first woman who also happens to be of African and Asian heritage to the office of vice president. Times are a changin''...While it is difficult to choose one, top thing to be grateful for, I would personally go for science. I am stoked that, faced with a global crisis, science came to the rescue, as it often has in the past. If I had to find a silver lining in COVID‐19—albeit it would be for the darkest of clouds—I am grateful for all of our colleagues, who despite their usual arguing, quickly and effectively developed tests, provided advice, epidemiological data and a better understanding of the virus and its mode of infection, and ultimately developed therapies and vaccines to save lives. The same can be said for the biotech and pharmaceutical industry that, notwithstanding its often‐noted faults, has been instrumental in developing, testing and mass‐producing efficient and safe vaccines in blistering, record time. Needless to say, I have also much gratitude to all of the scientists and regulators at the FDA and elsewhere who work hard to make life as we once knew it come back to us, hopefully in the near future.Once again, thank you for everything, Science.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号