首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
AmpD is a cytoplasmic peptidoglycan (PG) amidase involved in bacterial cell-wall recycling and in induction of β-lactamase, a key enzyme of β-lactam antibiotic resistance. AmpD belongs to the amidase_2 family that includes zinc-dependent amidases and the peptidoglycan-recognition proteins (PGRPs), highly conserved pattern-recognition molecules of the immune system. Crystal structures of Citrobacter freundii AmpD were solved in this study for the apoenzyme, for the holoenzyme at two different pH values, and for the complex with the reaction products, providing insights into the PG recognition and the catalytic process. These structures are significantly different compared with the previously reported NMR structure for the same protein. The NMR structure does not possess an accessible active site and shows the protein in what is proposed herein as an inactive “closed” conformation. The transition of the protein from this inactive conformation to the active “open” conformation, as seen in the x-ray structures, was studied by targeted molecular dynamics simulations, which revealed large conformational rearrangements (as much as 17 Å) in four specific regions representing one-third of the entire protein. It is proposed that the large conformational change that would take the inactive NMR structure to the active x-ray structure represents an unprecedented mechanism for activation of AmpD. Analysis is presented to argue that this activation mechanism might be representative of a regulatory process for other intracellular members of the bacterial amidase_2 family of enzymes.  相似文献   

2.
Hunter P 《EMBO reports》2011,12(6):504-507
New applications and technologies, and more rigorous safety measures could herald a new era for genetically modified crops with improved traits, for use in agriculture and the pharmaceutical industry.The imminent prospect of the first approval of a plant-made pharmaceutical (PMP) for human use could herald a new era for applied plant science, after a decade of public backlash against genetically modified crops, particularly in Europe. Yet, the general resistance to genetically modified organisms might have done plant biotechnology a favour in the long run, by forcing it to adopt more-rigorous procedures for efficacy and safety in line with the pharmaceutical industry. This could, in turn, lead to renewed vigour for plant science, with the promise of developing not only food crops that deliver benefits to consumers and producers, but also a wide range of new pharmaceuticals.This is certainly the view of David Aviezer, CEO of Protalix, an Israeli company that has developed what could become the first recombinant therapeutic protein from plants to treat Gaucher disease. The protein is called taliglucerase alpha; it is a recombinant human form of the enzyme glucocerebrosidase that is produced in genetically engineered carrot cells. This enzyme has a crucial role in the breakdown of glycolipids in the cell membrane and is either used to provide energy or for cellular recognition. Deficiency of this enzyme causes accumulation of lipids with a variety of effects including premature death.“My feeling is that there is a dramatic change in this area with a shift away from the direction where a decade ago biotech companies like Monsanto and Dow went with growing transgenic plants in an open field, and instead moving this process into a more regulatory well-defined process inside a clean room,” Aviezer said. “Now the process is taking place in confined conditions and is very highly regulated as in the pharmaceutical industry.”…resistance to genetically modified organisms might have done plant biotechnology a favour […] forcing it to adopt more-rigorous procedures for efficacy and safety…He argues that this is ushering in a new era for plant biotechnology that could lead to greater public acceptance, although he denies that the move to clean-room development has been driven purely by the environmental backlash against genetically modified organisms in the late 1990s and early 2000s. “That was one aspect, but I think the move has been coming more from an appreciation that biopharmaceuticals require a more regulatory defined setting than is achieved at the moment with transgenic plants.”Interest in deriving pharmaceuticals from plants, known colloquially as ‘pharming'', first took off in the 1990s after researchers showed that monoclonal antibodies could be made in tobacco plants (Hiatt et al, 1989). This led to genetic engineering of plants to produce vaccines, antibodies and proteins for therapeutics, but none gained regulatory approval, mostly because of safety concerns. Moreover, the plants were grown in open fields, therefore attracting the same criticisms as transgenic food crops. In fact, a recent study showed that the views of the public on pharming depended on the product and the means to produce it; the researchers found increasing acceptance if the plants were used to produce therapeutics against severe diseases and grown in containment (Pardo et al, 2009).However, it was the technical challenges involved in purification and the associated regulatory issues that really delayed the PMP field, according to George Lomonossoff, project leader in biological chemistry at the John Innes Centre for plant research in Norwich in the UK, part of the Biotechnology and Biological Sciences Research Council (BBSRC). “Extraction from plants required the development of systems which are not clogged by the large amounts of fibrous material, mainly cellulose, and the development of GMP [good manufacturing practice; quality and testing guidelines for pharmaceutical manufacture] compliant methods of purification which are distinct from those required from, say, mammalian cells,” said Lomonossoff. “All this is very time consuming.”“Secondly there was no regulatory framework in place to assess the risks associated with proteins produced in plants, and determining how equivalent they are to mammalian-cell-produced material and what kind of contaminants you might have to guard against,” Lomonossoff added. “Again, attempting to address all possible concerns is a lengthy and expensive process.” Yet recent work by Protalix and a few other companies, such as Dow Agrosciences, has given grounds for optimism that purification and GMP-compliant methods of production have finally been established, Lomonossoff added.…a recent study showed that the views of the public on pharming depended on the product and the means to produce it…The first important breakthrough for PMPs came in 2006, when Dow Agrosciences gained regulatory approval from the US Department of Agriculture for a vaccine against Newcastle disease, a contagious bird infection caused by paramyxovirus PMV-1. “Though the vaccine, produced in tobacco-suspension culture cells, was never deployed commercially, it showed that regulatory approval for a plant-made pharmaceutical can be obtained, albeit for veterinary use in this case,” Lomonossoff said.As approval is imminent for taliglucerase alpha for human use, it is natural to ask why plants, as opposed to micro-organisms and animals, are worth the effort as sources of vaccines, antibiotics or hormones. There are three reasons: first, plants can manufacture some existing drugs more cheaply; second, they can do it more quickly; and third, and perhaps most significantly, they will be able to manufacture more complex proteins that cannot be produced with sufficient yield in any other way.An important example in the first category is insulin, which is being manufactured in increasing quantities to treat type 1 diabetes and some cases of type 2 diabetes. Until the arrival of recombinant DNA technology, replacement insulin was derived from the pancreases of animals in abattoirs, mostly cattle and pigs, but it is now more often produced from transgenic Escherichia coli, or sometimes yeast. Recently, there has been growing interest in using plants rather than bacteria as sources of insulin (Davidson, 2004; Molony et al, 2005). SemBioSys, a plant biotechnology company based in Calgary, Canada, is now developing systems to produce insulin and other therapeutic proteins in the seeds of safflower, an oilseed crop (Boothe et al, 2009).…plants can in principle be engineered to produce any protein, including animal ones…“We have developed technology that combines the high-capacity, low-cost production of therapeutic proteins in seeds with a novel technology that simplifies downstream purification,” said Joseph Boothe, vice president of research and development at SemBioSys. “The target proteins are engineered to associate with small, oil-containing structures within the seed known as oilbodies,” Boothe explained. “When extracted from the seed these oilbodies and associated proteins can be separated from other components by simple centrifugation. As a result, much of the heavy lifting around the initial purification is accomplished without chromatography, providing for substantial cost savings.”The second potential advantage of PMPs is their speed to market, which could prove most significant for the production of vaccines, either against emerging diseases or seasonal influenza, for which immunological changes in the virus mean that newly formulated vaccines are required each year. “In terms of a vaccine, I think influenza is very promising particularly as speed is of the essence in combating new strains,” Lomonossoff said. “Using transient expression methods, you can go from sequence to expressed protein in two weeks.” Transient gene expression involves injection of genes into a cell to produce a target protein, rather than permanently incorporating the gene into a host genome. This is emerging as a less technically difficult and faster alternative to developing stable cell lines for expressing bioengineered proteins. The process of injecting the desired gene into the target genome, known as transfection, can be effected not only by viruses, but also by non-viral agents including various lipids, polyethylenine and calcium phosphate.The last of the three advantages of plants for pharmaceutical production—the ability to manufacture proteins not available by other means—is creating perhaps the greatest excitement. The Protalix taliglucerase alpha protein falls into this category, and is likely to be followed by other candidates for treating disorders that require enzymes or complex molecules beyond the scope of bacteria, according to Aviezer. “I would say that for simpler proteins, bacteria will still be the method of choice for a while,” Aviezer said. “But for more complex proteins currently made via mammalian cells, I think we can offer a very attractive alternative using plant cells.”Indeed, plants can in principle be engineered to produce any protein, including animal ones, as Boothe pointed out. “In some cases this may require additional genetic engineering to enable the plant to perform certain types of protein modification that differ between plants and animals,” he said. “The classic example of this is glycosylation. With recent advances in the field it is now possible to engineer plants to glycosylate proteins in a manner similar to that of mammalian cells.” Glycosylation is a site-directed process that adds mono- or polysaccharides to organic molecules, and plays a vital role in folding and conferring stability on the finished molecule or macromolecule. Although plants can be engineered to perform it, bacteria generally cannot, which is a major advantage of plant systems over micro-organisms for pharmaceutical manufacture, according to Aviezer. “This enables plant systems to do complex folding and so make proteins for enzyme replacement or antibodies,” Aviezer said.Genomic-assisted breeding is being used either as a substitute for, or a complement to, genetic-modification techniques…In addition to plants themselves, their viruses also have therapeutic potential, either to display epitopes—the protein, sugar or lipid components of antigens on the surface of an infectious agent—so as to trigger an immune response or, alternatively, to deliver a drug directly into a cell. However, as Lomonossoff pointed out, regulatory authorities remain reluctant to approve any compound containing foreign nucleic acids for human use because of the risk of infection as a side effect. “I hope the empty particle technology [viruses without DNA] we have recently developed will revive this aspect,” Lomonossoff said. “The empty particles can also be used as nano-containers for targeted drug delivery and we are actively exploring this.”As pharmaceutical production is emerging as a new field for plant biology, there is a small revolution going on in plant breeding, with the emergence of genomic techniques that allow simultaneous selection across several traits. Although genetic modification can, by importing a foreign gene, provide instant expression of a desired trait, such as drought tolerance, protein content or pesticide resistance, the new field of genomics-assisted breeding has just as great potential through selection of unique variants within the existing gene pool of a plant, according to Douwe de Boer, managing director of the Netherlands biotech group Genetwister. “With this technology it will be possible to breed faster and more efficiently, especially for complex traits that involve multiple genes,” he said. “By using markers it is possible to combine many different traits in one cultivar, variety, or line in a pre-planned manner and as such breed superior crops.”“The application of genomics technologies and next generation sequencing will surely revolutionize plant breeding and will eventually allow this to be achieved with clinical precision”Genomic-assisted breeding is being used either as a substitute for, or a complement to, genetic-modification techniques, both for food crops to bolt on traits such as nutrient value or drought resistance, and for pharmaceutical products, for example to increase the yield of a desired compound or reduce unwanted side effects. Yet, there is more research required to make genomic-assisted breeding as widely used as established genetic-modification techniques. “The challenge in our research is to find markers for each trait and as such we extensively make use of bio-informatics for data storage, analysis and visualization,” de Boer said.The rewards are potentially enormous, according to Alisdair Fernie, a group leader from the Max-Planck-Institute for Molecular Plant Physiology in Potsdam, Germany. “Smart breeding will certainly have a massive impact in the future,” Fernie said. “The application of genomics technologies and next generation sequencing will surely revolutionize plant breeding and will eventually allow this to be achieved with clinical precision.” The promise of such genomic technologies in plants extends beyond food and pharmaceuticals to energy and new materials or products such as lubricants; the potential of plants is that they are not just able to produce the desired compound, but can often do so more quickly, efficiently and cheaply than competing biotechnological methods.  相似文献   

3.
Despite the scientific community''s overwhelming support for the European Research Council, many grant recipients are irked about red tapeThere is one thing that most European researchers agree on: B stands for Brussels and bureaucracy. Research funding from the European Commission (EC), which distributes EU money, is accompanied by strict accountability and auditing rules in order to ensure that European taxpayers'' money is not wasted. All disbursements are treated the same, whether subsidies to farmers or grants to university researchers. However, the creation of the European Research Council (ERC) in 2007 as a new EU funding agency for basic research created high hopes among scientists for a reduced bureaucratic burden.… many researchers who have received ERC funding have been angered with accounting rules inherited from the EC''s Framework Programmes…ERC has, indeed, been a breath of fresh air to European-level research funding as it distributes substantial grants based only on the excellence of the proposal and has been overwhelmingly supported by the scientific community. Nevertheless, many researchers who have received ERC funding have been angered with accounting rules inherited from the EC''s Framework Programmes, and which seem impossible to change. In particular, a requirement to fill out time sheets to demonstrate that scientists spend an appropriate amount of time working on the project for which they received their ERC grant has triggered protests over the paperwork (Jacobs, 2009).Luis Serrano, Coordinator of the Systems Biology Programme at the Centre for Genomic Regulation in Barcelona, Spain, and recipient of a €2 million ERC Advanced Investigator Grant for five years, said the requirement of keeping time sheets is at best a waste of time and worst an insult to the high-level researchers. “Time sheets do not make much sense, to be honest. If you want to cheat, you can always cheat,” he said. He said other grants he receives from the Spanish government and the Human Frontier Science Programme do not require time sheets.Complaints by academic researchers about the creeping bureaucratization of research are not confined to the old continent (see Opinion by Paul van Helden, page 648). As most research, as well as universities and research institutes, is now funded by public agencies using taxpayers'' money, governments and regulators feel to be under pressure to make sure that the funds are not wasted or misappropriated. Yet, the USA and the EU have taken different approaches to making sure that scientists use public money correctly. In the USA, misappropriation of public money is considered a criminal offence that can be penalized by a ban on receiving public funds, fines and even jail time; in fact, a few scientists in the USA have gone to prison.By contrast, the EU puts the onus on controlling how public money is spent upfront. Research funding under the EU''s Framework Programmes requires clearly spelt out deliverables and milestones, and requires researchers to adhere to strict accountability and auditing rules. Not surprisingly, this comes with an administrative burden that has raised the ire of many scientists who feel that their time is better spent doing research. Serrano said in a major research centre such as the CRG, the administration could minimize the paper burden. “My administration prepares them for me and I go one, two, three, four, five and I do all of them. You can even have a machine sign for you,” he commented. “But I can imagine researchers who don''t have the administrative help, this can take up a significant amount of time.” For ERC grants, which by definition are for ‘blue-skies'' research and thus do not have milestones or deliverables, such paperwork is clearly not needed.Complaints by academic researchers about the creeping bureaucratization of research are not confined to the old continentNot everyone is as critical as Serrano though. Vincent Savolainen at the Division of Biology at Imperial College London, UK, and recipient of a €2.5 million, five-year ERC Advanced Investigator Grant, said, “Everything from the European Commission always comes with time sheets, and ERC is part of the European Commission.” Still, he felt it was very confusing to track time spent on individual grants for Principal Investigators such as him. “It is a little bit ridiculous but I guess there are places where people may abuse the system. So I can also see the side of the European Commission,” he said. “It''s not too bad. I can live with doing time sheets every month,” he added. “Still, it would be better if they got rid of it.”Juleen Zierath, an integrative physiologist in the Department of Molecular Medicine at Karolinska Institutet (Stockholm, Sweden), who received a €2.5 million, five-year ERC grant, takes the time sheets in her stride. “If I worked in a company, I would have to fill out a time sheet,” she said. “I''m delighted to have the funding. It''s a real merit. It''s a real honour. It really helps my work. If I have to fill out a time sheet for the privilege of having that amount of funding for five years, it''s not a big issue.”Zierath, a native of Milwaukee (WI, USA) who came to Karolinska for graduate work in 1989, said the ERC''s requirements are certainly “bureaucracy light” compared with the accounting and reporting requirements for more traditional EU funding instruments, such as the ‘Integrated Projects''. “ERC allows you to focus more on the science,” she said. “I don''t take time sheets as a signal that the European Union doesn''t count on us to be doing our work on the project. They have to be able to account for where they''re spending the money somehow and I think it''s okay. I can understand where some people would be really upset about that.”…governments and regulators feel to be under pressure to make sure that the funds are not wasted or misappropriated…The complaints about time sheets and other bureaucratic red tape have caught the attention of high-level scientists and research managers throughout Europe. In March 2009, the EC appointed an outside panel, headed by Vaira Vike-Freiberga, former President of Latvia, to review the ERC''s structures and mechanisms. The panel reported in July last year that the objective of building a world-class institution is not properly served by “undue cumbersome regulations, checks and controls.” Although fraud and mismanagement should be prevented, excessively bureaucratic procedures detract from the mission, and might be counter-productive.Helga Nowotny, President of the ERC, said the agency has to operate within the rules of the EC''s Framework Programme 7, which includes the ERC. She explained that if researchers hold several grants, the EC wants recipients to account for their time. “The Commission and the Rules of Participation of course argue that many of these researchers have more than one grant or they may have other contracts. In order to be accountable, the researchers must tell us how much time they spend on the project. But instead of simply asking if they spent a percentage of time on it, the Commission auditors insist on time sheets. I realize that filling them out has a high symbolic value for a researcher. So, why not leave it to the administration of the host institution?”Particle physicist Ian Halliday, President of the European Science Foundation and a major supporter of the ERC, said that financial irregularities that affected the EU over many years prompted the Commission to tighten its monitoring of cash outlays. “There have been endless scandals over the agricultural subsidies. Wine leaks. Nonexistent olive trees. You name it,” he said. “The Commission''s financial system is designed to cope with that kind of pressure as opposed to trusting the University of Cambridge, for example, which has been there for 800 years or so and has a well-earned reputation by now. That kind of system is applied in every corner of the European Commission. And that is basically what is causing the trouble. But these rules are not appropriate for research.”…financial irregularities that affected the EU over many years prompted the Commission to tighten its monitoring of cash outlaysNowotny is sympathetic and sensitive to the researchers'' complaints, saying that requiring time sheets for researchers sends a message of distrust. “It feels like you''re not trusted. It has this sort of pedantic touch to it,” she said. “If you''ve been recognized for doing this kind of top research, researchers feel, ‘Why bother [with time sheets]?''” But the bureaucratic alternative would not work for the ERC either. This would mean spelling out ‘deliverables'' in advance, which is clearly not possible with frontier research.Moreover, as Halliday pointed out, there is inevitably an element of fiction with time sheets in a research environment. In his area of research, for example, he considers it reasonable to track the hours of a technician fabricating parts of a telescope. But he noted that there is a different dynamic for researchers: “Scientists end up doing their science sitting in their bath at midnight. And you mull over problems and so forth. How do you put that on a time sheet?” Halliday added that one of the original arguments in establishing the ERC was to put it at an arm''s length from the Commission and in particular from financial regulations. But to require scientists to specify what proportion of their neurons are dedicated to a particular project at any hour of the day or night is nonsensical. Nowotny agreed. “The time sheet says I''ve been working on this from 11 in the morning until 6 in the evening or until midnight or whatever. This is not the way frontier research works,” she said.Halliday, who served for seven years as chief executive of the Particle Physics and Astronomy Research Council (Swindon, UK), commented that all governments require accountability. In Great Britain, for instance, much more general accountability rules are applied to grantees, thereby offering a measure of trust. “We were given a lot of latitude. Don''t get me wrong that we allowed fraud, but the system was fit for the purpose of science. If a professor says he''s spending half his time on a certain bit of medical research, let''s say, the government will expect half his salary to show up in the grants he gets from the funding agencies. We believe that if the University of Cambridge says that this guy is spending half his time on this research, then that''s probably right and nobody would get excited if it was 55% or 45%. People would get excited if it was 5%. There are checks and balances at that kind of level, but it''s not at a level of time sheets. It will be checked whether the project has done roughly what it said.”Other funding agencies also take a less bureaucratic approach. Candace Hassall, head of Basic Careers at the Wellcome Trust (London, UK), which funds research to improve human and animal health, said Wellcome''s translation awards have milestones that researchers are expected to meet. But “time sheets are something that the Wellcome Trust hasn''t considered at all. I would be astonished if we would ever consider them. We like to work closely with our researchers, but we don''t require that level of reporting detail,” she said. “We think that such detailed, day-by-day monitoring is actually potentially counterproductive overall. It drives people to be afraid to take risks when risks should be taken.”…to require scientists to specify what proportion of their neurons are dedicated to a particular project at any hour of the day or night is nonsensicalOn the other side of the Atlantic, Jack Dixon, vice president and chief scientific officer at the Howard Hughes Medical Institution (Chevy Chase, MD, USA), who directs Hughes'' investigator programme, said he''d never heard of researchers being asked to keep time sheets: “Researchers filling out time sheets is just something that''s never crossed our minds at the Hughes. I find it sort of goofy if you want to know the truth.”In fact, a system based on trust still works better in the academic worldInstead, Hughes trusts researchers to spend the money according to their needs. “We trust them,” Dixon said. “What we ask each of our scientists to do is devote 75% of their time to research and then we give them 25% of their time which they can use to teach, serve on committees. They can do consulting. They can do a variety of things. Researchers are free to explore.”There is already growing support for eliminating the time sheets and other bureaucratic requirements that come with an ERC grant, and which are obviously just a hangover from the old system. Indeed, there have been complaints, such as reviewers of grant applications having to fax in copies of their passports or identity cards, before being allowed sight of the proposals, said Nowotny. The review panel called on the EC to adapt its rules “based on trust and not suspicion and mistrust” so that the ERC can attain the “full realization of the dream shared by so many Europeans in the academic and policy world as well as in political milieus.”In fact, a system based on trust still works better in the academic world. Hassall commented that lump-sum payments encourage the necessary trust and give researchers a sense of freedom, which is already the principle behind ERC funding. “We think that you have to trust the researcher. Their careers are on the line,” she said. Nowotny hopes ERC will be allowed to take a similar approach to that of the Wellcome Trust, with its grants treated more like “a kind of prize money” than as a contract for services.She sees an opportunity to relax the bureaucratic burden with a scheduled revision of the Rules of Participation but issues a word of caution given that, when it comes to EU money, other players are involved. “We don''t know whether we will succeed in this because it''s up to the finance ministers, not even the research ministers,” she explained. “It''s the finance ministers who decide the rules of participation. If finance ministers agree then the time sheets would be gone.”  相似文献   

4.
Femtobiology freeze-frames crucial split seconds of chemical reactions to investigate how enzymes function. The potential prize from this knowledge could be new avenues for drug development or ways to produce clean energy.Along with replication and mutability, living beings are set apart from the mineral background they inhabit by their metabolism—their ability to catalyse chemical reactions. Since Linus Pauling first proposed that these reactions are made possible by enzymes that recognize and bind tightly to their substrates at a crucial transition point [1], it has become increasingly clear that understanding these reactions requires details of the precise molecular alignments that take place at the level of femtoseconds (10−15s).This transition state is the ‘point of no return'' for colliding molecules in a chemical reaction. Beyond it, the reactants inevitably go on to form new products; before it, the reaction does not take place. It lasts for tens to hundreds of femtoseconds, when the molecules are at a state of maximum energy from which they will fall either towards completing the reaction, or with equal likelihood, away from it. The role of the enzyme is to enable the molecules to negotiate this energy summit and to reach the point of completing the reaction.Many processes, including protein folding and the splitting of water during photosynthesis, pass through more than one transition state. Unravelling them all is a challenging task, but the potential prizes are great and might include the ability to harness reactions to produce carbon-neutral energy, for example, by mimicking or exploiting photosynthesis. There are also great therapeutic possibilities, as cell replication in cancer or metabolic processes in pathogens could be halted by intervening at transition states to block key reactions.This therapeutic avenue was first explored in 1986 by Richard Wolfenden, now at the University of North Carolina at Chapel Hill, USA, who calculated that conformational changes in the active site of an enzyme at the transition state should enable it to bind to the reactants with huge strength to overcome the energy barrier [2]. This, in turn, suggested that suitably designed analogues, mimicking the reactants at the transition state, could intervene by binding to the enzyme during that brief window, thus rendering the enzyme ineffective.However, the technologies needed to gather information about transition states have only become available during the past decade. The principle technology in use is X-ray absorption spectroscopy (XAS), which is combined with an ultra-fast laser in an arrangement known as a ‘pump probe''. This setup determines the geometrical shape of the approaching molecular orbitals and the distribution of electrostatic charge around them. The XAS provides information about charge distribution, whilst the pump probe yields details of the geometrical structure during the crucial femtoseconds of the transition state.The pump probe splits a short laser pulse into two separate pulses by a timescale corresponding to the period of the relevant molecular vibrations. The first pulse—the pump—excites the sample, whereas the second pulse—the probe—measures the changes caused by the first. This information can be used to determine the structural details of the transition state, thus enabling the hunt for suitable analogues. Vern Schramm''s laboratory at the Albert Einstein College of Medicine of Yeshiva University, in New York, USA, is doing exactly this. “Our approach gives geometry and electrostatic information for the transition state,” Schramm explained. “We can use computational approaches to compare these to large numbers of related molecules to see which best mimic the transition state.” Schramm''s team has already applied this to develop a drug that targets Plasmodium falciparum—the protozoan parasite that causes malaria. The drug blocks the crucial purine pathway with a transition-state analogue [3]. Plasmodium is a purine auxotroph, meaning that it cannot manufacture the molecule directly. Instead, the parasite makes purines indirectly, through an enzyme called purine nucleoside phosphorylase that synthesizes a purine precursor called hypoxanthine. Schramm''s transition analogue, BCX4945, binds to the active site of the enzyme at the transition state and so blocks its action, starving the parasite of purine.…the potential prizes are great and might include the ability to harness reactions to produce carbon-neutral energy, for example, by mimicking or exploiting photosynthesisIn trials, BCX4945 cleared P. falciparum infection in night monkeys of the Aotus genus—a model close to that of human malarial infection. But there was some re-emergence of the parasite at reduced levels after a few days, similar to the pattern observed with conventional anti-malarial drugs. The drug has been licensed to BioCryst Pharmaceuticals, which is providing it to third parties, under license, for clinical trials. “One such party is now evaluating the drug for a go/no-go decision to go forward into a small-controlled human trial,” commented Schramm. We expect that party to make that decision by mid-2013.”Meanwhile, Schramm has planned laboratory studies to determine the exact mechanism of drug action, off-target effects and the efficiency of different drug combinations in night monkeys, as well as the rate of resistance formation in the parasite to BCX4945. However, he is having trouble finding funding for the research, as the eventual treatment would require more than three doses per day, making it difficult to deploy in regions that suffer from malaria and have poor health infrastructure. Nevertheless, Schramm is convinced that the drug has great potential because of its low toxicity and different mode of action, which starves the parasite. It has certainly demonstrated that transition-state analogues can work.In the meantime, Schramm''s group is targeting human immunodeficiency virus (HIV), which has also resisted attempts to develop satisfactory therapies that are both effective and have acceptably low side effects. The aim is to inhibit the HIV-1 protease that cleaves newly synthesized polypeptides to enable the virus particles to become infectious and invade new cells. HIV protease inhibitors have been used for years, but resistant strains of HIV have emerged. Schramm believes that a transition-state analogue might overcome this problem of resistance. “We recently solved the transition-state structures of HIV protease native and drug-resistant enzymes,” he said. “Surprisingly, the transition states are identical. Thus, the resistance does not come from altered transition-state structure. The result shows that if a transition-state analogue can be found for the reaction, it should efficiently inhibit both the native and resistant enzymes.”Although such an approach holds great promise, there are significant challenges for developing drugs that mimic transition states. One is that solving the structure of the transition state itself is not sufficient, as the analogues might still not be suitable for use in humans. Kinases, for example, perform a wide variety of signalling and other functions by transferring phosphate groups. “In kinases, we understand the transition states, but biologically compatible mimics of the transition state have not been achieved,” Schramm said.Even when biologically compatible, effective mimics are available, they might still prove inappropriate owing to unanticipated effects on other pathways. Schramm also pointed out that an inhibitor can be too powerful, irrespective of its mode of action. “Some human targets are essential and it will be harmful to cause complete inhibition for long periods. An example is the target of statins, HMGCoA reductase, which is the pacemaker enzyme for cholesterol, but also for all other steroid hormones,” he explained. This biochemical knowledge of the target is crucial for using transition-state analogues, Schramm noted. “When the target is unique to a pathogen, for example, their use is ideal. But when the target is a host enzyme, for example in cancer, animal experiments are essential to show that the analogue has the desired effect with limited toxicity.”Femtobiology is not only focused on identifying transition-state analogues for drug development; researchers are also digging into photosynthesis, given its potential for yielding carbon-neutral fuels and electric power. Photosynthesis involves two photoreactions that harvest light to energize electrons through a plethora of associated enzymes and co-factors. The crucial first step is carried out by photosystem 2 (PS2), which uses light energy to split two water molecules into oxygen and four electrons. The electrons are transferred to the Calvin cycle in which they convert carbon dioxide into carbohydrates.…the technologies needed to gather information about transition states have only become available during the past decadeThe water-splitting part of PS2 is the crucial component for solar energy conversion because it is the engine of the whole system and the key to its high efficiency [4]. “Understanding the water-splitting reaction and identifying the various reaction steps and intermediates is of key importance and will be very important for the development of new and efficient artificial systems,” explained Villy Sundstrom, whose team at Lund University in Sweden works on solar energy conversion research.The water splitting occurs in a cluster of four manganese ions and one calcium ion in a five-state cycle. To analyse the process accurately requires elucidating the precise structure of each stage, each of which lasts for only a short period. An important step forward was made in 2011, with the production of a model of the complex in the ground S1-state by X-ray crystallography at a resolution of 1.9 Å [5]. This still left the great challenge of determining the structure of the transient S2-, S3- and S4-states, but provided essential information that stimulated further work on the structure of the S2-state [6]. The study of the S2-state, by Khandavalli Lakshmi and colleagues at The Baruch ‘60 Center for Biochemical Solar Energy Research in Troy, New York, USA, involved the use of PS2 isolated from three species—two cyanobacteria and spinach. The researchers trapped the oxygen-evolving complex (OEC) in the S2-intermediate-state by low temperature illumination.Lakshmi''s team used a technique called two-dimensional hyperfine sublevel correlation spectroscopy to detect weak magnetic interactions between the manganese cluster of the S2-state and the surrounding protons. “The major breakthrough of the 1.9 Å X-ray crystal structure [of the S1-state] is that it identifies all of the amino acid ligands of the Mn4Ca-oxo cluster and four water molecules that are directly coordinated to the metal ions,” Lakshmi said. This helped the team with their detective work in locating all the structural units within 5 Å of the Mn4Ca-oxo cluster that might be involved. “This leads to several likely candidates that include amino acid ligands that are directly co-ordinated to the cluster, amino acid side chains that are not co-ordinated to the cluster, two water molecules that are co-ordinated to the manganese ion, two water molecules that are co-ordinated to the Ca2+ ion and nine water molecules that form a hydrogen bond network in the vicinity of the Mn4Ca-oxo cluster in the crystal structure,” Lakshmi explained.…biochemical knowledge of the target is crucial for using transition-state analogues…One of the interesting findings was that the S2-states of the three organisms studied were almost indistinguishable. “In an unexpected but welcome surprise, we observe that the hyperfine spectra of the S2-state of the OEC of PSII from Thermosynechococcus vulcanus, the PsbB variant of Synechocystis PCC 6803 and spinach are identical,” Lakshmi said. “This suggests that the OEC of PSII is highly conserved in the three species”.There is still some way to go to unravel all S-states of PS2, Lakshmi conceded. “There are several open questions on the fate of the water molecules in the S-states that warrant immediate attention,” she said. These include the precise location and binding of the substrate water molecules, the oxidation state of the manganese ions that ligate the substrate water molecules and precise geometry of the Mn4Ca-oxo cluster.In parallel with the structural and functional analysis of the S-states of PS2, research has been ongoing into artificial systems that use catalysts other than the Mn4Ca-oxo for water splitting. Such systems had only achieved levels of efficiency usually two orders of magnitude lower than PS2 itself, in terms of the rate of oxygen production. But a major advance uses a ruthenium catalyst to achieve water oxidation rates similar to PS2 [7]. There is just one important caveat: the catalyst does not use light to drive the oxidation, it uses an acidic solution of ammonium cerium nitrate, a compound of the rare earth metal cerium. However, the team believes that the high rates of oxidation achieved with the ruthenium catalyst could lead to water oxidation technology based on more abundant elements, such as the first-row metals rather than rare earth ones. In the future, they hope that the knowledge gained about these artificial catalysts and how they work will pave the way to the light-driven generation of molecular hydrogen by water splitting.Whilst the ultimate aims of directly harnessing photosynthesis for human benefit, and the creation of an artificial system that rivals the water-splitting efficiency of PS2 would be huge steps forward with profound implications for energy production, the end is a long way off. In the meantime, the growing interest in split-second moments at the catalytic centres of many enzymes continues to enhance our knowledge of the metabolism and lays the groundwork for progress in drug development, energy production and other areas.  相似文献   

5.
Cooperativity is extensively used by enzymes, particularly those acting at key metabolic branch points, to “fine tune” catalysis. Thus, cooperativity and enzyme catalysis are intimately linked, yet their linkage is poorly understood. Here we show that negative cooperativity in the rate-determining step in the E1 component of the Escherichia coli pyruvate dehydrogenase multienzyme complex is an outcome of redistribution of a “rate-promoting” conformational pre-equilibrium. An array of biophysical and biochemical studies indicates that non-catalytic but conserved residues directly regulate the redistribution. Furthermore, factors such as ligands and temperature, individually or in concert, also strongly influence the redistribution. As a consequence, these factors also exert their influence on catalysis by profoundly influencing the pre-equilibrium facilitated dynamics of communication between multienzyme components. Our observations suggest a mode of cooperativity in the E1 component that is consistent with the dynamical hypothesis shown to satisfactorily explain cooperativity in many well studied enzymes. The results point to the likely existence of multiple modes of communication between subunits when the entire class of thiamin diphosphate-dependent enzymes is considered.  相似文献   

6.
Despite intense interest and considerable effort via high-throughput screening, there are few examples of small molecules that directly inhibit protein-protein interactions. This suggests that many protein interaction surfaces may not be intrinsically “druggable” by small molecules, and elevates in importance the few successful examples as model systems for improving our fundamental understanding of druggability. Here we describe an approach for exploring protein fluctuations enriched in conformations containing surface pockets suitable for small molecule binding. Starting from a set of seven unbound protein structures, we find that the presence of low-energy pocket-containing conformations is indeed a signature of druggable protein interaction sites and that analogous surface pockets are not formed elsewhere on the protein. We further find that ensembles of conformations generated with this biased approach structurally resemble known inhibitor-bound structures more closely than equivalent ensembles of unbiased conformations. Collectively these results suggest that “druggability” is a property encoded on a protein surface through its propensity to form pockets, and inspire a model in which the crude features of the predisposed pocket(s) restrict the range of complementary ligands; additional smaller conformational changes then respond to details of a particular ligand. We anticipate that the insights described here will prove useful in selecting protein targets for therapeutic intervention.  相似文献   

7.
Hunter P 《EMBO reports》2011,12(5):401-404
Modern computing power grants scientists the ability to crunch previously impossible amounts of data. Nevertheless, the human brain has not yet been replaced as the engine of design and discovery.Biologists first dipped their toes into the waters of mathematics when Gregor Mendel developed his laws of trait inheritance in the mid-nineteenth century. Since then, statistics and computation have come to play an important part in almost all aspects of applied and fundamental research. The power of number crunching has even led to questions about the role of traditional observation or insight in experimentation, with a concomitant concern that over-reliance on mathematics might lead researchers to lazily follow preconceived ideas, or to ignore inconvenient data that actually indicate that a theory needs modification.Lars Jensen, a group leader in Disease Systems Biology at the Novo Nordisk Foundation (NNF) Centre for Protein Research at the University of Copenhagen in Denmark, takes this position. “It is, in my opinion, a risk that has always been there and still is there in the mathematical era,” he commented. “If researchers have a preconceived idea about how the results of an experiment should be, they may be tempted to classify observations as outliers if they do not fit the expectations.”“If researchers have a preconceived idea about how the results of an experiment should be, they may be tempted to classify observations as outliers if they do not fit the expectations”As such, the risk that statistics trump observation has to be considered carefully, but should not turn back the tide of computation and analysis in biology. Jeremy Nicholson, head of the Department of Surgery and Cancer in the Faculty of Medicine at Imperial College London, UK, argues that only the application of mathematics can show if the results of an experiment are true. “The only proof of biological activity is either in statistics, which of course goes back a long way, or geometry, as used in physical anthropology,” he said.…almost every biological state […] has an associated pattern of relative molecular concentrations […] these signatures can be detected against the background of normal cellular functionThe role of mathematics in biological analysis is expanding, particularly with the advent of the various ‘omics'' fields. Multivariate statistics, for example, allows the simultaneous analysis of variables—such as the expression levels of several genes—which makes it possible to draw simple inferences from complex data sets. This analysis can be performed not only on the expression of the genes themselves, but also downstream, on the behaviour of the gene products; reflected, for example, in the molecular composition of a blood or tissue sample. This, Nicholson argues, has led to progress in the emerging field of surgical metabonomics, which he defines as a systems approach to examining changes in hundreds or thousands of low-molecular-weight metabolites in an intact tissue or biofluid. “Our biggest recent advances in thinking are in surgical metabonomics and real-time profiling,” he said, adding that these techniques will have a huge impact on diagnosis and surgical procedures.The key point is that almost every biological state—be it a specific cancer or a metabolic condition, such as diabetes—has an associated pattern of relative molecular concentrations in cells and tissues. In principle, these signatures can be detected against the background of normal cellular function. The data usually come from nuclear magnetic resonance or mass spectrometry analyses, which yield spectral peaks and troughs relating to the identity and relative proportions of the molecular constituents of the sample. The immediate objective is not to identify individual molecules, but to analyse the overall pattern of the components. The components are usually moieties of larger molecules, such as hydroxyl or amino groups, which yield characteristic peaks. However, because different molecules have groups in common, it is not immediately possible to identify the exact contents of a sample.…the mathematical tools underlying many […] methods are based on Bayes theorem [which] allows mathematicians to calculate the probability of a prior event on the basis of […] data that emerges afterwards“A typical example is where one is looking for biomarkers of a disease,” explained Tim Ebbels, a senior lecturer in computational bioinformatics at Imperial College London. “You compare profiles from normal people against those with the disease and ask the question: which molecules change in concentration between the two groups?”In the past, this analysis would have been done using a statistical technique such as a t-test, which compares just two variables at a time. The limitation is obvious: the test cannot detect small differences in concentration between many molecules. This is where modern, so-called ‘latent-variable'' techniques step in. “Not only do [latent-variable techniques] allow one to spot groups of metabolites changing together—as you might expect if they are involved in the same pathway, for instance—but they also provide simple and intuitive visualizations of the data,” Ebbels explained. “Visualization is a key part of the discovery process and would be hard to do without these kinds of tools. For instance, how do you visualize the levels of hundreds of metabolites changing over hundreds or thousands of individuals? You cannot just plot a scatter plot of the levels of metabolite one versus metabolite two. You need a tool that reduces the dimensionality of the data—this is what latent-variable methods do.”Recent advances have helped to take analysis to the next level by identifying individual molecules associated with a particular disease state from their spectral data. Such insight used to involve time-consuming literature searches to identify candidates associated with a particular pattern and specific experiments to provide proof. Now, the emerging technique of statistical correlation spectroscopy identifies precise patterns that appear repeatedly across a set of samples, which enables software to identify the molecules that are responsible with increasing levels of certainty.“…when you crunch large data sets, you accept that here and there you do make a mistake […] The idea is that, given the thousands of datapoints you can crunch, the few mistakes do not change the big picture”To a large extent, it is simply the availability of complex data sets spanning many variables that has driven progress in analytical biology, rather than advances in mathematics itself. Indeed, the mathematical tools underlying many of the recently developed methods are based on Bayes'' theorem, which was developed by the English Presbyterian minister Thomas Bayes and published after his death in 1763, by the Royal Society of London. The theorem allows mathematicians to calculate the probability of a previous event on the basis of evidence or data that emerges afterwards. In multivariate analyses, it identifies the event or condition most likely to be associated with a particular complex data set spanning many variables. In metabonomics it can identify the disease associated with a particular distribution of molecules with a high degree of accuracy. “Modern mathematical techniques in biology is a large subject, but you can think of the progress being based on Bayesian approaches,” noted Gael Yvert from the Laboratory of Molecular and Cellular Biology at the École Normale Supérieure, Lyon, France.Ultimately, whatever tools are available and whatever technological advances are made, innovation and originality of the human spirit will still determine what makes science brilliantYvert applies multivariate analysis to genetic mechanisms that underlie phenotypic differences between individuals within a species, focusing on the yeast Saccharomyces cerevisiae. Bayesian methods have had a profound effect on both the design and analysis of Yvert''s experiments. Essentially, Yvert has been able to strike a balance between statistical power and the cost of doing an experiment, which has helped him to minimize the number of microarrays needed to establish a link between a trait and a genetic background.For example, if you want to identify the genes responsible for a particular trait by analysing expression levels across the genomes of two yeast strains under different experimental conditions, Bayesian techniques can provide the answer to a certain level of probability, which can then be increased by repeating the experiment. For a given budget, that means either reducing the number of conditions tested or the number of yeast strains that can be analysed. This is necessary, Yvert pointed out, because it is pointless to compromise on statistical power, especially if the conclusions are unexpected. “It is often better to have many replicates than to explore more conditions, because then robust inferences are obtained,” he explained.There are other situations, though, in which modern analytical methods help scientists to understand biological mechanisms by revealing uncertainty. This is the case with interactions between proteins, which often depend on precise alignments of sites involving specific configurations of atoms. Until recently, the structure of such binding sites was obtained using X-ray or electron crystallography. These techniques have yielded increasingly accurate information about the position of atoms within a single protein or protein complex, but this is only a snapshot; it cannot reveal the extent of atomic flexibility within a complex molecule. Additionally, the crystal structures are based on the average positions of molecules, which might not be sufficient to predict the behaviour of the protein during interaction or binding. However, Russ Altman, chair of the Department of Bioengineering and director of the programme in Biomedical Informatics at Stanford University in the USA, explained that if you also know how much freedom each atom has to move within the structure, it becomes possible to predict how the molecule will interact during binding. Altman has applied computational modelling to determine the degree of uncertainty of the position of various atoms within a protein molecule.“I think the ability to represent the uncertainty in the position of individual atoms is critical and still not fully appreciated,” Altman said. “Crystal structures are fabulous, but crystals provide an environment that may encourage unrealistically low atomic positional deviations. The field of protein disorder and its importance for structure and function has exploded, and our work was an early indicator and demonstration of the importance of thinking about this.”This has helped researchers to work out the detailed mechanisms of protein binding, which depend not only on the average positions of atoms, but also on their freedom of movement. “Our results suggest that certain atoms within proteins can be positioned with great certainty, while others have great uncertainty,” Altman commented.This idea of representing the uncertainty of atoms has been scaled up to the study of binding between drugs and their targets, leading to the design of new therapeutic compounds that should be more effective. “We have recently shown that drug binding sites can be represented by a series of loosely interacting microenvironments,” Altman explained. “This representation allows us to recognize similar sites that might bind the same ligands, with similar microenvironments, but perhaps arranged slightly differently. These slight differences can be accommodated by flexibility in the ligand (Halperin et al, 2008).” Altman is applying this knowledge to designing kinase inhibitors. In theory, it should be possible to design drugs capable of targeting a broader range of related proteins, such as kinases, which often have key roles in inflammatory responses and disorders.Over the past decade or two, the role and use of analytical and computational techniques have developed rapidly, but they are still just tools that require human insight to draw conclusions. An important point is that as data sets become larger and more complex, the potential for errors and different interpretations becomes greater, even with the help of sophisticated statistical analysis. In this regard, Dieter Ebert, whose group at the University of Basel in Switzerland specializes in the evolution of host–parasite interactions, noted that, “when you crunch large data sets, you accept that here and there you do make a mistake […] The idea is that, given the thousands of datapoints you can crunch, the few mistakes do not change the big picture. Once you go into details, you may find the mistakes, but this needs often a sharp eye and experience.”A simple example is the widely applied technique of shotgun sequencing, to determine whole genomes by breaking up the DNA into random overlapping segments that are small enough to be sequenced individually. These segments are then read and assembled into a continuous whole sequence by a computer program. As Ebert noted, this process always misassembles a few of the sequences, even though the software is getting better all the time. Whether these errors matter depends on the application: whether the objective is to determine the broad structure and layout of a whole genome, or to focus more closely on sequences or even individual genes.“If your work is aimed at seeing the larger picture, you may live with error rates below a certain threshold. So if you compare a newly assembled genome with previous genomes, and you are looking for the overall patterns, you can ignore some errors,” Dieter said. “But if you pick out one section of the genome and you want to study the particular gene order in this region, I would strongly suggest you verify that this region was correctly assembled, even if the chance of misassembly is only 1 in 100.”The wider message is that as biology becomes more analytical and interdisciplinary, the skills required to design experiments and interpret results have inevitably changed. Nevertheless, one fundamental point remains: human skill and judgement are needed to determine whether a set of results confirms expectations, whether it indicates that further investigation is needed, or whether it requires revision of the existing orthodoxy. Ultimately, whatever tools are available and whatever technological advances are made, innovation and originality of the human spirit will still determine what makes science brilliant.  相似文献   

8.
Zhang L  Ren G 《PloS one》2012,7(1):e30249
The dynamic personalities and structural heterogeneity of proteins are essential for proper functioning. Structural determination of dynamic/heterogeneous proteins is limited by conventional approaches of X-ray and electron microscopy (EM) of single-particle reconstruction that require an average from thousands to millions different molecules. Cryo-electron tomography (cryoET) is an approach to determine three-dimensional (3D) reconstruction of a single and unique biological object such as bacteria and cells, by imaging the object from a series of tilting angles. However, cconventional reconstruction methods use large-size whole-micrographs that are limited by reconstruction resolution (lower than 20 Å), especially for small and low-symmetric molecule (<400 kDa). In this study, we demonstrated the adverse effects from image distortion and the measuring tilt-errors (including tilt-axis and tilt-angle errors) both play a major role in limiting the reconstruction resolution. Therefore, we developed a “focused electron tomography reconstruction” (FETR) algorithm to improve the resolution by decreasing the reconstructing image size so that it contains only a single-instance protein. FETR can tolerate certain levels of image-distortion and measuring tilt-errors, and can also precisely determine the translational parameters via an iterative refinement process that contains a series of automatically generated dynamic filters and masks. To describe this method, a set of simulated cryoET images was employed; to validate this approach, the real experimental images from negative-staining and cryoET were used. Since this approach can obtain the structure of a single-instance molecule/particle, we named it individual-particle electron tomography (IPET) as a new robust strategy/approach that does not require a pre-given initial model, class averaging of multiple molecules or an extended ordered lattice, but can tolerate small tilt-errors for high-resolution single “snapshot” molecule structure determination. Thus, FETR/IPET provides a completely new opportunity for a single-molecule structure determination, and could be used to study the dynamic character and equilibrium fluctuation of macromolecules.  相似文献   

9.
10.
Gu W  Yang J  Lou Z  Liang L  Sun Y  Huang J  Li X  Cao Y  Meng Z  Zhang KQ 《PloS one》2011,6(1):e16262
Microbial ferulic acid decarboxylase (FADase) catalyzes the transformation of ferulic acid to 4-hydroxy-3-methoxystyrene (4-vinylguaiacol) via non-oxidative decarboxylation. Here we report the crystal structures of the Enterobacter sp. Px6-4 FADase and the enzyme in complex with substrate analogues. Our analyses revealed that FADase possessed a half-opened bottom β-barrel with the catalytic pocket located between the middle of the core β-barrel and the helical bottom. Its structure shared a high degree of similarity with members of the phenolic acid decarboxylase (PAD) superfamily. Structural analysis revealed that FADase catalyzed reactions by an “open-closed” mechanism involving a pocket of 8×8×15 Å dimension on the surface of the enzyme. The active pocket could directly contact the solvent and allow the substrate to enter when induced by substrate analogues. Site-directed mutagenesis showed that the E134A mutation decreased the enzyme activity by more than 60%, and Y21A and Y27A mutations abolished the enzyme activity completely. The combined structural and mutagenesis results suggest that during decarboxylation of ferulic acid by FADase, Trp25 and Tyr27 are required for the entering and proper orientation of the substrate while Glu134 and Asn23 participate in proton transfer.  相似文献   

11.
Cândido Godói (CG) is a small municipality in South Brazil with approximately 6,000 inhabitants. It is known as the “Twins'' Town” due to its high rate of twin births. Recently it was claimed that such high frequency of twinning would be connected to experiments performed by the German Nazi doctor Joseph Mengele. It is known, however, that this town was founded by a small number of families and therefore a genetic founder effect may represent an alternatively explanation for the high twinning prevalence in CG. In this study, we tested specific predictions of the “Nazi''s experiment” and of the “founder effect” hypotheses. We surveyed a total of 6,262 baptism records from 1959–2008 in CG catholic churches, and identified 91 twin pairs and one triplet. Contrary to the “Nazi''s experiment hypothesis”, there is no spurt in twinning between the years (1964–1968) when Mengele allegedly was in CG (P = 0.482). Moreover, there is no temporal trend for a declining rate of twinning since the 1960s (P = 0.351), and no difference in twinning among CG districts considering two different periods: 1927–1958 and 1959–2008 (P = 0.638). On the other hand, the “founder effect hypothesis” is supported by an isonymy analysis that shows that women who gave birth to twins have a higher inbreeding coefficient when compared to women who never had twins (0.0148, 0.0081, respectively, P = 0.019). In summary, our results show no evidence for the “Nazi''s experiment hypothesis” and strongly suggest that the “founder effect hypothesis” is a much more likely alternative for explaining the high prevalence of twinning in CG. If this hypothesis is correct, then this community represents a valuable population where genetic factors linked to twinning may be identified.  相似文献   

12.
“Big” molecules such as proteins and genes still continue to capture the imagination of most biologists, biochemists and bioinformaticians. “Small” molecules, on the other hand, are the molecules that most biologists, biochemists and bioinformaticians prefer to ignore. However, it is becoming increasingly apparent that small molecules such as amino acids, lipids and sugars play a far more important role in all aspects of disease etiology and disease treatment than we realized. This particular chapter focuses on an emerging field of bioinformatics called “chemical bioinformatics” – a discipline that has evolved to help address the blended chemical and molecular biological needs of toxicogenomics, pharmacogenomics, metabolomics and systems biology. In the following pages we will cover several topics related to chemical bioinformatics. First, a brief overview of some of the most important or useful chemical bioinformatic resources will be given. Second, a more detailed overview will be given on those particular resources that allow researchers to connect small molecules to diseases. This section will focus on describing a number of recently developed databases or knowledgebases that explicitly relate small molecules – either as the treatment, symptom or cause – to disease. Finally a short discussion will be provided on newly emerging software tools that exploit these databases as a means to discover new biomarkers or even new treatments for disease.

What to Learn in This Chapter

  • The meaning of chemical bioinformatics
  • Strengths and limitations of existing chemical bioinformatic databases
  • Using databases to learn about the cause and treatment of diseases
  • The Small Molecule Pathway Database (SMPDB)
  • The Human Metabolome Database (HMDB)
  • DrugBank
  • The Toxin and Toxin-Target Database (T3DB)
  • PolySearch and Metabolite Set Enrichment Analysis
This article is part of the “Translational Bioinformatics” collection for PLOS Computational Biology.
  相似文献   

13.
Yaari G  Eisenmann S 《PloS one》2011,6(10):e24532
The long lasting debate initiated by Gilovich, Vallone and Tversky in is revisited: does a “hot hand” phenomenon exist in sports? Hereby we come back to one of the cases analyzed by the original study, but with a much larger data set: all free throws taken during five regular seasons () of the National Basketball Association (NBA). Evidence supporting the existence of the “hot hand” phenomenon is provided. However, while statistical traces of this phenomenon are observed in the data, an open question still remains: are these non random patterns a result of “success breeds success” and “failure breeds failure” mechanisms or simply “better” and “worse” periods? Although free throws data is not adequate to answer this question in a definite way, we speculate based on it, that the latter is the dominant cause behind the appearance of the “hot hand” phenomenon in the data.  相似文献   

14.
15.

Background

Previous studies indicate that in published reports, trial results can be distorted by the use of “spin” (specific reporting strategies, intentional or unintentional, emphasizing the beneficial effect of the experimental treatment). We aimed to (1) evaluate the presence of “spin” in press releases and associated media coverage; and (2) evaluate whether findings of randomized controlled trials (RCTs) based on press releases and media coverage are misinterpreted.

Methods and Findings

We systematically searched for all press releases indexed in the EurekAlert! database between December 2009 and March 2010. Of the 498 press releases retrieved and screened, we included press releases for all two-arm, parallel-group RCTs (n = 70). We obtained a copy of the scientific article to which the press release related and we systematically searched for related news items using Lexis Nexis.“Spin,” defined as specific reporting strategies (intentional or unintentional) emphasizing the beneficial effect of the experimental treatment, was identified in 28 (40%) scientific article abstract conclusions and in 33 (47%) press releases. From bivariate and multivariable analysis assessing the journal type, funding source, sample size, type of treatment (drug or other), results of the primary outcomes (all nonstatistically significant versus other), author of the press release, and the presence of “spin” in the abstract conclusion, the only factor associated, with “spin” in the press release was “spin” in the article abstract conclusions (relative risk [RR] 5.6, [95% CI 2.8–11.1], p<0.001). Findings of RCTs based on press releases were overestimated for 19 (27%) reports. News items were identified for 41 RCTs; 21 (51%) were reported with “spin,” mainly the same type of “spin” as those identified in the press release and article abstract conclusion. Findings of RCTs based on the news item was overestimated for ten (24%) reports.

Conclusion

“Spin” was identified in about half of press releases and media coverage. In multivariable analysis, the main factor associated with “spin” in press releases was the presence of “spin” in the article abstract conclusion.  相似文献   

16.
Our experience and prejudice concerning food play an important role in modulating gustatory information processing; gustatory memory stored in the central nervous system influences gustatory information arising from the peripheral nervous system. We have elucidated the mechanism of the “top-down” modulation of taste perception in humans using functional magnetic resonance imaging (fMRI) and demonstrated that gustatory imagery is mediated by the prefrontal (PFC) and insular cortices (IC). However, the temporal order of activation of these brain regions during gustatory imagery is still an open issue. To explore the source of “top-down” signals during gustatory imagery tasks, we analyzed the temporal activation patterns of activated regions in the cerebral cortex using another non-invasive brain imaging technique, magnetoencephalography (MEG). Gustatory imagery tasks were presented by words (Letter G-V) or pictures (Picture G-V) of foods/beverages, and participants were requested to recall their taste. In the Letter G-V session, 7/9 (77.8%) participants showed activation in the IC with a latency of 401.7±34.7 ms (n = 7) from the onset of word exhibition. In 5/7 (71.4%) participants who exhibited IC activation, the PFC was activated prior to the IC at a latency of 315.2±56.5 ms (n = 5), which was significantly shorter than the latency to the IC activation. In the Picture G-V session, the IC was activated in 6/9 (66.7%) participants, and only 1/9 (11.1%) participants showed activation in the PFC. There was no significant dominance between the right and left IC or PFC during gustatory imagery. These results support those from our previous fMRI study in that the Letter G-V session rather than the Picture G-V session effectively activates the PFC and IC and strengthen the hypothesis that the PFC mediates “top-down” control of retrieving gustatory information from the storage of long-term memories and in turn activates the IC.  相似文献   

17.
Positive feedback plays a key role in the ability of signaling molecules to form highly localized clusters in the membrane or cytosol of cells. Such clustering can occur in the absence of localizing mechanisms such as pre-existing spatial cues, diffusional barriers, or molecular cross-linking. What prevents positive feedback from amplifying inevitable biological noise when an un-clustered “off” state is desired? And, what limits the spread of clusters when an “on” state is desired? Here, we show that a minimal positive feedback circuit provides the general principle for both suppressing and amplifying noise: below a critical density of signaling molecules, clustering switches off; above this threshold, highly localized clusters are recurrently generated. Clustering occurs only in the stochastic regime, suggesting that finite sizes of molecular populations cannot be ignored in signal transduction networks. The emergence of a dominant cluster for finite numbers of molecules is partly a phenomenon of random sampling, analogous to the fixation or loss of neutral mutations in finite populations. We refer to our model as the “neutral drift polarity model.” Regulating the density of signaling molecules provides a simple mechanism for a positive feedback circuit to robustly switch between clustered and un-clustered states. The intrinsic ability of positive feedback both to create and suppress clustering is a general mechanism that could operate within diverse biological networks to create dynamic spatial organization.  相似文献   

18.
In neuromuscular acetylcholine (ACh) receptor channels (AChRs), agonist molecules bind with a low affinity (LA) to two sites that can switch to high affinity (HA) and increase the probability of channel opening. We measured (by using single-channel kinetic analysis) the rate and equilibrium constants for LA binding and channel gating for several different agonists of adult-type mouse AChRs. Almost all of the variation in the equilibrium constants for LA binding was from differences in the association rate constants. These were consistently below the limit set by diffusion and were substantially different even though the agonists had similar sizes and the same charge. This suggests that binding to resting receptors is not by diffusion alone and, hence, that each binding site can undergo two conformational changes (“catch” and “hold”) that connect three different structures (apo-, LA-bound, and HA-bound). Analyses of ACh-binding protein structures suggest that this binding site, too, may adopt three discrete structures having different degrees of loop C displacement (“capping”). For the agonists we tested, the logarithms of the equilibrium constants for LA binding and LA↔HA gating were correlated. Although agonist binding and channel gating have long been considered to be separate processes in the activation of ligand-gated ion channels, this correlation implies that the catch-and-hold conformational changes are energetically linked and together comprise an integrated process having a common structural basis. We propose that loop C capping mainly reflects agonist binding, with its two stages corresponding to the formation of the LA and HA complexes. The catch-and-hold reaction coordinate is discussed in terms of preopening states and thermodynamic cycles of activation.  相似文献   

19.
20.
Living organisms are adept in forming inorganic materials (biominerals) with unique structures and properties that exceed the capabilities of engineered materials. Biomimetic materials syntheses are being developed that aim at replicating the advantageous properties of biominerals in vitro and endow them with additional functionalities. Recently, proof-of-concept was provided for an alternative approach that allows for the production of biomineral-based functional materials in vivo. In this approach, the cellular machinery for the biosynthesis of nano-/micropatterned SiO2 (silica) structures in diatoms was genetically engineered to incorporate a monomeric, cofactor-independent (“simple”) enzyme, HabB, into diatom silica. In the present work, it is demonstrated that this approach is also applicable for enzymes with “complex” activity requirements, including oligomerization, metal ions, organic redox cofactors, and posttranslational modifications. Functional expression of the enzymes β-glucuronidase, glucose oxidase, galactose oxidase, and horseradish peroxidase in the diatom Thalassiosira pseudonana was accomplished, and 66 to 78% of the expressed enzymes were stably incorporated into the biosilica. The in vivo incorporated enzymes represent approximately 0.1% (wt/wt) of the diatom biosilica and are stabilized against denaturation and proteolytic degradation. Furthermore, it is demonstrated that the gene construct for in vivo immobilization of glucose oxidase can be utilized as the first negative selection marker for diatom genetic engineering.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号