首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 281 毫秒
1.
Natural host‐parasite interactions exhibit considerable variation in host quality, with profound consequences for disease ecology and evolution. For instance, treatments (such as vaccination) may select for more transmissible or virulent strains. Previous theory has addressed the ecological and evolutionary impact of host heterogeneity under the assumption that hosts and parasites disperse globally. Here, we investigate the joint effects of host heterogeneity and local dispersal on the evolution of parasite life‐history traits. We first formalise a general theoretical framework combining variation in host quality and spatial structure. We then apply this model to the specific problem of parasite evolution following vaccination. We show that, depending on the type of vaccine, spatial structure may select for higher or lower virulence compared to the predictions of non‐spatial theory. We discuss the implications of our results for disease management, and their broader fundamental relevance for other causes of host heterogeneity in nature.  相似文献   

2.
Hanley KA 《Evolution》2011,4(4):635-643
Even students who reject evolution are often willing to consider cases in which evolutionary biology contributes to, or undermines, biomedical interventions. Moreover, the intersection of evolutionary biology and biomedicine is fascinating in its own right. This review offers an overview of the ways in which evolution has impacted the design and deployment of live-attenuated virus vaccines, with subsections that may be useful as lecture material or as the basis for case studies in classes at a variety of levels. Live-attenuated virus vaccines have been modified in ways that restrain their replication in a host so that infection (vaccination) produces immunity but not disease. Applied evolution, in the form of serial passage in novel host cells, is a “classical” method to generate live-attenuated viruses. However, many live-attenuated vaccines exhibit reversion to virulence through back-mutation of attenuating mutations, compensatory mutations elsewhere in the genome, recombination or reassortment, or changes in quasispecies diversity. Additionally, the combination of multiple live-attenuated strains may result in competition or facilitation between individual vaccine viruses, resulting in undesirable increases in virulence or decreases in immunogenicity. Genetic engineering informed by evolutionary thinking has led to a number of novel approaches to generate live-attenuated virus vaccines that contain substantial safeguards against reversion to virulence and that ameliorate interference among multiple vaccine strains. Finally, vaccines have the potential to shape the evolution of their wild-type counterparts in counter-productive ways; at the extreme, vaccine-driven eradication of a virus may create an empty niche that promotes the emergence of new viral pathogens.  相似文献   

3.
4.
Eradication of Taenia solium cysticercosis: a role for vaccination of pigs.   总被引:12,自引:0,他引:12  
Neurocysticercosis due to Taenia solium is an important cause of human morbidity and mortality, particularly in Latin America and parts of Africa and Asia. The disease has been recognised as potentially eradicable. Emphasis has been placed on control of the parasite through mass chemotherapy of human populations to remove tapeworm carriers. This strategy does not control the source of tapeworm infections, cysticercosis in pigs, and parasite transmission may continue due to incomplete chemotherapy coverage of human tapeworm carriers or because of immigration of tapeworm carriers into control areas. Exceptionally effective, practical vaccines have been developed against cysticercosis in sheep and cattle and a recent trial has proved recombinant antigens to be effective against Taenia solium cysticercosis in pigs. A new strategy for eradication of Taenia solium is proposed, based principally on a combined approach of chemotherapy of human tapeworm carriers and vaccination of all pigs at risk of infection.  相似文献   

5.
Tolerance to parasites reduces the harm that infection causes the host (virulence). Here we investigate the evolution of parasites in response to host tolerance. We show that parasites may evolve either higher or lower within-host growth rates depending on the nature of the tolerance mechanism. If tolerance reduces virulence by a constant factor, the parasite is always selected to increase its growth rate. Alternatively, if tolerance reduces virulence in a nonlinear manner such that it is less effective at reducing the damage caused by higher growth rates, this may select for faster or slower replicating parasites. If the host is able to completely tolerate pathogen damage up to a certain replication rate, this may result in apparent commensalism, whereby infection causes no apparent virulence but the original evolution of tolerance has been costly. Tolerance tends to increase disease prevalence and may therefore lead to more, rather than less, disease-induced mortality. If the parasite is selected, even a highly efficient tolerance mechanism may result in more individuals in total dying from disease. However, the evolution of tolerance often, although not always, reduces the individual risk of dying from infection.  相似文献   

6.
Despite the effectiveness of vaccines in dramatically decreasing the number of new infectious cases and severity of illnesses, imperfect vaccines may not completely prevent infection. This is because the immunity afforded by these vaccines is not complete and may wane with time, leading to resurgence and epidemic outbreaks notwithstanding high levels of primary vaccination. To prevent an endemic spread of disease, and achieve eradication, several countries have introduced booster vaccination programs. The question of whether this strategy could eventually provide the conditions for global eradication is addressed here by developing a seasonally-forced mathematical model. The analysis of the model provides the threshold condition for disease control in terms of four major parameters: coverage of the primary vaccine; efficacy of the vaccine; waning rate; and the rate of booster administration. The results show that if the vaccine provides only temporary immunity, then the infection typically cannot be eradicated by a single vaccination episode. Furthermore, having a booster program does not necessarily guarantee the control of a disease, though the level of epidemicity may be reduced. In addition, these findings strongly suggest that the high coverage of primary vaccination remains crucial to the success of a booster strategy. Simulations using estimated parameters for measles illustrate model predictions. This work was supported in part by the Natural Sciences and Engineering Research Council of Canada (NSERC). One of the authors (P.R.) acknowledges the support of the Ellison Medical Foundation.  相似文献   

7.
Host resistance to parasites can come in two main forms: hosts may either reduce the probability of parasite infection (anti-infection resistance) or reduce parasite growth after infection has occurred (anti-growth resistance). Both resistance mechanisms are often imperfect, meaning that they do not fully prevent or clear infections. Theoretical work has suggested that imperfect anti-growth resistance can select for higher parasite virulence by favouring faster-growing and more virulent parasites that overcome this resistance. In contrast, imperfect anti-infection resistance is thought not to select for increased parasite virulence, because it is assumed that it reduces the number of hosts that become infected, but not the fitness of parasites in successfully infected hosts. Here, we develop a theoretical model to show that anti-infection resistance can in fact select for higher virulence when such resistance reduces the effective parasite dose that enters a host. Our model is based on a monarch butterfly-parasite system in which larval food plants confer resistance to the monarch host. We carried out an experiment and showed that this environmental resistance is most likely a form of anti-infection resistance, through which toxic food plants reduce the effective dose of parasites that initiates an infection. We used these results to build a mathematical model to investigate the evolutionary consequences of food plant-induced resistance. Our model shows that when the effective infectious dose is reduced, parasites can compensate by evolving a higher per-parasite growth rate, and consequently a higher intrinsic virulence. Our results are relevant to many insect host-parasite systems, in which larval food plants often confer imperfect anti-infection resistance. Our results also suggest that - for parasites where the infectious dose affects the within-host dynamics - vaccines that reduce the effective infectious dose can select for increased parasite virulence.  相似文献   

8.
HOST LIFE HISTORY AND THE EVOLUTION OF PARASITE VIRULENCE   总被引:3,自引:0,他引:3  
Abstract.— We present a general epidemiological model of host‐parasite interactions that includes various forms of superinfection. We use this model to study the effects of different host life‐history traits on the evolution of parasite virulence. In particular, we analyze the effects of natural host death rate on the evolutionarily stable parasite virulence. We show that, contrary to classical predictions, an increase in the natural host death rate may select for lower parasite virulence if some form of superinfection occurs. This result is in agreement with the experimental results and the verbal argument presented by Ebert and Mangin (1997). This experiment is discussed in the light of the present model. We also point out the importance of superinfections for the effect of nonspecific immunity on the evolution of virulence. In a broader perspective, this model demonstrates that the occurrence of multiple infections may qualitatively alter classical predictions concerning the effects of various host life‐history traits on the evolution of parasite virulence.  相似文献   

9.
Poliomyelitis vaccination via live Oral Polio Vaccine (OPV) suffers from the inherent problem of reversion: the vaccine may, upon replication in the human gut, mutate back to virulence and transmissibility resulting in circulating vaccine derived polio viruses (cVDPVs). We formulate a general mathematical model to assess the impact of cVDPVs on prospects for polio eradication. We find that for OPV coverage levels below a certain threshold, cVDPVs have a small impact in comparison to the expected endemic level of the disease in the absence of reversion. Above this threshold, the model predicts a small but significant endemic level of the disease, even where standard models predict eradication. In light of this, we consider and analyze three alternative eradication strategies involving a transition from continuous OPV vaccination to either continuous Inactivated Polio Vaccine (IPV), pulsed OPV vaccination, or a one-time IPV pulse vaccination. Stochastic modeling shows continuous IPV vaccination is effective at achieving eradication for moderate coverage levels, while pulsed OPV is effective if higher coverage levels are maintained. The one-time pulse IPV method may also be a viable strategy, especially in terms of the number of vaccinations required and time to eradication, provided that a sufficiently large pulse is practically feasible. More investigation is needed regarding the frequency of revertant virus infection resulting directly from vaccination, the ability of IPV to induce gut immunity, and the potential role of spatial transmission dynamics in eradication efforts. B.G. Wagner’s research is supported by the Natural Sciences and Engineering Research Council of Canada (NSERC) Doctoral Scholarship. D.J.D. Earn’s research is supported by the Canadian Institutes of Health Research (CIHR), Natural Sciences and Engineering Research Council of Canada (NSERC) and the J.S. McDonnell Foundation.  相似文献   

10.
Imitation dynamics predict vaccinating behaviour   总被引:5,自引:0,他引:5  
There exists an interplay between vaccine coverage, disease prevalence and the vaccinating behaviour of individuals. Moreover, because of herd immunity, there is also a strategic interaction between individuals when they are deciding whether or not to vaccinate, because the probability that an individual becomes infected depends upon how many other individuals are vaccinated. To understand this potentially complex interplay, a game dynamic model is developed in which individuals adopt strategies according to an imitation dynamic (a learning process), and base vaccination decisions on disease prevalence and perceived risks of vaccines and disease. The model predicts that oscillations in vaccine uptake are more likely in populations where individuals imitate others more readily or where vaccinating behaviour is more sensitive to changes in disease prevalence. Oscillations are also more likely when the perceived risk of vaccines is high. The model reproduces salient features of the time evolution of vaccine uptake and disease prevalence during the whole-cell pertussis vaccine scare in England and Wales during the 1970s. This suggests that using game theoretical models to predict, and even manage, the population dynamics of vaccinating behaviour may be feasible.  相似文献   

11.
A study by Gandon et al. (2001) considered the potential ways pathogens may evolve in response to vaccination with imperfect vaccines. In this paper, by focusing on acute infections of vertebrate hosts, we examine whether imperfect vaccines that do not completely block a pathogen's replication (antigrowth) or transmission (antitransmission) may lead to evolution of more or less virulent pathogen strains. To address this question, we use models of the within-host dynamics of the pathogen and the host's immune responses. One advantage of the use of this within-host approach is that vaccination can be easily incorporated in the models and the trade-offs between pathogen transmissibility, host recovery, and virulence that drive evolution of pathogens in these models can be easily estimated. We find that the use of either antigrowth or antitransmission vaccines leads to the evolution of pathogens with an increased within-host growth rate; infection of unvaccinated hosts with such evolved pathogens results in high host mortality and low pathogen transmission. Vaccination of only a fraction of hosts with antigrowth vaccines may prevent pathogens from evolving high virulence due to pathogen adaptation to unvaccinated hosts and thus protection of vaccinated hosts from pathogen-induced disease. In contrast, antitransmission vaccines may be beneficial only if they are effective enough to cause pathogen extinction. Our results suggest that particular mechanisms of action of vaccines and their efficacy are crucial in predicting longterm evolutionary consequences of the use of imperfect vaccines.  相似文献   

12.
Salmonella spp. in cattle contribute to bacterial foodborne disease for humans. Reduction of Salmonella prevalence in herds is important to prevent human Salmonella infections. Typical control measures are culling of infectious animals, vaccination, and improved hygiene management. Vaccines have been developed for controlling Salmonella transmission in dairy herds; however, these vaccines are imperfect and a variety of vaccine effects on susceptibility, infectiousness, Salmonella shedding level, and duration of infectious period were reported. To assess the potential impact of imperfect Salmonella vaccines on prevalence over time and the eradication criterion, we developed a deterministic compartmental model with both replacement (cohort) and lifetime (continuous) vaccination strategies, and applied it to a Salmonella Cerro infection in a dairy farm. To understand the uncertainty of prevalence and identify key model parameters, global parameter uncertainty and sensitivity analyses were performed. The results show that imperfect Salmonella vaccines reduce the prevalence of Salmonella Cerro. Among three vaccine effects that were being considered, decreasing the length of the infectious period is most effective in reducing the endemic prevalence. Analyses of contour lines of prevalence or the critical reproduction ratio illustrate that, reducing prevalence to a certain level or zero can be achieved by choosing vaccines that have either a single vaccine effect at relatively high effectiveness, or two or more vaccine effects at relatively low effectiveness. Parameter sensitivity analysis suggests that effective control measures through applying Salmonella vaccines should be adjusted at different stages of infection. In addition, lifetime (continuous) vaccination is more effective than replacement (cohort) vaccination. The potential application of the developed vaccination model to other Salmonella serotypes related to foodborne diseases was also discussed. The presented study may be used as a tool for guiding the development of Salmonella vaccines.  相似文献   

13.
Understanding the processes that shape the evolution of parasites is a key challenge for evolutionary biology. It is well understood that different parasites may often infect the same host and that this may have important implications to the evolutionary behavior. Here we examine the evolutionary implications of the conflict that arises when two parasite species, one vertically transmitted and the other horizontally transmitted, infect the same host. We show that the presence of a vertically transmitted parasite (VTP) often leads to the evolution of higher virulence in horizontally transmitted parasites (HTPs), particularly if the VTPs are feminizing. The high virulence in some HTPs may therefore result from coinfection with cryptic VTPs. The impact of an HTP on a VTP evolution depends crucially on the nature of the life‐history trade‐offs. Fast virulent HTPs select for intermediate feminization and virulence in VTPs. Coevolutionary models show similar insights, but emphasize the importance of host life span to the outcome, with higher virulence in both types of parasite in short‐lived hosts. Overall, our models emphasize the interplay of host and parasite characteristics in the evolutionary outcome and point the way for further empirical study.  相似文献   

14.
There is little doubt evolution has played a major role in preventing the control of infectious disease through antibiotic and insecticide resistance, but recent theory suggests disease interventions such as vaccination may lead to evolution of more harmful parasites. A new study published in PLOS Biology by Andrew Read and colleagues shows empirically that vaccination against Marek’s disease has favored higher virulence; without intervention, the birds die too quickly for any transmission to occur, but vaccinated hosts can both stay alive longer and shed the virus. This is an elegant empirical demonstration of how evolutionary theory can predict potentially dangerous responses of infectious disease to human interventions.There is little doubt that evolution continues to play a major role in preventing drug and vector-control programs from eliminating many infectious diseases. How much of the global infectious disease burden is attributable to recent evolution, and how much to social and other forces, remains unclear, but we are unquestionably severely impacted by the evolutionary potential of pathogens [1]. There is a large body of evolutionary theory that seeks to understand the processes that make some infectious diseases acute and lethal while others are chronic and mild [25]. More recently, this general theory has been applied to make predictions of the evolutionary outcomes of particular disease interventions within the broader aim of “virulence management” [6,7,9]. Of particular importance is that the theory predicts that there is the potential for certain disease interventions, including certain types of vaccination, to select for the evolution of greater virulence (cause higher mortality) and therefore present a greater threat to their hosts [7]. However, the theory generally makes deliberately simple assumptions, ignoring, for example, the molecular mechanisms that underpin host–parasite interactions. While this is one of the strengths of the approach—since it aims to make general predictions—it has been unclear how relevant this theory is to real infectious diseases. Read et al. have now provided a direct empirical test of one of the key theoretical predictions that “imperfect” vaccination can select for higher virulence [8]. The study is important because although there is increasing interest in evolutionary biology by the medical community [10,11], few empirical tests of evolutionary theory have been conducted that are of immediate relevance to important disease problems. Importantly, this empirical paper confirms the unintuitive and worrying predictions of a very simple theoretical model of the implications of a common disease intervention.In some sense, theory on the evolution of virulence addresses the fundamental question of why infectious diseases kill their hosts. In a classic infectious disease, whether spread by contact, environmental infectious stages, or through vectors, the longer the host is infectious, the greater the chance that transmission will occur. If by killing the host the infectious period is shortened, then all things being equal, parasite genotypes that kill the host more slowly have a longer infectious period and will therefore be favored. Hence, virulence (defined as disease-induced mortality) will be selected against by evolution, and parasites should evolve to become benign; they should evolve away from parasitism towards commensalism (infection without host damage). This is the “conventional wisdom” [3] that leads ultimately to the question: why are some parasites lethal? High virulence in a relatively rare host into which a disease occasionally spills over, such as Ebola in humans, may persist because selection is predominately occurring in the reservoir rather than the rare host. Furthermore, recently emerged, initially virulent disease may be in the process of evolving to lower virulence as they become more endemic in a new host. Also, in principle, disease-induced mortality could be a by-product of infection that is completely unrelated to both the genotype and life history characteristics of the parasite and is therefore not selected against. However, evolutionary theory assumes that virulence has been selected for because fundamentally things are not equal. Specifically, most theory assumes that disease-induced mortality (virulence) results from a “trade-off” (a gain in one trait comes at the expense of another) with another parasite characteristic, so that virulence is a correlated and unavoidable consequence of another factor that increases the chance of transmission. In this “trade-off hypothesis,” people have generally focused on virulence (mortality rate) being a by-product/cost of the transmission rate [2,5,12], as it is an appealing idea that high growth rates within the host may produce more transmission stages and therefore a higher rate of transmission, but also cause more damage and therefore higher virulence. Clearly within-host dynamics are much more complicated than this caricature, and we rarely understand all the mechanisms that underpin any trade-offs, but there is now good evidence for this overall relationship between transmission rate and mortality rate (virulence) in a number of systems [5,1316]. Given this trade-off relationship, the theory predicts that there is an optimal transmission rate and level of virulence that maximizes the average number of infections that would occur in a completely susceptible population (the so called basic reproductive number, R0). With any disease intervention, however, there is a clear and present danger that the balance between transmission rate and virulence will be altered, leading to changes in “optimal” virulence. It is here that evolutionary theory can be useful in predicting the impact on infectious disease virulence of different interventions.The theoretical paper that predicted the empirical results tested in the Read paper was inspired by the potential use of “imperfect” or “leaky” vaccines for malaria [7]. The key assumption of the model is that vaccination is “leaky” such that transmission can occur from infected, vaccinated individuals. If, on the other hand, the vaccination is “sterilizing,” preventing the infection (or at least the infectivity) of vaccinated individuals, then vaccinated individuals represent an evolutionary dead-end for parasites, as there is no opportunity for selection to occur. The model is simple in that it explicitly excludes escape mutants (which in some sense also make vaccines imperfect) and is typical of the approach used in evolutionary theory. The power of the approach is that by focusing on one process, the theory can make clear predictions. The key message of the detailed modeling is that leaky vaccination, which reduces the impact of the disease and thereby lowers pathogenicity, selects for a higher growth rate in the parasite, leading to a greater transmission rate and higher virulence [7]. Effectively, the cost of higher exploitation is reduced, which changes the shape of the virulence-transmission rate trade-off and allows for higher optimal growth and transmission rates. An important consequence is that a highly virulent parasite strain that kills its host so quickly that it cannot persist in an unvaccinated population can potentially circulate in a vaccinated population.Read et al. present clear evidence that imperfect vaccines do indeed enable the persistence of much more virulent strains of Marek’s disease to circulate than would be possible in the absence of vaccination. Marek’s is a disease of poultry that is spread by inhalation, persists in the environment, and initially causes paralysis in older birds. Previous work from the group had shown clearly that there is a transmission-virulence trade-off in the disease [15,16]. Leaky vaccination [17] against Marek’s has been common since the 1970s, and over this period the disease has become much more virulent [12]. While during this time there have been a number of changes including the intensification of production [18] and shorter bird life spans [15] that could, in theory, have increased virulence, the Read paper directly examines whether leaky vaccines could be the cause. In the new paper, the vaccine is confirmed to be “leaky” [17]: vaccinated birds can become infected and, critically, they can shed the virus. In the core experiment, Read et al. vaccinated birds from naïve parents (so that there were no maternal antibodies) with five virus strains that vary in virulence from 60% mortality over two months to 100% mortality by 10 days. In terms of classic virulence measures, this approximates as a 10-fold variation in the disease-induced mortality rate. Vaccination does reduce shedding of the virus; however, this positive effect of the intervention is overwhelmed in the more virulent viruses by the fact that unvaccinated birds die much more quickly. Without vaccination, the virulent strains generally kill the hosts before any transmission can take place: a strong example of the transmission virulence trade-off in action. The researchers went further and directly examined transmission using sentinel birds. These sentinel birds were put in enclosures with either vaccinated or unvaccinated birds, both of which had been challenged with the more virulent viruses. Early death in the unvaccinated birds meant that no sentinels were infected, and this contrasted starkly with the vaccinated birds enclosure, where the sentinel birds were infected. As a whole, this paper provides a direct test of the idea that vaccination allows the transmission of virus strains that are too virulent to transmit in non-vaccinated hosts.A key criticism of the Gandon et al. theoretical paper is that there is no clear evidence of higher virulence due to human vaccination programs. However, the established successful human vaccination programs have mostly been “sterilizing” [19], although as we implement human vaccination programs with “leaky” vaccines, we will be carrying out real-world “experiments” that test the theory. The Read paper has shown, however, that this piece of evolutionary theory is pertinent to real-world infectious disease control. More generally, this study highlights the potential usefulness of evolutionary theory for disease control and suggests that it may therefore have an important role to play in the design of medical interventions. If so, it is important that we take a broad view of the evolution of virulence theory and the “trade-off hypothesis” beyond the simple relationship between transmission and mortality rates. Evolutionary theory is directly applicable whenever virulence is an optimum determined by the relative costs and benefits of a number of correlated parasite traits. This broader view is important since, for infectious agents that are obligate killers (i.e., they can only transmit at the death of their host), virulence is positively related to transmission. However, in these diseases there are other trade-offs, such as one between productivity and time to death, that lead to an evolutionarily optimal virulence that is determined by selection [5,20,21]. Indeed, the paper that is often cited as the origin of the trade-off hypothesis [4] described a trade-off between virulence (disease-induced mortality) and recovery (rather than transmission) such that faster growing, more damaging parasite strains are harder to clear, and the hosts take longer to recover [4,22]. Some of the criticism for the trade-off hypothesis focuses on whether there is a transmission–virulence relationship in a particular disease interaction; but if we take this broad view, there is considerable evidence that virulence is shaped by selection [5,12,23]. Furthermore, although “accidental” high virulence in a rare host may not be selected against, selection is likely to be happening in the more common hosts, and even if virulence is caused primarily by host immunopathology, this has the potential to select the parasites to modulate growth or immunomodulation [2427].In terms of vaccination programs, it would clearly be useful if there were more experimental tests of the theory. However, there are likely to be few systems in which there has been the widespread historical implementation of leaky vaccination that are also amiable to experimentation. There are, however, a number of key issues that still need to be addressed theoretically. In particular, any vaccination program is likely to have incomplete vaccine coverage, and understanding the impact of different levels of heterogeneity in vaccine coverage within the population to the evolution of virulence is a difficult but important problem. Furthermore, there is likely to be genetic variation both in resistance of hosts and the efficacy of vaccination within most populations, and this heterogeneity may have important implications to the outcome of vaccination [28]. It is also the case that there can often be specificity between different parasite strains and host genotypes, and this may be of considerable importance in many systems [12], particularly outside of relatively genetically homogenous agricultural populations. That said, given that we now have this test in Marek’s disease, we are now able, at least, to say that the theory is relevant to real-world disease systems. It seems prudent, therefore, to take this risk seriously and consider the potential for selection in the use of new vaccines. The theory tells us that the key questions we need to ask of a vaccination program are: is the vaccine leaky? Does the vaccine act to reduce the impact of the disease within an individual? And is the virulence of the parasite selected for—whether it is due to the transmission virulence trade-off or some other, broader trade-off relationship? These questions should ideally be addressed before the implementation of any vaccination program, and careful monitoring would be usefully implemented in the light of this potential selection for higher virulence.More broadly, other disease interventions beyond vaccination also have the potential to select upon disease and cause similar problems [6]. It is very hard for us to predict the evolutionary outcome after the emergence of a new disease into a population, but we can and should do much better in predicting the impact of our own disease interventions. If we take it seriously, evolutionary theory gives us the opportunity to move towards a more evolutionarily rational program of disease intervention. While the Read paper shows how the simple theory of the Gandon paper can predict real disease dynamics, there is considerable potential for the development of more disease-specific theory that includes more of the key detailed mechanistic knowledge of a particular host–parasite interaction. There is an opportunity for real advances in the predictive power of the models through tight collaborations between evolutionary modelers and molecular parasitologists and/or virologists. Calls for more serious acceptance of evolutionary biology by the medical community are increasing [10,11], and the Read paper shows that a combination of predictive theory and empirical tests of this theory in real-world disease systems have real potential to improve disease interventions in the light of evolutionary responses.  相似文献   

15.
An increasing number of scientists have recently raised concerns about the threat posed by human intervention on the evolution of parasites and disease agents. New parasites (including pathogens) keep emerging and parasites which previously were considered to be 'under control' are re-emerging, sometimes in highly virulent forms. This re-emergence may be parasite evolution, driven by human activity, including ecological changes related to modern agricultural practices. Intensive farming creates conditions for parasite growth and transmission drastically different from what parasites experience in wild host populations and may therefore alter selection on various traits, such as life-history traits and virulence. Although recent epidemic outbreaks highlight the risks associated with intensive farming practices, most work has focused on reducing the short-term economic losses imposed by parasites, such as application of chemotherapy. Most of the research on parasite evolution has been conducted using laboratory model systems, often unrelated to economically important systems. Here, we review the possible evolutionary consequences of intensive farming by relating current knowledge of the evolution of parasite life-history and virulence with specific conditions experienced by parasites on farms. We show that intensive farming practices are likely to select for fast-growing, early-transmitted, and hence probably more virulent parasites. As an illustration, we consider the case of the fish farming industry, a branch of intensive farming which has dramatically expanded recently and present evidence that supports the idea that intensive farming conditions increase parasite virulence. We suggest that more studies should focus on the impact of intensive farming on parasite evolution in order to build currently lacking, but necessary bridges between academia and decision-makers.  相似文献   

16.
Ecological interactions between microparasite populations in the same host are an important source of selection on pathogen traits such as virulence and drug resistance. In the rodent malaria model Plasmodium chabaudi in laboratory mice, parasites that are more virulent can competitively suppress less virulent parasites in mixed infections. There is evidence that some of this suppression is due to immune-mediated apparent competition, where an immune response elicited by one parasite population suppress the population density of another. This raises the question whether enhanced immunity following vaccination would intensify competitive interactions, thus strengthening selection for virulence in Plasmodium populations. Using the P. chabaudi model, we studied mixed infections of virulent and avirulent genotypes in CD4+T cell-depleted mice. Enhanced efficacy of CD4+T cell-dependent responses is the aim of several candidate malaria vaccines. We hypothesized that if immune-mediated interactions were involved in competition, removal of the CD4+T cells would alleviate competitive suppression of the avirulent parasite. Instead, we found no alleviation of competition in the acute phase, and significant enhancement of competitive suppression after parasite densities had peaked. Thus, the host immune response may actually be alleviating other forms of competition, such as that over red blood cells. Our results suggest that the CD4+-dependent immune response, and mechanisms that act to enhance it such as vaccination, may not have the undesirable affect of exacerbating within-host competition and hence the strength of this source of selection for virulence.  相似文献   

17.
Infectious disease treatments, both pharmaceutical and vaccine, face three universal challenges: the difficulty of targeting treatments to high-risk 'superspreader' populations who drive the great majority of disease spread, behavioral barriers in the host population (such as poor compliance and risk disinhibition), and the evolution of pathogen resistance. Here, we describe a proposed intervention that would overcome these challenges by capitalizing upon Therapeutic Interfering Particles (TIPs) that are engineered to replicate conditionally in the presence of the pathogen and spread between individuals--analogous to 'transmissible immunization' that occurs with live-attenuated vaccines (but without the potential for reversion to virulence). Building on analyses of HIV field data from sub-Saharan Africa, we construct a multi-scale model, beginning at the single-cell level, to predict the effect of TIPs on individual patient viral loads and ultimately population-level disease prevalence. Our results show that a TIP, engineered with properties based on a recent HIV gene-therapy trial, could stably lower HIV/AIDS prevalence by ~30-fold within 50 years and could complement current therapies. In contrast, optimistic antiretroviral therapy or vaccination campaigns alone could only lower HIV/AIDS prevalence by <2-fold over 50 years. The TIP's efficacy arises from its exploitation of the same risk factors as the pathogen, allowing it to autonomously penetrate superspreader populations, maintain efficacy despite behavioral disinhibition, and limit viral resistance. While demonstrated here for HIV, the TIP concept could apply broadly to many viral infectious diseases and would represent a new paradigm for disease control, away from pathogen eradication but toward robust disease suppression.  相似文献   

18.
Global immunization programmes have achieved some remarkable successes. In 1977, Frank Fenner's Commission declared smallpox to have been eradicated by an 11-year-long intensive campaign. The Expanded Programme on Immunization encompassed six important childhood vaccines and reached over three-quarters of the world's children. Polio eradication has gone remarkably well, with only 10 out of 200 countries reporting residual cases. But amidst all the good news, there is also bad news. Coverage is variable; infrastructure is crumbling; and newer vaccines are not being incorporated in many country programmes. The Bill and Melinda Gates Foundation has introduced a new dynamic here. From their initial gift of $100 million in December 1998, their commitment to date is US$1.5 billion - and rising. At the centre is a Global Children's Vaccine Fund which permitted the launch, in January 2000, of the Global Alliance for Vaccines and Immunization. This is targeted to the 74 poorest countries of the world and is designed to improve vaccination infrastructure, to purchase newer vaccines and to support research and development. Even before we know how successful this programme will be, it has had its imitators. The Global Fund to Fight AIDS, TB and Malaria borrowed many concepts from GAVI. The Global Alliance for Improved Nutrition announced in May 2002 does so as well, and is heavily supported by Gates. Highly effective parasite control programmes antedate all this but will be much strengthened. However, we still face a sizeable budgetary gap both for research and for bringing the best advances to all people who need them.  相似文献   

19.
Despite enormous success of mass immunization programs in reducing incidence of infectious diseases, vaccine-escape strains have emerged perhaps as a consequence of strong selection pressures exerted on parasites by vaccines. Pertussis presents a well-documented example. As a childhood infection, it exhibits age-specific transmission biased to children. Assuming different transmission rates between children and adults, I study, by means of an age-structured epidemic model, evolutionary dynamics of parasite virulence in a vaccinated population. I find that the age-structure does not affect the evolutionary dynamics of parasite virulence. Also, based on empirical data reporting antigenic divergence with vaccine strains and mutations in virulence-associated genes in pertussis populations, I allow for parallel occurrence of mutations in parasite virulence and associated immune evasion. I conclude that this simultaneous adaptation of both traits may substantially alter the evolutionary course of the parasite. In particular, higher values of virulence are favoured once the parasite is able to evade the transmission-blocking vaccine-induced immunity. On the other hand, lower values of virulence are selected for once the parasite evolves the ability to evade the virulence-blocking vaccine-induced immunity. I emphasize the importance of multi-trait evolution to assess the direction of parasite adaptation more accurately.  相似文献   

20.
The SIR (susceptible-infectious-resistant) and SIS (susceptible-infectious-susceptible) frameworks for infectious disease have been extensively studied and successfully applied. They implicitly assume the upper and lower limits of the range of possibilities for host immune response. However, the majority of infections do not fall into either of these extreme categories. We combine two general avenues that straddle this range: temporary immune protection (immunity wanes over time since infection), and partial immune protection (immunity is not fully protective but reduces the risk of reinfection). We present a systematic analysis of the dynamics and equilibrium properties of these models in comparison to SIR and SIS, and analyse the outcome of vaccination programmes. We describe how the waning of immunity shortens inter-epidemic periods, and poses major difficulties to disease eradication. We identify a "reinfection threshold" in transmission when partial immunity is included. Below the reinfection threshold primary infection dominates, levels of infection are low, and vaccination is highly effective (approximately an SIR model). Above the reinfection threshold reinfection dominates, levels of infection are high, and vaccination fails to protect (approximately an SIS situation). This association between high prevalence of infection and vaccine failure emphasizes the problems of controlling recurrent infections in high-burden regions. However, vaccines that induce a better protection than natural infection have the potential to increase the reinfection threshold, and therefore constitute interventions with a surprisingly high capacity to reduce infection where reduction is most needed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号