首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Net blotch is a barley foliar disease caused by two forms of Pyrenophora teres: Pyrenophora teres f. teres (PTT) and Pyrenophora teres f. maculata (PTM). To monitor and quantify their occurrence during the growing season, diagnostic system based on real-time PCR was developed. TaqMan MGB (Minor Groove Binder) primers and probes were designed that showed high specificity for each of the two forms of P. teres. As a host plant internal standard, TaqMan MGB primers and probe based on RacB gene sequence were designed. The method was optimised on pure fungal DNA and on plasmid standard dilutions. Quantification was accomplished by comparing Ct values of unknown samples with those obtained from plasmid standard dilutions. The assay detects down to five gene copies per reaction. It is able to produce reliable quantitative data over a range of six orders of magnitude. The developed assay was used to differentiate and quantify both forms of P. teres in infected barley leaves. Correlation R(2)=0.52 was obtained between the Ct values and size of symptoms areas in early stage of infection. Application of the TaqMan MGB technology to leaf samples collected in 20 barley varieties in the region Kromeriz during the growing season of 2003 and 2004 revealed that P. teres f. teres predominated in these 2 years. The developed method is an important tool to quantify and monitor the dynamics of the two forms of P. teres during the growing season.  相似文献   

2.
Pyrenophora teres, causal agent of net blotch of barley, exists in two forms, designated P. teres f. teres and P. teres f. maculata, which induce net form net blotch (NFNB) and spot form net blotch (SFNB), respectively. Significantly more work has been performed on the net form than on the spot form although recent activity in spot form research has increased because of epidemics of SFNB in barley-producing regions. Genetic studies have demonstrated that NFNB resistance in barley is present in both dominant and recessive forms, and that resistance/susceptibility to both forms can be conferred by major genes, although minor quantitative trait loci have also been identified. Early work on the virulence of the pathogen showed toxin effector production to be important in disease induction by both forms of pathogen. Since then, several laboratories have investigated effectors of virulence and avirulence, and both forms are complex in their interaction with the host. Here, we assemble recent information from the literature that describes both forms of this important pathogen and includes reports describing the host-pathogen interaction with barley. We also include preliminary findings from a genome sequence survey. TAXONOMY: Pyrenophora teres Drechs. Kingdom Fungi; Phylum Ascomycota; Subphylum Pezizomycotina; Class Dothideomycete; Order Pleosporales; Family Pleosporaceae; Genus Pyrenophora, form teres and form maculata. IDENTIFICATION: To date, no clear morphological or life cycle differences between the two forms of P. teres have been identified, and therefore they are described collectively. Towards the end of the growing season, the fungus produces dark, globosely shaped pseudothecia, about 1-2mm in diameter, on barley. Ascospores measuring 18-28μm × 43-61μm are light brown and ellipsoidal and often have three to four transverse septa and one or two longitudinal septa in the median cells. Conidiophores usually arise singly or in groups of two or three and are lightly swollen at the base. Conidia measuring 30-174μm × 15-23μm are smoothly cylindrical and straight, round at both ends, subhyaline to yellowish brown, often with four to six pseudosepta. Morphologically, P. teres f. teres and P. teres f. maculata are indistinguishable. HOST RANGE: Comprehensive work on the host range of P. teres f. teres has been performed; however, little information on the host range of P. teres f. maculata is available. Hordeum vulgare and H. vulgare ssp. spontaneum are considered to be the primary hosts for P. teres. However, natural infection by P. teres has been observed in other wild Hordeum species and related species from the genera Bromus, Avena and Triticum, including H. marinum, H. murinum, H. brachyantherum, H. distichon, H. hystrix, B. diandrus, A. fatua, A. sativa and T. aestivum (Shipton et al., 1973, Rev. Plant Pathol. 52:269-290). In artificial inoculation experiments under field conditions, P. teres f. teres has been shown to infect a wide range of gramineous species in the genera Agropyron, Brachypodium, Elymus, Cynodon, Deschampsia, Hordelymus and Stipa (Brown et al., 1993, Plant Dis. 77:942-947). Additionally, 43 gramineous species were used in a growth chamber study and at least one of the P. teres f. teres isolates used was able to infect 28 of the 43 species tested. However, of these 28 species, 14 exhibited weak type 1 or 2 reactions on the NFNB 1-10 scale (Tekauz, 1985). These reaction types are small pin-point lesions and could possibly be interpreted as nonhost reactions. In addition, the P. teres f. teres host range was investigated under field conditions by artificially inoculating 95 gramineous species with naturally infected barley straw. Pyrenophora teres f. teres was re-isolated from 65 of the species when infected leaves of adult plants were incubated on nutrient agar plates; however, other than Hordeum species, only two of the 65 host species exhibited moderately susceptible or susceptible field reaction types, with most species showing small dark necrotic lesions indicative of a highly resistant response to P. teres f. teres. Although these wild species have the potential to be alternative hosts, the high level of resistance identified for most of the species makes their role as a source of primary inoculum questionable. DISEASE SYMPTOMS: Two types of symptom are caused by P. teres. These are net-type lesions caused by P. teres f. teres and spot-type lesions caused by P. teres f. maculata. The net-like symptom, for which the disease was originally named, has characteristic narrow, dark-brown, longitudinal and transverse striations on infected leaves. The spot form symptom consists of dark-brown, circular to elliptical lesions surrounded by a chlorotic or necrotic halo of varying width.  相似文献   

3.
Spring barley cultivars currently grown in Germany are all more or less susceptible to Rhynchosporium secalis (Oudem.) J.J. Davis, but there are obvious differences in the degree of susceptibility under field conditions. Small genotypic differences may be caused by both genetic and environmental effects, respectively. To minimize the influence of environmental variation on disease expression, several inoculation methods were developed in the present study. In two experiments the effectiveness of the inoculation by spraying of single spore isolates was tested in the glasshouse and in the field, respectively. High infection levels were achieved in the glasshouse. Despite the infection of barley in the field, disease expression levels remained low due to unfavourable conditions. Another experiment showed the usefulness of infected straw applied in the autumn only for testing the seedling infection type of spring barley cultivars against R, secalis. Seedling assay scores and field infection levels were closely related (r = 0.796, P < 0.01; r = 0.911, P < 0.001). Therefore, both the spray infection technique in the glasshouse and the inoculation technique using infected straw in the field appear to be suitable to detect genetic differences in resistance/susceptibility of spring barley cultivars against leaf blotch.  相似文献   

4.
A method for estimating disease induced yield losses by using single wheat tillers as experimental units was evaluated under commercial field conditions. Natural epidemics of Septoria tritici blotch, yellow rust and leaf rust which developed late in the season had no effect on the vegetative growth of the host, but caused only reduced kernels weight. The optimal sample size was determined experimentally to be 300 tillers. Losses were also estimated in the same experiments using field plots as experimental units. Yield loss estimates derved from using single tillers were significantly correlated with those derived from using field plots. The relationship between disease and yield varied significantly among fields, even in a given year, region and cultivar. Thus, it was concluded that this relationship has to be established empirically wherever yield losses are to be estimated, and that under Israeli conditions it may be done by using the single tillers method.  相似文献   

5.
An AUDPC model was developed to describe the relationship between Alternaria porri infection and yield loss in garlic. Percentage yield loss was regressed against AUDPC and gave an acceptable fit to a linear model with an intercept of -0.35, a slope of 0.09 and R2= 0.85 during 1990–91 and an intercept of 1.91, a slope of 0.08 and R2= 0.87 during 1991–92. The effect of various levels of leaf damage on garlic bulb yield was also studied to simulate damage caused by A. porri. Significant yield reduction by 25% defoliation at 5 wk, 50% defoliation at 4 wk and 75% defoliation at 3 wk before bulb maturity were recorded. No significant yield reduction from 2 wk before bulb maturity could, however, be noticed even at the highest levels of leaf damage. Linear regression models were fitted for predicting yield at various levels of defoliation at different weeks before bulb maturity. These models can be used to describe the consequences of disease epidemics, projecting possible losses during the season and justifying the need for chemical disease control.  相似文献   

6.
Comparisons of epidemics of light leafspot of differing duration and time of initiation were made in two experiments using a single cultivar of Brassica napus. Fungicide was applied before introduction of disease to prevent infection or some time after inoculation to stop further disease development. In the first experiment, substantial reductions in green leaf area and total plant dry-matter were found at flowering when disease was introduced in the autumn or in January. Plant dry weight at maturity was also greatly reduced in these treatments. The detrimental effect of an epidemic initiated in the autumn was avoided to a large extent if fungicide application began in February. Epidemics initiated in March had only small effects on final dry-matter yield. Seed yield was negatively correlated with the length of the epidemic. In a second experiment, early epidemics initiated in the autumn were halted after different time intervals. Commencing fungicide application even as early as December failed to prevent some loss of dry weight at flowering. At maturity, however, dry weight and seed yield were reduced significantly when fungicide application was delayed until February. Failure to control the disease resulted in a 46% loss of seed yield.  相似文献   

7.
The relationship between development of light leaf spot and yield loss in winter oilseed rape was analysed, initially using data from three experiments at sites near Aberdeen in Scotland in the seasons 1991/92, 1992/93 and 1993/94, respectively. Over the three seasons, single-point models relating yield to light leaf spot incidence (% plants with leaves with light leaf spot) at GS 3.3 (flower buds visible) generally accounted for more of the variance than single-point models at earlier or later growth stages. Only in 1992/93, when a severe light leaf spot epidemic developed on leaves early in the season, did the single-point model for disease severity on leaves at GS 3.5/4.0 account for more of the variance than that for disease incidence at GS 3.3. In 1991/92 and 1992/3, when reasonably severe epidemics developed on stems, the single-point model for light leaf spot incidence (stems) at GS 6.3 accounted for as much of the variance. Two-point (disease severity at GS 3.3 and GS 4.0) and AUDPC models (disease incidence/severity) accounted for more of the variance than the single-point model based on disease incidence at GS 3.3 in 1992/93 but not in the other two seasons. Therefore, a simple model using the light leaf spot incidence at GS 3.3 (x) as the explanatory variable was selected as a predictive model to estimate % yield loss (yr): yr= 0.32x– 0.57. This model fitted all three data sets from Scotland, When data sets from Rothamsted, Rosemaund and Thurloxton in England were used to test it, this single-point predictive model generally fitted the data well, except when yield loss was clearly not related to occurrence of light leaf spot. However, the regression lines relating observed yield loss to light leaf spot incidence at GS 3.3 often had smaller slopes than the line produce, by the model based on Scottish data.  相似文献   

8.
The effects of the combinations of leaf rust inoculation at different growth stages and initial inoculum levels on leaf rust development and yield of winter wheat cultivars, McNair 1003 and Coker 762 were evaluated. Disease onset stage and initial inoculum level affected the rate of leaf rust development and shape of the disease progress curves in both cultivars. Epidemics with common onset stages and different initial inoculum levels differed in area under the disease progress curve (AUDPC). Leaf rust epidemics initiated at Feeke's growth stages 5, 7, and 10 reduced yield in both cultivars. Leaf rust epidemics initiated early with high inoculum levels had the greatest deleterious effect on yield. Maximum losses due to leaf rust were 30–40 %. Yield loss was directly related to AUDPC when the AUDPC varied from 500 to 1700 in McNair 1003 and from 250 to 1700 in Coker 762. Yield reduction was mainly due to reduction in grain weight.  相似文献   

9.
Surveys and field experiments showed pasmo to be the most serious disease affecting UK winter linseed in the 1997–98, 1998–99 and 1999–2000 growing seasons. Survey data indicated that pasmo was widespread in England and Scotland, causing extensive loss of leaves and stem and capsule symptoms, on both winter and spring linseed crops. In winter linseed experiments at ADAS Boxworth and Rothamsted, when severe epidemics occurred (1997–98 and 1999–2000), control of pasmo with one or two MBC fungicide sprays increased yield. In experiments when severe pasmo epidemics did not occur (1998–99), fungicide applications did not increase yield. In all three growing seasons, large numbers of air-borne Mycosphaerella linicola ascospores were collected in the summer months. At the time when the winter linseed crop was emerging and becoming established in October/November, there were more air-borne M. linicola ascospores in 1999 than in 1998. April/May rainfall was much greater in 1998 (135 mm) and 2000 (223 mm), when severe pasmo epidemics developed by July, than in 1999 (68 mm) when disease severity in July was less. Regression analyses suggested that yield decreased as percentage area affected by pasmo on leaves or stems in July increased. The formulae relating yield loss to pasmo severity, derived from these experiments, were combined with disease survey data to estimate, retrospectively, the UK national losses from pasmo. Estimated national losses from pasmo on winter linseed, although >50% of crops were sprayed with fungicide, were approximately £2.9M in 1998, £1.6M in 1999 and £0.37M in 2000 (when the area of winter linseed had decreased greatly). Estimated combined losses on winter and spring linseed were approximately £14.8M in 1998, £34.9M in 1999 and £11.0M in 2000.  相似文献   

10.
Breban R 《PloS one》2011,6(12):e28300
Both pandemic and seasonal influenza are receiving more attention from mass media than ever before. Topics such as epidemic severity and vaccination are changing the way in which we perceive the utility of disease prevention. Voluntary influenza vaccination has been recently modeled using inductive reasoning games. It has thus been found that severe epidemics may occur because individuals do not vaccinate and, instead, attempt to benefit from the immunity of their peers. Such epidemics could be prevented by voluntary vaccination if incentives were offered. However, a key assumption has been that individuals make vaccination decisions based on whether there was an epidemic each influenza season; no other epidemiological information is available to them. In this work, we relax this assumption and investigate the consequences of making more informed vaccination decisions while no incentives are offered. We obtain three major results. First, individuals will not cooperate enough to constantly prevent influenza epidemics through voluntary vaccination no matter how much they learned about influenza epidemiology. Second, broadcasting epidemiological information richer than whether an epidemic occurred may stabilize the vaccination coverage and suppress severe influenza epidemics. Third, the stable vaccination coverage follows the trend of the perceived benefit of vaccination. However, increasing the amount of epidemiological information released to the public may either increase or decrease the perceived benefit of vaccination. We discuss three scenarios where individuals know, in addition to whether there was an epidemic, (i) the incidence, (ii) the vaccination coverage and (iii) both the incidence and the vaccination coverage, every influenza season. We show that broadcasting both the incidence and the vaccination coverage could yield either better or worse vaccination coverage than broadcasting each piece of information on its own.  相似文献   

11.
Animal body size often varies systematically along latitudinal gradients, where individuals are either larger or smaller with varying season length. This study examines ecotypic responses by the generalist grasshopper Melanoplus femurrubrum (Orthoptera: Acrididae) in body size and covarying, physiologically based life history traits along a latitudinal gradient with respect to seasonality and energetics. The latitudinal compensation hypothesis predicts that smaller body size occurs in colder sites when populations must compensate for time constraints due to short seasons. Shorter season length requires faster developmental and growth rates to complete life cycles in one season. Using a common garden experimental design under laboratory conditions, we examined how grasshopper body size, consumption, developmental time, growth rate and metabolism varied among populations collected along an extended latitudinal gradient. When reared at the same temperature in the lab, individuals from northern populations were smaller, developed more rapidly, and showed higher growth rates, as expected for adaptations to shorter and generally cooler growing seasons. Temperature-dependent, whole organism metabolic rate scaled positively with body size and was lower at northern sites, but mass-specific standard metabolic rate did not differ among sites. Total food consumption varied positively with body size, but northern populations exhibited a higher mass-specific consumption rate. Overall, compensatory life history responses corresponded with key predictions of the latitudinal compensation hypothesis in response to season length.  相似文献   

12.
In the UK, ascospores of Leptosphaeria maculans first infect leaves of oilseed rape in the autumn to cause phoma leaf spots, from which the fungus can grow to cause stem cankers in the spring. Yield losses due to early senescence and lodging result if the stem cankers become severe before harvest. The risk of severe stem canker epidemics needs to be forecast in the autumn when the pathogen is still in the leaves, since early infections cause the greatest yield losses and fungicides have limited curative activity. Currently, the most effective way to forecast severe stem canker is to monitor the onset of phoma leaf spotting in winter oilseed rape crops, although this does not allow much time in which to apply a fungicide. Early warnings of risks of severe stem canker epidemics could be provided at the beginning of the season through regional forecasts based on disease survey and weather data, with options for input of crop-specific information and for updating forecasts during the winter. The accuracy of such forecasts could be improved by including factors relating to the maturation of ascospores in pseudothecia, the release of ascospores and the occurrence of infection conditions, as they affect the onset, intensity and duration of the phoma leaf spotting phase. Accurate forecasting of severe stem canker epidemics can improve disease control and optimise fungicide use.  相似文献   

13.
Seasonal influenza epidemics cause consistent, considerable, widespread loss annually in terms of economic burden, morbidity, and mortality. With access to accurate and reliable forecasts of a current or upcoming influenza epidemic’s behavior, policy makers can design and implement more effective countermeasures. This past year, the Centers for Disease Control and Prevention hosted the “Predict the Influenza Season Challenge”, with the task of predicting key epidemiological measures for the 2013–2014 U.S. influenza season with the help of digital surveillance data. We developed a framework for in-season forecasts of epidemics using a semiparametric Empirical Bayes framework, and applied it to predict the weekly percentage of outpatient doctors visits for influenza-like illness, and the season onset, duration, peak time, and peak height, with and without using Google Flu Trends data. Previous work on epidemic modeling has focused on developing mechanistic models of disease behavior and applying time series tools to explain historical data. However, tailoring these models to certain types of surveillance data can be challenging, and overly complex models with many parameters can compromise forecasting ability. Our approach instead produces possibilities for the epidemic curve of the season of interest using modified versions of data from previous seasons, allowing for reasonable variations in the timing, pace, and intensity of the seasonal epidemics, as well as noise in observations. Since the framework does not make strict domain-specific assumptions, it can easily be applied to some other diseases with seasonal epidemics. This method produces a complete posterior distribution over epidemic curves, rather than, for example, solely point predictions of forecasting targets. We report prospective influenza-like-illness forecasts made for the 2013–2014 U.S. influenza season, and compare the framework’s cross-validated prediction error on historical data to that of a variety of simpler baseline predictors.  相似文献   

14.
The effect of eyespot throughout the season on wheat receiving different amounts of nitrogen was studied in pot experiments. Plants inoculated in December showed chlorosis of outer leaves in February. Among plants with high nitrogen, eyespot killed 11%, caused straggling of 31% and whiteheads in 14% of the surviving ear-bearing straws, reduced yield of straw by 8% and of grain by 16%. The loss in straw yield was due to reduction in plant number, that of grain was about half due to reduction in number of ears and half to production of lighter grains. Among plants with low nitrogen the disease killed 23% of the plants, caused straggling of 86% and whiteheads in 18% of the surviving ear-bearing straws, and reduced yield of straw by 23% and of grain by 44%. The loss in straw yield was due to death of plants, that of grain was about two-thirds due to the production of fewer ears and one-third to that of lighter grains. In the high-nitrogen series the number of shoots at the time of maximum tillering was reduced by 29%; in the low-nitrogen series the disease caused reduction in height, a very uneven crop, delay in ear and anther emergence, and an increase in tail corn from 4% in the controls to 30% in the inoculated plants.
All inoculated plants became infected, but those receiving high nitrogen had only 49% of the ear-bearing straws with severe lesions at harvest, while those receiving low nitrogen had 86%. The larger number of tillers produced when nitrogen was applied may have enabled the less severely diseased shoots to survive and bear ears while the most severely infected died.  相似文献   

15.
Samples from 200–300 randomly selected spring barley crops were taken annually at growth stage 73–77 (milky ripe) from 1976 to 1980. The number of samples from each region was proportional to the area of barley grown in each region. The percentage of the area of the top two leaves affected by diseases was recorded. Mildew (Erysiphe graminis) was the most widespread and severe disease recorded. Brown rust (Puccinia hordei) and rhynchosporium (Rhyn-chosporium secalis) occurred frequently but at relatively low levels. Yellow rust {Puccinia striiformis) and septoria (Septoria nodorum) were seen on less than 50% of the samples in most years, and halo spot (Selenophoma donacis) and net blotch (Pyrenophora teres) were rarely recorded. There was an association between the severity of rhynchosporium and the number of rain days in May and June. The highest levels of brown rust occurred in the south and east and rhynchosporium was more common in Wales and the south-west than in the east, but there were no differences in the regional distribution of other diseases. Cultivar resistance, sowing date, previous cropping and fungicide usage were all found to be associated with altered disease levels. The proportion of crops treated with a foliar fungicidal spray rose from 26% in 1976 to 47% in 1980. The use of tridemorph declined but that of triadimefon increased reaching 29% of crops treated by 1980. The use of ethirimol as a seed treatment declined from 16% of crops grown from treated seed in 1976 to 7% in 1980. Estimated yield losses between 1976 and 1980 varied between 4% and 9% due to mildew, between 0.3% and 0.8% due to brown rust and between 0.2% and 0.5% due to rhynchosporium.  相似文献   

16.
Monoconidial cultures of Pyrenophora teres, the causal agent of barley net blotch, were isolated from leaves collected from six populations of the barley landrace "S'orgiu sardu" growing in five agro-ecological areas of Sardinia, Italy, and genotyped using AFLPs. The 150 isolates were from lesions of either the "net form" (P. teres f. sp. teres) or the "spot form" (P. teres f. sp. maculata) of the disease. Of 121 AFLP markers, 42%, were polymorphic. Cluster analysis resolved the isolates into two strongly divergent groups (F(ST) = 0.79), corresponding to the net (45% of the isolates) and the spot (55% of the isolates) forms (designated the NFR and SFR groups, respectively). The absence of intermediate genotypes and the low number of shared markers between the two groups indicated that hybridization between the two formae is rare or absent under the field condition of Sardinia. Five of the barley populations hosted both forms but in different proportions. The SFR populations were similar in overall polymorphism to the NFR populations. However, compared to the SFR form, the NFR occurred in all fields sampled and showed a higher population divergence (F(ST) = 0.43 versus F(ST) = 0.09 with all isolates; F(ST) = 0.37 versus F(ST) = 0.06 with clone corrected samples) probably due to a lower migration rate. AFLP fingerprints resolved 117 distinct genotypes among the 150 isolates sampled (78%), 87% in SFR and 68% in NFR isolates. Although the absolute numbers may be a function of the number of AFLP markers assayed, the relative difference suggests that clonality is more prevalent among the NFR isolates (with 11 of 46 haplotypes observed more than once), compared with SFR isolates (7 of 71 haplotypes). Both digenic and multilocus linkage disequilibrium analyses suggested that sexual reproduction occurs at significant levels within the NFR and SFR populations, and that the relative contribution of sexual and asexual reproduction varies among different environments.  相似文献   

17.
Climate change is causing warmer and more variable temperatures as well as physical flux in natural populations, which will affect the ecology and evolution of infectious disease epidemics. Using replicate seminatural populations of a coevolving freshwater invertebrate‐parasite system (host: Daphnia magna, parasite: Pasteuria ramosa), we quantified the effects of ambient temperature and population mixing (physical flux within populations) on epidemic size and population health. Each population was seeded with an identical suite of host genotypes and dose of parasite transmission spores. Biologically reasonable increases in environmental temperature caused larger epidemics, and population mixing reduced overall epidemic size. Mixing also had a detrimental effect on host populations independent of disease. Epidemics drove parasite‐mediated selection, leading to a loss of host genetic diversity, and mixed populations experienced greater evolution due to genetic drift over the season. These findings further our understanding of how diversity loss will reduce the host populations’ capacity to respond to changes in selection, therefore stymying adaptation to further environmental change.  相似文献   

18.
An experiment designed to generate a wide variety of epidemics of Mycosphaerella graminicola without deliberate replication was done in two years. Disease severity was estimated at frequent intervals during the life of the crop and yields measured at harvest. Absolute estimates of disease severity were derived by regression of visual estimates on subsamples of leaves on which severity was estimated objectively using an image analysis system. Yield was predicted best by the integral of the square root of M. graminicola severity over the normal lifetime of each leaf, measured in thermal time. Only the top two leaves contributed to yield loss; no influence of the third leaf on yield was detected. Thousand grain weight was best predicted by the integral of the square root of M. graminicola severity on the flag leaf alone. Parameter estimates were similar in the two years. The prediction equations were consistent with yields observed in an experiment done in a third year using two sowing dates and two rates of nitrogen fertilisation, despite a much greater range of disease severity. Although critical point models could describe each year's results adequately, neither parameter estimates nor the growth stage at which the best relation occurred were consistent across years. The equations to predict loss may be useful for farmers in decision-support systems which are based on prediction.  相似文献   

19.
Phytophthora ramorum, an invasive pathogen and the causal agent of Sudden Oak Death, has become established in mixed-evergreen and redwood forests in coastal northern California. While oak and tanoak mortality is the most visible indication of P. ramorum’s presence, epidemics are largely driven by the presence of bay laurel (Umbellularia californica), a reservoir host that supports both prolific sporulation in the winter wet season and survival during the summer dry season. In order to better understand how over-summer survival of the pathogen contributes to variability in the severity of annual epidemics, we monitored the viability of P. ramorum leaf infections over three years along with coincident microclimate. The proportion of symptomatic bay laurel leaves that contained viable infections decreased during the first summer dry season and remained low for the following two years, likely due to the absence of conducive wet season weather during the study period. Over-summer survival of P. ramorum was positively correlated with high percent canopy cover, less negative bay leaf water potential and few days exceeding 30°C but was not significantly different between mixed-evergreen and redwood forest ecosystems. Decreased summer survival of P. ramorum in exposed locations and during unusually hot summers likely contributes to the observed spatiotemporal heterogeneity of P. ramorum epidemics.  相似文献   

20.
Barley, Hordeum vulgare L., is well adapted to subarctic Alaska growing conditions, but little is known about its response to grasshopper defoliation. A field experiment was conducted to study dry matter and grain yield in response to a combination of grasshopper defoliation and weeds in 2002 and 2003 near Delta Junction, AK (63 degrees 55' N, 145 degrees 20' W). Barley plants at third to fourth leaf stage were exposed to a combination of two levels of weeds (present or absent) and four densities of grasshoppers (equivalent to 0, 25, 50, and 75 grasshoppers per m2) of third to fourth instars of Melanoplus sanguinipes (F). Dry matter accumulation by the barley plants was determined at three times during the growing seasons: approximately 10 d after introduction of the grasshoppers, shortly after anthesis, and at maturity. Dry matter accumulation and grain yield were much lower in 2003 than in 2002, probably due to very low levels of soil moisture early in the growing season of 2003. Head clipping accounted for a greater portion of yield loss in 2003 than in 2002. The percentage of reduction in harvestable yield due to grasshoppers remained fairly constant between years (1.9 and 1.4 g per grasshopper per m2 in 2002 and 2003, respectively) despite a large difference in overall yield. Examination of the yield components suggest that yields were reduced by the early season drought in 2003 primarily through fewer seeds per head, whereas grasshoppers in both years reduced average seed weight, but not numbers of seeds.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号