首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Samples from 200–400 randomly selected winter barley crops were taken annually at growth stage 71–73 from 1981 to 1991, with the exception of 1984 and 1985. The number of samples from each region was proportional to the area of barley growth in each region. The percentage of the area of the top two leaves affected by diseases and the severity of stem base diseases were recorded. Mildew (Erysiphe graminis) was the most widespread of the foliar diseases and in three years (1982, 1986 and 1991) was also the most severe. Rhynchosporium (Rhynchosporium secalis), net blotch (Pyrenophora teres) and brown rust (Puc-cinia hordei) were also prevalent in some years. Of the stem base diseases, fusarium was often the most widespread. Eyespot (Pseudocercosporella her-potrichoides) severity varied widely from year to year ranging from 1.2% of stems affected by moderate or severe symptoms in 1982 to 24.1% in 1988. There were regional differences in the severity of mildew, rhynchosporium, brown rust, halo spot (Selenophoma donacis) and eyespot. Cultivar resistance affected disease severity and previous cropping affected eyespot and less frequently mildew, rhynchosporium and net blotch. Eyespot, and to a lesser extent, sharp eyespot, were less severe in late- than in early-sown crops. The percentage of crops treated with a fungicidal spray increased from 72% in 1981 to 95% in 1991. The use of benzimidazole fungicides for the control of eyespot declined in response to the development of resistance, and more recently the use of prochlo-raz also declined. Broad spectrum DMI fungicides were widely used, and the use of morpholines to improve mildew control increased significantly. The proportion of crops grown from seed treated with a non-mercurial fungicidal seed dressing reached a peak of 47% in 1986 but subsequently declined to 22% in 1990 and 1991.  相似文献   

2.
In experiments during 1983–86 take-all was more severe and eyespot and sharp eyespot less frequent in 2nd-4th crops of winter wheat at Woburn (Beds.) than at Rothamsted (Herts.). Third crops had most take-all and yielded least grain. Against this background, small plots, 37 cm × 31 cm, in which all plants were sampled, were tried as a means of increasing experimental precision. They were arranged in fours in incomplete blocks and blocks with complementary treatments (putative controls of take-all) were paired. Thirty of these block-pairs were distributed throughout each experimental site in each year to provide one replicate of the design for each of three sampling times: April, June and August. Unattributed variation in disease and plant growth for plots within blocks was compared to that in other strata (block-pairs and blocks within block-pairs) of the experiment. The variability amongst block-pairs scattered throughout the site was nearly always greater than that for blocks within block-pairs (98% of take-all assessments, 71% of soil infectivity estimates, 94% of eyespot and sharp eyespot assessments and 86% of all plant measurements). The variability of blocks within block-pairs exceeded that of plots much less frequently (56% and 69% of take-all assessments, 33% and 25% of soil infectivity measurements, 63% and 56% of eyespot and sharp eyespot assessments and 50% and 63% of plant measurements; Rothamsted and Woburn, respectively). Small plots were judged mostly on this last comparison, where a variance ratio in excess of 1 indicated that the small plots had decreased variability and increased precision. Variance ratios for different assessments of take-all indicates that small plots: i) most consistently decreased disease variability during the years of maximum disease, ii) were slightly less effective at Rothamsted than at Woburn, and iii) were usually less effective in fourth crops than in previous crops. Soil infectivity was most uniform after crops with most disease and blocks were rarely more variable than plots. hxcept when disease was severe, soil infectivity in August tended to be positively associated with the yield of the crop just harvested. These findings reveal changes in the scale of disease patterns, both during the crop sequence and within individual crops, and suggest more than one scale of pattern in take-all-infested fields. This is discussed in relation to field experimentation and take-all decline.  相似文献   

3.
Three successive crops of winter wheat or barley were grown as second, third and fourth cereals. Communities of fungi on shoot bases, identified after isolation on agar media, were more diverse (determined by number of taxa identified) on wheat than on barley, and their diversity increased from year to year. Diversity was not affected by seed treatments containing fluquinconazole or prochloraz. Eyespot (caused by Tapesia spp.) and brown foot rot (caused by Fusarium spp. or Microdochium nivale ) increased from year to year. Eyespot, brown foot rot (after the first year) and sharp eyespot (which remained infrequent), assessed in summer (June), affected wheat more than barley. Eyespot severity was increased slightly on barley by treatments containing fluquinconazole, formulated with or without prochloraz, in the second year (third cereal), when it was also decreased slightly on wheat by fluquinconazole plus prochloraz, except in plots where the treatment had been applied for two successive years. The increases or decreases in eyespot in the second year were accompanied by, respectively, decreases or increases in the frequency of Idriella bolleyi where fluquinconazole was applied alone. Although the eyespot pathogen Tapesia yallundae (but not Tapesia acuformis ) is sensitive to fluquinconazole in vitro , seed treatment, applied principally to control take-all disease, is likely to have only a small effect against eyespot (or other stem-base diseases), and then only on wheat and when formulated with prochloraz.  相似文献   

4.
Plots were fumigated with various amounts of D-D or 85% dazomet dust and sown with spring wheat given various amounts of nitrogenous fertilizer. Dazomet increased yield and decreased take-all disease in the first crop after application, but increased the disease in the second crop. Although D-D increased take-all slightly, it increased yield in 1966, but in 1967 it decreased yield and its use was associated with a severe ear deformity. Fumigation had little effect on eyespot, sharp eyespot, root browning (Fusarium spp.), or browning root rot (Pythium spp.), but decreased nematode damage where nematodes were numerous.  相似文献   

5.
In a replicated field experiment mean yields of wheat from plots that, in the preceding 2 years, had carried oats, beans or potatoes were 39.2 and 42.6 cwt. per acre in 1954 for Holdfast and Cappelle, respectively; 42.8 and 55.8 in 1955 and 34.9 and 49.6 in 1956. Previous wheat crops had more effect than any other treatment in increasing the incidence of eyespot, take-all and weeds and in decreasing the number of ears per unit area and the yield of grain. In 1956 on plots carrying the first, second and third successive wheat crops the percentages of straws with eyespot were respectively 12, 54 and 42 and with take-all 0.1, 1 and 16. Cappelle was less severely infected by eyespot than Holdfast. The second and third successive wheat crops yielded an average of 23.3 cwt./acre less than the first wheat crop. Cappelle consistently yielded more than Holdfast, the mean difference being 13.8 cwt./acre after potatoes but only 3.8 cwt./acre after two wheat crops. The higher seed-rate gave an average increase in grain yield of 3.3 cwt./acre; but where eyespot and take-all were both severe the lower seed-rate yielded as much total and more dressed grain than the higher. Wheat given a spring top dressing of 6 cwt./acre Nitro-Chalk yielded an average of 4 cwt./acre more grain than wheat given 3 cwt./acre.  相似文献   

6.
Assessments of Phialophora radicicola var. graminicola (PRG) and Gaeumannomyces graminis var. tritici (GGT) were made by culturing and by direct microscopic examination of pieces of seminal roots from 16 winter wheat crops grown in different cropping sequences and with different phosphate manuring. PRG occurred on all wheat crops, but was abundant only on wheat after grass, where it seemed to delay the onset of damaging take-all by 1 yr. Delayed occurrence of take-all by phosphate fertiliser was not related to differences in populations of PRG. Wheat grown in ‘take-all decline’ soils had only small amounts of PRG, indicating that the development and the decline of take-all epidemics may be influenced by different biological control mechanisms; breaking sequences of wheat crops by 1 yr grass leys might harness the advantages of both mechanisms.  相似文献   

7.
An experiment was made on the fourth, fifth and sixth successive crops of winter wheat to determine the effects of various treatments on the troubles which result from close cereal cropping. Eyespot and lodging were prevalent in the first year (1946); weeds in the second; eyespot, lodging, take-all and weeds in the third.
Spraying with H2SO4 reduced the incidence of eyespot, lodging and weeds, and increased yield of grain on plots which received sulphate of ammonia (by 2.7, 2.2 and 10.0 cwt./acre in successive years).
Sulphate of ammonia increased the incidence at harvest of eyespot and lodging, reduced take-all and consistently increased yield of straw. Eyespot and lodging reduced the effect of the fertilizer on yield of grain, take-all increased it.
Increase in seed rate increased the incidence of severe eyespot and of take-all; it increased lodging except when plants were dwarfed by take-all.
Weight of straw and percentage straws with severe eyespot lesions independently affected lodging, together accounting for 51% of the variance in percentage area lodged at harvest and 64 % of that lodged 33 days earlier.
Mean yields of grain on untreated plots sown with 3.3 1/2 bushels seed/acre fell from 26.0 to 22.5 to 11.7 cwt./acre in successive years, whereas yields of 28.4, 29.9 and 29.1 cwt./acre were obtained on sprayed plots sown with 1 1/2.2 bushels seed/acre which received 4 cwt./acre sulphate of ammonia, showing that high yields were maintained when eyespot, lodging, take-all and weeds were controlled.
By 1948 yields of grain on unsprayed plots had fallen to the level of those on similarly manured plots on the continuous wheat experiment on Broadbalk field. Spraying increased grain by amounts similar to those resulting from one year's fallow on Broadbalk; but fallow had its greatest effects on plots with low nitrogen, spraying on those with high nitrogen.  相似文献   

8.
Reduction of Take-all Inoculum by Rotation with Lupins, Oats or Field Peas   总被引:1,自引:0,他引:1  
The feasibility of use of lupins, oats and field peas as alternative rotation crops to reduce inoculum of the take-all fungus (Gaeumannomyces graminis var. tritici) (under Western Australian field conditions) and disease in following wheat was investigated with a one year field trial, the soil from which was used in two succeeding pot experiments. The possible mechanisms of reduction of inoculum and disease by these crops were examined testing the soil for pathogen and disease suppression. Rotation with lupins or oats for two seasons reduced (P <0.05) inoculum of the take-all fungus and lupins, oats or field peas reduced (P <0.05) disease in following wheat. Lupins alone reduced inoculum and disease, (P <0.1) after one season. No apparent suppression of the pathogen in the absence of host plants was recorded after one season of rotation, but after two seasons, lupins, oats or field peas all suppressed (P <0.02) growth of the pathogen within soil. However only field pea soil suppressed take-all in comparison with the wheat control. Although after two seasons all rotation crops were effective in reducing inoculum and disease the mechanisms of reduction appear to differ between the rotation crops used in this study.  相似文献   

9.
In crops of winter wheat (1986—88) or winter barley (1987—88) inoculated with W-type or R-type isolates of Pseudocercosporella herpotrichoides and sown on different dates (1986) or at different seed rates (1987,1988) eyespot epidemics developed in different ways. Methods of measuring eyespot incidence/severity during crop growth were compared for their ability to predict eyespot severity at grain filling. Regressions were calculated for eyespot severity score at GS 71 on earlier measurements, either at GS 30/31 (11 methods) or from GS 22 to GS 65 (3 methods). Based on measurements at GS 30/31, all the methods predicted eyespot severity at GS 71 well in plots of winter barley inoculated with W-type isolates (r, 0.83—0.97) but the accuracy of predictions in plots inoculated with R-type isolates was very variable (r, 0.09—0.71). Predictions for 1987 and 1988 were less accurate in wheat than in W-type plots of barley, but did not differ between W-type and R-type plots (r, 0.70—0.89). When the wheat data for 1986 were also included predictions were less accurate, especially in R-type plots (r, 0—0.59). Generally, it was easier to predict eyespot severity at GS 71 in W-type than in R-type plots, especially in barley and in wheat before GS 37/39. Predictions of eyespot severity at GS 71 based on measurements before GS 25 were inaccurate for both wheat and barley. After GS 25 the accuracy of the prediction was generally good in W-type plots and did not improve greatly except in wheat after GS 59. However, there was a steady improvement in the accuracy of the prediction in R-type plots of barley from GS 24 to GS 53. Assessments of eyespot incidence on stems predicted eyespot severity at GS 71 more accurately than assessments on leaf sheaths on wheat after GS 37/39, but were not as good on barley until GS 53.  相似文献   

10.
An association genetics analysis was conducted to investigate the genetics of resistance to Septoria tritici blotch, caused by the fungus Zymoseptoria tritici (alternatively Mycosphaerella graminicola), in cultivars and breeding lines of wheat (Triticum aestivum) used in the UK between 1860 and 2000. The population was tested with Diversity Array Technology (DArT) and simple‐sequence repeat (SSR or microsatellite) markers. The lines formed a single population with no evidence for subdivision, because there were several common ancestors of large parts of the pedigree. Quantitative trait loci (QTLs) controlling Septoria resistance were postulated on 11 chromosomes, but 38% of variation was not explained by the identified QTLs. Calculation of best linear unbiased predictions (BLUPs) identified lineages of spring and winter wheat carrying different alleles for resistance and susceptibility. Abundant variation in Septoria resistance may be exploited by crossing well‐adapted cultivars in different lineages to achieve transgressive segregation and thus breed for potentially durable quantitative resistance, whereas phenotypic selection for polygenic quantitative resistance should be effective in breeding cultivars with increased resistance. The most potent allele reducing susceptibility to Septoria, on chromosome arm 6AL, was associated with reduced leaf size. Genes which increase susceptibility to Septoria may have been introduced inadvertently into UK wheat breeding programmes from cultivars used to increase yield, rust resistance and eyespot resistance between the 1950s and 1980s. This indicates the need to consider trade‐offs in plant breeding when numerous traits are important and to be cautious about the use of non‐adapted germplasm.  相似文献   

11.
In 1986–88 the development of eyespot lesions in winter wheat or winter barley differed plots inoculated with W-type isolates of Pseudocercosporella herpotrichoides and plots inoculated with R-type isolates. In the spring of 1986, after a cold winter, the incidence (%shoots infected) and severity (number of leaf sheaths penetrated) of eyespot lesions in wheat before GS 30/31 were greater in plots inoculated with R-type isolates than in those inoculated with W-type isolates. In 1987, after amild winter, eyespot incidence and severity in both wheat and barley were initially greater in W-type plots than in R-type plots. However, by GS 30/31 or 1987. In 1988, when the crop was October-sown, eyespot incidence and severity were greater in W-type than in R-type plots at GS 30/31. Differences in eyespot incidence and severity between W-type and R-type plots were smaller in barley than in wheat. Both the incidence and severity of eyespot were greater in early-sown than in late-sown plots. Seed rate, had little effect on the rate of lesion development in 1987, but in 1988 the rate of penetration was less at the low seed rate for both wheat and barley.  相似文献   

12.
The proteolytic activity of the leaf extracellular space of wheat cultivars Pigüé and Isla Verde was estimated after inoculation of either detached leaves or plants with the fungus Septoria tritici. Pigüé is resistant, whereas Isla Verde is susceptible to the disease caused by S. tritici. Viable conidiospores of the fungus caused similar increases in both hydrogen peroxide production and chitinase activity of the cultivars studied. In contrast, they caused a decrease in the extracellular serine proteinase activity of Isla Verde and a significant increase in that of Pigüé. Independently of the cultivar from which it was extracted, the extracellular serine proteinase inhibited the germination of Septoria tritici conidiospores. These results suggest that the proteolytic activity of the leaf extracellular space can participate in the defence of wheat plants against Septoria tritici. Its regulation may be controlled by specific defence components of each cultivar.  相似文献   

13.
In a series of experiments excised leaves from take-all infected wheat plants and from control plants were inoculated with Septoria nodorum. Larger lesions, more lesions/leaf and more pycnidia/unit area of lesion were produced from take-all plants. Significant effects of predisposition were demonstrated when only 3% of the area of the total root system was infected by take-all. Microscopical investigations revealed that germ-tubes of S. nodorum grew more rapidly on leaves from take-all plants, but the time of penetration was not affected. It is proposed that the observed effects of predisposition arose because more germ-tubes produced successful infections and host tissue was more rapidly colonised. The importance of these results for the epidemiology of glume blotch is discussed.  相似文献   

14.

Background and Aims

French wheat grains may be of little value on world markets because they have low and highly variable grain protein concentrations (GPC). This nitrogen-yield to yield ratio depends on crop nitrogen (N) fertilization as well as on crop capacity to use N, which is known to vary with climate and disease severity. Here an examination is made of the respective roles that N remobilization and post-anthesis N uptake play in N yield variations; in particular, when wheat crops (Triticum aestivum) are affected by leaf rust (Puccinia triticina) and Septoria tritici blotch (teleomorph Mycosphaerella graminicola).

Methods

Data from a 4-year field experiment was used to analyse N yield variations in wheat crops grown either with a third or no late N fertilization. Natural aerial epidemics ensured a range of disease severity, and fungicide ensured disease-free control plots. The data set of Gooding et al. (2005, Journal of Agricultural Science 143: 503–518) was incorporated in order to enlarge the range of conditions.

Key Results

Post-anthesis N uptake accounted for a third of N yield whilst N remobilization accounted for two-thirds in all crops whether affected by diseases or not. However, variations in N yield were highly correlated with post-anthesis N uptake, more than with N remobilization, in diseased and also healthy crops. Furthermore, N remobilization did not significantly correlate with N yield in healthy crops. These findings matched data from studies using various wheat genotypes under various management and climatic conditions. Leaf area duration (LAD) accurately predicted N remobilization whether or not crops were diseased; in diseased crops, LAD also accurately predicted N uptake.

Conclusions

Under the experimental conditions, N yield variations were closely associated with post-anthesis N uptake in diseased but also in healthy crops. Understanding the respective roles of N uptake and N remobilization in the case of diseased and healthy crops holds the promise of better modelling of variations in N yield, and thus in GPC.Key words: Triticum aestivum, Puccinia triticina, leaf rust, Mycosphaerella graminicola, Septoria tritici blotch, N uptake, N remobilization, N yield, Leaf area duration  相似文献   

15.
This research was initiated to determine whether soils suppressive to take-all of wheat caused by Gaeumannomyces graminis var. tritici (Ggt) occur in Montana, and to identify the organisms most likely involved in this suppression. From an initial screening of eight soils collected from different wheat growing areas of Montana, two were highly suppressive to take-all. Microbial characterization of these soils indicated that different mechanisms were involved in the suppression. In Larslan soil, mycoparasitism appeared to be the main mechanism. Two different fungi with exceptional ability to reduce the severity of take-all were isolated from this soil. One of these fungi could parasitize the hyphae of Ggt. Field tests with these fungi in Ggt infested soil showed increases of over 100% in both harvestble tillers and grain yield as compared to treatments without these two fungi. In tests with 48 different bacteria and 10 actinomycetes from Larslan soil, none were able to consistently reduce severity of take-all alone, or in mixtures. In Toston soil, antibiosis by actinomycetes and perhaps the involvement of Pseudomonas spp. in production of antibiotics and/or siderophores appeared to be the most likely mechanisms involved in take-all suppression. Increases in shoot dry weight over that in the Ggt infested control using mixtures of pseudomonads and actinomycetes ranged from 25% to 87%. Actinomycetes added individually or in mixtures to soil infested with Ggt consistently reduced the severity of the disease to a greater extent than did mixtures of Pseudomonas spp.  相似文献   

16.
Samples from 200–300 randomly selected spring barley crops were taken annually at growth stage 73–77 (milky ripe) from 1976 to 1980. The number of samples from each region was proportional to the area of barley grown in each region. The percentage of the area of the top two leaves affected by diseases was recorded. Mildew (Erysiphe graminis) was the most widespread and severe disease recorded. Brown rust (Puccinia hordei) and rhynchosporium (Rhyn-chosporium secalis) occurred frequently but at relatively low levels. Yellow rust {Puccinia striiformis) and septoria (Septoria nodorum) were seen on less than 50% of the samples in most years, and halo spot (Selenophoma donacis) and net blotch (Pyrenophora teres) were rarely recorded. There was an association between the severity of rhynchosporium and the number of rain days in May and June. The highest levels of brown rust occurred in the south and east and rhynchosporium was more common in Wales and the south-west than in the east, but there were no differences in the regional distribution of other diseases. Cultivar resistance, sowing date, previous cropping and fungicide usage were all found to be associated with altered disease levels. The proportion of crops treated with a foliar fungicidal spray rose from 26% in 1976 to 47% in 1980. The use of tridemorph declined but that of triadimefon increased reaching 29% of crops treated by 1980. The use of ethirimol as a seed treatment declined from 16% of crops grown from treated seed in 1976 to 7% in 1980. Estimated yield losses between 1976 and 1980 varied between 4% and 9% due to mildew, between 0.3% and 0.8% due to brown rust and between 0.2% and 0.5% due to rhynchosporium.  相似文献   

17.
In experiments with commercial seed of different cultivars at Rothamsted and Woburn, Bedfordshire in 1985 – 88 the severity of black dot on daughter tubers at harvest differed between cultivars. The disease was most severe on Desiree tubers. Amounts of disease were similar at both sites in 1986 – 88 but in 1985 it was more severe at Woburn than at Rothamsted. Disease-free seed of 12 (1987) or 15 (1988) cultivars were planted in experiments at Rothamsted (inoculated with Colletotrichum coccodes or not) and at Mepal, Cambridgeshire (not inoculated) and black dot assessed at harvest in October 1987 and in September and October 1988. There were significant differences in the amount of disease on different cultivars and the order of severity was similar at the two sites, on the two harvest dates in 1988 and in both years. Desiree, Maris Piper, Maris Peer and Record were amongst those cultivars severely affected whereas Cara, Pentland Crown and Romano were least affected. Skin discoloration caused by black dot was more noticeable on white-skinned than red-skinned cultivars and was severe on the Dutch cultivars Estima, Marfona, Santé and Wilja.  相似文献   

18.
Winter wheat drilled directly into stubble or pasture treated with paraquat to kill the vegetation has been found to be less severely attacked by take-all (Ophiobolus graminis (Sacc.) Sacc.) and eyespot (Cercosporella herpotrichoides Fron) than wheat drilled after cultivation. The reduction of take-all is associated, not with a direct effect of the chemical, but with factors, resulting from the technique, which limit the rate of spread of the fungus in the undisturbed soil.  相似文献   

19.
Incidence and severity of the take-all disease in spring wheat and spring barley caused by Gaeumannomyces graminis (syn. Ophiobolus graminis) were studied during seven years of monoculture. The fungus apparently survived for much longer periods in the soil under non-susceptible break-crops than previously recorded. The incidence and severity of infection increased progressively with each successive cereal crop from initially low levels to a maximum within 3–7 years, which was followed by a progressive but limited decline in the disease. Spring wheat was more susceptible to take-all than spring barley and the development of take-all decline (TAD) was recorded earlier in the sequences of wheat than of barley crops. Nitrogen did not influence the disease until the point of maximum incidence and severity, when it caused a reduction in disease levels in addition to that associated with TAD. Factors influencing the time of onset and the rate of development of take-all and of TAD are discussed and possible explanations for TAD are suggested.  相似文献   

20.
Septoria tritici blotch, caused by Mycosphaerella graminicola (anamorph Septoria tritici), is one of the most important foliar diseases of wheat in much of the world. Susceptibility of host plants to septoria was investigated by cytogenetic analysis. A line of Hobbit sib (Dwarf A) in which translocated chromosome 5BS–7BS was nominally substituted by chromosome arms 5BS and 7BS from Bezostaya 1 had a much lower mean level of septoria than Hobbit sib itself. By the use of microsatellite markers, it was shown that the 5BS arm of this line had in fact been substituted by the homologous arm of Chinese Spring. Further investigation of substitution and nullitetrasomic lines demonstrated that chromosome arm 5BS of Hobbit sib possesses genes, which either promote susceptibility to septoria or suppress resistance. This chromosome arm has previously been shown to carry genes for resistance to yellow (stripe) rust and powdery mildew, implying a trade-off between resistances to these two diseases and to septoria in wheat breeding. Bezostaya 1 was found to have specific resistance to M. graminicola isolate IPO323, probably controlled by the gene Stb6 on chromosome arm 3AS, present in numerous wheat cultivars. It also had partial resistance to septoria distributed over several chromosomes, which may explain the value of this cultivar as a source of septoria resistance.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号