首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The incidence of Septoria nodorum and Rhynchosporium secalis in N.I.A.B. cultivar trials of wheat and barley respectively showed marked regional and seasonal variation. Peak annual incidence of the two diseases often coincided. Levels of S. nodorum infection in winter wheat generally exceeded those in spring wheat, while levels of R. secalis in winter and spring barley were usually similar. Annual peaks in mean disease levels were associated with above average rainfall and raindays and below average temperatures, especially during the period of epidemic development. Regional variation in disease incidence reflected the association with high rainfall. S. nodorum infection in wheat was more widespread, and incidence of leaf infection was higher, than R. secalis in barley. Annual and regional incidence of S. nodorum and R. secalis in trials was similar to that recorded in the Plant Pathology Laboratory surveys of commercial crops. Disease records from cultivar trials can provide useful additional information on the behaviour of S. nodorum and R. secalis nationally.  相似文献   

2.
Historical datasets have much to offer. We analyse data from winter wheat, spring and winter barley, oil seed rape, sugar beet and forage maize from the UK National List and Recommended List trials over the period 1948–2007. We find that since 1982, for the cereal crops and oil seed rape, at least 88% of the improvement in yield is attributable to genetic improvement, with little evidence that changes in agronomy have improved yields. In contrast, in the same time period, plant breeding and changes in agronomy have contributed almost equally to increased yields of forage maize and sugar beet. For the cereals prior to 1982, contributions from plant breeding were 42, 60 and 86% for winter barley, winter wheat and spring barley, respectively. These results demonstrate the overwhelming importance of plant breeding in increasing crop productivity in the UK. Winter wheat data are analysed in more detail to exemplify the use of historical data series to study and detect disease resistance breakdown, sensitivity of varieties to climatic factors, and also to test methods of genomic selection. We show that breakdown of disease resistance can cause biased estimates of variety and year effects, but that comparison of results between fungicide treated and untreated trials over years may be a means to screen for durable resistance. We find the greatest sensitivities of the winter wheat germplasm to seasonal differences in rainfall and temperature are to summer rainfall and winter temperature. Finally, for genomic selection, correlations between observed and predicted yield ranged from 0.17 to 0.83. The high correlation resulted from markers predicting kinship amongst lines rather than tagging multiple QTL. We believe the full value of these data will come from exploiting links with other experiments and experimental populations. However, not to exploit such valuable historical datasets is wasteful.  相似文献   

3.
A survey of foliar diseases of spring barley in England and Wales in 1967   总被引:3,自引:0,他引:3  
A total of 5,250 acres of spring barley was sampled in proportion to the barley acreage in each of eight regions. The percentage leaf area ‘affected’ by each disease was recorded for the first (flag) and second leaves at a growth stage when the grain was milky ripe. The operation was automated by punching the data directly on to computer paper tape and a programme was written to analyse the results. Mildew was found to be the most important disease, causing an average loss in yield of approximately 18 %, followed in descending order by brown rust 3%, leaf blotch 1 %, yellow rust 1 %, halo spot 1 %. The total loss due to foliar diseases was estimated to be 20–25% of the national barley yield. Diseases were more severe in the southern than northern regions, e.g. mildew severity was three times higher because of the greater popularity of mildew-susceptible varieties, and the prevalence of weather more conducive to mildew development. Disease severity was not related to previous cropping, and in general the February-sown crops had more mildew than the April-sown crops.  相似文献   

4.
P. Frei    K. Gindro    H. Richter    S. Schürch 《Journal of Phytopathology》2007,155(5):281-288
Ramularia collo‐cygni (Sutton & Waller) is involved in a disease complex of barley characterized by the formation of necrotic spots on leaves. Isolation of this fungus is difficult, which complicates the study of its epidemiology and the aetiology of the disease complex. A new assay based on polymerase chain reaction (PCR) was developed to detect the presence of R. collo‐cygni (Rcc) without previous isolation of the fungus nor prior purification of DNA. Primers RC3 and RC5 were designed to amplify a 348 bp fragment of the internal transcribed spacer region of this pathogen. These primers were highly specific to Rcc, as no cross‐reactions were observed with other barley pathogens or saprobes commonly found on this crop. Amplification was possible from crude extracts (direct‐PCR), circumventing the need for a DNA purification. Detection of 1 fg of target DNA was achieved with a single PCR. This specific and sensitive assay was used to study the epidemiology of Rcc on winter and spring barley at two locations in Switzerland. Rcc was present on winter barley from snow melting until harvest and colonized gradually all leaf layers. It was also found on volunteers, which could, together with weeds, offer a survival possibility to the pathogen between two barley crops. The fungus was detected on spring barley only after it sporulated on winter barley, indicating that Rcc probably spreads from winter to spring barley. The hypothesis of Rcc being a seed‐borne pathogen can be tested more easily with this fast and reliable molecular tool, which may also find applications in barley breeding programmes and fungicide trials.  相似文献   

5.
Samples from 200–300 randomly selected spring barley crops were taken annually at growth stage 73–77 (milky ripe) from 1976 to 1980. The number of samples from each region was proportional to the area of barley grown in each region. The percentage of the area of the top two leaves affected by diseases was recorded. Mildew (Erysiphe graminis) was the most widespread and severe disease recorded. Brown rust (Puccinia hordei) and rhynchosporium (Rhyn-chosporium secalis) occurred frequently but at relatively low levels. Yellow rust {Puccinia striiformis) and septoria (Septoria nodorum) were seen on less than 50% of the samples in most years, and halo spot (Selenophoma donacis) and net blotch (Pyrenophora teres) were rarely recorded. There was an association between the severity of rhynchosporium and the number of rain days in May and June. The highest levels of brown rust occurred in the south and east and rhynchosporium was more common in Wales and the south-west than in the east, but there were no differences in the regional distribution of other diseases. Cultivar resistance, sowing date, previous cropping and fungicide usage were all found to be associated with altered disease levels. The proportion of crops treated with a foliar fungicidal spray rose from 26% in 1976 to 47% in 1980. The use of tridemorph declined but that of triadimefon increased reaching 29% of crops treated by 1980. The use of ethirimol as a seed treatment declined from 16% of crops grown from treated seed in 1976 to 7% in 1980. Estimated yield losses between 1976 and 1980 varied between 4% and 9% due to mildew, between 0.3% and 0.8% due to brown rust and between 0.2% and 0.5% due to rhynchosporium.  相似文献   

6.
Sprays of captafol, carbendazim, carbendazim + tridemorph + maneb, diclobutrazol, triadimefon or triadimefon + carbendazim all completely protected barley plants in a glasshouse against R. secalis for at least 30 days. However, their effectiveness in preventing disease development when applied after inoculation differed: triadimefon, traidimefon + carbendazim, or diclobutrazol were the most effective, completely preventing symptom development when applied up to 5 days after inoculation to plants grown above 16 °C, and up to 8 days below 8 °C. All the fungicides decreased the number of viable conidia produced by leaf blotch lesions, and when applied to infected plants at G. S. 30, greatly decreased the upward spread of the disease under simulated rain conditions; the most effective fungicides in these respects were triadimefon and triadimefon + carbendazim. The above fungicides and fungicide mixtures, together with the recently introduced materials fenpropimorph and propiconazole were applied to diseased winter barley crops in winter or in spring. All treatments decreased leaf blotch development and increased yields. In most cases, a winter application was more effective than spring applications, particularly if applied in mid-November. The most effective fungicides were triadimefon and propiconazole. The field trials data fitted well with the predictions of performance indicated by the glasshouse investigations.  相似文献   

7.
Summary Ammonium nitrate fertilizer, labelled with15N, was applied in spring to winter wheat growing in undisturbed monoliths of clay and sandy loam soil in lysimeters; the rates of application were respectively 95 and 102 kg N ha−1 in the spring of 1976 and 1975. Crops of winter wheat, oilseed rape, peas and barley grown in the following 5 or 6 years were treated with unlabelled nitrogen fertilizer at rates recommended for maximum yields. During each year of the experiments the lysimeters were divided into treatments which were either freelydrained or subjected to periods of waterlogging. Another labelled nitrogen application was made in 1980 to a separate group of lysimeters with a clay soil and a winter wheat crop to study further the uptake of nitrogen fertilizer in relation to waterlogging. In the first growing season, shoots of the winter wheat at harvest contained 46 and 58% of the fertilizer nitrogen applied to the clay and sandy loam soils respectively. In the following year the crops contained a further 1–2% of the labelled fertilizer, and after 5 and 6 years the total recoveries of labelled fertilizer in the crops were 49 and 62% on the clay and sandy loam soils respectively. In the first winter after the labelled fertilizer was applied, less than 1% of the fertilizer was lost in the drainage water, and only about 2% of the total nitrogen (mainly nitrate) in the drainage water from both soils was derived from the fertilizer. Maximum annual loss occurred the following year but the proportion of tracer nitrogen in drainage was nevertheless smaller. Leaching losses over the 5 and 6 years from the clay and sandy loam soil were respectively 1.3 and 3.9% of the original application. On both soils the percentage of labelled nitrogen to the total crop nitrogen content was greater after a period of winter waterlogging than for freely-drained treatments. This was most marked on the clay soil; evidence points to winter waterlogging promoting denitrification and the consequent loss of soil nitrogen making the crop more dependent on spring fertilizer applications.  相似文献   

8.
Macdonald  A.J.  Poulton  P.R.  Stockdale  E.A.  Powlson  D.S.  Jenkinson  D.S. 《Plant and Soil》2002,246(1):123-137
An earlier paper (Macdonald et al., 1997; J. Agric. Sci. (Cambridge) 129, 125) presented data from a series of field experiments in which 15N-labelled fertilizers were applied in spring to winter wheat, winter oilseed rape, potatoes, sugar beet and spring beans grown on four different soils in SE England. Part of this N was retained in the soil and some remained in crop residues on the soil surface when the crop was harvested. In all cases the majority of this labelled N remained in organic form. In the present paper we describe experiments designed to follow the fate of this `residual' 15N over the next 2 years (termed the first and second residual years) and measure its value to subsequent cereal crops. Averaging over all of the initial crops and soils, 6.3% of this `residual' 15N was taken up during the first residual year when the following crop was winter wheat and significantly less (5.5%) if it was spring barley. In the second year after the original application, a further 2.1% was recovered, this time by winter barley. Labelled N remaining after potatoes and sugar beet was more available to the first residual crop than that remaining after oilseed rape or winter wheat. By the second residual year, this difference had almost disappeared. The availability to subsequent crops of the labelled N remaining in or on the soil at harvest of the application year decreased in the order: silty clay loam>sandy loam>chalky loam>heavy clay. In most cases, only a small proportion of the residual fertilizer N available for plant uptake was recovered by the subsequent crop, indicating poor synchrony between the mineralization of 15N-labelled organic residues and crop N uptake. Averaging over all soils and crops, 22% of the labelled N applied as fertilizer was lost (i.e., unaccounted for in harvested crop and soil to a depth of 100 cm) by harvest in the year of application, rising to 34% at harvest of the first residual year and to 35% in the second residual year. In the first residual year, losses of labelled N were much greater after spring beans than after any of the other crops.  相似文献   

9.
Temporal flux in the morphological and molecular diversity of UK barley   总被引:15,自引:5,他引:10  
Genetic-diversity assessments, using both phenotypic and molecular-marker data, were made on a collection of 134 barley varieties (both winter and spring types), chosen on the basis of their representation on the NIAB "Recommended List" over the period 1925-1995. Genotypic (AFLP and SSR) and phenotypic (UPOV characters) data were analysed to determine short- and long-term temporal trends in diversity over the period. A consistent pattern emerged demonstrating that only a minor proportion of the overall variance appears to be the result of any temporal drift, although there were strong indications of qualitative shifts in diversity, probably related to the changing relative acreage of winter and spring barleys over the study period. Our overall conclusions are that systematic plant breeding does not inevitably lead to a reduction in the genetic diversity of agricultural crops, and that diverse breeding programmes and the variety delivery systems in place in the UK have generally been successful in maintaining sufficient genetic diversity to allow the steady rise in genetic potential that has been a feature of 20th century crop breeding. The concentration of breeding effort into a smaller number of independent programmes is likely to be prejudicial to the maintenance of the genetic diversity of a crop.  相似文献   

10.
A large number of accessions of covered and naked barley from eastern Nepal were grown without vernalization, and it was found that naked barley accessions were predominantly spring varieties while covered barley accessions were predominantly winter varieties. Seven accessions were subjected to a range of vernalization periods. Four naked varieties were spring varieties, although one showed some response to vernalization, but the three covered barleys were winter varieties. Although the majority of naked barleys are spring forms, they are winter sown at high altitudes and this does not conform to the distribution of naked barley described by Takahashi (1955). Wheat accessions which came from villages situated at high altitudes tended to have higher vernalization requirements than those which came from lower altitudes. This was taken to indicate local adaptation and a low movement of seeds (gene-flow) between villages. The relationship between vernalization requirement and altitude was not found in barley. Marked but contrasting regional patterns for vernalization requirement occurred in the wheat and covered barley. It was concluded that gene-flow was greater within regions than between them. This regional isolation together with environmental heterogeneity are major diversity promoting mechanisms.  相似文献   

11.
Samples from 200–400 randomly selected winter barley crops were taken annually at growth stage 71–73 from 1981 to 1991, with the exception of 1984 and 1985. The number of samples from each region was proportional to the area of barley growth in each region. The percentage of the area of the top two leaves affected by diseases and the severity of stem base diseases were recorded. Mildew (Erysiphe graminis) was the most widespread of the foliar diseases and in three years (1982, 1986 and 1991) was also the most severe. Rhynchosporium (Rhynchosporium secalis), net blotch (Pyrenophora teres) and brown rust (Puc-cinia hordei) were also prevalent in some years. Of the stem base diseases, fusarium was often the most widespread. Eyespot (Pseudocercosporella her-potrichoides) severity varied widely from year to year ranging from 1.2% of stems affected by moderate or severe symptoms in 1982 to 24.1% in 1988. There were regional differences in the severity of mildew, rhynchosporium, brown rust, halo spot (Selenophoma donacis) and eyespot. Cultivar resistance affected disease severity and previous cropping affected eyespot and less frequently mildew, rhynchosporium and net blotch. Eyespot, and to a lesser extent, sharp eyespot, were less severe in late- than in early-sown crops. The percentage of crops treated with a fungicidal spray increased from 72% in 1981 to 95% in 1991. The use of benzimidazole fungicides for the control of eyespot declined in response to the development of resistance, and more recently the use of prochlo-raz also declined. Broad spectrum DMI fungicides were widely used, and the use of morpholines to improve mildew control increased significantly. The proportion of crops grown from seed treated with a non-mercurial fungicidal seed dressing reached a peak of 47% in 1986 but subsequently declined to 22% in 1990 and 1991.  相似文献   

12.
Samples from 300 – 400 randomly selected winter wheat crops were taken annually at growth stage 73 – 75 from 1976 to 1988 with the exception of 1983 and 1984. The number of samples from each region was proportional to the area of wheat grown in each region. The percentage of the area of the top two leaves affected by diseases, the severity of ear and stem base diseases and, in 6 years, the severity of take-all were recorded. Septoria tritici and Septoria nodorum were, on average, the most severe of the foliar diseases and eyespot (Pseudocercosporella herpotrichoides) was the most severe of the stem base diseases. Regional differences in levels of S. tritici, brown rust, sharp eyespot and nodal fusarium were significant. Cultivar resistance affected disease severity, and previous cropping patterns particularly affected take-all and eyespot. Eyespot and sharp eyespot were less severe in late- than in early-sown crops. The percentage of crops treated with a fungicidal spray increased from 14% in 1976 to over 90% between 1983 and 1985. Use of benzimidazole fungicides applied at growth stage 31 declined, while use of morpholines from flag leaf emergence onwards increased between 1985 and 1988.  相似文献   

13.
我国旱地小麦区域试验精确度及其环境综合评价   总被引:1,自引:0,他引:1  
选用2003-2009年我国4个麦区、233个品种(系)、82个试点组成的全国旱地冬春小麦区域试验产量资料,研究了我国旱地小麦国家区域试验点的精确度、环境鉴别力和代表性,并对试验环境进行综合评价.结果表明:一年一点试验平均试验误差变异系数(CV)为6.1%,平均相对最小显著差数(RLSD)为10.5%,一年多点试验的CV值均在8.2%以内,大多数试点的CV和RLSD控制得较好,试验精确度和品种比较精确度均较高.试点分辨力以西北春小麦组最强,其他3个产区差别不大.试点代表性以东北春小麦组最好、西北春小麦组最差.在综合考虑试点分辨力和代表性基础上,借助GGE模型构建环境综合评价参数(rg-h),结果显示,我国旱地小麦理想试点比例只有32.4%,产区间比较,理想试点的比例依次为:西北春小麦旱地组(40.9%)>东北春小麦旱地组(33.3%)>黄淮冬小麦旱地组(30.4%)>北部冬小麦旱地组(21.4%).  相似文献   

14.
Annual patterns of mildew on winter and spring barley, wheat and oats at NIAB trial sites for 1958-68 are reported. High intensities of mildew were preceded by early infections, particularly at sites where both winter and spring crops were infected early. Relative earliness and severity of infections in all six crops of cereals were generally similar at any site. This pattern was repeated within denned mildew ‘regions’, i.e. north, east and west. Sprowston (Norfolk) was atypical of the eastern sites, showing patterns of mildew infection more characteristic of sites in the west, where mildew values were 50 % higher than elsewhere. Dates of first visible infection became later further eastwards and northwards, particularly in winter cereals. This suggested the possibility of dispersal of inoculum by prevailing winds to the north, north-east and east, or a similar progression in climatic factors favourable to mildew development. Estimates of loss of potential yield in the trial plots of sites in each region and over the whole country were calculated for winter wheat, spring barley and spring oats using the formulae of Large and Doling, for which supplementary confirmation was provided.  相似文献   

15.
Incidence and severity of the take-all disease in spring wheat and spring barley caused by Gaeumannomyces graminis (syn. Ophiobolus graminis) were studied during seven years of monoculture. The fungus apparently survived for much longer periods in the soil under non-susceptible break-crops than previously recorded. The incidence and severity of infection increased progressively with each successive cereal crop from initially low levels to a maximum within 3–7 years, which was followed by a progressive but limited decline in the disease. Spring wheat was more susceptible to take-all than spring barley and the development of take-all decline (TAD) was recorded earlier in the sequences of wheat than of barley crops. Nitrogen did not influence the disease until the point of maximum incidence and severity, when it caused a reduction in disease levels in addition to that associated with TAD. Factors influencing the time of onset and the rate of development of take-all and of TAD are discussed and possible explanations for TAD are suggested.  相似文献   

16.
In 1996 and 1997 a field survey of the abundance and species composition of cereal aphid primary and secondary parasitoids in spring barley, winter wheat and durum wheat was conducted in Zealand, Denmark. The purpose was to create a better understanding of the mechanisms underlying aphid–parasitoid dynamics in the field. Such an understanding can be used when developing biological control methods in cereals. In both years aphid attacks in cereals began in late June and never exceeded the economic threshold. In 1996 the first aphids were found in wheat on 26 June; in 1997 the first aphids were found on 24 June on both crops. The highest densities reached in 1996 were an average of six aphids per shoot in winter wheat and one aphid per shoot in spring barley. In 1997 the highest densities reached were 11 aphids per shoot in winter wheat and four aphids per shoot in spring barley. The aphid population collapsed by the end of July to early August in 1996, but it collapsed by mid-July in 1997. The onset and peak of parasitization were delayed in comparison to aphid infestation. Parasitism was 20–60% by the end of the cropping season in spring barley, and 30–80% in winter wheat and durum wheat in 1996. In 1997 parasitism did not exceed 3–11% in barley and was less than 2% in one winter wheat field but more than 40% in the other winter wheat field sampled. In both years most parasitism was due to Aphidiidae (Hymenoptera). The two dominant species were Aphidius ervi Haliday and Aphidius rhopalosiphi De Stefani-Perez. Hyperparasitism began after primary parasitism and increased progressively during the cropping season. The two years were similar in many respects, including for species composition of aphids and parasitoids. The late start of the aphid infestation may have contributed to the high level of parasitization found in 1996, but in 1997 the aphid infestation period was so short that a parasitoid population did not have time to build up.  相似文献   

17.
Two separate surveys of root diseases of cereals in the Western Australian (WA) cereal belt were conducted: the first conducted annually for wheat and barley during 1976–1982 and the second for wheat during 2005–2007. For the 1976–1982 survey, the cereal belt was divided into 15 zones based on the location and rainfall. Sampling was representative of the actual cropping area, with both wheat and barley sampling sites selected by zone as a percentage of total sites. Over 31 000 plants were assessed from a total of 996 fields. Average take‐all incidence ranged from 3% in the northern low rainfall zone to 57% in the southern high rainfall zone. Other root diseases assessed included rhizoctonia root rot, fusarium crown rot and subcrown internode discolouration. During the 2005–2007 survey, around 20 000 plants from a total of 210 fields being intensively cropped with cereals were surveyed for take‐all, rhizoctonia root rot, fusarium crown rot, common root rot, root lesion nematode and cereal cyst nematode. The 2005–2007 survey results indicated that root and crown diseases prevailed in paddocks frequently cropped with cereals and occurred at damaging levels across all WA cropping districts surveyed. The more recent root disease survey identified that the fungal diseases rhizoctonia root rot and fusarium crown rot and the root lesion nematode were the most serious impediments to intensive cereal production, particularly in the southern region of WA. Comparing the 2005–2007 results with the previous survey of 1976–1982, the relative importance of take‐all appears to have declined over the past 30 years.  相似文献   

18.
Fusarium head blight is an important disease of cereal crops caused by Fusarium species. It causes not only a reduction in yield, but most Fusarium species (F. graminearum. F. culmorum, F. avenaceum. F. poae) produce also a range of toxic metabolites such as deoxynivalenol (DON) and zearalenone (ZEA). The evaluation of Fusarium species was followed up under natural infection conditions during the growing seasons 2001--2002 and 2002--2003 in two varietal winter wheat experiments on the experimental farm of the Hogeschool Gent at Bottelare. Disease pressure, DON and ZEA content, different Fusarium species as well as growth and yield parameters were determined. In both years there were significant differences between the varieties concerning the susceptibility to Fusarium and the DON content. ZEA was not found in the kernels. The mean deoxynivalenol (DON) content was in 2002 (1,126 mg/kg) higher than in 2003 (0.879 mg/kg) although the mean disease severity was bigger in 2003 than in 2002 what means that the DON content was not always correlated with the disease severity. The Fusarium species most frequently identified in our two field trials (Bottelare) were F. graminearum and F. culmorum Varietal differences in susceptibility to Fusarium species and DON contamination could be detected.  相似文献   

19.
Plant diseases are caused by pathogen populations continuously subjected to evolutionary forces (genetic flow, selection, and recombination). Ascochyta blight, caused by Mycosphaerella pinodes, is one of the most damaging necrotrophic pathogens of field peas worldwide. In France, both winter and spring peas are cultivated. Although these crops overlap by about 4 months (March to June), primary Ascochyta blight infections are not synchronous on the two crops. This suggests that the disease could be due to two different M. pinodes populations, specialized on either winter or spring pea. To test this hypothesis, 144 pathogen isolates were collected in the field during the winter and spring growing seasons in Rennes (western France), and all the isolates were genotyped using amplified fragment length polymorphism (AFLP) markers. Furthermore, the pathogenicities of 33 isolates randomly chosen within the collection were tested on four pea genotypes (2 winter and 2 spring types) grown under three climatic regimes, simulating winter, late winter, and spring conditions. M. pinodes isolates from winter and spring peas were genetically polymorphic but not differentiated according to the type of cultivars. Isolates from winter pea were more pathogenic than isolates from spring pea on hosts raised under winter conditions, while isolates from spring pea were more pathogenic than those from winter pea on plants raised under spring conditions. These results show that disease developed on winter and spring peas was initiated by a single population of M. pinodes whose pathogenicity is a plastic trait modulated by the physiological status of the host plant.  相似文献   

20.
Three successive crops of winter wheat or barley were grown as second, third and fourth cereals. Communities of fungi on shoot bases, identified after isolation on agar media, were more diverse (determined by number of taxa identified) on wheat than on barley, and their diversity increased from year to year. Diversity was not affected by seed treatments containing fluquinconazole or prochloraz. Eyespot (caused by Tapesia spp.) and brown foot rot (caused by Fusarium spp. or Microdochium nivale ) increased from year to year. Eyespot, brown foot rot (after the first year) and sharp eyespot (which remained infrequent), assessed in summer (June), affected wheat more than barley. Eyespot severity was increased slightly on barley by treatments containing fluquinconazole, formulated with or without prochloraz, in the second year (third cereal), when it was also decreased slightly on wheat by fluquinconazole plus prochloraz, except in plots where the treatment had been applied for two successive years. The increases or decreases in eyespot in the second year were accompanied by, respectively, decreases or increases in the frequency of Idriella bolleyi where fluquinconazole was applied alone. Although the eyespot pathogen Tapesia yallundae (but not Tapesia acuformis ) is sensitive to fluquinconazole in vitro , seed treatment, applied principally to control take-all disease, is likely to have only a small effect against eyespot (or other stem-base diseases), and then only on wheat and when formulated with prochloraz.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号