首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
In a field experiment on winter wheat, take‐all on plants and the infectivity of the soil were studied in crop sequences with different combinations of sowing dates. Take‐all was negligible in the first wheat crop, but thereafter the mean disease intensity (measured using a take‐all rating, TAR, with a maximum of 300) was 108, 190, 118 and 251 in the second to fifth successive crops. In each growing season, the disease differed amongst sequences and built up more rapidly and was more intense on plants sown in mid‐September than on plants sown in mid‐October. In late‐sown plots, where volunteers had been present during the mid‐September to mid‐October period, take‐all reached an intensity intermediate between that in early‐sown plots and that in late‐sown plots that had been kept free of volunteers. Volunteers, therefore, partially offset the expected beneficial effect of decreased disease with later sowing. Differences in take‐all amongst sequences were most pronounced in the second wheat crop and early sowing of the previous wheat increased intensity of disease. In the following (third) crop, differences in disease intensity amongst sequences were smaller. Soil infectivity (measured by seedling bioassay after harvest) built up progressively from a low level after the first crop to peak after the third crop. In this build‐up phase, soil infectivity estimates were always numerically greater after harvest of early‐sown treatments than after later‐sown treatments, although never significant at P= 0.05. The greatest difference (P= 0.06) was recorded in October before sowing of the third crop, where the comparison was between soil after two previous early sowings and soil after two previous later sowings and control of volunteers. In the same autumn, presence of green cover (i.e. volunteers) was associated with a smaller loss of soil infectivity between harvest and later sowing than occurred in an absence of green cover. In 2nd–4th crops, where comparisons were available and mean TARs indicated moderate levels of take‐all, sowing later had no yield benefit, despite more take‐all and greater soil infectivity associated with early sowing. Important considerations for the management of crops at risk of take‐all are 1) choosing appropriate sowing dates to minimize take‐all or to encourage take‐all decline and 2) controlling volunteers and weed hosts where crops are sown late to minimise take‐all.  相似文献   

2.
Dulout  Anne  Lucas  Philippe  Sarniguet  Alain  Doré  Thierry 《Plant and Soil》1997,197(1):149-155
Two experiments were carried out in France in which disease indices were used to evaluate the effects of wheat volunteers and blackgrass (Alopecurus myosuroides) on soil infectivity and soil conduciveness to take-all caused by Gaeumannomyces graminis var. tritici. Soil infectivity was evaluated by measuring the disease index on susceptible wheat plants grown on soil samples collected from the field. Soil conduciveness to the disease was obtained by measuring disease indices on plants grown on soil samples to which different amounts of take-all fungus inoculum were added. One experiment (Expt. 1) was carried out using soils from farmers' fields (two fields in 1994 and two in 1995); soil infectivity and soil conduciveness were evaluated for three experimental situations: bare soil, soil with wheat volunteers and soil with blackgrass plants. In 1994 the soil infectivity was zero in bare soil, high with the wheat cover, and intermediate with the blackgrass cover. In 1995 the soil infectivity was uniformly low for all three conditions. Soils bearing wheat were less conducive than bare soil, soils bearing blackgrass and bare soils were similarly conducive. A second experiment (Expt. 2) carried out in 1995 compared the soil infectivity and soil conduciveness to take-all of soils planted with wheat or blackgrass in set-aside land after periods of wheat monoculture of 0–6 yr. The soil infectivity was low for all treatments. The soil was more conducive after blackgrass than after wheat. In both cases, the soil conduciveness was less when the monoculture had continued for more than 4 yr. The decline was less after blackgrass than after wheat. Thus, whenever set-aside is set up during the increase phase of the disease in fields with cereal successions, abundant wheat volunteers might hinder the expected positive effect of a break in cereal successions on take-all development. The presence of blackgrass in a set-aside field, with significant soil infectivity and high soil conduciveness, might increase the risks of take-all development in a wheat crop following set-aside.  相似文献   

3.
Experiments were conducted in fields which had a history of nil to four rice (Oryza sativa L.) crops during the previous four summers. Incorporating stubble after each harvest reduced soil nitrate-N content between crops, but increased soil N mineralization potential. During the fourth successive crop, plots where stubble had been incorporated after the previous three harvests had an average 21% more soil NH4N and 22% more N uptake than plots where stubble had been burnt.Soil fertility fell rapidly with increasing numbers of crops, and the unfertilized fifth crop accumulated approximately half the N (60 kg N ha-1) found in the unfertilized first crop (116 kg). Fertilizer N alleviated the effects of annual cropping; the application of 210 kg N ha-1 to the fifth crop (uptake of 156 kg N ha-1) resulted in similar N uptake to the first crop fertilized with 50 kg N ha-1 (154 kg N ha-1).Applying N at sowing had no significant effect on soil NH4-N concentration after permanent flood (PF), while N application at PF resulted in increased NH4-N concentration and N uptake until panicle initiation (PI). N applied at PI increased soil NH4-N concentration at least until the microspore stage.Management factors such as stubble incorporation and increasing N application rate, maintained N supply and enabled successive rice crops to accumulate similar quantities of N at maturity.  相似文献   

4.
Incidence and severity of the take-all disease in spring wheat and spring barley caused by Gaeumannomyces graminis (syn. Ophiobolus graminis) were studied during seven years of monoculture. The fungus apparently survived for much longer periods in the soil under non-susceptible break-crops than previously recorded. The incidence and severity of infection increased progressively with each successive cereal crop from initially low levels to a maximum within 3–7 years, which was followed by a progressive but limited decline in the disease. Spring wheat was more susceptible to take-all than spring barley and the development of take-all decline (TAD) was recorded earlier in the sequences of wheat than of barley crops. Nitrogen did not influence the disease until the point of maximum incidence and severity, when it caused a reduction in disease levels in addition to that associated with TAD. Factors influencing the time of onset and the rate of development of take-all and of TAD are discussed and possible explanations for TAD are suggested.  相似文献   

5.
At the end of the spring 1987 growing season, the mycoparasite Sporidesmium sclerotivorum was applied at 0, 0.2, 2 or 20 kg ha‐1 to lettuce plants infected with Sclerotinia minor. Disease incidence was monitored in the same plots for five subsequent crops (three fall and two spring crops) without additional application of either pathogen or mycoparasite. Logistic growth curves were fitted to the data to describe disease progression over time for each inoculum level within each of the five crops. Within each crop, increasing the quantity of mycoparasite inoculum resulted in positive horizontal displacement of the curve with respect to time. As quantities of inoculum of S. sclerotivorum increased, inflection points of the disease progress curves increased at a decreasing rate. Thus, additional mycoparasite inoculum resulted in ever‐smaller increases in inflection point, and after a certain threshold level of mycoparasite inoculum (< 0.2 kg ha‐1), increases in inflection point did not result in meaningful increases in harvestable lettuce. Maximum rates of disease increase were not different among the treatments within each crop, but were different between crops. Maximum rates of disease increase averaged 3.4, 3.4, 2.1, 3.6 and 1.5% day‐1 for the fall 1987, spring 1988, fall 1988, spring 1989, and fall 1989, respectively. At all inoculum levels, the fall epidemics began later after planting than the spring epidemics.  相似文献   

6.
Plants have evolved strategies of stimulating and supporting specific groups of antagonistic microorganisms in the rhizosphere as a defense against diseases caused by soilborne plant pathogens owing to a lack of genetic resistance to some of the most common and widespread soilborne pathogens. Some of the best examples of natural microbial defense of plant roots occur in disease suppressive soils. Soil suppressiveness against many different diseases has been described. Take-all is an important root disease of wheat, and soils become suppressive to take-all when wheat or barley is grown continuously in a field following a disease outbreak; this phenomenon is known as take-all decline (TAD). In Washington State, USA and The Netherlands, TAD results from the enrichment during monoculture of populations of 2,4-diacetylphloroglucinol (2,4-DAPG)-producing Pseudomonas fluorescens to a density of 10 (5) CFU/g of root, the threshold required to suppress the take-all pathogen, Gaeumannomyces graminis var. tritici. 2,4-DAPG-producing P. fluorescens also are enriched by monoculture of other crops such as pea and flax, and evidence is accumulating that 2,4-DAPG producers contribute to the defense of plant roots in many different agroecosystems. At this time, 22 distinct genotypes of 2,4-DAPG producers (designated A - T, PfY and PfZ) have been defined by whole-cell repetitive sequence-based (rep)-PCR analysis, restriction fragment length polymorphism (RFLP) analysis of PHLD, and phylogenetic analysis of PHLD, but the number of genotypes is expected to increase. The genotype of an isolate is predictive of its rhizosphere competence on wheat and pea. Multiple genotypes often occur in a single soil and the crop species grown modulates the outcome of the competition among these genotypes in the rhizosphere. 2,4-DAPG producers are highly effective biocontrol agents against a variety of plant diseases and ideally suited for serving as vectors for expressing other biocontrol traits in the rhizosphere.  相似文献   

7.
The effect of rice culture on changes in the number of a strain of soybean root-nodule bacteria, (Bradyrhizobium japonicum CB1809), already established in the soil by growing inoculated soybean crops, was investigated in transitional red-brown earth soils at two sites in south-western New South Wales. At the first site, 5.5 years elapsed between the harvest of the last of four successive crops of soybean and the sowing of the next. In this period three crops of rice and one crop of triticale were sown and in the intervals between these crops, and after the crop of triticale, the land was fallowed. Before sowing the first rice crop, the number of Bradyrhizobium japonicum was 1.32×105 g–1 soil. The respective numbers of bradyrhizobia after the first, second and third rice crops were 4.52 ×104, 1.26×104 and 6.40×102 g–1 soil. In the following two years the population remained constant. Thus sufficient bradyrhizobia survived in soil to nodulate and allow N2-fixation by the succeeding soybean crop. At the second site, numbers of bradyrhizobia declined during a rice crop, but the decline was less than when the soil was fallowed (400-fold cf. 2200-fold). Multiplication of bradyrhizobia was rapid in the rhizosphere of soybean seedlings sown without inoculation in the rice bays. At 16 days after sowing, their numbers were not significantly different (p<0.05) from those in plots where rice had not been sown. Nodulation of soybeans was greatest in plots where rice had not been grown, but yield and grain nitrogen were not significantly different (p<0.05). Our results indicate that flooding soil has a deleterious effect on the survival of bradyrhizobia but, under the conditions of the experiments, sufficient B. japonicum strain CB 1809 survived to provide good nodulation after three crops of rice covering a total period of 5.5 years between crops of soybean.  相似文献   

8.

Background and aims

Take-all, caused by the fungus Gaeumannomyces graminis var. tritici, is the most damaging root disease of wheat. A severe attack often leads to premature ripening and death of the plant resulting in a reduction in grain yield and effects on grain quality (Gutteridge et al. in Pest Manag Sci 59:215–224, 2003). Premature death of the plant could also lead to inefficient use of applied nitrogen (Macdonald et al. in J Agric Sci 129(2):125–154, 1997). The aim of this study was to determine crop N uptake and the amount of residual mineral N in the soil after harvest where different severities of take-all had occurred.

Methods

Plant and soil samples were taken at anthesis and final harvest from areas showing good and poor growth (later confirmed to be caused by take-all disease) in three winter wheat crops grown on the same soil type on Rothamsted Farm in SE England in 1995, 2007 and 2008 (harvest sampling only). All crops received fertiliser N in spring at recomended rates (190–200?kg?N ha?1). On each ocassion crops were assessed for severity of take-all infection (TAR) and crop N uptakes and soil nitrate plus ammonium (SMN) was determined. Grain yields were also measured.

Results

Grain yields (at 85% dry matter) of crops with moderate infection (good crops) ranged from 4.3 to 13.0?t ha?1, compared with only 0.9–4.5?t ha?1 for those with severe infection (poor crops). There were significant (P?<?0.05) negative relationships between crop N uptake and TAR at anthesis and final harvest. At harvest, good crops contained 129–245?kg?N ha?1 in grain, straw and stubble, of which 85–200?kg?N ha?1 was in the grain. In contrast, poor crops contained only 46–121?kg?N ha?1, of which only 22–87?kg?N ha?1 was in the grain. Positive relationships between SMN and TAR were found at anthesis and final harvest. The SMN in the 0–50?cm layer following harvest of poor crops was significantly (P?<?0.05) greater than that under good crops, and most (73–93%) was present as nitrate.

Conclusions

Localised patches of severe take-all infection decreased the efficiency with which hexaploid wheat plants recovered soil and fertiliser derived N, and increased the subsequent risk of nitrate leaching. The risk of gaseous N losses to the atmosphere from these areas may also have been enhanced.  相似文献   

9.
This paper reports results from a 3-year field experiment which examined Nitrogen (N) leaching loss from land under various set-aside managements. Four treatments were examined: three ploughed plots which were sown with wheat, ryegrass or maintained fallow; the fourth treatment was unploughed and natural weed growth (volunteers) permitted. The l-year set-aside was followed by two winter wheat test crops. Ceramic suction cups were installed at a depth of 90 cm and used to collect drainage water. N leaching loss was calculated by multiplying drainage volume, calculated from meteorological data, by its inorganic N concentration.Set-aside management significantly affected N leaching loss over the three years. During the set-aside year, the peak nitrate concentration from the unploughed treatment growing volunteer weeds was significantly lower than that from ploughed plots. Of the latter, by the spring, crop (i.e. wheat and ryegrass) assimilation of N significantly reduced N concentration compared to the fallow. The four set-aside treatments had a carry-over effect to the following year (first wheat test crop) resulting in significant differences in N losses. Leaching following the ryegrass treatment was very small and we believe that the grass residues minimised rates of net-N mineralization.The influence of set-aside management continued to the second wheat test crop where N loss was greater under the all wheat rotation because take-all had reduced yield and therefore crop N uptake.  相似文献   

10.
The yield of wheat and the incidence of take-all were measured in crops grown in six different 4-year sequences, repeated in 3 successive years. The first crop of winter wheat grown after oats or beans yielded 13–23 cwt/acre (1632–2887 kg/ha) more grain than wheat after wheat or barley. Spring wheat after oats yielded 2–5 cwt/acre (250–625 kg/ha) more than spring wheat after wheat. The smaller yields of wheat after wheat or barley were caused mostly by greater prevalence of take-all. Regression analysis indicates that each 1 % increase in straws with take-all decreased yield of winter wheat by 0·6%. Take-all was more prevalent in the second and third successive wheat crops after oats than in the fourth crop.  相似文献   

11.
Six sterol biosynthesis-inhibiting fungicides representing several combinations of properties were applied to soil to control naturally-occurring take-all (caused by Gaeumannomyces graminis var. tritici) in winter wheat in field experiments in two successive years. The average take-all severity category was never more than moderate in the different clay-loam and sandy loam sites used in each year. At each site in each year there were six treatments and an untreated control in an arrangement based on a balanced-incomplete-block design for six treatments in 10 blocks each with three treatments. Each block had three treated plots and a control plot and was paired with the complementary block of three treatments (plus control) to form a complete replicate, of which there were 30 per site. Take-all assessments in June or July showed that after incorporation into the seed bed (at 2 kg ha“1and sometimes at 1 kg ha”1) in autumn, two non-volatile, strongly lipophilic compounds, nuarimol and triadimenol, with good intrinsic toxicity to the take-all fungus and slow rates of degradation, partially controlled take-all. However, another compound, flutriafol, with similar properties to nuarimol and triadimenol, controlled take-all less. Two slightly volatile, strongly lipophilic compounds, flusilazole and penconazole, with good intrinsic activity, were less effective (at 2 kg ha-1). A volatile, less lipophilic compound, PP 969, with less intrinsic activity, also partially controlled take-all, but only after application as a drench in the spring (2 kg ha-1). The most effective treatments were generally more effective the greater the level of disease (as indicated by assessments of disease in control plots), especially in spring assessments of disease. Although flutriafol did not perform as expected, it still seems reasonable to conclude that the requirements for a soil-applied fungicide to control take-all are likely to be: (i) good intrinsic fungitoxicity, (ii) some mobility in soil water (i.e. not strongly lipophilic), and (iii) season-long persistence.  相似文献   

12.
An experiment was made on the fourth, fifth and sixth successive crops of winter wheat to determine the effects of various treatments on the troubles which result from close cereal cropping. Eyespot and lodging were prevalent in the first year (1946); weeds in the second; eyespot, lodging, take-all and weeds in the third.
Spraying with H2SO4 reduced the incidence of eyespot, lodging and weeds, and increased yield of grain on plots which received sulphate of ammonia (by 2.7, 2.2 and 10.0 cwt./acre in successive years).
Sulphate of ammonia increased the incidence at harvest of eyespot and lodging, reduced take-all and consistently increased yield of straw. Eyespot and lodging reduced the effect of the fertilizer on yield of grain, take-all increased it.
Increase in seed rate increased the incidence of severe eyespot and of take-all; it increased lodging except when plants were dwarfed by take-all.
Weight of straw and percentage straws with severe eyespot lesions independently affected lodging, together accounting for 51% of the variance in percentage area lodged at harvest and 64 % of that lodged 33 days earlier.
Mean yields of grain on untreated plots sown with 3.3 1/2 bushels seed/acre fell from 26.0 to 22.5 to 11.7 cwt./acre in successive years, whereas yields of 28.4, 29.9 and 29.1 cwt./acre were obtained on sprayed plots sown with 1 1/2.2 bushels seed/acre which received 4 cwt./acre sulphate of ammonia, showing that high yields were maintained when eyespot, lodging, take-all and weeds were controlled.
By 1948 yields of grain on unsprayed plots had fallen to the level of those on similarly manured plots on the continuous wheat experiment on Broadbalk field. Spraying increased grain by amounts similar to those resulting from one year's fallow on Broadbalk; but fallow had its greatest effects on plots with low nitrogen, spraying on those with high nitrogen.  相似文献   

13.
In experiments during 1983–86 take-all was more severe and eyespot and sharp eyespot less frequent in 2nd-4th crops of winter wheat at Woburn (Beds.) than at Rothamsted (Herts.). Third crops had most take-all and yielded least grain. Against this background, small plots, 37 cm × 31 cm, in which all plants were sampled, were tried as a means of increasing experimental precision. They were arranged in fours in incomplete blocks and blocks with complementary treatments (putative controls of take-all) were paired. Thirty of these block-pairs were distributed throughout each experimental site in each year to provide one replicate of the design for each of three sampling times: April, June and August. Unattributed variation in disease and plant growth for plots within blocks was compared to that in other strata (block-pairs and blocks within block-pairs) of the experiment. The variability amongst block-pairs scattered throughout the site was nearly always greater than that for blocks within block-pairs (98% of take-all assessments, 71% of soil infectivity estimates, 94% of eyespot and sharp eyespot assessments and 86% of all plant measurements). The variability of blocks within block-pairs exceeded that of plots much less frequently (56% and 69% of take-all assessments, 33% and 25% of soil infectivity measurements, 63% and 56% of eyespot and sharp eyespot assessments and 50% and 63% of plant measurements; Rothamsted and Woburn, respectively). Small plots were judged mostly on this last comparison, where a variance ratio in excess of 1 indicated that the small plots had decreased variability and increased precision. Variance ratios for different assessments of take-all indicates that small plots: i) most consistently decreased disease variability during the years of maximum disease, ii) were slightly less effective at Rothamsted than at Woburn, and iii) were usually less effective in fourth crops than in previous crops. Soil infectivity was most uniform after crops with most disease and blocks were rarely more variable than plots. hxcept when disease was severe, soil infectivity in August tended to be positively associated with the yield of the crop just harvested. These findings reveal changes in the scale of disease patterns, both during the crop sequence and within individual crops, and suggest more than one scale of pattern in take-all-infested fields. This is discussed in relation to field experimentation and take-all decline.  相似文献   

14.
In a replicated field experiment mean yields of wheat from plots that, in the preceding 2 years, had carried oats, beans or potatoes were 39.2 and 42.6 cwt. per acre in 1954 for Holdfast and Cappelle, respectively; 42.8 and 55.8 in 1955 and 34.9 and 49.6 in 1956. Previous wheat crops had more effect than any other treatment in increasing the incidence of eyespot, take-all and weeds and in decreasing the number of ears per unit area and the yield of grain. In 1956 on plots carrying the first, second and third successive wheat crops the percentages of straws with eyespot were respectively 12, 54 and 42 and with take-all 0.1, 1 and 16. Cappelle was less severely infected by eyespot than Holdfast. The second and third successive wheat crops yielded an average of 23.3 cwt./acre less than the first wheat crop. Cappelle consistently yielded more than Holdfast, the mean difference being 13.8 cwt./acre after potatoes but only 3.8 cwt./acre after two wheat crops. The higher seed-rate gave an average increase in grain yield of 3.3 cwt./acre; but where eyespot and take-all were both severe the lower seed-rate yielded as much total and more dressed grain than the higher. Wheat given a spring top dressing of 6 cwt./acre Nitro-Chalk yielded an average of 4 cwt./acre more grain than wheat given 3 cwt./acre.  相似文献   

15.
To increase our understanding of the fate of applied nitrogen inPhaseolus vulgaris crops grown under tropical conditions,15N-labelled urea was applied to bean crops and followed for three consecutive cropping periods. Each crop received 100 kg urea-N ha?1 and 41 kg KCl?K ha?1. At the end of each period we estimated each crop's recovery of the added nitrogen, the residual effects of nitrogen from the previous cropping period, the distribution of nitrogen in the soil profile, and leaching losses of nitrogen. In addition, to evaluate potential effects of added phosphorus on nitrogen cycling in this crop, beans were treated at planting with either 35 kg rock-phosphate-P, 35 kg superphosphate-P, or 0 kg P ha?1. Results showed that 31.2% of the nitrogen in the first crop was derived from the applied urea, which represents a nitrogen utilization efficiency of 38.5%. 6.2% of the nitrogen in the second crop was derived from fertilizer applied to the first crop, and 1.4% of the nitrogen in the third crop. Nitrogen utilization efficiencies for these two crops, with respect to the nitrogen applied to the first crop, were 4.6 and 1.2%, respectively. In total, the three crops recovered 44.3% of the nitrogen applied to the first crop. The remainder of the nitrogen was either still in the soil profile or had been lost by leaching, volatilization or denitrification.15N enrichment of mineral-N(NO3+NH4) suggests that at the end of the second crop, the pulse of fertilizer applied to the first crop had probably passed the 120 cm depth.15N enrichment of organic-N suggests that root activity of beans and weeds transported nitrogen to 90–120 cm (or deeper). We could account for 109 kg fertilizer-N ha?1 in harvested biomass, crop residue, and soil at the end of the first cropping period. This indicates an experimental error of about 10% if no nitrogen was lost by volatilization, denitrification, or leaching below 120 cm. At the end of the second and third crops, 76 and 80 kg N ha?1, respectively, could be accounted for, suggesting that 20 to 25% of the applied-N was lost from the system over a 2-crop period. The two types of added phosphorus did not significantly differ in their effects on bean yields.  相似文献   

16.
Abstract The mechanisms of build-up of inoculum of the take-all fungus, Gaeumannomyces graminis var. tritici. and infection of crop plants from self-sown (volunteer) wheat were analysed in a factorial experiment in a glasshouse. Treatments comprised sowing-date, inoculum density, soil aeration and texture, volunteer density and control of volunteers. Timing of treatments was related to field practice by the use of cumulative day-degrees and by consideration of sowing date and the geometry of seed placement. Prior sowing of volunteers, which were exposed to soil inoculum. resulted in significant increases in the incidence and severity of disease on subsequently sown seedlings. Increasing the density of volunteer seedlings increased the levels of subsequent infection. This effect, however, was significantly influenced by sowing date and density of the initial inoculum. The use of glyphosate to kill volunteers did not markedly affect the carry-over of disease.  相似文献   

17.
Pal  Sudhansu S. 《Plant and Soil》1998,198(2):169-177
Phosphate solubilizing bacteria (PSB) were isolated from sixty soil samples of various soil classes and cropping histories in Himalayan regions of Uttar Pradesh, India by enrichment culture techniques. Phosphate solubilization and acid tolerance of each strain was estimated. A strain (PAS-2) isolated froma pasture and waste land of pH 4.8, organic matter 2.6% available N 265kg ha-1, available P2O5(Bray's II) 2.3kg ha-1 and available K2O 353 kg ha-1 had the highest P-solubilization (45 µg P per mL per day) and also highest acid tolerance rating 42. The strain was identified as Bacillus sp. Seed inoculation of this bacterial strain resulted in significant increases in grain and vegetative yield of fingermillet (Elosine coracana), maize (Zea mays), amaranth (Amaranthus hypochondriacus), buckwheat (Fagopyrium esculentum), frenchbean (Phaseolus vulgaris) with or without added P sources. The significant grain yield (quintol ha-1) with phosphate and seed inoculation ranged from 33.85 in maize, 26.33 in frenchbean, 22.41 in buckwheat, 20.71 in amaranth and 19.19 in fingermillet as compared to controls. The highest response was observed with frenchbean followed by fingermillet, buckwheat, amaranth and maize. Phosphate use efficiency was highest in frenchbean followed by maize and lowest and almost at par in buckwheat, amaranth and fingermillet. Available phosphate was also highest in frenchbean cultivated plot followed by amaranth, fingermillet, buckwheat and maize. The MPN count of phosphate solubilizing bacteria were also influenced by seed inoculation of strain PAS-2. Frenchbean exerted greaterrhizosphere effect followed by pseudocereals and cereals. Likewise, phosphate nutrition of crops were also improved through seed inoculation irrespective of added P sources. The study thus demonstrated that selection of efficient strain of PSB from acid soil and its seed inoculation in selected crop genotype is beneficial in boosting up crop yield in low productive hill soil. Seed inoculation also created greater rhizosphere effect over uninoculation which improved P-nutrition of crops and also available soil P.  相似文献   

18.
Reduction of Take-all Inoculum by Rotation with Lupins, Oats or Field Peas   总被引:1,自引:0,他引:1  
The feasibility of use of lupins, oats and field peas as alternative rotation crops to reduce inoculum of the take-all fungus (Gaeumannomyces graminis var. tritici) (under Western Australian field conditions) and disease in following wheat was investigated with a one year field trial, the soil from which was used in two succeeding pot experiments. The possible mechanisms of reduction of inoculum and disease by these crops were examined testing the soil for pathogen and disease suppression. Rotation with lupins or oats for two seasons reduced (P <0.05) inoculum of the take-all fungus and lupins, oats or field peas reduced (P <0.05) disease in following wheat. Lupins alone reduced inoculum and disease, (P <0.1) after one season. No apparent suppression of the pathogen in the absence of host plants was recorded after one season of rotation, but after two seasons, lupins, oats or field peas all suppressed (P <0.02) growth of the pathogen within soil. However only field pea soil suppressed take-all in comparison with the wheat control. Although after two seasons all rotation crops were effective in reducing inoculum and disease the mechanisms of reduction appear to differ between the rotation crops used in this study.  相似文献   

19.
Herdina  Roget  D. K. 《Plant and Soil》2000,227(1-2):87-98
A rapid, routine DNA-based assay to quantify Gaeumannomyces graminis var. tritici (Ggt), the causal agent of take-all disease of cereals, has been developed and used for the prediction of take-all in a wide range of field soils. Based on the correlation of the DNA-based assay and a soil bioassay, the risk of disease development can be estimated. Ggt DNA levels of <30 pg, 30–50 pg and >50 pg in 0.1 g soil organic matter correspond to low, moderate and high levels of the disease, respectively. Limitations in the prediction of take-all, including sampling requirements to obtain representative soil samples from fields and increasing the sensitivity and the accuracy of the DNA assay, are described. The main advantage in using the DNA-based assay, in estimating the amount of Ggt inoculum in soil, is that the levels of Ggt in soil samples can be assessed rapidly and accurately. Farmers can now have soil samples assessed before sowing. The DNA result can be used to predict the potential yield loss and determine the most appropriate management options using decision support software that is currently available. This DNA technology is currently being used commercially to detect and predict take-all. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   

20.
A field experiment conducted at Central Rice Research Institute, Cuttack, during three successive seasons showed that with the 120-day-duration variety Ratna two dual crops ofAzolla pinnata R. Brown (Bangkok isolate) could be achieved 25 and 50 days after transplanting (DAT) by inoculating 2.0 t ha−1 of fresh Azolla 10 and 30 DAT respectively. One basal crop of Azolla could also be grown using the same inoculum 20 days before transplanting (DBT) in fallow rice fields. The three crops of Azolla grown—once before transplanting and twice after transplanting—gave an average total biomass of 38–63 and 43–64 t ha−1 fresh Azolla containing 64–90 and 76–94 kg N ha−1 respectively in the square and rectangular spacings. Two crops of Azolla grown only as a dual crop, on the other hand, gave 26–39 and 29–41 t ha−1 fresh Azolla which contained 44–61 and 43–59 kg N ha−1 respectively. Growth and yield of rice were significantly higher in Azolla basal plus Azolla dual twice incorporated treatments than in the Azolla dual twice incorporation, Azolla basal plus 30 kg N ha−1 urea and 60 kg N ha−1 urea treatments. Azolla basal plus 30 kg N ha−1 urea and 60 kg N ha−1 urea showed similar yields but Azolla dual twice incorporation was significantly lower than those. The different spacing with same plant populations did not affect growth and yield significantly, whereas Azolla growth during dual cropping was 8.3 and 64% more in the rectangular spacing than in the square spacing in Azolla basal plus Azolla dual twice incorporation and Azolla dual twice incorporation treatments.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号