首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Sowing date and phosphorus utilization by wheat   总被引:1,自引:0,他引:1  
Batten  G. D.  Fettell  N. A.  Mead  J. A.  Khan  M. A. 《Plant and Soil》1993,155(1):297-300
The uptake and utilization of phosphorus (P) by cereal crops is influenced by the growing period of the crop. In this article the effect of sowing date on the utilization of P by wheat crops grown in southern NSW is reviewed. Crops sown early in the accepted sowing period require smaller inputs of P fertilizer to reach the maximum yield but produce grain with a higher concentration of P than crops sown late in the sowing season. For later sowings a higher rate of applied P is required to achieve the yield potential but this is not associated with a high grain P concentration or a high rate of removal of P from the soil. If grain with a high P concentration is required as seed for subsequent crops, then sowing early, even with little or no applied P fertilizer, is preferable, although crops sown early in the season are likely to remove more P from the soil than the amount applied in fertilizer.  相似文献   

2.
A seed treatment containing fluquinconazole as the only active ingredient was tested in sequences of up to six consecutive crops of winter wheat. It was applied or not applied in each year, and was tested in all possible combinations with treatments applied in previous years. Take‐all was controlled effectively, and grain yield usually increased, when the disease intensity was moderate or severe in non‐treated crops, but control of the most severe take‐all did not result in acceptable yields or grain quality. Treatment of a first wheat or second wheat with little take‐all did not usually benefit the subsequent crop. Non‐treatment of a crop grown after a treated, diseased crop usually resulted in a marked increase in disease, indicating that treatment had delayed progress of the epidemic. Take‐all was controlled by treatment of a crop grown after a treated, diseased crop but the amount of control and of increased yield was often less than that in a treated crop grown after a non‐treated crop in the same crop sequence. Similar effects of seed treatment were apparent in crops grown on a site with take‐all decline. The alternative fungicide, silthiofam, applied as a seed treatment in the later years of some experiments, was usually as effective as fluquinconazole. From these experiments, it is recommended that: a) fluquinconazole seed treatment should be applied to a second or third wheat crop, grown after a first wheat crop that was managed to avoid rapid take‐all development (e.g. by avoiding very early sowing); b) a break crop should follow the treated crop; c) the seed treatment should not normally be used in longer sequences of wheat or on take‐all decline soil unless it is planned to follow the treated crop with a non‐cereal break.  相似文献   

3.
Experiments on the Rothamsted and Woburn Experimental Farms studied the effects on take‐all of different break crops and of set‐aside/conservation covers that interrupted sequences of winter wheat. There was no evidence for different effects on take‐all of the break crops per se but the presence of volunteers, in crops of oilseed rape, increased the amounts of take‐all in the following wheat. Severity of take‐all was closely related to the numbers of volunteers in the preceding break crops and covers, and was affected by the date of their destruction. Early destruction of set‐aside/conservation covers was usually effective in preventing damaging take‐all in the following wheat except, sometimes, when populations of volunteers were very large. The experiments were not designed to test the effects of sowing dates but different amounts of take‐all in the first wheats after breaks or covers apparently affected the severity of take‐all in the following (second) wheats only where the latter were relatively late sown. In earlier‐sown second wheats, take‐all was consistently severe and unrelated to the severity of the disease in the preceding (first) wheats. Results from two very simple experiments suggested that substituting set‐aside/conservation covers for winter wheat, for 1 year only, did not seriously interfere with the development of take‐all disease or with the development or maintenance of take‐all decline (TAD). With further research, it might be possible for growers wishing to exploit TAD to incorporate set‐aside/conservation covers into their cropping strategies, and especially to avoid the worst effects of the disease on grain yield during the early stages of epidemics.  相似文献   

4.
Winter wheat was grown for six successive years (Expt 1) and for three successive years (Expt 2) in field experiments on different soil types. Artificial inoculum of the take-all fungus (Gaeumannomyces graminis var. tritici cultured on autoclaved oat grains) was incorporated in the soil of some of the plots just before, or at, sowing of the first winter wheat crop. Expt 1 tested the incorporation of similar amounts of inoculum (212 kg ha-1) at different depths. Expt 2 tested different amounts of inoculum at the same, shallow depth. Early sowing (September), late sowing (October) and spring inoculation were additional treatments, applied to the first crop only, in Expt 2. Seasonal factors apart, the disease outcome in the first year after inoculation depended on amounts and placement of applied inoculum, as well as date of sowing. Deeper inoculum resulted in less disease (Expt 1). Severe take-all was produced in Expt 2 by incorporating inoculum shallowly in sufficient quantities (400 kg ha-1 or more). Less inoculum (200 kg ha-1) generated less disease, especially in earlier-sown plots. Differences in disease amongst inoculum treatments were greatest in the first year and diminished subsequently, particularly where sowing had been early in the first year. In Expt 1, where first crops exposed to artificial inoculum developed moderate-to-severe disease, disease in subsequent second and/or third crops was less. In the fourth crop a second peak of disease occurred, coinciding with a first peak in sequences without added inoculum. Take-all decline (TAD) appeared to be expressed in all sequences thereafter. In Expt 2 in sequences without added inoculum, TAD occurred after a peak of disease in the second crop. Where 400 kg ha-1 or more of inoculum were added, disease was severe in the first year and decreased progressively in successive years. Disease was less patchy in plots that received artificial inoculum. However, it remains uncertain mat severe disease caused by artificial inoculation achieved an early onset of true TAD. The infectivity of the top 12 cm of soil in the first 3 yr of Expt 1, determined by bioassay, depended on the depth of added inoculum and amount of disease in subsequent crops. However, at the time of the naturally occurring peak of disease severity (in either inoculated or non-inoculated plots) it did not predict either disease or TAD. Differences and similarities amongst epidemics developing naturally and those developing from different amounts and placement of applied inoculum have been revealed. The epidemiological implications of adding inoculum and the potential value of artificially-created epidemics of take-all in field trials are discussed.  相似文献   

5.
Seed treatments containing fluquinconazole, silthiofam or a standard fungicide mixture with no activity against take‐all were compared in all combinations of sequences in successive second and third winter wheat crops in five field experiments and second to fourth crops in a sixth experiment. Compared with the standard treatment, silthiofam decreased take‐all more effectively than fluquinconazole when crops were sampled at tillering. In samples taken in summer, during grain filling, silthiofam often decreased the incidence of take‐all (percentage of plants with root symptoms) more than fluquinconazole, but fluquinconazole more effectively decreased the incidence of severe take‐all (percentage of plants with more than 75% of their root systems blackened). It is suggested that these differences are a consequence of more effective control of primary infection of roots by silthiofam and of secondary, root‐to‐root, infection by fluquinconazole. Silthiofam usually increased yield more than did fluquinconazole, perhaps as a consequence of better early protection during tiller and/or spikelet formation. Treatment with either of the fungicides affected epidemic development in the treated crop and in crops grown subsequently. In particular, decreased take‐all had the effect of delaying the year‐to‐year epidemic, so that nontreatment of a subsequent crop resulted in an upsurge in disease. Treatment with either take‐all fungicide of a crop grown after a treated crop was relatively effective if the epidemic in the comparable nontreated crop sequence was continuing to increase. It was, however, detrimental if the disease was approaching its peak in the first treated crop, particularly if a treated (fourth wheat) crop was being compared with a similar crop in a nontreated sequence in which take‐all decline had developed. These results provide a basis for recommendations for the use of seed treatment fungicides in sequences of wheat crops.  相似文献   

6.
Granular and liquid formulations of chlorfenvinphos were applied to a sandy-loam as continuous logarithmically-changing doses of approximately 0.2 to 4.0 kg a.i./ha and radish were sown immediately and 23 days after treatment. After 62 days, insecticide concentrations in the soil had not diminished detectably in the granule treatment but had declined by about 20% in the liquid treatment. In both sowings, residues in the harvested radish were higher, dose-for-dose, from the granule than from the liquid treatment and with both formulations were lower in the second than in the first sowing. Within treatments there were log-log relationships between the dose of insecticide and residue concentrations in the soil and radish. In both sowings the residues were most variable between replicate soil and crop samples from the granule treatment. In the first sowing, cabbage root fly damage was reduced most by the liquid treatment but in the second sowing equivalent doses of the two formulations gave similar protection. At 2.0, 2.24 or 3.0 kg a.i./ha, chlorfenvinphos incorporated before sowing protected radish better than pre-sowing or pre-emergence applications to the soil surface. When applied before sowing, the liquid formulation gave better protection and lower residue concentrations in the radish than the granules. As a post-emergence spray, chlorfenvinphos was much more effective than fonofos, diazinon or triazophos but it was often phytotoxic, decreased yield significantly, left large residues in the radish in two of the four experiments and, in common with other surface treatments, substantially decreased the Z:E chlorfenvinphos isomer ratio. Although a single application of granules protected two successive radish crops, it was concluded that third and subsequent sowings on the same land should probably be treated with smaller doses to limit terminal residues.  相似文献   

7.
Direct sowing of Miscanthus seed could lower crop establishment costs, and increase the rate of grower uptake and biomass supply for the emerging bio‐economy. A replicated field trial was conducted at two contrasting UK sites: Aberystwyth (ABR) in mid‐Wales and Blankney (BLK) in Lincolnshire. These sites encompass the west–east meteorological gradient in the United Kingdom where the growing season at ABR is cooler and wetter while BLK is warmer and drier. Primed and unprimed Miscanthus sinensis seeds were sown directly onto the soil surface with and without a clear biodegradable mulch film, at nine dates interspersed from May to October. Average daily mean soil surface temperatures measured over the first 2 months after sowing under the mulch film were higher than control plots (2.7°C ABR and 4.2°C BLK). At both sites, the film covering also affected soil volumetric moisture relative to uncovered control plots (?3% ABR and 8% BLK), demonstrating the negative impact of mulch film when sowing on dry soil. Over nine sowings, seed germination at ABR under film varied between ?28% and +18% of germination under control conditions. Seedlings from the first three sowings at both sites under film had sufficient physiological maturity to survive the first winter period. At BLK, mulch film significantly increased tiller count and height in both the first and second years after sowing. At ABR, where temperatures were lower, film covering significantly increased tiller height but not count. Water priming had no significant effect on seed viability or germination in the field tests. Base temperatures for germination of primed and unprimed seeds on a thermal gradient plate were 7.0°C and 5.7°C, respectively, with a ± 1.7°C confidence interval. Based on our results for M. sinensis in the United Kingdom, we recommend the sowing of unprimed seed in May under film and only when the soil is moist.  相似文献   

8.
By accelerating crop development, warming climates may result in mismatches between key sensitive growth stages and extreme climate events, with severe consequences for crop yield and food security. Using recent estimates of gene responses to vernalization and photoperiod in wheat, we modelled the flowering times of all ‘potential’ genotypes as influenced by the velocity of climate change across the Australian wheatbelt. In the period 1957–2010, seasonal increases in temperature of 0.012 °C yr?1 were recorded and changed flowering time of a mid‐season wheat genotype by an average ?0.074 day yr?1, with flowering ‘velocity’ of up to 0.95 km yr?1 towards the coastal edges of the wheatbelt; this is an estimate of how quickly the given genotype would have to be ‘moved’ across the landscape to maintain its original flowering time. By 2030, these national changes are projected to accelerate by up to 3‐fold for seasonal temperature and by up to 5‐fold for flowering time between now and 2030, with average national shifts in flowering time of 0.33 and 0.41 day yr?1 between baseline and the worst climate scenario tested for 2030 and 2050, respectively. Without new flowering alleles in commercial germplasm, the life cycle of wheat crops is predicted to shorten by 2 weeks by 2030 across the wheatbelt for the most pessimistic climate scenario. While current cultivars may be otherwise suitable for future conditions, they will flower earlier due to warmer temperatures. To allow earlier sowing to escape frost, heat and terminal drought, and to maintain current growing period of early‐sown wheat crops in the future, breeders will need to develop and/or introduce new genetic sources for later flowering, more so in the eastern part of the wheatbelt.  相似文献   

9.
Take‐all disease (Gaeumannomyces graminis var. tritici) in wheat crops is known to be suppressed by naturally occurring antagonistic fungi, closely related to the pathogen, that infect grasses and cereals. This form of suppression was re‐investigated because of the changing importance and role of grass weeds and grass covers in arable farming. Natural populations of the competitive fungus Gaeumannomyces cylindrosporus, allowed to develop under rye‐grass, were more effective than artificially introduced populations in suppressing the development of take‐all in following wheat crops. To be effective, the antagonist needs to be present before the start of wheat cropping. Introducing G. cylindrosporus, but not G. graminis var. graminis (a potential antagonist that is faster growing), into a previous crop, or just after the previous crop, sometimes suppressed take‐all, but the effect was small. It is concluded that, for any future attempts at biocontrol by these fungi, they should be introduced into a preceding crop not susceptible to take‐all. Take‐all inoculum in the soil should be at a minimum and effective hosts of the take‐all pathogen must not be present as weeds or volunteers.  相似文献   

10.
Dulout  Anne  Lucas  Philippe  Sarniguet  Alain  Doré  Thierry 《Plant and Soil》1997,197(1):149-155
Two experiments were carried out in France in which disease indices were used to evaluate the effects of wheat volunteers and blackgrass (Alopecurus myosuroides) on soil infectivity and soil conduciveness to take-all caused by Gaeumannomyces graminis var. tritici. Soil infectivity was evaluated by measuring the disease index on susceptible wheat plants grown on soil samples collected from the field. Soil conduciveness to the disease was obtained by measuring disease indices on plants grown on soil samples to which different amounts of take-all fungus inoculum were added. One experiment (Expt. 1) was carried out using soils from farmers' fields (two fields in 1994 and two in 1995); soil infectivity and soil conduciveness were evaluated for three experimental situations: bare soil, soil with wheat volunteers and soil with blackgrass plants. In 1994 the soil infectivity was zero in bare soil, high with the wheat cover, and intermediate with the blackgrass cover. In 1995 the soil infectivity was uniformly low for all three conditions. Soils bearing wheat were less conducive than bare soil, soils bearing blackgrass and bare soils were similarly conducive. A second experiment (Expt. 2) carried out in 1995 compared the soil infectivity and soil conduciveness to take-all of soils planted with wheat or blackgrass in set-aside land after periods of wheat monoculture of 0–6 yr. The soil infectivity was low for all treatments. The soil was more conducive after blackgrass than after wheat. In both cases, the soil conduciveness was less when the monoculture had continued for more than 4 yr. The decline was less after blackgrass than after wheat. Thus, whenever set-aside is set up during the increase phase of the disease in fields with cereal successions, abundant wheat volunteers might hinder the expected positive effect of a break in cereal successions on take-all development. The presence of blackgrass in a set-aside field, with significant soil infectivity and high soil conduciveness, might increase the risks of take-all development in a wheat crop following set-aside.  相似文献   

11.
Two field trials were conducted to investigate different herbage grasses and cereals for their susceptibility to the disease take‐all, for their impact on concentrations of the pathogen, Gaeumannomyces graminis var. tritici (Ggt), in soil and for their effect on development of take‐all in a subsequent wheat crop. In the herbage grass trial, Bromus willdenowii was highly susceptible to Ggt, produced the greatest post‐senescence Ggt concentrations in soil and highest incidence of take‐all in following wheat crop. Lolium perenne, Lolium multiflorum and Festuca arundinacea supported low Ggt soil concentrations and fallow the least. The relationship between susceptibility to Ggt and post‐senescence concentrations in soil differed between pasture grasses and cereals. In a trial in which Ggt was added to half the plots and where wheat, barley, triticale, rye or fallow were compared, the susceptibility of the cereals to take‐all was not clearly linked to post‐harvest soil Ggt concentrations. In particular, triticale and rye had low and negligible take‐all infection respectively, but greater post‐harvest soil Ggt concentrations than barley or wheat. This indicates that low Ggt concentrations on roots may build up during crop senescence on some cereals. Soil Ggt concentrations were greater following harvest in inoculated plots sown to cereals, but in the second year there was more take‐all in the previously non‐inoculated than inoculated plots. Thus, the grass and cereal species differed in susceptibility to take‐all, in their impact on Ggt multiplication and in associated take‐all severity in following wheat crop.  相似文献   

12.
The sudden decline following the peak in population abundance of aphids on crops of small grain cereals is attributed to the joint effect of natural enemies and plant senescence. To distinguish between these causes, a four year experiment was established in which the numbers of Metopolophium dirhodum (Walker) infesting spring wheat plots sown from April to June at c. 14 day intervals were determined. Aphid abundance in replicates sown at successive dates peaked within a period of 5-9 days (106-171 day degrees above a base temperature of 0 degrees C) although their sowing dates varied by 62-97 days (727-1106 day degrees). At the time of the aphid population peaks, plants in the different sowings differed in age (11-99 days), developmental stage (stage 15-65 on the Zadoks scale), leaf nitrogen content and shoot mass. Maximum abundance of M. dirhodum decreased with sowing date because the time available for its population increase was shorter on late than early sowings. The abundance of M. dirhodum on spring wheat was similar to its abundance on winter wheat. After reaching peak abundance, aphids declined in numbers within 3-7 days. The effect of host plant ageing on the M. dirhodumdecline thus appeared small. Natural enemies (largely mycoses), and timing of alata production may have contributed to the aphid decline.  相似文献   

13.
Effect of sowing date on the optimum plant density of winter wheat   总被引:2,自引:0,他引:2  
Pressure on financial margins in UK wheat production is driving a review of all inputs, and seed represents one of the largest financial inputs in wheat production. The potential savings through exploiting the crop's ability to compensate for reduced population are, therefore, attractive. Field experiments were carried out at ADAS Rosemaund (Herefordshire, UK) in 1996/97, 1997/98 and 1998/99 to investigate the effect of sowing date on dry matter growth and yield responses of winter wheat to reduced plant population. There were three target sowing dates (late‐Septembr, mid‐October and mid‐November), six seed rates (20, 40, 80, 160, 320 and 640 seeds m?2) and four varieties (Cadenza, Haven, Soissons and Spark). Grain yield was significantly affected by plant population with a mean reduction from 9.2 to 5.5 t ha?1 as plant number was reduced from 336 to 13 m?2. In addition, there was a significant interaction between plant density and sowing date. There was, however, no interaction between variety and plant population in terms of yield, except when lodging affected high plant populations of lodging susceptible varieties. The experiments demonstrated scope for reducing plant populations below the current target of 250–300 plants m?2; however, the degree of reduction was dependent on sowing date. Over the three years, the average economic optimum plant density was 62 plants m?2 for late‐September, 93 plants m?2 for mid‐October, and 139 plants m?2 for mid‐November sowings. Compensation for reduced population was due to increased shoot number per plant, increased grain number per ear and to a lesser extent increased grain size. Higher economic optimum plant densities at later sowing dates were due to reduced tiller production and hence ear number per plant. The other compensatory mechanisms were unaffected by sowing date.  相似文献   

14.
Abstract The mechanisms of build-up of inoculum of the take-all fungus, Gaeumannomyces graminis var. tritici. and infection of crop plants from self-sown (volunteer) wheat were analysed in a factorial experiment in a glasshouse. Treatments comprised sowing-date, inoculum density, soil aeration and texture, volunteer density and control of volunteers. Timing of treatments was related to field practice by the use of cumulative day-degrees and by consideration of sowing date and the geometry of seed placement. Prior sowing of volunteers, which were exposed to soil inoculum. resulted in significant increases in the incidence and severity of disease on subsequently sown seedlings. Increasing the density of volunteer seedlings increased the levels of subsequent infection. This effect, however, was significantly influenced by sowing date and density of the initial inoculum. The use of glyphosate to kill volunteers did not markedly affect the carry-over of disease.  相似文献   

15.
16.
Field experiments were done to evaluate the extent to which cover crops can be used to help farmers comply with current legislation on nitrate leaching from arable land in nitrate vulnerable zones. Nitrate leaching was measured in sandy loam and chalky loam soils under a range of early sown (mid-August) cover crops at two sites in SE England, and in the subsequent winter following their incorporation. Cover crop species tested were forage rape, rye, white mustard, a rye/white mustard mixture, Phacelia and ryegrass. Additional treatments were weeds plus cereal volunteers, a bare fallow and a conventional winter barley crop sown one month later than the cover crops and grown to maturity. Cover crop and bare fallow treatments were followed by spring barley. This was followed by winter barley, as was the conventional winter barley crop. In the winter immediately after establishment, early sown cover crops decreased nitrate leaching by 29–91% compared to bare fallow. They were most effective in a wet winter on the sandy loam where nitrate leaching under bare fallow was greatest. There was little difference between cover crop species with respect to their capacity to decrease nitrate leaching, but losses were consistently smaller under forage rape. The growth of weeds plus cereal volunteers significantly decreased nitrate leaching on the sandy loam compared with a bare fallow, but was less effective on the chalky loam. Nitrate leaching under the later sown winter barley was often greater than under cover crops, but under dry conditions leaching losses were similar. In the longer-term, in most cases, the inclusion of cover crops in predominantly cereal-based cropping systems did not significantly decrease cumulative nitrate leaching compared with two successive winter cereals. In summary, early sown cover crops are most likely to be effective when grown on freely drained sandy soils where the risk of nitrate leaching is greatest. They are less likely to be effective on poorer drained, medium-heavy textured soils in the driest parts of SE England. In these areas the regeneration of weeds and cereal volunteers together with some additional broadcast seed may be sufficient to avoid excessive nitrate losses. In the short-term, mineralization of N derived from the relatively small cover crops grown once every 3–4years in cereal-based cropping systems is unlikely to contribute greatly to nitrate leaching in later years and adjustments to fertilizer N recommendations will not usually be necessary.  相似文献   

17.
The effect of rice culture on changes in the number of a strain of soybean root-nodule bacteria, (Bradyrhizobium japonicum CB1809), already established in the soil by growing inoculated soybean crops, was investigated in transitional red-brown earth soils at two sites in south-western New South Wales. At the first site, 5.5 years elapsed between the harvest of the last of four successive crops of soybean and the sowing of the next. In this period three crops of rice and one crop of triticale were sown and in the intervals between these crops, and after the crop of triticale, the land was fallowed. Before sowing the first rice crop, the number of Bradyrhizobium japonicum was 1.32×105 g–1 soil. The respective numbers of bradyrhizobia after the first, second and third rice crops were 4.52 ×104, 1.26×104 and 6.40×102 g–1 soil. In the following two years the population remained constant. Thus sufficient bradyrhizobia survived in soil to nodulate and allow N2-fixation by the succeeding soybean crop. At the second site, numbers of bradyrhizobia declined during a rice crop, but the decline was less than when the soil was fallowed (400-fold cf. 2200-fold). Multiplication of bradyrhizobia was rapid in the rhizosphere of soybean seedlings sown without inoculation in the rice bays. At 16 days after sowing, their numbers were not significantly different (p<0.05) from those in plots where rice had not been sown. Nodulation of soybeans was greatest in plots where rice had not been grown, but yield and grain nitrogen were not significantly different (p<0.05). Our results indicate that flooding soil has a deleterious effect on the survival of bradyrhizobia but, under the conditions of the experiments, sufficient B. japonicum strain CB 1809 survived to provide good nodulation after three crops of rice covering a total period of 5.5 years between crops of soybean.  相似文献   

18.
Proctor and Maris Puma barleys were sown in October, early March, and late April at 50, 100, 200, 400 and 800 plants per m2and at low and high fertilizer levels. Shoot dry matter and grain dry matter showed no significant response to density for the first two sowings, but increased with increasing density in the last sowing. In all sowings the ear number per m2rose with increasing density, and grain number per ear fell, and there were only small interactions between density and sowing date, but the effect on weight per grain differed markedly in the last sowing from that in the first two sowings. Nitrogen concentration per cent of dry matter in the grain and in the straw showed little response to density and the values for both grain and straw were highest in the last sowing. Nitrogen content per m2for both shoot and grain at first rose, and then fell with increasing density. The maximum amount of nitrogen per m2was found at about 100 plants per m2in the early sowings, and at 400 plants per m2in the last sowing. The nitrogen data indicated a loss of nitrogen from the plant at high densities.  相似文献   

19.
The assemblage of carabid species trapped over a 3 yr period in eight separate but contiguous fields was analysed by Canonical Correspondence Analysis (CCA), Seven of the fields were subject to arable crop rotations involving a mixture of autumn and early spring‐sown cereals (wheat or barley), late spring‐sown crops (potatoes, sugar and fodder beet and maize) and short‐term uncultivated grass leys, The eighth field was an established ‘permanent’ grass pasture of at least 50 yrs standing, Pooled samples collected in the first half of the growing season (April‐June) showed clear evidence of soil cultivation effects on community structure, Systematic exclusion of samples and re‐analysis distinguished the fauna of firstly, old pasture samples, samples from ley pastures of increasing age and finally samples from fields with different times of soil cultivation, In the latter analysis, the main effects of soil cultivation were related to differences in ground cover over the winter and a direct effect of spring soil cultivation on autumn breeding populations, The ordination of pooled catches for the second half of the growing season (July‐September) could not be related to known year, field or crop cultivation variables, The underlying species ordination suggested that later in the summer the effect of crop cover on soil microclimate may mask cultivation effects by influencing the post‐emergence dispersal of autumn‐breeding populations and the reproductive success of spring‐breeders, The combination of earlier soil cultivation effects, and probably microclimatic influences later in the season, resulted in the strong distinction of whole season carabid catches from individual fields, It is concluded that the uniqueness of individual field histories may provide a mechanism to promote the co‐existence of ecologically similar species within the farmed landscape and enhance the abundance and bio‐diversity of species in the face of routine soil cultivation.  相似文献   

20.
Remote sensing‐derived wheat crop yield‐climate models were developed to highlight the impact of temperature variation during thermo‐sensitive periods (anthesis and grain‐filling; TSP) of wheat crop development. Specific questions addressed are: can the impact of temperature variation occurring during the TSP on wheat crop yield be detected using remote sensing data and what is the impact? Do crop critical temperature thresholds during TSP exist in real world cropping landscapes? These questions are tested in one of the world's major wheat breadbaskets of Punjab and Haryana, north‐west India. Warming average minimum temperatures during the TSP had a greater negative impact on wheat crop yield than warming maximum temperatures. Warming minimum and maximum temperatures during the TSP explain a greater amount of variation in wheat crop yield than average growing season temperature. In complex real world cereal croplands there was a variable yield response to critical temperature threshold exceedance, specifically a more pronounced negative impact on wheat yield with increased warming events above 35 °C. The negative impact of warming increases with a later start‐of‐season suggesting earlier sowing can reduce wheat crop exposure harmful temperatures. However, even earlier sown wheat experienced temperature‐induced yield losses, which, when viewed in the context of projected warming up to 2100 indicates adaptive responses should focus on increasing wheat tolerance to heat. This study shows it is possible to capture the impacts of temperature variation during the TSP on wheat crop yield in real world cropping landscapes using remote sensing data; this has important implications for monitoring the impact of climate change, variation and heat extremes on wheat croplands.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号