首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Lettuce tipburn is an irreversible physiological disorder caused by calcium deficiency that decreases the crop value. Breeding a tipburn-resistant cultivar is the only causal therapy in many cases. In this study, we investigated an efficient method to evaluate lettuce resistance to tipburn in vitro. Seedlings of 19 lettuce cultivars representing three head types were cultured on agar medium containing EGTA, which chelates Ca2+. The percentage of tipburned leaves decreased proportionally with EGTA concentration. Susceptible cultivars were distinguished at 0.01 mM EGTA, whereas resistant cultivars were classified at 1.0 mM EGTA. Based on mean values of tipburn measurements, tipburn susceptibility was highest for ‘Leaf Lettuce’, followed by ‘Butterhead Lettuce’, and then ‘Crisphead Lettuce’. Two cultivars were selected for further tests using hydroponic and pot culture. The rank order of susceptibility to tipburn in these experiments was consistent with that of the in vitro assay. The in vitro evaluation of lettuce susceptibility to calcium deficiency is useful for initial screening of lettuce cultivars against tipburn incidence. Resistant cultivars identified in this study are practical candidates for cultivation in controlled environments, such as a plant factory, while sensitive cultivars are also useful as indicator plants to monitor environmental conditions.  相似文献   

2.
Spraying Chinese cabbage seedlings [Brassica pekinensis (Lour.) Rupr.] with the growth retardant daminozide (succinic acid-2,2-dimethylhydrazide) reduced tipburn of the mature plants. As the concentration of daminozide increased, the reduction in tipburn damage was correlated with increased calcium content in young susceptible leaves. This effect was much more pronounced in plants that were misted once a day during the head formation period.Incubation of detached Chinese cabbage leaves for 48 h (in the dark) in solutions which contained either EDTA or EGTA caused characteristic lesions at the leaf tips. The extent of the damage was reduced by including CaCl2 in the solutions. Leaves which were incubated in a solution of EDTA+GA3 or EGTA+GA3 were severely affected, with the latter solution being the more harmful. GA3 alone did not enhance tipburn. CaCl2 greatly reduced the effect of a complex of chelating agents and GA3. Leaves derived from daminozide-treated plants which were incubated in EDTA+GA3 were less affected with tipburn lesions than leaves of control plants treated with the same solutions. When detached leaves were water-stressed for 24 h prior to incubation in these solutions, the severity of tipburn symptoms increased. The possible interactions between GA, calcium chelation and tipburn development are discussed.Contribution no. 1171-E, 1984 series, from the ARO, The Volcani Center Bet Dagan, Israel.  相似文献   

3.
核桃凋落叶分解对莴笋抗氧化系统及光合特性的影响   总被引:3,自引:0,他引:3  
为探讨核桃对农作物的化感作用,该试验采用盆栽法,设置4个凋落叶施用量水平(0、30、60、90g/盆),研究了核桃凋落叶在土壤中自然分解过程中对莴笋(播种后80、100、120和140d)抗氧化系统、光合生理特征及其生长的影响。结果显示:(1)核桃凋落叶处理的莴笋叶片超氧化物歧化酶(SOD)、过氧化物酶(POD)、过氧化氢酶(CAT)活性在播种80d时得到促进,在100d时受到抑制,而在120d之后基本恢复至正常水平,并以SOD表现最为敏感。(2)核桃凋落叶处理100和120d时,莴笋叶片可溶性蛋白(SP)含量显著降低,而可溶性糖(SS)含量显著增加。(3)核桃凋落叶处理100、120d时,莴笋叶片净光合速率(Pn)受到显著抑制,各处理气孔导度(Gs)和蒸腾速率(Tr)显著低于对照。(4)核桃凋落叶处理的莴笋株高、地上部分生物量及地上部分占总生物量比重在处理120d时均显著低于对照,在140d时基本恢复正常。研究表明,核桃凋落叶在土壤中分解对莴笋产生的化感作用强度随分解时间延长呈现出逐渐增强后再减弱的变化趋势;莴笋可以通过调控自身的保护酶活性和渗透调节物质含量在一定程度上缓解化感物质伤害,对核桃凋落叶的化感作用有较强的耐受能力,生产中可以在核桃林下进行间作或者套作莴笋。  相似文献   

4.
A field experiment on winter wheat in autumn 1991 investigated the effect of the rhabditid nematode, Phasmarhabditis hermaphrodita, applied to soil at five dose rates (108 - 1010 infective larvae ha-1) immediately after seed sowing, on slug populations and damage to seeds and seedlings. The nematode was compared with methiocarb pellets broadcast at recommended field rate immediately after drilling and no molluscicide treatment. Slug damage to wheat seeds and seedlings was assessed 6 and 13 wk after drilling. Seedling survival increased and slug grazing damage to seedlings declined linearly with increasing log nematode dose. These two measures of slug damage were combined to give an index of undamaged plant equivalents, which also increased linearly with increasing log nematode dose. ANOVA showed that, after 6 wk, there were significantly more undamaged plant equivalents on plots treated with the two highest nematode doses (3 × 109 and 1 × 1010 ha-1) than on untreated plots, but the number of undamaged plant equivalents on methiocarb-treated plots was not significantly greater than that on untreated plots. Slug populations were assessed by refuge trapping and soil sampling. Deroceras reticulatum was the commonest of several species of slugs recorded. During the first 4 wk after sowing, significantly more slugs were found under refuge traps on plots treated with certain doses of P. hermaphrodita than under traps on untreated plots and more showed signs of nematode infection than expected from the prevalence of infection in slugs from soil samples, suggesting that the presence of P. hermaphrodita altered slug behaviour. Application of P. hermaphrodita had no significant impact on numbers or biomass of slugs in soil during a 27 wk period after treatment, except after 5 wk when slug numbers were inversely related to log nematode dose. However, by this time, numbers in soil samples from untreated plots had declined to levels similar to those in plots treated with the highest dose of nematodes. During the first 5 wk after treatment, c. 20% of slugs in soil samples from untreated plots showed symptoms of nematode infection. It is suggested that this represented the background level of infection in the experimental field rather than spread of infection from treated plots. The apparent lack of impact of P. hermaphrodita on slug numbers and biomass in soil suggests that its efficacy in protecting wheat from slug damage was through inhibition of feeding by infected slugs.  相似文献   

5.
Summary The growth promoting capacity of the insecticides/nematicides, Curater (2,3-dihydro-2,2-dimethylbenzofuran-7-yl methylcarbamate), Temik (2-methyl-2-(methylthio) propionaldehyde 0-(methylcarbamoyl) oxim and Mocap (0-ethyl S, S-Dipropyl phosphorodithioate) were compared. For each product, three equimolar doses were applied: 1.3×10−3, 2.6×10−3 and 5.2×10−3 molar per m2. The treatments were applied to a field soil, part of which was untreated, part of which was steamed to test the direct effect of the products on the test plants. In the first experiment wheat, lettuce and mungbeans were sown immediately after broad cast application of the insecticides. In the second experiment three maize cultivars (Suroit, Breda and LG 12) were used as test plants: one half of the soil was treated with the insecticides just before sowing the maize, the other half was originally treated in the previous experiment. Considering the effect on growth (the weight of the two months old wheat and lettuce plants and the length of the mungbeans) in unsteamed and steamed soil increasing amounts of Mocap had a negative effect on the growth of beans and wheat: in contrast the weight of the lettuce was markedly increased. Curater 5G and Temik 10G had no influence on the mungbeans and a slightly positive effect on the growth of lettuce. The growth of wheat was slightly decreased by Temik 10G and slightly increased by Curater 5G. The growth of maize was decreased by increasing amounts of Mocap in both steamed and unsteamed soil. However, in plots treated in the previous experiment the reverse effect was observed, growth being slightly improved.  相似文献   

6.
Winter wheat was grown for six successive years (Expt 1) and for three successive years (Expt 2) in field experiments on different soil types. Artificial inoculum of the take-all fungus (Gaeumannomyces graminis var. tritici cultured on autoclaved oat grains) was incorporated in the soil of some of the plots just before, or at, sowing of the first winter wheat crop. Expt 1 tested the incorporation of similar amounts of inoculum (212 kg ha-1) at different depths. Expt 2 tested different amounts of inoculum at the same, shallow depth. Early sowing (September), late sowing (October) and spring inoculation were additional treatments, applied to the first crop only, in Expt 2. Seasonal factors apart, the disease outcome in the first year after inoculation depended on amounts and placement of applied inoculum, as well as date of sowing. Deeper inoculum resulted in less disease (Expt 1). Severe take-all was produced in Expt 2 by incorporating inoculum shallowly in sufficient quantities (400 kg ha-1 or more). Less inoculum (200 kg ha-1) generated less disease, especially in earlier-sown plots. Differences in disease amongst inoculum treatments were greatest in the first year and diminished subsequently, particularly where sowing had been early in the first year. In Expt 1, where first crops exposed to artificial inoculum developed moderate-to-severe disease, disease in subsequent second and/or third crops was less. In the fourth crop a second peak of disease occurred, coinciding with a first peak in sequences without added inoculum. Take-all decline (TAD) appeared to be expressed in all sequences thereafter. In Expt 2 in sequences without added inoculum, TAD occurred after a peak of disease in the second crop. Where 400 kg ha-1 or more of inoculum were added, disease was severe in the first year and decreased progressively in successive years. Disease was less patchy in plots that received artificial inoculum. However, it remains uncertain mat severe disease caused by artificial inoculation achieved an early onset of true TAD. The infectivity of the top 12 cm of soil in the first 3 yr of Expt 1, determined by bioassay, depended on the depth of added inoculum and amount of disease in subsequent crops. However, at the time of the naturally occurring peak of disease severity (in either inoculated or non-inoculated plots) it did not predict either disease or TAD. Differences and similarities amongst epidemics developing naturally and those developing from different amounts and placement of applied inoculum have been revealed. The epidemiological implications of adding inoculum and the potential value of artificially-created epidemics of take-all in field trials are discussed.  相似文献   

7.
The effect of rice culture on changes in the number of a strain of soybean root-nodule bacteria, (Bradyrhizobium japonicum CB1809), already established in the soil by growing inoculated soybean crops, was investigated in transitional red-brown earth soils at two sites in south-western New South Wales. At the first site, 5.5 years elapsed between the harvest of the last of four successive crops of soybean and the sowing of the next. In this period three crops of rice and one crop of triticale were sown and in the intervals between these crops, and after the crop of triticale, the land was fallowed. Before sowing the first rice crop, the number of Bradyrhizobium japonicum was 1.32×105 g–1 soil. The respective numbers of bradyrhizobia after the first, second and third rice crops were 4.52 ×104, 1.26×104 and 6.40×102 g–1 soil. In the following two years the population remained constant. Thus sufficient bradyrhizobia survived in soil to nodulate and allow N2-fixation by the succeeding soybean crop. At the second site, numbers of bradyrhizobia declined during a rice crop, but the decline was less than when the soil was fallowed (400-fold cf. 2200-fold). Multiplication of bradyrhizobia was rapid in the rhizosphere of soybean seedlings sown without inoculation in the rice bays. At 16 days after sowing, their numbers were not significantly different (p<0.05) from those in plots where rice had not been sown. Nodulation of soybeans was greatest in plots where rice had not been grown, but yield and grain nitrogen were not significantly different (p<0.05). Our results indicate that flooding soil has a deleterious effect on the survival of bradyrhizobia but, under the conditions of the experiments, sufficient B. japonicum strain CB 1809 survived to provide good nodulation after three crops of rice covering a total period of 5.5 years between crops of soybean.  相似文献   

8.
Tipburn in lettuce is a physiological disorder expressed as a necrosis in the margins of young developing leaves and is commonly observed under saline conditions. Tipburn is usually attributed to Ca2+ deficiencies, and there has very limited research on other mechanisms that may contribute to tipburn development. This work examines whether symptoms are mediated by increased reactive oxygen species (ROS) production.Two butter lettuce (Lactuca sativa L.) varieties, Sunstar (Su) and Pontina (Po), with contrasting tipburn susceptibility were grown in hydroponics with low Ca2+ (0.5 mM), and with or without 50 mM NaCl. Tipburn symptoms were observed only in Su, and only in the saline treatment. Tipburn incidence in response to topical treatments with Ca2+ scavengers, Ca2+ transport inhibitors, and antioxidants was assessed. All treatments were applied before symptom expression, and evaluated later, when symptoms were expected to occur. Superoxide presence in tissues was determined with nitro blue tetrazolium (NBT) and oxidative damage as malondialdehyde (MDA) content. Superoxide dismutase (SOD), catalase (CAT) and ascorbate peroxidase (APX) activities were assayed.Under control and saline conditions, tipburn could be induced in both varieties by topical treatments with a Ca2+ scavenger (EGTA) and Ca2+ transport inhibitors (verapamil, LaCl3) and reduced by supplying Ca2+ along with a ionophore (A 23187). Tipburn symptoms were associated with locally produced ROS. O2 and oxidative damage significantly increased in leaf margins before symptom expression, while topical antioxidant applications (Tiron, DPI) reduced symptoms in treated leaves, but not in the rest of the plant. Antioxidant enzyme activity was higher in Po, and increased more in response to EGTA treatments, and may contribute to mitigating oxidative damage and tipburn expression in this variety.  相似文献   

9.
In a field experiment on winter wheat, take‐all on plants and the infectivity of the soil were studied in crop sequences with different combinations of sowing dates. Take‐all was negligible in the first wheat crop, but thereafter the mean disease intensity (measured using a take‐all rating, TAR, with a maximum of 300) was 108, 190, 118 and 251 in the second to fifth successive crops. In each growing season, the disease differed amongst sequences and built up more rapidly and was more intense on plants sown in mid‐September than on plants sown in mid‐October. In late‐sown plots, where volunteers had been present during the mid‐September to mid‐October period, take‐all reached an intensity intermediate between that in early‐sown plots and that in late‐sown plots that had been kept free of volunteers. Volunteers, therefore, partially offset the expected beneficial effect of decreased disease with later sowing. Differences in take‐all amongst sequences were most pronounced in the second wheat crop and early sowing of the previous wheat increased intensity of disease. In the following (third) crop, differences in disease intensity amongst sequences were smaller. Soil infectivity (measured by seedling bioassay after harvest) built up progressively from a low level after the first crop to peak after the third crop. In this build‐up phase, soil infectivity estimates were always numerically greater after harvest of early‐sown treatments than after later‐sown treatments, although never significant at P= 0.05. The greatest difference (P= 0.06) was recorded in October before sowing of the third crop, where the comparison was between soil after two previous early sowings and soil after two previous later sowings and control of volunteers. In the same autumn, presence of green cover (i.e. volunteers) was associated with a smaller loss of soil infectivity between harvest and later sowing than occurred in an absence of green cover. In 2nd–4th crops, where comparisons were available and mean TARs indicated moderate levels of take‐all, sowing later had no yield benefit, despite more take‐all and greater soil infectivity associated with early sowing. Important considerations for the management of crops at risk of take‐all are 1) choosing appropriate sowing dates to minimize take‐all or to encourage take‐all decline and 2) controlling volunteers and weed hosts where crops are sown late to minimise take‐all.  相似文献   

10.
In 1997 and 1998 the stimulation of hatch of potato cyst nematodes (PCN) by a trap crop was studied at various times during the growing season in a container and a field experiment. Solanum nigrum‘90‐4750‐188’was used as the trap crop in both experiments and was sown on 1 May, 16 June or 1 August in two successive years on different plots. Neither experiment revealed much seasonal variation in hatchability of PCN juveniles under a trap crop. In the container experiment, the hatch of the Globodera pallida Pa3 population was equally and strongly stimulated (89%) at all sowing dates in both years, except for the 1 August sowing in 1998 (when the hatch was 77% under extremely wet soil conditions). In the control treatment with non‐hosts (flax followed by barley) the total spontaneous hatch was 50% over 2 yr. In the field experiment, the hatch of PCN, averaged over the four populations, was also equally stimulated (71%) at all sowing dates in both years. In the control treatment with non‐hosts (flax‐barley) the total spontaneous hatch was 36% over 2 yr. Total hatch under the trap crop over 2 yr varied between the four PCN populations from 63% to 80%. In 1998 and 1999, control of potato cyst nematodes (PCN) by the potential trap crops Solanum sisymbriifolium and S. nigrum‘90‐4750‐188’was studied in the field. Potato was also included as a trap crop. In the 1998 experiment, potato, S. sisymbriifolium and S. nigrum strongly stimulated the hatch of PCN compared with the non‐host white mustard (Sinapis alba). Roots of potato and white mustard were mainly found in the top 10 cm of soil, whereas roots of S. sisymbriifolium and S. nigrum were also abundant at depths of 10–20 cm and 20–30 cm. In the 1999 experiment, soil infestation with PCN decreased markedly with potato and S. sisymbriifolium as trap crops. In plots moderately to severely infested with 2‐yr old cysts (2–29 juveniles ml?1 air dried soil), potato reduced soil infestation by 87% and S. sisymbriifolium by 77%. In plots moderately to severely infested with 1‐yr old cysts the reductions were 74% and 60%, respectively. The reduction was least on plots very severely infested with PCN (110–242 juveniles ml?1 soil): 69% and 52% for potato and S. sisymbriifolium, respectively. Soil infestations of plots that were initially slightly to severely infested with the root‐knot nematode Meloidogyne hapla were greatly reduced under fallow and S. sisymbriifolium but increased under potato. From these and previous experiments it was concluded that, for several reasons, S. sisymbriifolium is a promising trap crop.  相似文献   

11.
Tipburn is an irreversible physiological disorder of Chinese cabbage that decreases crop value. Because of a strong environmental component, tipburn‐resistant cultivars are the only solution, although tipburn resistance genes are unknown in Chinese cabbage. We studied three populations of Chinese cabbage over four growing seasons under field conditions: (a) 194 diverse inbred lines, (b) a doubled haploid (DH100) population, and (c) an F2 population. The 194 lines were genotyped using single nucleotide polymorphism markers, and genome‐wide‐association mapping showed that 24 gQTLs were significantly associated with tipburn disease index. Analysis of the DH100 and F2 populations identified a shared tipburn‐associated locus, gqbTRA06, that was found to cover the region defined by one of the 24 gQTLs. Of 35 genes predicted in the 0.14‐Mb quantitative trait locus region, Bra018575 (calreticulin family protein, BrCRT2) showed higher expression levels during disease development. We cloned the two BrCRT2 alleles from tipburn‐resistant (BrCRT2R) and tipburn‐susceptible (BrCRT2S) lines and identified a 51‐bp deletion in BrCRT2S. Overexpression of BrCRT2R increased Ca2+ storage in the Arabidopsis crt2 mutant and also reduced cell death in leaf tips and margins under Ca2+‐depleted conditions. Our results suggest that BrCRT2 is a possible candidate gene for controlling tipburn in Chinese cabbage.  相似文献   

12.
韩懂懂  杨光  邸雪颖  李兆国 《生态学报》2023,43(21):8727-8738
探索火烧迹地土壤理化性质的驱动因子是解释生态系统响应火干扰机制的重要组成部分。旨在分析兴安落叶松林火烧迹地土壤理化性质的决定因子,深入认识火干扰在北方森林生态系统所扮演的角色,为北方森林火烧迹地火后恢复、林业可持续发展提供科学支持和理论依据。以过火后1 a的兴安落叶松林火烧迹地为研究对象,设置了不同火烈度和立地条件的火烧迹地及对照研究样地共计35块。在每个研究样地记录了树种、胸径、存活状态等乔木数据,测量了坡向、坡位、坡度、海拔等地形数据,利用植被变化情况量化了火烈度。分别采集了0—5 cm和5—10 cm的土壤样品并测定其9种理化指标,对比了过火与未过火样地这些土壤理化指标的差异。然后,分析了各土壤理化指标与火烈度量化结果的关联趋势,比较了火烈度、地形和乔木因子对火烧迹地土壤理化性质的影响程度。与对照样地相比,火干扰提升了土壤理化指标观测值的离散程度,显著提高了0—5 cm土壤含水率(MC)(P<0.05)和5—10 cm土壤MC、总氮(TN)、总有机碳(TOC)含量(P<0.05)。土壤理化性质与火烈度有明显关联趋势的并不多,观察到0—5 cm土壤MC随火烈度指数的增加...  相似文献   

13.
Plant and soil nitrogen isotope ratios (δ15N) were studied in experimental grassland plots of varying species richness. We hypothesized that partitioning of different sources of soil nitrogen among four plant functional groups (legumes, grasses, small herbs, tall herbs) should increase with diversity. Four years after sowing, all soils were depleted in 15N in the top 5 cm whereas in non‐legume plots soils were enriched in 15N at 5–25 cm depth. Decreasing foliar δ15N and Δδ15N (= foliar δ15N ? soil δ15N) values in legumes indicated increasing symbiotic N2 fixation with increasing diversity. In grasses, foliar Δδ15N also decreased with increasing diversity suggesting enhanced uptake of N depleted in 15N. Foliar Δδ15N values of small and tall herbs were unaffected by diversity. Foliar Δδ15N values of grasses were also reduced in plots containing legumes, indicating direct use of legume‐derived N depleted in 15N. Increased foliar N concentrations of tall and small herbs in plots containing legumes without reduced foliar δ15N indicated that these species obtained additional mineral soil N that was not consumed by legumes. These functional group and species specific shifts in the uptake of different N sources with increasing diversity indicate complementary resource use in diverse communities.  相似文献   

14.
In glasshouses practising monoculture of butterhead lettuce in Belgium, high densities of pin nematodes (Paratylenchus spp.) are frequently associated with reduced plant growth. Growers currently apply chemical soil disinfestation measures to manage this problem, although stricter phytosanitary regulations are forcing a shift towards integrated management. Efficient implementation of such management requires knowledge about the factors influencing nematode population dynamics, and the damage threshold for lettuce. The nematode populations in five Belgian glasshouses were monitored for at least 1 year by frequently soil sampling at 0–30 cm and 30–60 cm depth. An undescribed species of Paratylenchus was identified in all glasshouses based on morphological and molecular features. High nematode densities (>20,000 (100 ml soil)?1) occurred in winter and spring. Chemical soil disinfestation lowered these populations greatly, although up to 14% survived in the deeper soil layer. After soil steaming under negative pressure, no pin nematodes were found. After 2 months of black fallow pin nematode densities were reduced by 50%–76%. Lamb's lettuce, parsley and wild rocket were found to be poor hosts in a pot experiment, while reproduction factors (Pf/Pi) on lettuce cultivars varied between 1 and 3. In three experiments with butterhead lettuce ‘Cosmopolia’ in pots with a series of 9 or 10 densities of Paratylenchus sp. [up to 35,000 (100 ml soil)?1], no damage to lettuce heads was observed. However, root weight and root quality were reduced, and the corresponding damage thresholds were rather low [1,754 and 362 Paratylenchus sp. (100 ml soil)?1, respectively]. Management strategies such as crop rotation, soil disinfestation or fallow are recommended to avoid pin nematode population build‐up.  相似文献   

15.
Crisphead lettuce (Lactuca sativa L.) crops exhibit several economically important, physiological disorders when grown in high temperature conditions. These include tipburn, rib discoloration, premature bolting, ribbiness, and internal rib cracking. We evaluated seven physiological disorders and three agronomic traits segregating in a recombinant inbred line (RIL) population consisting of 152 F7 RILs derived from an intra-specific cross between two crisphead cultivars, L. sativa cv. Emperor x L. sativa cv. El Dorado; evaluations were carried out at each of two parental maturities in one planting and at one intermediate maturity in a second planting in each of 2 years for a total of six evaluations. A genetic map was developed using 449 polymorphic SNP markers; it comprises 807 cM in 20 linkage groups that covered 51 % of the nine lettuce chromosomes. Composite interval mapping revealed a total of 36 significant QTLs for eight out of the ten traits evaluated. Significant QTLs were distributed in 11 linkage groups on seven of the chromosomes and accounted for up to 83 % of the phenotypic variation observed. The three largest QTLs for rib discoloration, which accounted individually for 7–21 % of the variation, were clustered with stem length, two with ribbiness and one with head firmness. Three major clusters of QTLs revealed pleiotropic effects or tight linkage between tipburn incidence and severity, head type, stem length, head firmness and ribbiness. One QTL, qTPB5.2, was detected in multiple trials and described 38–70 % of the variation in tipburn incidence. qTPB5.2 is, therefore, a useful candidate gene for breeding for tipburn resistance using marker-assisted selection.  相似文献   

16.
In the 1990s during wet seasons a new disease causing brown leaf spots on lettuce (Lactuca sativa) was found for the first time in many lettuce‐growing areas of Austria and Germany. The causal agent, a new pathogenic species called Septoria birgitae, may be responsible for total crop loss. To study how temperature, inoculum density and leaf wetness period influence disease incidence and severity of leaf spot on lettuce caused by S. birgitae, we carried out in vivo experiments in growth chambers and in the field. Additionally, we evaluated the relevance of infected plant debris acting as a primary inoculum source in soil for subsequent crops. S. birgitae produces spores over a wide temperature range between 5°C and 30°C, and can infect plants at temperatures between 10°C and 30°C, with an optimum between 20°C and 30°C. Spores of S. birgitae at a density of at least 103 conidia mL–1 are essential for disease outbreak on lettuce. Because leaf wetness is crucial for releasing conidia from pycnidia, we studied the impact of leaf wetness duration on disease development under various temperature conditions. For relevant leaf spot disease development on lettuce in vivo, a leaf wetness duration of at least 24 h and temperatures higher than 10°C were necessary. Leaf spot disease development in the field required several leaf wetness periods longer than 20 h at approximately 15°C at the beginning of crop cultivation. Incorporating S. birgitae infected plant debris in soil as a primary inoculum was not relevant for leaf spot disease outbreak in the next year. However, in cases of continuous cropping of lettuce on the same field and in the same season, Septoria‐infected lettuce debris may become more relevant.  相似文献   

17.
Evidence for biological nature of the grape replant problem in California   总被引:2,自引:0,他引:2  
Westphal  Andreas  Browne  Greg T.  Schneider  Sally 《Plant and Soil》2002,242(2):197-203
A bioassay was developed to investigate causes of grape replant problems under controlled conditions. Soils were collected from methyl bromide-fumigated and non-fumigated plots at a site cleared from a 65-year-old grape vineyard (Vitis vinifera cv. Thompson seedless) at Parlier, CA. Subsamples of the non-fumigated soil were either left non-treated, subjected to autoclaving (twice 45 min), or heating at 40, 50, 60, 70, 80 or 90 °C for 30 min. Subsequently, the samples were placed in 120-mL pots, planted with rooted hardwood grape cuttings (V. vinifera, cv. Carignane) and placed in a greenhouse or growth chamber. Three months after transplanting, vines from non-treated or 40 °C-treated soil had lower shoot weights and densities of healthy lateral roots than vines from the other treatments. Pythium spp. were isolated from 45 to 55% of the plated root segments from vines grown in non-treated, or soil that had been heated at 40 or 50 °C but were not detected in roots from soil given other treatments. Egg masses of root-knot nematode, Meloidogyne spp., were produced on roots from non-treated or heated at 40 °C soil, but no egg masses were detected on roots of the other treatments. In another test with the same soils, remnant roots from non-fumigated or pre-plant methyl bromide-fumigated soil were extracted and amended to non-fumigated soil, soil from fumigated field plots, soil fumigated in a small container, or autoclaved potting mix. The transfer of old vine roots from non-fumigated field soil resulted in incidence of Pythium spp. on grape assay roots, but there was no measurable effect of the transfer on growth and health of the bioassay plant roots. The results of the bioassays indicate that grape replant problem at the California site had biological causes. The bioassay approach may aid in future determinations of the etiology of grape replant problems.  相似文献   

18.
A simple and rapid bioassay was implemented to detect the germination activity of extracts from soils in pre/post-burn conditions. Soil samples taken from burnt, unburnt and adjacent plots at depths of 0–2, 2–4, 4–6 and 6–8 cm before and after burning mesic grassland in South Africa were analysed for germination activity over an eight-week period. Soil samples were extracted using dichloromethane and bioassayed using Grand Rapids lettuce (Lactuca sativa L.) achenes (seeds). The Grand Rapids lettuce seeds exhibited greater germination percentages when treated with extracts from burnt soil compared to the other plots. The magnitude of the germination activity declined with time since the burn. The Grand Rapids lettuce seeds also exhibited significantly higher germination when treated with unburnt soil extracts compared to the control (distilled water) which indicates the existence of other factors controlling germination in unburnt soil. Germination activity in the adjacent plots decreased with time. These findings indicate that the germination activity of the smoke derived from burning plant-material diffuses into the soil and its persistence declines with time. Considering that the soil seed bank contains viable seeds, at a moderate depth, and that they are initially unaffected by the heat of the fire, then smoke residues following a fire can influence the germination and recruitment of plant species that are responsive to smoke-derived compounds and are represented in the germinable soil seed bank.  相似文献   

19.
A multi-fan system (MFS) for single culture beds was developed to improve the airflow in a plant factory with artificial light. The MFS had seven fans which were installed on both the front and back sides of culture beds to generate airflow from two opposite horizontal directions. The fans that push the air into the culture bed were air inlets while those that pull the air out of the culture bed were air outlets. In this study, three airflow patterns were evaluated: T1, the front and back sides of the culture bed were air inlets; T2, the front side was an air inlet and the backside was an air outlet; and T3, both the front and back sides were air outlets. A culture bed with no MFS was used as a control (T4). Lettuce growth and tipburn occurrence were evaluated and leaf boundary layer resistance (1/gbv), sensible heat flux (Sh), and latent heat flux (Lh) of lettuce plants were estimated. The airflow pattern in T1 improved the air velocity (Va) by an average of 0.75 m s-1 and a variation coefficient of 65%. The 1/gbv decreased significantly with the increase in Va, and the lowest value of 54.0 s m-1 was observed in T1. The low resistance to heat and moisture transfer enhanced the Sh and Lh of lettuce plants. The average Sh and Lh were 40% and 46% higher in T1 compared with those in T4. The fresh and dry weights of lettuce plants in T1 were 1.13 and 1.06 higher than those in T4, respectively. No tipburn occurrence was observed in lettuce plants grown under the MFS while five leaves per plant were injured with tipburn in T4. The results indicated that improving the airflow can improve the growth of indoor cultured lettuce and alleviate the occurrence of tipburn due to the decrease in the 1/gbv and the increase in the transpiration rate.  相似文献   

20.
The response of seven lettuce cultivars to two geographically different Lettuce mosaic virus (LMV) isolates (LMV‐A, LMV‐T) was statistically evaluated based on infection rate, virus accumulation and symptom severity in different time trials. LMV‐A is characterized by the ability to systemically infect cv. Salinas 88 (mo12‐carrying resistant cultivar), and inducing mild mosaic symptoms. Among lettuce cultivars, Varamin (a native cultivar) similar to cv. Salinas showed the most susceptibility to both LMV isolates, whereas another native cultivar, Varesh, was tolerant to the virus with minimal viral accumulation and symptom scores, significantly different from other cultivars at P < 0.05. LMV‐A systemically infects all susceptible lettuce cultivars more rapidly and at a higher rate than LMV‐T. This isolate accumulated in lettuce cultivars at a significantly higher level, determined by semiquantitative ELISA and induced more severe symptoms than LMV‐T isolate at 21 dpi. This is the first evidence for a LMV isolate with ability to systemically infect mo12‐carrying resistant cultivar of lettuce from Iran. In this study, accumulation level of LMV showed statistically meaningful positive correlation with symptom severity on lettuce plants. Based on the results, three evaluated parameters differed considerably by lettuce cultivar and virus isolate.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号