首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 218 毫秒
1.
Anti-predator behaviors often entail foraging costs, and thus prey response to predator cues should be adjusted to the level of risk (threat-sensitive foraging). Simultaneously dangerous predators (with high hunting success) should engender the evolution of innate predator recognition and appropriate anti-predator behaviors that are effective even upon the first encounter with the predator. The above leads to the prediction that prey might respond more strongly to cues of dangerous predators that are absent, than to cues of less dangerous predators that are actually present. In an applied context this would predict an immediate and stronger response of ungulates to the return of top predators such as wolves (Canis lupus) in many parts of Europe and North America than to current, less threatening, mesopredators. We investigated the existence of innate threat-sensitive foraging in black-tailed deer. We took advantage of a quasi-experimental situation where deer had not experienced wolf predation for ca. 100 years, and were only potentially exposed to black bears (Ursus americanus). We tested the response of deer to the urine of wolf (dangerous) and black bear (less dangerous). Our results support the hypothesis of innate threat-sensitive foraging with clear increased passive avoidance and olfactory investigation of cues from wolf, and surprisingly none to black bear. Prey which have previously evolved under high risk of predation by wolves may react strongly to the return of wolf cues in their environments thanks to innate responses retained during the period of predator absence, and this could be the source of far stronger non-consumptive effects of the predator guild than currently observed.  相似文献   

2.
Behavioural correlations are at the heart of understanding how conflicting demands shape the evolution of ecologically important behaviours. Many studies have focused on the effects of negative behavioural correlations generated by time budget conflicts within situations. We examined an alternative possibility that involves positive behavioural correlations due to behavioural carryovers across situations. Specifically, we examined the role of behavioural carryovers in governing antipredator responses of streamside salamander larvae, Ambystoma barbouri, to predatory green sunfish, Lepomis cyanellus. Earlier work showed that these larvae suffer heavy sunfish predation due to high larval exposure to fish (high proportion of time spent out of refuge). Earlier work also showed that paradoxically, despite selection pressure from fish, these larvae show higher exposure in the presence of fish (poorer antipredator behaviour) than a sister species that inhabits fishless, ephemeral ponds. The standard time budget trade-off between feeding and antipredator behaviour does not appear to explain the observed antipredator behaviours. Instead, the present study shows that the relatively large proportion of time that larvae spend out of refuge (exposed) in fish pools in the daytime can be explained in part by behavioural correlations across situations. Specifically, larvae showed positive correlations among individuals in their daytime exposure in fish pools, nighttime exposure in fish pools, and exposure in fishless pools. The benefits of high exposure in fishless conditions (associated with high feeding and developmental rates) and high exposure in fish pools at night (necessary to drift out of fish pools) have apparently overridden the predation cost of being exposed in fish pools in the day. Behavioural correlations across situations might often result in ecologically important behaviours that appear maladaptive in an isolated context. Copyright 2003 The Association for the Study of Animal Behaviour. Published by Elsevier Science Ltd. All rights reserved.   相似文献   

3.
Many prey animals experience temporal variation in the risk of predation and therefore face the problem of allocating their time between antipredator efforts and other activities like feeding and breeding. We investigated time allocation of prey animals that balanced predation risk and feeding opportunities. The predation risk allocation hypothesis predicts that animals should forage more in low- than in high-risk situations and that this difference should increase with an increasing attack ratio (i.e. difference between low- and high-risk situations) and proportion of time spent at high risk. To test these predictions we conducted a field test using bank voles (Clethrionomys glareolus) as a prey and the least weasel (Mustela nivalis nivalis) as a predator. The temporal pattern and intensity of predation risk were manipulated in large outdoor enclosures and the foraging effort and patch use of voles were measured by recording giving-up densities. We did not observe any variation in feeding effort due to changes in the level of risk or the proportion of time spent under high-risk conditions. The only significant effect was found when the attack ratio was altered: the foraging effort of voles was higher in the treatment with a low attack ratio than in the treatment with a high attack ratio. Thus the results did not support the predation risk allocation hypothesis and we question the applicability of the hypothesis to our study system. We argue that the deviation between the observed pattern of feeding behaviour of bank voles and that predicted by the predation risk allocation hypothesis was mostly due to the inability of voles to accurately assess the changes in the level of risk. However, we also emphasise the difficulties of testing hypotheses under outdoor conditions and with mammals capable of flexible behavioural patterns.  相似文献   

4.
Prey animals must balance antipredatory behaviour with foraging behaviour. According to the risk allocation hypothesis, prey increase antipredatory behaviour and reduce foraging activity during pulses of high risk, but with continuous risk, other activities must continue and antipredatory behaviour decreases despite the risk. We studied the impact of lynx presence on the vigilance behaviour of wild roe deer under conditions of (i) a pulsed elevated risk by experimentally spreading lynx urine as an olfactory cue, and (ii) continuous risk by comparing an area where lynx was eradicated 160 years ago to an area where lynx has been re‐introduced 30 years ago. Roe deer were extremely vigilant in response to the predator olfactory cue; however, roe deer vigilance did not differ measurably among areas with or without potential lynx presence. Deer were more vigilant before sunset than during the night at both study areas, probably due to long‐term adaptation of roe deer to human hunting during daytime. Vigilance decreased from August to September even though activity of lynx increases in autumn, which may be a result either of increased foraging due to decrease in food quality in autumn, or of changes in social organization of the deer. Our results suggested that the degree of vigilance depends on environmental cues. We found that roe deer respond to lynx urine despite a long absence of lynx in the ecosystem. Our results support the risk allocation hypothesis for responses to pulses of high risk but not for responses to continuous elevated levels of risk.  相似文献   

5.
Temporal variation in predation risk may fundamentally influence antipredator responses of prey animals. To maximize lifetime fitness, prey must be able to optimize energy gain and minimize predation risk, and responses to current levels of risk may be influenced by background levels of risk. A ‘risk allocation’ model has recently been proposed to predict the intensity of antipredator responses that should occur as predation risk varies over time. Prey animals from high‐risk environments should respond to predators with relatively low intensities of antipredator behaviour because long periods of antipredator behaviour may result in unacceptable decreases in levels of foraging activity. Moreover, animals that are under frequent risk should devote more energy to foraging during brief pulses of safety compared with animals under infrequent attack. In this study, we experimentally tested the risk allocation hypothesis. We exposed juvenile rainbow trout, Oncorhynchus mykiss, to three levels of risk (high, moderate and low) crossed with two levels of temporal variation (exposed to risk three times a day and once a day). In accordance with the model, we found that trout exposed to risky situations more frequently responded with significantly less intense antipredator behaviour than trout exposed to risk infrequently. The intensity of response of trout exposed to moderate risk three times a day decreased to levels similar to situations of no risk. However, in contrast to the second prediction of the model, animals under frequent risk were not more active during periods of safety compared with animals under infrequent risk. Although behaviour in the face of predation risk was dependent on the broader temporal context in which risk varied, the specific predictions of the risk allocation model were only partly supported.  相似文献   

6.
S. M. Dixon  R. L. Baker 《Oecologia》1988,76(2):200-205
Summary We used laboratory studies to examine the role of predation risk and cost of anti-predator behaviour in determining the behavioural response of several larval instars of Ischnura verticalis to a fish predator (Lepomis gibbosus). Smaller larvae were less susceptible to fish predation than larger larvae. Smaller larvae depressed movement to a greater degree in the presence of fish than did larger larvae; large larvae were generally less active than small larvae regardless of fish presence. Reduced feeding resulted in smaller larvae suffering more in terms of reduced growth than did large larvae. In general, our results tend to support the hypothesis that individuals that suffer high costs of anti-predator behaviour but little risk of predation may only exhibit anti-predator behaviours in the presence of predators, whereas individuals with a higher risk of predation and a lower cost of anti-predator behaviour may evolve anti-predator mechanisms that are in effect even in the absence of predators.  相似文献   

7.
8.
Models explaining behavioural syndromes often focus on state-dependency, linking behavioural variation to individual differences in other phenotypic features. Empirical studies are, however, rare. Here, we tested for a size and growth-dependent stable behavioural syndrome in the juvenile-stages of a solitary apex predator (pike, Esox lucius), shown as repeatable foraging behaviour across risk. Pike swimming activity, latency to prey attack, number of successful and unsuccessful prey attacks was measured during the presence/absence of visual contact with a competitor or predator. Foraging behaviour across risks was considered an appropriate indicator of boldness in this solitary predator where a trade-off between foraging behaviour and threat avoidance has been reported. Support was found for a behavioural syndrome, where the rank order differences in the foraging behaviour between individuals were maintained across time and risk situation. However, individual behaviour was independent of body size and growth in conditions of high food availability, showing no evidence to support the state-dependent personality hypothesis. The importance of a combination of spatial and temporal environmental variation for generating growth differences is highlighted.  相似文献   

9.
《Animal behaviour》1988,36(5):1317-1322
This study examines the compromise between predator avoidance and foraging in bluntnose minnows, Pimephales notatus, in shoals of different sizes and at three levels of hunger: 5, 24 or 72 h of deprivation. Trials were carried out in the laboratory with a predator present or absent. Foraging was affected significantly by shoal size, predator presence and hunger. Foraging latency decreased as shoal size and hunger level increased, but latency increased in the presence of a predator. Foraging rate was less when there was a predator present. In the absence of a predator, foraging rate increased as hunger level increased. At the 5 h hunger level, foraging rate increased as shoal size increased, in the absence but not the presence of a predator. At this low level of hunger, innows in all shoal sizes fed at a low rate when the risk of predation was greater. At higher levels of hunger, fish in all shoal sizes fed at a high rate when no predator was present, so that foraging rate did not change across shoal size. When the predator was present at the higher hunger levels, only fish in larger, safer shoals fed at a rate greater than at the lowest hunger level.  相似文献   

10.
In natural environments, predation risk varies over time. The risk allocation hypothesis predicts that prey is expected to adjust key anti‐predator behaviours such as vigilance to temporal variation in risk. We tested the predictions of the risk allocation hypothesis in a natural environment where both a species‐rich natural predator community and human hunters are abundant and where the differences in seasonal and circadian activity between natural and anthropogenic predators provided a unique opportunity to quantify the contributions of different predator classes to anti‐predator behaviour. Whereas natural predators were expected to show similar levels of activity throughout the seasons, hunter activity was high during the daytime during a clearly defined hunting season. According to the risk allocation hypothesis, vigilance should then be higher during the hunting season and during daytime hours than during the non‐hunting season and night‐time hours. Roe deer (Capreolus capreolus) on the edge of Bia?owie?a Primeval Forest in Eastern Poland displayed vigilance behaviour consistent with these predictions. The behavioural response of roe deer to temporarily varying predation risks emphasises the behavioural plasticity of this species and suggests that future studies of anti‐predator behaviour need to incorporate circadian variation in predation pressure as well as risk gradients of both natural and anthropogenic predators.  相似文献   

11.
Prey animals often respond to predators by reducing activity levels. This can produce a trait‐mediated indirect interaction (TMII) between predators and prey resources, whereby reduced foraging by prey in the presence of a predator causes an increase in prey resources. TMIIs play important roles in structuring communities, and it is important to understand factors that determine their strength. One such influence may be behavioural variation in the prey species, with indirect effects of predators being stronger within populations that are more responsive to the presence of a predator. We tested 1) whether the behavioural responsiveness of populations of wood frog tadpoles to predator cues was related to the predation risk in their native ponds, and 2) whether more responsive tadpoles yielded stronger TMIIs. To do this, we 1) measured the activity of tadpoles from 18 populations in mesocosms with and without caged predators, and 2) measured changes in the biomass of periphyton (the tadpoles’ diet) between predator treatments for each population. We found that tadpoles from higher predation risk ponds reduced their time outside refuges more in the presence of predators and tended to move less when visible, suggesting possible local adaptation to predation regimes. Though the presence of predators generally resulted in higher periphyton biomass – a TMII – there was no evidence that the strength of this TMII was affected by variation in tadpole behaviour. Foraging activity and general activity may be decoupled to some extent, enabling high predation risk‐adapted tadpoles to limit the fitness costs of reduced foraging when predators are present.  相似文献   

12.
Prey modify their behaviour to avoid predation, but dilemmas arise when predators vary in hunting style. Behaviours that successfully evade one predator sometimes facilitate exposure to another predator, forcing the prey to choose the lesser of two evils. In such cases, we need to quantify behavioural strategies in a mix of predators. We model optimal behaviour of Atlantic cod Gadus morhua larvae in a water column, and find the minimal vulnerability from three common predator groups with different hunting modes; 1) ambush predators that sit‐and‐wait for approaching fish larvae; 2) cruising invertebrates that eat larvae in their path; and 3) fish which are visually hunting predators. We use a state‐dependent model to find optimal behaviours (vertical position and swimming speed over a diel light cycle) under any given exposure to the three distinct modes of predation. We then vary abundance of each predator and quantify direct and indirect effects of predation. The nature and strength of direct and indirect effects varied with predator type and abundance. Larvae escaped about half the mortality from fish by swimming deeper to avoid light, but their activity level and cumulative predation from ambush predators increased. When ambush invertebrates dominated, it was optimal to be less active but in more lit habitats, and predation from fish increased. Against cruising predators, there was no remedy. In all cases, the shift in behaviour allowed growth to remain almost the same, while total predation were cut by one third. In early life stages with high and size‐dependent mortality rates, growth rate can be a poor measure of the importance of behavioural strategies.  相似文献   

13.
In the context of conservation hatcheries that seek to bolster wild populations by releasing captively-reared fishes into the wild, steelhead Oncorhynchus mykiss were used to test the hypothesis that naturalistic rearing environments promote adaptive behaviour that might otherwise not develop in typical hatchery environments. When comparisons were made among fish reared in barren, structured or structurally variable environments ( i.e. the location of the structure was repositioned every 2–3 days), structure in the rearing environment increased future exploratory behaviour, but only if the structure was stable. Under conditions of high perceived predation risk, the fish no longer exhibited increased exploratory behaviour, suggesting that it is expressed in an adaptive, context-dependant manner. Another concern with hatcheries is that relaxed selection over multiple generations in captivity can increase maladaptive behavioural variation. Compared to rearing in hatchery-typical barren environments, rearing in structured-stable environments decreased behavioural variation. This effect, which occurred during development and did not involve selection, demonstrates a different mechanism for change in behavioural variation in captivity. These experiments show that effects of structure and structural stability occur at the level of both average behaviour and behavioural variation, and suggest that these effects should be considered when fishes are reared in hatcheries for later release into the wild.  相似文献   

14.
Different hunger levels can modify a prey's antipredator behavior in the presence or absence of food. Satiated animals often forego foraging if a predator is nearby, whereas starved animals may risk a predator encounter to search for food. This study evaluated the influence of nutritional state on the behavior of the flatworm Dugesia dorotocephala in the presence of food, predator, and crushed conspecific cues. We found that flatworms are attracted to cues originating from a food source, crushed conspecifics, and a predator (dragonfly larva) compared with control cues. Among the different hunger level treatment groups, levels of satiation had no influence on activity levels but significantly influenced time spent close to and distance from the cue source. Flatworm movement toward predator cues emitted from dragonfly larvae is contrary to our expectations. These results suggest either a unique case of chemical mimicry from the dragonfly larvae or an inherent attraction of planarians to odonate predators that allows them to scavenge the remains of other odonate prey items.  相似文献   

15.
Synopsis Shoals of 3, 5, 7, 10, 15, and 20 bluntnose minnows,Pimephales notatus, were allowed to forage in the absence and presence of a fish predator, which was separated from the shoal by a clear plexiglass partition. A typical dilution effect was observed in that individual fish in larger shoals were approached less frequently by the predator. In the absence of a predator, foraging latency decreased significantly and the rate of foraging increased with increasing shoal size. Foraging latency for each shoal size tended to increase in the presence of a predator and foraging rate decreased, significantly for shoals of 7, 15, and 20 fish. Members of larger shoals were safer and enjoyed a greater level of food consumption, perhaps due to decreased individual vigilance for predators and social facilitation. However, foraging effort decreased when a predator was present, as more time was allocated to predator avoidance.  相似文献   

16.
1. The sub-lethal effects of hydrologic disturbances on stream invertebrates are poorly understood, but integral to some models of how disturbances influence population and community dynamics. Carnivorous larvae of a net-spinning caddisfly, Plectrocnemia conspersa , have a strong predatory impact in some streams. Their silken nets, however, are vulnerable to high flow disturbance and the consequent destruction of nets could reduce predatory impacts and have life history consequences.
2. In a laboratory experiment, we manipulated the frequency of disturbances that destroyed the nets of P. conspersa , in the presence and absence of potential prey. Animals were housed individually and each trial lasted 8 days. We estimated net size, cumulative mass of silk produced, net allocation (net mass expressed as a proportion of body mass), per capita prey consumption and growth or mass loss of larvae.
3. In the absence of prey, increased disturbance frequency was accompanied by increased loss of body mass, a reduction of net size and an increase in the cumulative mass of silk produced. At the highest disturbance frequency, larvae eventually gave up producing nets. The ratio of net mass to body mass decreased with increasing disturbance, suggesting a trade-off in the allocation of resources, with a decreasing proportion of resources available for foraging. In the presence of prey, increased disturbance frequency was accompanied by a reduction in per capita prey consumption. Although foraging success offset the costs of silk production, growth rate decreased with increasing disturbance and could eventually lead to reduced body size and fecundity of adults.
4. These sub-lethal effects suggest that hydrologic disturbances could impose metabolic costs and reduce foraging efficiency of this predator. Thus, disturbances may reduce predator impact on prey populations and reduce predator population size without any direct mortality or loss of individuals.  相似文献   

17.
The complexity of behavioural interactions in predator-prey systems has recently begun to capture trait-effects, or non-lethal effects, of predators on prey via induced behavioural changes. Non-lethal predation effects play crucial roles in shaping population and community dynamics, particularly by inducing changes to foraging, movement and reproductive behaviours of prey. Prey exhibit trade-offs in behaviours while minimizing predation risk. We use a novel evolutionary ecosystem simulation EcoSim to study such behavioural interactions and their effects on prey populations, thereby addressing the need for integrating multiple layers of complexity in behavioural ecology. EcoSim allows complex intra- and inter-specific interactions between behaviourally and genetically unique individuals called predators and prey, as well as complex predator-prey dynamics and coevolution in a tri-trophic and spatially heterogeneous world. We investigated the effects of predation risk on prey energy budgets and fitness. Results revealed that energy budgets, life history traits, allocation of energy to movements and fitness-related actions differed greatly between prey subjected to low-predation risk and high-predation risk. High-predation risk suppressed prey foraging activity, increased total movement and decreased reproduction relative to low-risk. We show that predation risk alone induces behavioural changes in prey which drastically affect population and community dynamics, and when interpreted within the evolutionary context of our simulation indicate that genetic changes accompanying coevolution have long-term effects on prey adaptability to the absence of predators.  相似文献   

18.
Growth rate has been established as a key parameter influencing foraging decisions involving the risk of predation. Through genetic manipulation, transgenic salmon bred to contain and transmit a growth hormone transgene are able to achieve growth rates significantly greater than those of unmanipulated salmon. Using such growth-enhanced transgenic Atlantic salmon, we directly tested the hypothesis that relative growth rates should be correlated with willingness to risk exposure to a predator. We used size-matched transgenic and control salmon in two experiments where these fish could either feed in safety, or in the presence of the predator. The first experiment constrained the predator behind a Plexiglas partition (no risk of mortality), the second required the fish to feed in the same compartment as the predator (a finite risk of mortality). During these experiments, transgenic salmon had rates of consumption that were approximately five times that of the control fish and rates of movement approximately double that of controls. Transgenic salmon also spent significantly more time feeding in the presence of the predator, and consumed absolutely more food at that location. When there was a real risk of mortality, control fish almost completely avoided the dangerous location. Transgenic fish continued to feed at this location, but at a reduced level. These data demonstrate that the growth enhancement associated with the transgenic manipulation increases the level of risk these fish are willing to incur while foraging. If the genetic manipulation necessary to increase growth rates is achievable through evolutionary change, these experiments suggest that growth rates of Atlantic salmon may be optimized by the risk of predation. Copyright 1999 The Association for the Study of Animal Behaviour.  相似文献   

19.
Summary Mayfly larvae of Paraleptophlebia heteronea (McDunnough) had two antipredator responses to a nocturnal fish predator (Rhinichthys cataractae (Valenciennes)): flight into the drift and retreat into interstitial crevices. Drift rates of Paraleptophlebia abruptly increased by 30 fold when fish were actively foraging in the laboratory streams but, even before fish were removed, drift began returning to control levels because larvae settled to the substrate and moved to areas of low risk beneath stones. This drifting response was used as an immediate escape behavior which likely decreases risk of capture from predators which forage actively at night. Surprisingly, drift most often occurred before contact between predator and prey, and we suggest that in darkness this mayfly may use hydrodynamic pressure waves for predator detection, rather than chemical cues, since fish forage in an upstream direction. Although drifting may represent a cost to mayfly larvae in terms of relocation to a new foraging area with unknown food resources, the immediate mortality risk probably out-weighs the importance of staying within a profitable food patch because larvae can survive starvation for at least 2 d. In addition to drifting, mayflies retreated from upper, exposed substrate surfaces to concealed interstitial crevices immediately after a predator encounter, or subsequent to resettlement on the substrate after predator-induced drift. A latency period was associated with this response and mayflies remained in these concealed locations for at least 3 h after dace foraging ceased. Because this mayfly feeds at night and food levels are significantly lower in field refugia under stones, relative to exposed stone surfaces, predator avoidance activity may limit foraging time and, ultimately, reduce the food intake of this stream mayfly.  相似文献   

20.
Prey must balance gains from activities such as foraging and social behavior with predation risk. Optimal escape theory has been successful in predicting escape behavior of prey under a range of risk and cost factors. The optimal approach distance, the distance from the predator at which prey should begin to flee, occurs when risk equals cost. Optimal escape theory predicts that for a fixed cost, the approach distance increases as risk increases. It makes no predictions about approach distance for prey in refuges that provide only partial protection or about escape variables other than approach distance, such as the likelihood of stopping before entering refuge and escape speed. By experimentally simulating a predator approaching keeled earless lizards, Holbrookia propinqua, the predictions of optimal escape theory for two risk factors, predator approach speed and directness of approach were tested. In addition, predictions that the likelihood of fleeing into refuge without stopping and the speed of escape runs increase with risk, in this case predator approach speed, and that lizards in incompletely protective refuges permit closer approach than lizards not in refuges were also tested. Approach distance increased with predator approach speed and directness of approach, confirming predictions of optimal escape theory. Lizards were more likely to enter refuge and ran faster when approached rapidly, verifying that predation risk affects escape decisions by the lizards for escape variables not included in optimal escape theory. They allowed closer approach when in incompletely protective refuges than when in the open, confirming the prediction that risk affects escape decisions while in refuge. Optimal escape theory has been highly successful, but testing it has led to relative neglect of important aspects of escape other than approach distance.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号