首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 422 毫秒
1.
Temporal variation in predation risk may fundamentally influence antipredator responses of prey animals. To maximize lifetime fitness, prey must be able to optimize energy gain and minimize predation risk, and responses to current levels of risk may be influenced by background levels of risk. A ‘risk allocation’ model has recently been proposed to predict the intensity of antipredator responses that should occur as predation risk varies over time. Prey animals from high‐risk environments should respond to predators with relatively low intensities of antipredator behaviour because long periods of antipredator behaviour may result in unacceptable decreases in levels of foraging activity. Moreover, animals that are under frequent risk should devote more energy to foraging during brief pulses of safety compared with animals under infrequent attack. In this study, we experimentally tested the risk allocation hypothesis. We exposed juvenile rainbow trout, Oncorhynchus mykiss, to three levels of risk (high, moderate and low) crossed with two levels of temporal variation (exposed to risk three times a day and once a day). In accordance with the model, we found that trout exposed to risky situations more frequently responded with significantly less intense antipredator behaviour than trout exposed to risk infrequently. The intensity of response of trout exposed to moderate risk three times a day decreased to levels similar to situations of no risk. However, in contrast to the second prediction of the model, animals under frequent risk were not more active during periods of safety compared with animals under infrequent risk. Although behaviour in the face of predation risk was dependent on the broader temporal context in which risk varied, the specific predictions of the risk allocation model were only partly supported.  相似文献   

2.
Many prey animals experience temporal variation in the risk of predation and therefore face the problem of allocating their time between antipredator efforts and other activities like feeding and breeding. We investigated time allocation of prey animals that balanced predation risk and feeding opportunities. The predation risk allocation hypothesis predicts that animals should forage more in low- than in high-risk situations and that this difference should increase with an increasing attack ratio (i.e. difference between low- and high-risk situations) and proportion of time spent at high risk. To test these predictions we conducted a field test using bank voles (Clethrionomys glareolus) as a prey and the least weasel (Mustela nivalis nivalis) as a predator. The temporal pattern and intensity of predation risk were manipulated in large outdoor enclosures and the foraging effort and patch use of voles were measured by recording giving-up densities. We did not observe any variation in feeding effort due to changes in the level of risk or the proportion of time spent under high-risk conditions. The only significant effect was found when the attack ratio was altered: the foraging effort of voles was higher in the treatment with a low attack ratio than in the treatment with a high attack ratio. Thus the results did not support the predation risk allocation hypothesis and we question the applicability of the hypothesis to our study system. We argue that the deviation between the observed pattern of feeding behaviour of bank voles and that predicted by the predation risk allocation hypothesis was mostly due to the inability of voles to accurately assess the changes in the level of risk. However, we also emphasise the difficulties of testing hypotheses under outdoor conditions and with mammals capable of flexible behavioural patterns.  相似文献   

3.
1.?Studies examining the integration of constitutive and inducible aspects of multivariate defensive phenotypes are rare. 2.?I asked whether marine snails (Nucella lamellosa) from habitats with and without abundant predatory crabs differed in constitutive and inducible aspects of defensive shell morphology. 3.?I examined multivariate shell shape development of snails from each habitat in the presence and absence of waterborne cues from feeding crabs (Cancer productus). I also examined the influence of constitutive and inducible shell morphology on resistance to crushing. 4.?Regardless of the presence of crabs, snails from high-risk (HR) habitats developed rotund, short-spired shells, while snails from low-risk habitats developed elongate shells, tall-spired shells, indicating among-habitat divergence in constitutive shell shape. Moreover, allometry analyses indicated that constitutive developmental patterns underlying this variation also differed between habitats. However, snails from HR habitats showed greater plasticity for apertural lip thickness and apertural area in the presence of crab cues, indicating among-habitat variation in defence inducibility. 5.?Both shell shape and apertural lip thickness contributed to shell strength suggesting that constitutive shell shape development and inducible lip thickening have evolved jointly to form an effective defence in habitats where predation risk is high.  相似文献   

4.
Predation is a strong selective force acting on prey animals. Predation is by nature highly variable in time; however, this aspect of predation risk has traditionally been overlooked by behavioural ecologists. Lima and Bednekoff proposed the predation risk allocation hypothesis (RAH), predicting how temporal variation in predation risk drives prey antipredator behaviours. This model is based on the concept that prey adaptively allocate their foraging and antipredator efforts across high‐ and low‐risk situations, depending on the duration of high‐ vs. low‐risk situations and the relative risk associated with each of them. An unstudied extension of the RAH is the effect of predictability of predation risk. A predictable risk should lead to prey displaying minimal vigilance behaviours during predictable low‐risk periods and the strongest antipredator behaviours during risky periods. Conversely, an unpredictable predation risk should result in prey displaying constant vigilance behaviour, with suboptimal foraging rates during periods of safety but antipredator behaviours of lower intensity during periods of risk. We tested this extension of the RAH using convict cichlids exposed to high‐risk alarm cues at two frequencies of risk (1× vs. 3×) per day, on either a fixed or random schedule for 5 d. We then tested the fish for a response to high‐risk cues (alarm cues) and to low‐risk cues (disturbance resulting from the introduction of distilled water). Our study supports previous results on the effects of risk frequency and cue intensity on cichlid behaviour. We failed to show an effect of risk predictability on the behavioural responses of cichlids to high‐risk alarm cues, but predictability did influence responses to low‐risk cues. We encourage further studies to test the effect of predictability in other systems.  相似文献   

5.
In theory, survival rates and consequent population status might be predictable from instantaneous behavioural measures of how animals prioritize foraging vs. avoiding predation. We show, for the 30 most common small bird species ringed in the UK, that one quarter respond to higher predation risk as if it is mass-dependent and lose mass. Half respond to predation risk as if it only interrupts their foraging and gain mass thus avoiding consequent increased starvation risk from reduced foraging time. These mass responses to higher predation risk are correlated with population and conservation status both within and between species (and independently of foraging habitat, foraging guild, sociality index and size) over the last 30 years in Britain, with mass loss being associated with declining populations and mass gain with increasing populations. If individuals show an interrupted foraging response to higher predation risk, they are likely to be experiencing a high quality foraging environment that should lead to higher survival. Whereas individuals that show a mass-dependent foraging response are likely to be in lower quality foraging environments, leading to relatively lower survival.  相似文献   

6.
Predation risk is often associated with group formation in prey, but recent advances in methods for analysing the social structure of animal societies make it possible to quantify the effects of risk on the complex dynamics of spatial and temporal organisation. In this paper we use social network analysis to investigate the impact of variation in predation risk on the social structure of guppy shoals and the frequency and duration of shoal splitting (fission) and merging (fusion) events. Our analyses revealed that variation in the level of predation risk was associated with divergent social dynamics, with fish in high-risk populations displaying a greater number of associations with overall greater strength and connectedness than those from low-risk sites. Temporal patterns of organisation also differed according to predation risk, with fission events more likely to occur over two short time periods (5 minutes and 20 minutes) in low-predation fish and over longer time scales (>1.5 hours) in high-predation fish. Our findings suggest that predation risk influences the fine-scale social structure of prey populations and that the temporal aspects of organisation play a key role in defining social systems.  相似文献   

7.
Mortality by moonlight: predation risk and the snowshoe hare   总被引:1,自引:0,他引:1  
Optimal behavior theory suggests that prey animals will reduceactivity during intermittent periods when elevated predationrisk outweighs the fitness benefits of activity. Specifically,the predation risk allocation hypothesis predicts that preyactivity should decrease dramatically at times of high predationrisk if there is high temporal variation in predation risk butshould remain relatively uniform when temporal variation inpredation risk is low. To test these predictions we examinedthe seasonably variable response of snowshoe hares to moonlightand predation risk. Unlike studies finding uniform avoidanceof moonlight in small mammals, we find that moonlight avoidanceis seasonal and corresponds to seasonal variation in moonlightintensity. We radio-collared 177 wild snowshoe hares to estimatepredation rates as a measure of risk and used movement distancesfrom a sample of those animals as a measure of activity. Inthe snowy season, 5-day periods around full moons had 2.5 timesmore predation than around new moons, but that ratio of theincreased predation rate was only 1.8 in the snow-free season.There was no significant increase in use of habitats with morehiding cover during full moons. Snowshoe hares' nightly movementdistances decreased during high-risk full-moon periods in thesnowy season but did not change according to moon phase in thesnow-free season. These results are consistent with the predationrisk allocation hypothesis.  相似文献   

8.
Predators can affect the vertical distribution of mobile intertidal invertebrates in two ways: they can (1) cause greater mortality of prey at certain intertidal levels, and (2) induce prey to seek safer intertidal areas. In this study, we investigate whether low-intertidal and subtidal predators affect the intertidal distribution of two congeneric species of small herbivorous gastropods of northeastern Pacific shores, Littorina sitkana Philippi 1846, and L. scutulata Gould 1849. In particular, we tested the hypothesis that predators affect the distribution of these snails by inducing them to seek higher and safer intertidal areas. On a wave-sheltered shore in Barkley Sound, British Columbia, L. sitkana and L. scutulata were both killed by predatory crabs (e.g., Cancer productus) more frequently when tethered near the lower limit of their intertidal distribution ( approximately 1 m) than when tethered where they were most common ( approximately 2.5 m), suggesting that high mortality rates are partly responsible for the lower-limit of these snails' intertidal distribution. However, two field mark-recapture experiments indicated that the snails' behavioral response to predation risk also influences their distribution. In the first experiment, snails from the 2.5-m level (low risk) transplanted to the 1.0-m level (high risk) displayed a strong and consistent tendency to move shoreward, especially L. sitkana, some traveling 10-15 m in 2-3 days to regain their original level. These shoreward movements were especially precise in the northern part of the study area, where predation rates on tethered snails were greatest. Furthermore, larger more vulnerable snails were more strongly oriented shoreward than smaller individuals, indicating that antipredator behavior might also contribute to intertidal size gradients in these species. In the second mark-recapture experiment, we manipulated predation risk using small cages and found that snails exposed to the odors of C. productus crabs foraging on conspecific and heterospecific snails displayed more precise (L. sitkana and L. scutulata) and longer (L. sitkana) shoreward movements than snails held in control conditions. These results provide the first experimental evidence that antipredator behavior may contribute to the intertidal distribution patterns of littorinids.  相似文献   

9.
Theoretical models of prey behaviour predict that food‐limited prey engage in risk‐prone foraging and thereby succumb to increased mortality from predation. However, predation risk also may be influenced by factors including prey density and structural cover, such that the presumed role of prey hunger on predation risk may be obfuscated in many complex predator–prey systems. Using a tadpole (prey) – dragonfly larva (predator) system, we determined relative risk posed to hungry vs. sated prey when both density and structural cover were varied experimentally. Overall, prey response to perceived predation risk was primarily restricted to increased cover use, and hungry prey did not exhibit risk‐prone foraging. Surprisingly, hungry prey showed lower activity than sated prey when exposed to predation risk, perhaps indicating increased effort in search of refuge or spatial avoidance of predator cues among sated animals. An interaction between hunger level and predation risk treatments indicated that prey state affected sensitivity to perceived risk. We also examined the lethal implications of prey hunger by allowing predators to select directly between hungry and sated prey. Although predators qualitatively favoured hungry prey when density was elevated and structural cover was sparse, the overall low observed variation in mortality risk between hunger treatments suggests that preferential selection of hungry prey was weak. This implies that hunger effects on prey mortality risk may not be readily observed in complex landscapes with additional factors influencing risk. Thus, current starvation‐predation trade‐off theory may need to be broadened to account for other mechanisms through which undernourished prey may cope with predation risk.  相似文献   

10.
Temporal variation of antipredatory behavior and a uniform distribution of predation risk over refuges and foraging sites may create foraging patterns different from those anticipated from risk in heterogenous habitats. We studied the temporal variation in foraging behavior of voles exposed to uniform mustelid predation risk and heterogeneous avian predation risk of different levels induced by vegetation types in eight outdoor enclosures (0.25 ha). We manipulated mustelid predation risk with weasel presence or absence and avian predation risk by reducing or providing local cover at experimental food patches. Foraging at food patches was monitored by collecting giving-up densities at artificial food patches, overall activity was automatically monitored, and mortality of voles was monitored by live-trapping and radiotracking. Voles depleted the food to lower levels in the sheltered patches than in the exposed ones. In enclosures with higher avian predation risk caused by lower vegetation height, trays were depleted to lower levels. Unexpectedly, voles foraged in more trays and depleted trays to lower levels in the presence of weasels than in the absence. Weasels match their prey's body size and locomotive abilities and therefore increase predation risk uniformly over both foraging sites and refuge sites that can both be entered by the predator. This reduces the costs of missing opportunities other than foraging. Voles changed their foraging strategy accordingly by specializing on the experimental food patches with predictable returns and probably reduced their foraging in the matrix of natural food source with unpredictable returns and high risk to encounter the weasel. Moreover, after 1 day of weasel presence, voles shifted their main foraging activities to avoid the diurnal weasel. This behavior facilitated bird predation, probably by nocturnal owls, and more voles were killed by birds than by weasels. Food patch use of voles in weasel enclosures increased with time. Voles had to balance the previously missed feeding opportunities by progressively concentrating on artificial food patches.  相似文献   

11.
An animal's foraging decisions are the outcome of the relative importance of the risk of starvation and predation. Fat deposition insures against periods of food shortage but it also carries a cost in terms of mass dependent predation risk due to reduced escape probability and extended exposure time. Accordingly, birds have been observed to show a unimodal foraging pattern with foraging concentrated at the end of the day under conditions of predictable food resources and high predation risk. We tested this hypothesis in a tropical granivorous finch, the rock firefinch Lagonosticta sanguinodorsalis , in an outdoor aviary experiment during which food was provided ad lib and the risk of predation was varied by providing food either adjacent to, or 5 m away from cover. Rock firefinches showed a bimodal foraging pattern regardless of the risk of predation at which they fed. The results suggest that predation is relatively unimportant in shaping their daily feeding pattern despite mass gain during the day being similar to temperate birds. Foraging patterns closely follow diurnal temperature variation and this is suggested to be the main determinant of the observed bimodal pattern.  相似文献   

12.
1. Foraging herbivores must deal with plant characteristics that inhibit feeding and they must avoid being eaten. Principally, toxins limit food intake, while predation risk alters how long animals are prepared to harvest resources. Each of these factors strongly affects how herbivores use food patches, and both constraints can pose immediate proximate costs and long-term consequences to fitness. 2. Using a generalist mammalian herbivore, the common brushtail possum (Trichosurus vulpecula), our aim was to quantitatively compare the influence of plant toxin and predation risk on foraging decisions. 3. We performed a titration experiment by offering animals a choice between non-toxic food at a risky patch paired with food with one of five toxin concentrations at a safe patch. This allowed us to identify the tipping point, where the cost of toxin in the safe food patch was equivalent to the perceived predation risk in the alternative patch. 4. At low toxin concentration, animals ate more from the safe than the risky patch. As toxin concentration increased at the safe patch, intake shifted until animals ate mainly from the risky patch. This shift was associated with behavioural changes: animals spent more time and fed longer at the risky patch, while vigilance increased at both risky and safe patches. 5. Our results demonstrate that the variation in toxin concentration, which occurs intraspecifically among plants, can critically influence the relative cost of predation risk on foraging. We show that herbivores quantify, compare and balance these two different but proximate costs, altering their foraging patterns in the process. This has potential ecological and evolutionary implications for the production of plant defence compounds in relation to spatial variation in predation risk to herbivores.  相似文献   

13.
For species that cannot seek cover to escape predators, aggregation becomes an important strategy to reduce predation risk. However, aggregation may not be entirely beneficial because aggregated animals may compete for access to limited resources and might even attract predators. Available evidence suggests that foraging competition influences time allocation in large-bodied macropodid marsupials, but previous studies have focused primarily on species in areas with protective cover. We studied red kangaroos, a species often found in open country without noticeable cover, to determine whether they experienced a net benefit by aggregation. Red kangaroos varied their time allocation as a function of group size and, importantly, more variation in time allocation to vigilance and foraging was explained by non-linear models than by linear models. This suggests red kangaroos directly translated the reduction of predation risk brought about by aggregation into greater time foraging and less time engaged in vigilance. We infer that red kangaroos received a net benefit by aggregation. Social species living in the open may be generally expected to rely on others to help manage predation risk. Communicated by K. Kotrschal  相似文献   

14.
Although a variety of behaviors expose animals to some risk of predation, there is no accepted way to compare their relative risk. For animals that retreat to refugia when alarmed by predators, the proportion of time devoted to each out-of-refuge behavior multiplied by the total time required to return to a refuge can be used to compare a behavior's relative predation risk. Total time to return to a refuge is a function of both response time - the time required to respond to an increased risk of predation — and travel time — the time required to flee to a refuge once alarmed. Quantifying these components can illustrate how animals minimize exposure to predators. Golden marmots (Marmota caudata aurea) were a refuging prey species used to examine the utility of this measure and to understand how marmots minimized their risk of exposure to predation. Golden marmots devoted different amounts of time to looking, foraging, self-grooming, and playing. To estimate the behavior-specific time required to return to refugia, the location of different activities was noted and a behavior-specific travel time was calculated. Alarm calls were played back to marmots engaged in different behaviors to determine, in a standardized manner, if there were behavior-specific response times. Marmots appeared to minimize their predation risk by performing most behaviors close to refugia. Results suggest that foraging was the riskiest behavior, largely because marmots foraged far from refugia and spent about 30% of their time foraging. While sample sizes were small, results also suggested that play, a rare adult behavior, exposed animals to predation because of a relatively long response time.  相似文献   

15.
David B. Lewis  Lisa A. Eby 《Oikos》2002,96(1):119-129
The effect of habitat structure on interactions between predators and prey may vary spatially. In estuarine salt marshes, heterogeneity in refuge quality derives from spatial variation in vegetation structure and in tidal inundation. We investigated whether predation by blue crabs on periwinkle snails was influenced by distance from the seaward edge of the salt marsh and by characteristics of the primary habitat structure, smooth cordgrass ( Spartina alterniflora ). Spartina may provide refuge for snails and interfere with foraging by crabs. Furthermore, predation risk should decline with distance from the seaward edge because landward regions require more travel time for crabs during tidal inundation. We investigated these processes using a comparative survey of snails and habitat traits, an experiment that assessed the crab population and measured predation risk, and a size-structured model that estimated encounter rates. Taken together, these approaches indicated that predation risk for snails was lower where Spartina was present and was lower in a landward direction. Furthermore, Spartina architecture and distance from the seaward edge interacted. The strength of the predation gradient between seaward and landward regions of the marsh was greater where Spartina was tall or dense. These predation gradients emerge because vegetation and distance inland decrease encounter rates between crabs and snails. This study suggests that habitat modification, a process not uncommon in salt marshes, may have consequences for interactions among intertidal fauna.  相似文献   

16.
Synopsis Recent studies show that fish forage actively when perceived risk is low, but decrease foraging and increase vigilance when perceived risk is high. Isolated juvenile chum salmon,Oncorhynchus keta, were visually exposed to groups of conspecifics engaged in different activities to examine their ability to gain information about foraging opportunity and risk by interpreting conspecific behavior. Isolates ate most when exposed to feeding groups, less when exposed to nonfeedig groups, and least when exposed to alarmed groups. Isolates exposed to alarmed conspecifics also spent more time motionless than did fish exposed to either feeding or nonfeeding conspecifics. These findings indicate that schooling fish gain information by interpreting conspecific behavior, and are consistent with research showing that animals balance the conflicting demands of foraging and vigilance.  相似文献   

17.
The effects of predation on the use of social foraging tactics, such as producing and scrounging, are poorly known in animals. On the one hand, recent theoretical models predict increased use of scrounging with increasing predation risk, when scroungers seeking feeding opportunities also have a higher chance of detecting predators. On the other hand, there may be no relation between tactic use and predation when antipredator vigilance is not compatible with scanning flockmates. We investigated experimentally the effects of predation risk on social foraging tactic use in tree sparrows, Passer montanus. We manipulated predation risk in the field by changing the distance between shelter and a feeder. Birds visited the feeder in smaller flocks, spent less time on it and were somewhat more vigilant far from shelter than close to it. Increased predation risk strongly affected the social foraging tactic used: birds used the scrounger tactic 30% more often far from cover than close to it. Between-flock variability in scrounging frequency was not related to the average vigilance level of the flock members, and within-flock variability in the use of scrounging was negatively related to the vigilance of birds. Our results suggest that in tree sparrows, the increased frequency of scrounging during high predation risk cannot simply be explained by an additional advantage of increasing antipredator vigilance. We propose alternative mechanisms (e.g. increased stochasticity in food supply, and that riskier places are used by individuals with lower reserves) that may explain increased scrounging when animals forage under high predation risk.  相似文献   

18.
Whereas there are many studies of the time allocated to antipredator vigilance while animals forage, the vast majority of these studies remain correlative. This is potentially problematic because a variety of factors other than variation in perceived risk might influence putative antipredator behaviors such as time allocated to vigilance and foraging. We conducted an experimental study of yellow‐bellied marmot (Marmota flaviventris) antipredator behavior while marmots foraged at a replicated set of feeding stations established 1, 5, 10, and 20 m from their main burrows. Marmots appeared to perceive a reduced risk of predation when they foraged in the presence of other marmots; they allocated more time to foraging and decreased the time allocated to vigilance. When they foraged farther from their burrows, marmots initiated foraging after a substantially greater amount of time, tended to increase the frequency of their bouts of vigilance, and decreased the duration of each bout. Yearling marmots took less time to begin foraging than adults. Marmot flight initiation distance at our feeding trays was independent of the distance they foraged away from the burrow. Taken together, these experimental results demonstrate that marmots' perceptions of risk increased with distance to the burrow and decreased when other individuals were within 10 m of them while they foraged.  相似文献   

19.
Small passerines are faced with a trade‐off when foraging during winter. Increasing energy reserves makes them more vulnerable to predators, while a low level of reserves exposes them to a high risk of starvation. Whether small birds under these circumstances are allowed to reduce their foraging activity under increased predation risk, for example in feeding sites more exposed to predators, remains controversial in former behavioural and ecological researches. In this study, we investigated the foraging activity of free‐living Tree Sparrow Passer montanus flocks feeding on an artificial feeding platform. The predation risk perceived by the sparrows was manipulated by placing the platform either close to or far from a bushy shelter. Foraging activity, assessed as cumulative activity of sparrows per unit time on the platform, did not differ between the low‐risk and the high‐risk conditions and did not significantly change during the day. Feeding efficiency, assessed as pecking rate, was not either reduced under the high‐risk condition. Our results suggest that sparrows were forced to feed almost continuously during the day in order to maintain their preferred level of energy reserves. However, several behavioural changes helped sparrows to adopt a safer foraging policy when feeding far from cover, as we found in another study. Altogether, sparrow flocks feeding far from cover decreased the overall foraging time (the time when any sparrow stayed on the platform) by approximately 20% as compared to the near cover condition. A possible way to maintain the same level of foraging activity despite of the reduction in overall foraging time is discussed.  相似文献   

20.
Prey animals encountering multiple stimuli must often make behavioral tradeoffs. Many environmental cues may influence the tradeoff observed, but recent theoretical work suggests that temporal variation in risk should influence how prey animals behave during any given period of risk. As time spent under risk of predation increases, prey animals will increase their allocation of foraging during periods of risk. This model is known as the risk allocation hypothesis (RAH) ( Lima & Bednekoff 1999 ). We tested the RAH using the crayfish Orconectes virilis. We selected two frequency regimes (exposure to risk every 6 or 12 h) and three cues suggestive of increasing risk (water, snapping turtle cue, and conspecific alarm cue). Test animals were exposed to one of the six frequency × risk combinations for 24 h, followed immediately by the simultaneous introduction of a food and a risk cue. Three behaviors (burrow use, non‐ambulatory motion, and locomotion) were then recorded for 5 min. Responses were significantly influenced by the interaction of risk and frequency. Further analysis indicated that responses were not consistently influenced by frequency alone. While our results do not support the predictions of the RAH for our frequency regimes, qualitative comparison with an earlier, similar study ( Hazlett 1999 ) suggests that risk allocation is occurring in this system. We recommend that frequency of encounter with risk be considered in future studies. Ignoring temporal variation may lead to over‐ or underestimation of the subject's natural responses.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号