scholarly journals Relaxation of risk-sensitive behaviour of prey following disease-induced decline of an apex predator, the Tasmanian devil

2015 ◽  
Vol 282 (1810) ◽  
pp. 20150124 ◽  
Author(s):  
Tracey Hollings ◽  
Hamish McCallum ◽  
Kaely Kreger ◽  
Nick Mooney ◽  
Menna Jones

Apex predators structure ecosystems through lethal and non-lethal interactions with prey, and their global decline is causing loss of ecological function. Behavioural changes of prey are some of the most rapid responses to predator decline and may act as an early indicator of cascading effects. The Tasmanian devil ( Sarcophilus harrisii ), an apex predator, is undergoing progressive and extensive population decline, of more than 90% in long-diseased areas, caused by a novel disease. Time since local disease outbreak correlates with devil population declines and thus predation risk. We used hair traps and giving-up densities (GUDs) in food patches to test whether a major prey species of devils, the arboreal common brushtail possum ( Trichosurus vulpecula ), is responsive to the changing risk of predation when they forage on the ground. Possums spend more time on the ground, discover food patches faster and forage more to a lower GUD with increasing years since disease outbreak and greater devil population decline. Loss of top–down effects of devils with respect to predation risk was evident at 90% devil population decline, with possum behaviour indistinguishable from a devil-free island. Alternative predators may help to maintain risk-sensitive anti-predator behaviours in possums while devil populations remain low.

1995 ◽  
Vol 22 (1) ◽  
pp. 115 ◽  
Author(s):  
D. S. Hik

Like most heavily preyed-upon animals, snowshoe hares (Lepus americanus) have to balance conflicting demands of obtaining food at a high rate and avoiding predators. Adopting foraging behaviours to minimise predation risk may also lead to a decline in condition, and hence fecundity. Predictions of three hypotheses (condition constraint hypothesis, predator-avoidance constraint hypothesis, predation-sensitive foraging (PSF) hypothesis) were tested by comparing changes in the survival and condition of snowshoe hares on four experimental areas in winter during a cyclic peak and decline (1989–1993) near Kluane Lake, Yukon, Canada, where (i) predation risk was reduced by excluding terrestrial predators (FENCE), (ii) food supply was supplemented with rabbit chow ad libitum (FOOD), (iii) these two treatments were combined (FENCE+FOOD), and (iv) an unmanipulated CONTROL was used. Different pattems of survival and changes in body mass were observed in the presence and absence of terrestrial predators. On the CONTROL area, female body mass and fecundity declined, even though sufficient winter forage was apparently available in all years. A similar decrease in body mass was observed on the FOOD treatment, but only during the third year of the population decline. In contrast, female body mass remained high throughout the decline in the absence of terrestrial predators in the FENCE+FOOD and FENCE treatments. Winter survival declined on CONTROL and FENCE areas during the first year of the population decline (1991), but remained higher on FOOD until 1992 and FENCE+FOOD until 1993. These results generally supported the PSF hypothesis where terrestrial predators were present (CONTROL and FOOD grids). Where terrestrial predators were absent (FENCE and FENCE+FOOD), the results supported the alternative condition constraint hypothesis. The evidence suggests that a cascade of sublethal behavioural and physiological effects associated with increased predation risk contribute to the population decline and delayed recovery of cyclic low-phase populations of snowshoe hares.


2015 ◽  
Vol 282 (1802) ◽  
pp. 20142870 ◽  
Author(s):  
Christopher E. Gordon ◽  
Anna Feit ◽  
Jennifer Grüber ◽  
Mike Letnic

Predators can impact their prey via consumptive effects that occur through direct killing, and via non-consumptive effects that arise when the behaviour and phenotypes of prey shift in response to the risk of predation. Although predators' consumptive effects can have cascading population-level effects on species at lower trophic levels there is less evidence that predators' non-consumptive effects propagate through ecosystems. Here we provide evidence that suppression of abundance and activity of a mesopredator (the feral cat) by an apex predator (the dingo) has positive effects on both abundance and foraging efficiency of a desert rodent. Then by manipulating predators' access to food patches we further the idea that apex predators provide small prey with refuge from predation by showing that rodents increased their habitat breadth and use of ‘risky′ food patches where an apex predator was common but mesopredators rare. Our study suggests that apex predators' suppressive effects on mesopredators extend to alleviate both mesopredators' consumptive and non-consumptive effects on prey.


2005 ◽  
Vol 27 (2) ◽  
pp. 231 ◽  
Author(s):  
LN Evans ◽  
MA Elgar ◽  
KA Handasyde

DETECTION and avoidance of predators are the principle strategies employed by prey to evade attack; by scanning their environment, prey individuals can reduce the likelihood of a predator approaching to within striking distance (Elgar 1989; Lima and Dill 1990). However, vigilance is often incompatible with foraging behaviours, and thus animals may be forced to trade-off the risk of predation against acquiring food. Consequently, the quality of a particular resource patch and its associated predation risk may influence the foraging decisions of animals (Werner et al. 1983; Newman and Caraco 1987; Heithaus and Dill 2002). Cover is an important feature of a foraging site because it can provide a hiding place to escape potential predators (Lazarus and Symonds 1992). Thus, animals may prefer foraging sites that are close to cover, or adjust their level of vigilance at different distances from cover in order to compensate for changes in the chance of early detection and escape (Elgar 1989; Lima and Dill 1990; Lima et al. 1985; Kramer and Bonenfant 1997).


Author(s):  
Robert J. Nowicki ◽  
Jordan A. Thomson ◽  
James W. Fourqurean ◽  
Aaron J. Wirsing ◽  
Michael R. Heithaus

Crustaceana ◽  
2015 ◽  
Vol 88 (7-8) ◽  
pp. 839-856 ◽  
Author(s):  
J. Hesse ◽  
J. A. Stanley ◽  
A. G. Jeffs

Kelp habitats are in decline in many temperate coastal regions of the world due to climate change and expansion of populations of grazing urchins. The loss of kelp habitat may influence the vulnerability to predators of the juveniles of commercially important species. In this study relative predation rates for kelp versus barren reef habitat were measured for early juvenile Australasian spiny lobster, Jasus edwardsii (Hutton, 1875), on the northeastern coast of New Zealand using tethering methods. Variation in assemblages of predators over small spatial scales appeared to be more important for determining the relative predation of lobsters, regardless of habitat type. Therefore, the assessment of relative predation risk to early juvenile lobsters between kelp and barren habitats will require more extensive sampling at a small spatial scale, as well as a specific focus on sampling during crepuscular and nocturnal periods when these lobsters are most at risk of predation.


2010 ◽  
Vol 37 (4) ◽  
pp. 273 ◽  
Author(s):  
Karen Fey ◽  
Peter B. Banks ◽  
Hannu Ylönen ◽  
Erkki Korpimäki

Context. Potential mammalian prey commonly use the odours of their co-evolved predators to manage their risks of predation. But when the risk comes from an unknown source of predation, odours might not be perceived as dangerous, and anti-predator responses may fail, except possibly if the alien predator is of the same archetype as a native predator. Aims. In the present study we examined anti-predator behavioural responses of voles from the outer archipelagos of the Baltic Sea, south-western Finland, where they have had no resident mammalian predators in recent history. Methods. We investigated responses of field voles (Microtus agrestis) to odours of native least weasels (Mustela nivalis) and a recently invading alien predator, the American mink (Mustela vison), in laboratory. We also studied the short-term responses of free-ranging field voles and bank voles (Myodes glareolus) to simulated predation risk by alien mink on small islands in the outer archipelago of the Baltic Sea. Key results. In the laboratory, voles avoided odour cues of native weasel but not of alien mink. It is possible that the response to mink is a context dependent learned response which could not be induced in the laboratory, whereas the response to weasel is innate. In the field, however, voles reduced activity during their normal peak-activity times at night as a response to simulated alien-mink predation risk. No other shifts in space use or activity in safer microhabitats or denser vegetation were apparent. Conclusions. Voles appeared to recognise alien minks as predators from their odours in the wild. However, reduction in activity is likely to be only a short-term immediate response to mink presence, which is augmented by longer-term strategies of habitat shift. Because alien mink still strongly suppresses vole dynamics despite these anti-predator responses, we suggest that behavioural naiveté may be the primary factor in the impact of an alien predator on native prey. Implications. Prey naiveté has long been considered as the root cause of the devastating impacts of alien predators, whereby native prey simply fail to recognise and respond to the novel predation risk. Our results reveal a more complex form of naiveté whereby native prey appeared to recognise alien predators as a threat but their response is ultimately inadequate. Thus, recognition alone is unlikely to afford protection for native prey from alien-predator impacts. Thus, management strategies that, for example, train prey in recognition of novel threats must induce effective responses if they are expected to succeed.


2018 ◽  
Vol 285 (1892) ◽  
pp. 20181582 ◽  
Author(s):  
Calum X. Cunningham ◽  
Christopher N. Johnson ◽  
Leon A. Barmuta ◽  
Tracey Hollings ◽  
Eric J. Woehler ◽  
...  

Top carnivores have suffered widespread global declines, with well-documented effects on mesopredators and herbivores. We know less about how carnivores affect ecosystems through scavenging. Tasmania's top carnivore, the Tasmanian devil (Sarcophilus harrisii) , has suffered severe disease-induced population declines, providing a natural experiment on the role of scavenging in structuring communities. Using remote cameras and experimentally placed carcasses, we show that mesopredators consume more carrion in areas where devils have declined. Carcass consumption by the two native mesopredators was best predicted by competition for carrion, whereas consumption by the invasive mesopredator, the feral cat ( Felis catus ), was better predicted by the landscape-level abundance of devils, suggesting a relaxed landscape of fear where devils are suppressed. Reduced discovery of carcasses by devils was balanced by the increased discovery by mesopredators. Nonetheless, carcasses persisted approximately 2.6-fold longer where devils have declined, highlighting their importance for rapid carrion removal. The major beneficiary of increased carrion availability was the forest raven ( Corvus tasmanicus ). Population trends of ravens increased 2.2-fold from 1998 to 2017, the period of devil decline, but this increase occurred Tasmania-wide, making the cause unclear. This case study provides a little-studied potential mechanism for mesopredator release, with broad relevance to the vast areas of the world that have suffered carnivore declines.


2018 ◽  
Vol 11 (1) ◽  
pp. 100-103
Author(s):  
Aldo Alvarez-Risco ◽  
Jaime Delgado-Zegarra ◽  
Jaime A. Yáñez ◽  
Santiago Diaz-Risco ◽  
Shyla Del-Aguila-Arcentales

Abstract The growth of tourism to Peru and the gastronomic boom with millions of people looking to taste Peruvian food is resulting in a risk of predation of natural sources necessary to make these dishes. The focus in only obtaining these ingredients can generate significant damage to the Peruvian biodiversity, so stakeholders need to develop strategies to avoid predation due to the gastronomic boom. Citizens and visitors need to play a role in protecting the natural resources and contributing to environmental sustainability.


2009 ◽  
Vol 5 (5) ◽  
pp. 600-602 ◽  
Author(s):  
Bob B. M. Wong ◽  
Marja Järvenpää ◽  
Kai Lindström

Reproductive activities are often conspicuous and can increase the risk of predation. Evidence suggests that individuals are capable of responding to predators in a risk-sensitive manner. However, most studies tend to consider only the predator-mediated responses of males and females in isolation and with little regard to differences in local environmental conditions. Here, we experimentally investigate the effects of environmental visibility (turbidity) and predation risk on reproductive decisions in the sand goby, Pomatoschistus minutus , when exposed to a visually oriented predator, the European perch, Perca fluviatilis . We found that gobies were more reluctant to spawn in the predator's presence, although larger males spawned sooner than smaller males. Interestingly, latency to spawning was unaffected by the visual environment, suggesting that gobies may be relying on non-visual cues under turbid conditions.


2001 ◽  
Vol 58 (1) ◽  
pp. 108-121 ◽  
Author(s):  
Jeffrey A Hutchings

Quantitative criteria used to assign species to categories of extinction risk may seriously overestimate these risks for marine fishes. Contemporary perception is that marine fishes may be less vulnerable to extinction than other taxa, because of great natural variability in abundance, high fecundity, rapid population growth, and an intrinsically high capability of recovering from low population size. Contrary to perception, however, there appears to be generally little theoretical or empirical support for the hypotheses that marine fish are more likely to experience large reductions in population size, to produce unusually high levels of recruitment, to have higher reproductive rates, or to recover more rapidly from prolonged population declines than nonmarine fishes. Although existing population-decline criteria may not accurately reflect probabilities of biological extinction, they do appear to reflect the converse-population recovery. Insufficient support for contemporary perceptions of their susceptibility to extinction, coupled with caveats associated with the assignment of extinction risk, suggest that significant increases in the population-decline thresholds used to assign marine fishes to at-risk categories would be inconsistent with a precautionary approach to fisheries management and the conservation of marine biodiversity.


Sign in / Sign up

Export Citation Format

Share Document