scholarly journals Behavioral responses across a mosaic of ecosystem states restructure a sea otter–urchin trophic cascade

2021 ◽  
Vol 118 (11) ◽  
pp. e2012493118 ◽  
Author(s):  
Joshua G. Smith ◽  
Joseph Tomoleoni ◽  
Michelle Staedler ◽  
Sophia Lyon ◽  
Jessica Fujii ◽  
...  

Consumer and predator foraging behavior can impart profound trait-mediated constraints on community regulation that scale up to influence the structure and stability of ecosystems. Here, we demonstrate how the behavioral response of an apex predator to changes in prey behavior and condition can dramatically alter the role and relative contribution of top-down forcing, depending on the spatial organization of ecosystem states. In 2014, a rapid and dramatic decline in the abundance of a mesopredator (Pycnopodia helianthoides) and primary producer (Macrocystis pyrifera) coincided with a fundamental change in purple sea urchin (Strongylocentrotus purpuratus) foraging behavior and condition, resulting in a spatial mosaic of kelp forests interspersed with patches of sea urchin barrens. We show that this mosaic of adjacent alternative ecosystem states led to an increase in the number of sea otters (Enhydra lutris nereis) specializing on urchin prey, a population-level increase in urchin consumption, and an increase in sea otter survivorship. We further show that the spatial distribution of sea otter foraging efforts for urchin prey was not directly linked to high prey density but rather was predicted by the distribution of energetically profitable prey. Therefore, we infer that spatially explicit sea otter foraging enhances the resistance of remnant forests to overgrazing but does not directly contribute to the resilience (recovery) of forests. These results highlight the role of consumer and predator trait-mediated responses to resource mosaics that are common throughout natural ecosystems and enhance understanding of reciprocal feedbacks between top-down and bottom-up forcing on the regional stability of ecosystems.

2022 ◽  
Vol 17 (1) ◽  
Author(s):  
Eileen Goldberg ◽  
Kathleen Conte ◽  
Victoria Loblay ◽  
Sisse Groen ◽  
Lina Persson ◽  
...  

Abstract Background Population-level health promotion is often conceived as a tension between “top-down” and “bottom-up” strategy and action. We report behind-the-scenes insights from Australia’s largest ever investment in the “top-down” approach, the $45m state-wide scale-up of two childhood obesity programmes. We used Normalisation Process Theory (NPT) as a template to interpret the organisational embedding of the purpose-built software designed to facilitate the initiative. The use of the technology was mandatory for evaluation, i.e. for reporting the proportion of schools and childcare centres which complied with recommended health practices (the implementation targets). Additionally, the software was recommended as a device to guide the implementation process. We set out to study its use in practice. Methods Short-term, high-intensity ethnography with all 14 programme delivery teams across New South Wales was conducted, cross-sectionally, 4 years after scale-up began. The four key mechanisms of NPT (coherence/sensemaking, cognitive participation/engagement, collective action and reflexive monitoring) were used to describe the ways the technology had normalised (embedded). Results Some teams and practitioners embraced how the software offered a way of working systematically with sites to encourage uptake of recommended practices, while others rejected it as a form of “mechanisation”. Conscious choices had to be made at an individual and team level about the practice style offered by the technology—thus prompting personal sensemaking, re-organisation of work, awareness of choices by others and reflexivity about professional values. Local organisational arrangements allowed technology users to enter data and assist the work of non-users—collective action that legitimised opposite behaviours. Thus, the technology and the programme delivery style it represented were normalised by pathways of adoption and non-adoption. Normalised use and non-use were accepted and different choices made by local programme managers were respected. State-wide, implementation targets are being reported as met. Conclusion We observed a form of self-organisation where individual practitioners and teams are finding their own place in a new system, consistent with complexity-based understandings of fostering scale-up in health care. Self-organisation could be facilitated with further cross-team interaction to continuously renew and revise sensemaking processes and support diverse adoption choices across different contexts.


2013 ◽  
Vol 59 (4) ◽  
pp. 485-505 ◽  
Author(s):  
Jon E. Brommer

Abstract Individual-based studies allow quantification of phenotypic plasticity in behavioural, life-history and other labile traits. The study of phenotypic plasticity in the wild can shed new light on the ultimate objectives (1) whether plasticity itself can evolve or is constrained by its genetic architecture, and (2) whether plasticity is associated to other traits, including fitness (selection). I describe the main statistical approach for how repeated records of individuals and a description of the environment (E) allow quantification of variation in plasticity across individuals (IxE) and genotypes (GxE) in wild populations. Based on a literature review of life-history and behavioural studies on plasticity in the wild, I discuss the present state of the two objectives listed above. Few studies have quantified GxE of labile traits in wild populations, and it is likely that power to detect statistically significant GxE is lacking. Apart from the issue of whether it is heritable, plasticity tends to correlate with average trait expression (not fully supported by the few genetic estimates available) and may thus be evolutionary constrained in this way. Individual-specific estimates of plasticity tend to be related to other traits of the individual (including fitness), but these analyses may be anti-conservative because they predominantly concern stats-on-stats. Despite the increased interest in plasticity in wild populations, the putative lack of power to detect GxE in such populations hinders achieving general insights. I discuss possible steps to invigorate the field by moving away from simply testing for presence of GxE to analyses that ‘scale up’ to population level processes and by the development of new behavioural theory to identify quantitative genetic parameters which can be estimated.


2020 ◽  
Author(s):  
Cheryl Case Johnson ◽  
Melissa Neuman ◽  
Peter MacPherson ◽  
Augustine Choko ◽  
Caitlin Quinn ◽  
...  

Abstract Background Many southern African countries are nearing the global goal to diagnose 90% of people with HIV by 2020. In 2016, 84% and 86% of people with HIV knew their status in Malawi and Zimbabwe respectively. Despite this progress, gaps remain, particularly among men (≥25 years). We investigated awareness, use and willingness to HIV self-test (HIVST) prior to large scale implementation and explored sociodemographic associations. Methods We pooled responses from two of the first cross-sectional Demographic Health Surveys to include HIVST questions: Malawi and Zimbabwe in 2015-16. Sociodemographic factors and sexual risk behaviours associated with previously testing for HIV, and awareness, past use and future willingness to self-test were investigated using univariable and multivariable logistic regression, adjusting for the sample design and limiting analysis to participants with completed questionnaire and a valid HIV result. Analysis of willingness to self-test was restricted to Zimbabwean men, as Malawians and women were not asked this question. Results Of 31 385 individuals, the proportion never-tested was higher for men (31.2%) than women (16.5%), p<0.001. For men, having ever tested increased with age. Past use and awareness of HIVST was very low, 1.2% and 12.6% respectively. Awareness was lower among women than men (9.1% vs 15.3%, adjusted odds ratio (aOR)=1.55; 95% confidence interval [CI]: 1.37-1.75), and at younger ages, and lower education and literacy levels. Willingness to self-test among Zimbabwean men was high (84.5%), with having previously tested for HIV, high sexual risk, and being ≥25 years associated with greater willingness. Wealthier men had greater awareness of HIVST than poorer men (p<0.001). Men at higher HIV-related sexual risk, compared to men at lower HIV-related sexual risk, had the greatest willingness to self-test (aOR=3.74; 95%CI: 1.39-10.03, p<0.009).Conclusions In 2015-16 many Malawian and Zimbabwean men had never tested for HIV. Despite low awareness and minimal HIVST experience at that time, willingness to self-test was high among Zimbabwean men, especially in older men with moderate to high HIV-related sexual risk. These data provide a valuable baseline against which to investigate population-level uptake of HIVST as programmes scale-up. Programmes introducing, or planning to introduce HIVST, should consider including questions in population-based surveys.


2018 ◽  
Vol 115 (29) ◽  
pp. 7545-7550 ◽  
Author(s):  
Erin E. Gorsich ◽  
Rampal S. Etienne ◽  
Jan Medlock ◽  
Brianna R. Beechler ◽  
Johannie M. Spaan ◽  
...  

Coinfecting parasites and pathogens remain a leading challenge for global public health due to their consequences for individual-level infection risk and disease progression. However, a clear understanding of the population-level consequences of coinfection is lacking. Here, we constructed a model that includes three individual-level effects of coinfection: mortality, fecundity, and transmission. We used the model to investigate how these individual-level consequences of coinfection scale up to produce population-level infection patterns. To parameterize this model, we conducted a 4-y cohort study in African buffalo to estimate the individual-level effects of coinfection with two bacterial pathogens, bovine tuberculosis (bTB) and brucellosis, across a range of demographic and environmental contexts. At the individual level, our empirical results identified bTB as a risk factor for acquiring brucellosis, but we found no association between brucellosis and the risk of acquiring bTB. Both infections were associated with reductions in survival and neither infection was associated with reductions in fecundity. The model reproduced coinfection patterns in the data and predicted opposite impacts of coinfection at individual and population scales: Whereas bTB facilitated brucellosis infection at the individual level, our model predicted the presence of brucellosis to have a strong negative impact on bTB at the population level. In modeled populations where brucellosis was present, the endemic prevalence and basic reproduction number (R0) of bTB were lower than in populations without brucellosis. Therefore, these results provide a data-driven example of competition between coinfecting pathogens that occurs when one pathogen facilitates secondary infections at the individual level.


Author(s):  
Aliza Werner-Seidler ◽  
Jennifer L. Hudson ◽  
Helen Christensen

This chapter describes the nature of primary prevention of anxiety and reports on evidence for its effectiveness. The chapter first defines prevention before reporting results of a systematic review of randomized controlled trials designed to prevent anxiety. A review of existing trials and associated effect sizes suggests that prevention programmes can be effective in preventing anxiety disorder incidence and symptoms in multiple settings (schools, workplaces, community) across the lifespan. The median effect size at post-test across all studies was 0.21, and 0.25 specifically for cognitive behavioural prevention programmes. Key elements common to prevention programmes are then discussed, including a consideration of programme content and personnel delivering the intervention. Key implementation barriers are raised, together with suggestions for how these might be overcome in order to scale up and offer prevention at a population level. The chapter concludes with a consideration of the impact these programmes could have on anxiety disorder incidence.


Science ◽  
2020 ◽  
Vol 368 (6496) ◽  
pp. 1243-1247 ◽  
Author(s):  
Edward J. Gregr ◽  
Villy Christensen ◽  
Linda Nichol ◽  
Rebecca G. Martone ◽  
Russell W. Markel ◽  
...  

Predator recovery often leads to ecosystem change that can trigger conflicts with more recently established human activities. In the eastern North Pacific, recovering sea otters are transforming coastal systems by reducing populations of benthic invertebrates and releasing kelp forests from grazing pressure. These changes threaten established shellfish fisheries and modify a variety of other ecosystem services. The diverse social and economic consequences of this trophic cascade are unknown, particularly across large regions. We developed and applied a trophic model to predict these impacts on four ecosystem services. Results suggest that sea otter presence yields 37% more total ecosystem biomass annually, increasing the value of finfish [+9.4 million Canadian dollars (CA$)], carbon sequestration (+2.2 million CA$), and ecotourism (+42.0 million CA$). To the extent that these benefits are realized, they will exceed the annual loss to invertebrate fisheries (−$7.3 million CA$). Recovery of keystone predators thus not only restores ecosystems but can also affect a range of social, economic, and ecological benefits for associated communities.


2020 ◽  
Vol 7 ◽  
Author(s):  
Melissa A. Miller ◽  
Megan E. Moriarty ◽  
Laird Henkel ◽  
Martin Tim Tinker ◽  
Tristan L. Burgess ◽  
...  

We compiled findings from 15 years (1998–2012) of southern sea otter (Enhydra lutris nereis) necropsies, incorporating data from 560 animals. Sensitive diagnostic tests were used to detect biotoxins, bacteria, parasites and fungi. Methods to classify primary and contributing causes of death (COD) and sequelae utilized an updated understanding of health risks affecting this population. Several interesting patterns emerged, including identification of coastal regions of high mortality risk for sea otter mortality due to shark bite, cardiomyopathy, toxoplasmosis, sarcocystosis, acanthocephalan peritonitis and coccidioidomycosis. We identified demographic attributes that enhanced the risk of disease in relation to age, sex, and reproductive stage. Death due to white shark (Carcharodon carcharias) bite increased dramatically during the study period and was the most common primary COD. However, when primary and contributing COD were combined, the most prevalent COD was infectious disease (affecting 63% of otters), especially fatal infections by acanthocephalans (Profilicollis spp.) and protozoa (e.g., Sarcocystis neurona and Toxoplasma gondii). Fatal bacterial infections were also extremely common as a primary process or a sequela, affecting 68% of examined otters. Substantial advances were made in identifying sea otters that died following exposure to the pervasive marine neurotoxin domoic acid (DA), and DA intoxication was conservatively estimated as a primary or contributing COD for 20% of otters. Cardiomyopathy was also highly prevalent as a primary or contributing COD (41%) and exhibited significant associations with DA intoxication and protozoal infection. For adult and aged adult females in late pup care through post-weaning at the time of death, 83% had end lactation syndrome (ELS) as a primary or contributing COD. This comprehensive longitudinal dataset is unique in its depth and scope. The large sample size and extensive time period provided an opportunity to investigate mortality patterns in a changing environment and identify spatial and temporal disease “hot spots” and emerging threats. Our findings will help improve estimates of population-level impacts of specific threats and optimize conservation and environmental mitigation efforts for this threatened species.


2002 ◽  
Vol 29 (4) ◽  
pp. 436-459 ◽  
Author(s):  
Robert S. Steneck ◽  
Michael H. Graham ◽  
Bruce J. Bourque ◽  
Debbie Corbett ◽  
Jon M. Erlandson ◽  
...  

Kelp forests are phyletically diverse, structurally complex and highly productive components of coldwater rocky marine coastlines. This paper reviews the conditions in which kelp forests develop globally and where, why and at what rate they become deforested. The ecology and long archaeological history of kelp forests are examined through case studies from southern California, the Aleutian Islands and the western North Atlantic, well-studied locations that represent the widest possible range in kelp forest biodiversity. Global distribution of kelp forests is physiologically constrained by light at high latitudes and by nutrients, warm temperatures and other macrophytes at low latitudes. Within mid-latitude belts (roughly 40–60° latitude in both hemispheres) well-developed kelp forests are most threatened by herbivory, usually from sea urchins. Overfishing and extirpation of highly valued vertebrate apex predators often triggered herbivore population increases, leading to widespread kelp deforestation. Such deforestations have the most profound and lasting impacts on species-depauperate systems, such as those in Alaska and the western North Atlantic. Globally urchin-induced deforestation has been increasing over the past 2–3 decades. Continued fishing down of coastal food webs has resulted in shifting harvesting targets from apex predators to their invertebrate prey, including kelp-grazing herbivores. The recent global expansion of sea urchin harvesting has led to the widespread extirpation of this herbivore, and kelp forests have returned in some locations but, for the first time, these forests are devoid of vertebrate apex predators. In the western North Atlantic, large predatory crabs have recently filled this void and they have become the new apex predator in this system. Similar shifts from fish- to crab-dominance may have occurred in coastal zones of the United Kingdom and Japan, where large predatory finfish were extirpated long ago. Three North American case studies of kelp forests were examined to determine their long history with humans and project the status of future kelp forests to the year 2025. Fishing impacts on kelp forest systems have been both profound and much longer in duration than previously thought. Archaeological data suggest that coastal peoples exploited kelp forest organisms for thousands of years, occasionally resulting in localized losses of apex predators, outbreaks of sea urchin populations and probably small-scale deforestation. Over the past two centuries, commercial exploitation for export led to the extirpation of sea urchin predators, such as the sea otter in the North Pacific and predatory fishes like the cod in the North Atlantic. The large-scale removal of predators for export markets increased sea urchin abundances and promoted the decline of kelp forests over vast areas. Despite southern California having one of the longest known associations with coastal kelp forests, widespread deforestation is rare. It is possible that functional redundancies among predators and herbivores make this most diverse system most stable. Such biodiverse kelp forests may also resist invasion from non-native species. In the species-depauperate western North Atlantic, introduced algal competitors carpet the benthos and threaten future kelp dominance. There, other non-native herbivores and predators have become established and dominant components of this system. Climate changes have had measurable impacts on kelp forest ecosystems and efforts to control the emission of greenhouse gasses should be a global priority. However, overfishing appears to be the greatest manageable threat to kelp forest ecosystems over the 2025 time horizon. Management should focus on minimizing fishing impacts and restoring populations of functionally important species in these systems.


Sign in / Sign up

Export Citation Format

Share Document