Tracking oxy-thermal habitat compression encountered by Chesapeake Bay striped bass through acoustic telemetry

Author(s):  
Hikaru Itakura ◽  
Michael H P O’Brien ◽  
David Secor

Abstract In many coastal ecosystems, habitat compression is caused by seasonal combinations of hypoxia and supraoptimal temperatures. These conditions commonly induce avoidance behaviours in mobile species, resulting in the concentrated use of marginal habitats. Using 3 years of acoustic telemetry and high-resolution water quality data recorded throughout Chesapeake Bay, we measured the seasonal movements and exposure of striped bass (Morone saxatilis) to oxy-thermal habitat compression. Striped bass moved to tidal freshwaters in spring (March–May), mesohaline waters in summer (June–August) and fall (September–November), and mesohaline and polyhaline waters in winter (December–February): seasonal patterns consistent with known spawning, foraging, and overwintering migrations. Analyses of habitat selection suggest that during conditions of prevalent sub-pycnocline hypoxia (June–September), striped bass appeared to select surface waters (i.e. they may avoid bottom hypoxic waters). Striped bass detections indicated tolerance of a wide range of surface water temperatures, including those >25°C, which regional regulatory bodies stipulate are stressful for this species. Still, during summer and fall striped bass selected the lowest-available temperature and avoided water temperature >27°C, demonstrating that Chesapeake Bay striped bass can encounter habitat compressions due to the behavioural avoidance of bottom hypoxia and high temperatures.

2018 ◽  
Vol 22 (2) ◽  
pp. 1175-1192 ◽  
Author(s):  
Qian Zhang ◽  
Ciaran J. Harman ◽  
James W. Kirchner

Abstract. River water-quality time series often exhibit fractal scaling, which here refers to autocorrelation that decays as a power law over some range of scales. Fractal scaling presents challenges to the identification of deterministic trends because (1) fractal scaling has the potential to lead to false inference about the statistical significance of trends and (2) the abundance of irregularly spaced data in water-quality monitoring networks complicates efforts to quantify fractal scaling. Traditional methods for estimating fractal scaling – in the form of spectral slope (β) or other equivalent scaling parameters (e.g., Hurst exponent) – are generally inapplicable to irregularly sampled data. Here we consider two types of estimation approaches for irregularly sampled data and evaluate their performance using synthetic time series. These time series were generated such that (1) they exhibit a wide range of prescribed fractal scaling behaviors, ranging from white noise (β  =  0) to Brown noise (β  =  2) and (2) their sampling gap intervals mimic the sampling irregularity (as quantified by both the skewness and mean of gap-interval lengths) in real water-quality data. The results suggest that none of the existing methods fully account for the effects of sampling irregularity on β estimation. First, the results illustrate the danger of using interpolation for gap filling when examining autocorrelation, as the interpolation methods consistently underestimate or overestimate β under a wide range of prescribed β values and gap distributions. Second, the widely used Lomb–Scargle spectral method also consistently underestimates β. A previously published modified form, using only the lowest 5 % of the frequencies for spectral slope estimation, has very poor precision, although the overall bias is small. Third, a recent wavelet-based method, coupled with an aliasing filter, generally has the smallest bias and root-mean-squared error among all methods for a wide range of prescribed β values and gap distributions. The aliasing method, however, does not itself account for sampling irregularity, and this introduces some bias in the result. Nonetheless, the wavelet method is recommended for estimating β in irregular time series until improved methods are developed. Finally, all methods' performances depend strongly on the sampling irregularity, highlighting that the accuracy and precision of each method are data specific. Accurately quantifying the strength of fractal scaling in irregular water-quality time series remains an unresolved challenge for the hydrologic community and for other disciplines that must grapple with irregular sampling.


Water ◽  
2020 ◽  
Vol 12 (6) ◽  
pp. 1568
Author(s):  
Barbara A. Doll ◽  
J. Jack Kurki-Fox ◽  
Jonathan L. Page ◽  
Natalie G. Nelson ◽  
Jeffrey P. Johnson

Stream restoration for mitigation purposes has grown rapidly since the 1980s. As the science advances, some organizations (Chesapeake Bay Program, North Carolina Department of Environmental Quality) have approved or are considering providing nutrient credits for stream restoration projects. Nutrient treatment on floodplains during overbank events is one of the least understood processes that have been considered as part of the Chesapeake Bay Program’s Stream Restoration Nutrient Crediting program. This study analyzed ten years of streamflow and water quality data from five stations in the Piedmont of North Carolina to evaluate proposed procedures for estimating nitrogen removal on the floodplain during overbank flow events. The volume of floodplain flow, the volume of floodplain flow potentially treated, and the nitrogen load retained on the floodplain were calculated for each overbank event, and a sensitivity analysis was completed. On average, 9% to 15% of the total annual streamflow volume accessed the floodplain. The percentage of the average annual volume of streamflow potentially treated ranged from 1.0% to 5.1%. Annually, this equates to 0.2% to 1.0% of the total N load retained/removed on the floodplain following restoration. The relatively low nitrogen retention/removal rates were due to a majority of floodplain flow occurring during a few large events each year that exceeded the treatment capacity of the floodplain. On an annual basis, 90% of total floodplain flow occurred during half of all overbank events and 50% of total floodplain flow occurred during two to three events each year. Findings suggest that evaluating only overbank events may lead to undervaluing stream restoration because treatment is limited by hydrologic controls that restrict floodplain retention time. Treatment is further governed by floodplain and channel size.


2020 ◽  
Vol 125 (7) ◽  
Author(s):  
Maria Herrmann ◽  
Raymond G. Najjar ◽  
Fei Da ◽  
Jaclyn R. Friedman ◽  
Marjorie A. M. Friedrichs ◽  
...  

2013 ◽  
Vol 10 (1) ◽  
pp. 699-728 ◽  
Author(s):  
P. J. Gerla

Abstract. Carbonate reactions and equilibria play a dominant role in the biogeochemical function of many wetlands. The US Geological Survey PHREEQC computer code was used to model geochemical reactions that may be typical for wetlands with water budgets characterized by: (a) input dominated by direct precipitation, (b) interaction with groundwater, (c) variable degrees of reaction with organic carbon, and (d) different rates of evapotranspiration. Rainfall with a typical composition was progressively reacted with calcite and organic carbon at various rates and proportions using PHREEQC. Contrasting patterns of the results suggest that basic water quality data collected in the field can reveal differences in the geochemical processes in wetlands. Given a temporal record, these can signal subtle changes in surrounding land cover and use. To demonstrate this, temperature, pH, and electrical conductivity (EC) were monitored for three years in five large wetlands comprising 48 sample sites in northwest Minnesota. EC and pH of samples ranged greatly – from 23 to 1300 μS cm−1 and 5.5 to 9. The largest range in pH was observed in small beach ridge wetlands, where two clusters are apparent: (1) low EC and a wide range of pH and (2) higher pH and EC. Large marshes within a glacial lake – till plain have a broad range of pH and EC, but depend on the specific wetland. Outlying data typically occurred in altered or disturbed areas. The inter-annual and intra-wetland consistency of the results suggests that each wetland system hosts characteristic geochemical conditions.


2017 ◽  
Author(s):  
Jonathan S Lefcheck ◽  
David J Wilcox ◽  
Rebecca R Murphy ◽  
Scott R Marion ◽  
Robert J Orth

Interactions among global change stressors and their effects at large scales are often proposed, but seldom evaluated. This situation is primarily due to lack of comprehensive, sufficiently long-term, and spatially-extensive datasets. Seagrasses, which provide nursery habitat, improve water quality, and constitute a globally-important carbon sink, are among the most vulnerable habitats on the planet. Here, we unite 31-years of high-resolution aerial monitoring and water quality data to elucidate the patterns and drivers of eelgrass (Zostera marina) abundance in Chesapeake Bay, USA, one of the largest and most valuable estuaries in the world with an unparalleled history of regulatory efforts. We show that eelgrass area has declined 29% in total since 1991, with wide-ranging and severe ecological and economic consequences. We go on to identify an interaction between decreasing water clarity and warming temperatures as the primary driver of this trend. Declining clarity has gradually reduced eelgrass over the past two decades, primarily in deeper beds where light is already limiting. In shallow beds, however, reduced visibility exacerbates the physiological stress of acute warming, leading to recent instances of decline approaching 80%. While degraded water quality has long been known to influence underwater grasses worldwide, we demonstrate a clear and rapidly emerging interaction with climate change. We highlight the urgent need to integrate a broader perspective into local water quality management, in the Chesapeake Bay and in the many other coastal systems facing similar stressors.


Author(s):  
Mondher Chehata ◽  
David Jasinski ◽  
Michael C. Monteith ◽  
William B. Samuels

2017 ◽  
Author(s):  
Qian Zhang ◽  
Ciaran J. Harman ◽  
James W. Kirchner

Abstract. River water-quality time series often exhibit fractal scaling, which here refers to autocorrelation that decays as a power law over some range of scales. Fractal scaling presents challenges to the identification of deterministic trends, but traditional methods for estimating spectral slope (β) or other equivalent scaling parameters (e.g., Hurst exponent) are generally inapplicable to irregularly sampled data. Here we consider two types of estimation approaches for irregularly sampled data and evaluate their performance using synthetic time series. These time series were generated such that (1) they exhibit a wide range of prescribed fractal scaling behaviors, ranging from white noise (β = 0) to Brown noise (β = 2), and (2) their sampling gap intervals mimic the sampling irregularity (as quantified by both the skewness and mean of gap-interval lengths) in real water-quality data. The results suggest that none of the existing methods fully account for the effects of sampling irregularity on β estimation. First, the results illustrate the danger of using interpolation for gap filling when examining auto-correlation, as the interpolation methods consistently under-estimate or over-estimate β under a wide range of prescribed β values and gap distributions. Second, the long-established Lomb-Scargle spectral method also consistently under-estimates β. A modified form, using only the lowest 5 % of the frequencies for spectral slope estimation, has very poor precision, although the overall bias is small. Third, a recent wavelet-based method, coupled with an aliasing filter, generally has the smallest bias and root-mean-squared error among all methods for a wide range of prescribed β values and gap distributions. The aliasing method, however, does not itself account for sampling irregularity, and this introduces some bias in the result. Nonetheless, the wavelet method is recommended for estimating β in irregular time series until improved methods are developed. Finally, all methods' performances depend strongly on the sampling irregularity, highlighting that the accuracy and precision of each method are data-specific. Accurately quantifying the strength of fractal scaling in irregular water-quality time series remains an unresolved challenge for the hydrologic community and for other disciplines that must grapple with irregular sampling.


1995 ◽  
Vol 31 (8) ◽  
pp. 133-139 ◽  
Author(s):  
L. R. Shuyler ◽  
L. C. Linker ◽  
C. P. Walters

The Chesapeake Bay Program (CBP) is based on good science and high quality data. This has allowed the program to set and implement meaningful goals. The research phase resulted in the “1983 Chesapeake Bay Agreement”, which called for the jurisdictions to focus existing pollution control programs on reducing the nutrient loads to the Bay. A second “Bay Agreement” was developed and signed by the jurisdictions in 1987. This agreement contained 27 specific goals including a Basinwide Nutrient Reduction Strategy to reduce “1985 controllable” nutrient loads to the Bay by 40 percent in the year 2000. To assure high quality monitoring data, CBP established a strong quality assurance and quality control procedure which is used for all monitoring. To assist with the monitoring a computer program, “Chesapeake Bay Automated Monitoring System” was developed to evaluate the quality of field and laboratory data and to allow the data to be directly loaded into the CBP computers. The Chesapeake Bay Program Office developed models for the drainage basin and the water of the Bay. The watershed model simulates the pollutant loads from eight land uses, the majority of the point sources and atmospheric deposition. It processes these loads through the river systems and delivers the load to the Bay for use in the Bay model. The Bay model uses these loads and adds atmospheric deposition, loads from the ocean interface and loads from bottom sediments to simulate water quality data at all points in the Bay. The models were used to confirm that the 40% nutrient reduction goal was correct, resulting in an amendment to the 1987 Agreement, calling for a commitment by the jurisdictions to develop tributary specific strategies to reach the goal. The tributary strategies lay out the future goals and direction that must be taken to reach the nutrient reduction goals. These tributary strategies are being evaluated by the program office, using the models. The results of the evaluation and model simulation of the implementation progress in the basin are discussed along with the economic implications of reaching these goals.


Sign in / Sign up

Export Citation Format

Share Document