scholarly journals Nearshore Benthic Mapping in the Great Lakes: A Multi-Agency Data Integration Approach in Southwest Lake Michigan

2021 ◽  
Vol 13 (15) ◽  
pp. 3026
Author(s):  
Molly K. Reif ◽  
Brandon S. Krumwiede ◽  
Steven E. Brown ◽  
Ethan J. Theuerkauf ◽  
Joseph H. Harwood

The Laurentian Great Lakes comprise the largest assemblage of inland waterbodies in North America, with vast geographic, environmentally complex nearshore benthic substrate and associated habitat. The Great Lakes Water Quality Agreement, originally signed in 1972, aims to help restore and protect the basin, and ecosystem monitoring is a primary objective to support adaptive management, environmental policy, and decision making. Yet, monitoring ecosystem trends remains challenging, potentially hindering progress in lake management and restoration. Consistent, high-resolution maps of nearshore substrate and associated habitat are fundamental to support management needs, and the nexus of high-quality remotely sensed data with improvements to analytical methods are increasing opportunities for large-scale nearshore benthic mapping at project-relevant spatial resolutions. This study attempts to advance the integration of high-fidelity data (airborne imagery and lidar, satellite imagery, in situ observations, etc.) and machine learning to identify and classify nearshore benthic substrate and associated habitat using a case study in southwest Lake Michigan along Illinois Beach State Park, Illinois, USA. Data inputs and analytical methods were evaluated to better understand their implications with respect to the Coastal and Marine Ecological Classification Standard (CMECS) classification hierarchy, resulting in an approach that could be easily applied to other shallow coastal environments. Classification of substrate and biotic components were iteratively classified in two Tiers in which classes with increasing specificity were identified using different combinations of airborne and satellite data inputs. Classification accuracy assessments revealed that for the Tier 1 substrate component (3 classes), average overall accuracy was 90.10 ± 0.60% for 24 airborne data combinations and 89.77 ± 1.02% for 12 satellite data combinations, whereas the Tier 1 biotic component (2 classes) average overall accuracy was 93.58 ± 0.91% for 24 airborne data combinations and 92.67 ± 0.71% for 11 satellite data combinations. The Tier 2 result for the substrate component (2 classes) was 93.28% for 2 airborne data combinations and 95.25% for the biotic component (2 classes). The study builds on foundational efforts to move towards a more integrated data approach, whereby data strengths and limitations for mapping nearshore benthic substrate and associated habitat, expressed through classification accuracy, were evaluated within the context of the CMECS classification hierarchy, and has direct applicability to critical monitoring needs in the Great Lakes.

Climate ◽  
2021 ◽  
Vol 9 (3) ◽  
pp. 43
Author(s):  
Jake Wiley ◽  
Andrew Mercer

As the mesoscale dynamics of lake-effect snow (LES) are becoming better understood, recent and ongoing research is beginning to focus on the large-scale environments conducive to LES. Synoptic-scale composites are constructed for Lake Michigan and Lake Superior LES events by employing an LES case repository for these regions within the U.S. North American Regional Reanalysis (NARR) data for each LES event were used to construct synoptic maps of dominant LES patterns for each lake. These maps were formulated using a previously implemented composite technique that blends principal component analysis with a k-means cluster analysis. A sample case from each resulting cluster was also selected and simulated using the Advanced Weather Research and Forecast model to obtain an example mesoscale depiction of the LES environment. The study revealed four synoptic setups for Lake Michigan and three for Lake Superior whose primary differences were discrepancies in a surface pressure dipole structure previously linked with Great Lakes LES. These subtle synoptic-scale differences suggested that while overall LES impacts were driven more by the mesoscale conditions for these lakes, synoptic-scale conditions still provided important insight into the character of LES forcing mechanisms, primarily the steering flow and air–lake thermodynamics.


2019 ◽  
Vol 58 (11) ◽  
pp. 2421-2436 ◽  
Author(s):  
M. Talat Odman ◽  
Andrew T. White ◽  
Kevin Doty ◽  
Richard T. McNider ◽  
Arastoo Pour-Biazar ◽  
...  

AbstractHigh levels of ozone have been observed along the shores of Lake Michigan for the last 40 years. Models continue to struggle in their ability to replicate ozone behavior in the region. In the retrospective way in which models are used in air quality regulation development, nudging or four-dimensional data assimilation (FDDA) of the large-scale environment is important for constraining model forecast errors. Here, paths for incorporating large-scale meteorological conditions but retaining model mesoscale structure are evaluated. For the July 2011 case studied here, iterative FDDA strategies did not improve mesoscale performance in the Great Lakes region in terms of diurnal trends or monthly averaged statistics, with overestimations of nighttime wind speed remaining as an issue. Two vertical nudging strategies were evaluated for their effects on the development of nocturnal low-level jets (LLJ) and their impacts on air quality simulations. Nudging only above the planetary boundary layer, which has been a standard option in many air quality simulations, significantly dampened the amplitude of LLJ relative to nudging only above a height of 2 km. While the LLJ was preserved with nudging only above 2 km, there was some deterioration in wind performance when compared with profiler networks above the jet between 500 m and 2 km. In examining the impact of nudging strategies on air quality performance of the Community Multiscale Air Quality model, it was found that performance was improved for the case of nudging above 2 km. This result may reflect the importance of the LLJ in transport or perhaps a change in mixing in the models.


2021 ◽  
Vol 13 (15) ◽  
pp. 3023
Author(s):  
Jinghua Xiong ◽  
Shenglian Guo ◽  
Jiabo Yin ◽  
Lei Gu ◽  
Feng Xiong

Flooding is one of the most widespread and frequent weather-related hazards that has devastating impacts on the society and ecosystem. Monitoring flooding is a vital issue for water resources management, socioeconomic sustainable development, and maintaining life safety. By integrating multiple precipitation, evapotranspiration, and GRACE-Follow On (GRAFO) terrestrial water storage anomaly (TWSA) datasets, this study uses the water balance principle coupled with the CaMa-Flood hydrodynamic model to access the spatiotemporal discharge variations in the Yangtze River basin during the 2020 catastrophic flood. The results show that: (1) TWSA bias dominates the overall uncertainty in runoff at the basin scale, which is spatially governed by uncertainty in TWSA and precipitation; (2) spatially, a field significance at the 5% level is discovered for the correlations between GRAFO-based runoff and GLDAS results. The GRAFO-derived discharge series has a high correlation coefficient with either in situ observations and hydrological simulations for the Yangtze River basin, at the 0.01 significance level; (3) the GRAFO-derived discharge observes the flood peaks in July and August and the recession process in October 2020. Our developed approach provides an alternative way of monitoring large-scale extreme hydrological events with the latest GRAFO release and CaMa-Flood model.


2021 ◽  
Vol 13 (2) ◽  
pp. 176
Author(s):  
Peng Zheng ◽  
Zebin Wu ◽  
Jin Sun ◽  
Yi Zhang ◽  
Yaoqin Zhu ◽  
...  

As the volume of remotely sensed data grows significantly, content-based image retrieval (CBIR) becomes increasingly important, especially for cloud computing platforms that facilitate processing and storing big data in a parallel and distributed way. This paper proposes a novel parallel CBIR system for hyperspectral image (HSI) repository on cloud computing platforms under the guide of unmixed spectral information, i.e., endmembers and their associated fractional abundances, to retrieve hyperspectral scenes. However, existing unmixing methods would suffer extremely high computational burden when extracting meta-data from large-scale HSI data. To address this limitation, we implement a distributed and parallel unmixing method that operates on cloud computing platforms in parallel for accelerating the unmixing processing flow. In addition, we implement a global standard distributed HSI repository equipped with a large spectral library in a software-as-a-service mode, providing users with HSI storage, management, and retrieval services through web interfaces. Furthermore, the parallel implementation of unmixing processing is incorporated into the CBIR system to establish the parallel unmixing-based content retrieval system. The performance of our proposed parallel CBIR system was verified in terms of both unmixing efficiency and accuracy.


Sensors ◽  
2021 ◽  
Vol 21 (12) ◽  
pp. 3982
Author(s):  
Giacomo Lazzeri ◽  
William Frodella ◽  
Guglielmo Rossi ◽  
Sandro Moretti

Wildfires have affected global forests and the Mediterranean area with increasing recurrency and intensity in the last years, with climate change resulting in reduced precipitations and higher temperatures. To assess the impact of wildfires on the environment, burned area mapping has become progressively more relevant. Initially carried out via field sketches, the advent of satellite remote sensing opened new possibilities, reducing the cost uncertainty and safety of the previous techniques. In the present study an experimental methodology was adopted to test the potential of advanced remote sensing techniques such as multispectral Sentinel-2, PRISMA hyperspectral satellite, and UAV (unmanned aerial vehicle) remotely-sensed data for the multitemporal mapping of burned areas by soil–vegetation recovery analysis in two test sites in Portugal and Italy. In case study one, innovative multiplatform data classification was performed with the correlation between Sentinel-2 RBR (relativized burn ratio) fire severity classes and the scene hyperspectral signature, performed with a pixel-by-pixel comparison leading to a converging classification. In the adopted methodology, RBR burned area analysis and vegetation recovery was tested for accordance with biophysical vegetation parameters (LAI, fCover, and fAPAR). In case study two, a UAV-sensed NDVI index was adopted for high-resolution mapping data collection. At a large scale, the Sentinel-2 RBR index proved to be efficient for burned area analysis, from both fire severity and vegetation recovery phenomena perspectives. Despite the elapsed time between the event and the acquisition, PRISMA hyperspectral converging classification based on Sentinel-2 was able to detect and discriminate different spectral signatures corresponding to different fire severity classes. At a slope scale, the UAV platform proved to be an effective tool for mapping and characterizing the burned area, giving clear advantage with respect to filed GPS mapping. Results highlighted that UAV platforms, if equipped with a hyperspectral sensor and used in a synergistic approach with PRISMA, would create a useful tool for satellite acquired data scene classification, allowing for the acquisition of a ground truth.


Forests ◽  
2021 ◽  
Vol 12 (4) ◽  
pp. 430 ◽  
Author(s):  
Ronald S. Zalesny ◽  
Andrej Pilipović ◽  
Elizabeth R. Rogers ◽  
Joel G. Burken ◽  
Richard A. Hallett ◽  
...  

Poplar remediation systems are ideal for reducing runoff, cleaning groundwater, and delivering ecosystem services to the North American Great Lakes and globally. We used phyto-recurrent selection (PRS) to establish sixteen phytoremediation buffer systems (phyto buffers) (buffer groups: 2017 × 6; 2018 × 5; 2019 × 5) throughout the Lake Superior and Lake Michigan watersheds comprised of twelve PRS-selected clones each year. We tested for differences in genotypes, environments, and their interactions for health, height, diameter, and volume from ages one to four years. All trees had optimal health. Mean first-, second-, and third-year volume ranged from 71 ± 26 to 132 ± 39 cm3; 1440 ± 575 to 5765 ± 1132 cm3; and 8826 ± 2646 to 10,530 ± 2110 cm3, respectively. Fourth-year mean annual increment of 2017 buffer group trees ranged from 1.1 ± 0.7 to 7.8 ± 0.5 Mg ha−1 yr−1. We identified generalist varieties with superior establishment across a broad range of buffers (‘DM114’, ‘NC14106’, ‘99038022’, ‘99059016’) and specialist clones uniquely adapted to local soil and climate conditions (‘7300502’, ‘DN5’, ‘DN34’, ‘DN177’, ‘NM2’, ‘NM5’, ‘NM6’). Using generalists and specialists enhances the potential for phytoremediation best management practices that are geographically robust, being regionally designed yet globally relevant.


2019 ◽  
Vol 29 (Supplement_4) ◽  
Author(s):  
J C Rejon-Parrilla ◽  
M Salcher-Konrad ◽  
M Nguyen ◽  
K Davis ◽  
P Jonsson ◽  
...  

Abstract Background Increasingly, health technology assessment (HTA) agencies must decide whether new medicines should be used routinely in the absence of randomised controlled trial (RCT) data, relying solely on non-randomised studies (NRS), which are at high risk of bias due to confounding. Against the background of increased availability and improved methods to analyse non-randomised data (e.g., propensity score methods and instrumental variables), it is important for decision-makers to have guidance on the analysis and interpretation of NRS to inform health economic evaluation. We therefore aimed to systematically and empirically assess the performance of NRS using different analytical methods as compared to RCTs and develop recommendations on the basis of our findings. Methods We conducted a large-scale meta-epidemiological review to obtain estimates of the discrepancy in treatment effects in matched RCTs and NRS of pharmacologic interventions from published meta-analyses indexed in MEDLINE and the Cochrane Database of Systematic Reviews. We also consulted with HTA bodies, regulators and academics from five European countries to learn from their experience with using non-randomised evidence. Results We compiled the largest dataset of clinical topics with matching RCTs and NRS using various analytical methods to date, covering >100 unique clinical questions. Incorporating information on direction of effect and effect size from >700 unique studies, the dataset can be used to evaluate discrepancies in treatment effects between study designs across a wide range of therapeutic areas. Conclusions An empirically based understanding of the risk of bias in NRS is required in order to promote the adequate use of non-randomised evidence as input for health economic decision-making.


2015 ◽  
Vol 112 (19) ◽  
pp. 6236-6241 ◽  
Author(s):  
Thomas M. Neeson ◽  
Michael C. Ferris ◽  
Matthew W. Diebel ◽  
Patrick J. Doran ◽  
Jesse R. O’Hanley ◽  
...  

In many large ecosystems, conservation projects are selected by a diverse set of actors operating independently at spatial scales ranging from local to international. Although small-scale decision making can leverage local expert knowledge, it also may be an inefficient means of achieving large-scale objectives if piecemeal efforts are poorly coordinated. Here, we assess the value of coordinating efforts in both space and time to maximize the restoration of aquatic ecosystem connectivity. Habitat fragmentation is a leading driver of declining biodiversity and ecosystem services in rivers worldwide, and we simultaneously evaluate optimal barrier removal strategies for 661 tributary rivers of the Laurentian Great Lakes, which are fragmented by at least 6,692 dams and 232,068 road crossings. We find that coordinating barrier removals across the entire basin is nine times more efficient at reconnecting fish to headwater breeding grounds than optimizing independently for each watershed. Similarly, a one-time pulse of restoration investment is up to 10 times more efficient than annual allocations totaling the same amount. Despite widespread emphasis on dams as key barriers in river networks, improving road culvert passability is also essential for efficiently restoring connectivity to the Great Lakes. Our results highlight the dramatic economic and ecological advantages of coordinating efforts in both space and time during restoration of large ecosystems.


2005 ◽  
Vol 26 (23) ◽  
pp. 5325-5342 ◽  
Author(s):  
G. E. Host ◽  
J. Schuldt ◽  
J. J. H. Ciborowski ◽  
L. B. Johnson ◽  
T. Hollenhorst ◽  
...  

2016 ◽  
Author(s):  
Rogier Westerhoff ◽  
Paul White ◽  
Zara Rawlinson

Abstract. Large-scale models and satellite data are increasingly used to characterise groundwater and its recharge at the global scale. Although these models have the potential to fill in data gaps and solve trans-boundary issues, they are often neglected in smaller-scale studies, since data are often coarse or uncertain. Large-scale models and satellite data could play a more important role in smaller-scale (i.e., national or regional) studies, if they could be adjusted to fit that scale. In New Zealand, large-scale models and satellite data are not used for groundwater recharge estimation at the national scale, since regional councils (i.e., the water managers) have varying water policy and models are calibrated at the local scale. Also, some regions have many localised ground observations (but poor record coverage), whereas others are data-sparse. Therefore, estimation of recharge is inconsistent at the national scale. This paper presents an approach to apply large-scale, global, models and satellite data to estimate rainfall recharge at the national to regional scale across New Zealand. We present a model, NGRM, that is largely inspired by the global-scale WaterGAP recharge model, but is improved and adjusted using national data. The NGRM model uses MODIS-derived ET and vegetation satellite data, and the available nation-wide datasets on rainfall, elevation, soil and geology. A valuable addition to the recharge estimation is the model uncertainty estimate, based on variance, covariance and sensitivity of all input data components in the model environment. This research shows that, with minor model adjustments and use of improved input data, large-scale models and satellite data can be used to derive rainfall recharge estimates, including their uncertainty, at the smaller scale, i.e., national and regional scale of New Zealand. The estimated New Zealand recharge of the NGRM model compare well to most local and regional lysimeter data and recharge models. The NGRM is therefore assumed to be capable to fill in gaps in data-sparse areas and to create more consistency between datasets from different regions, i.e., to solve trans-boundary issues. This research also shows that smaller-scale recharge studies in New Zealand should include larger boundaries than only a (sub-)aquifer, and preferably the whole catchment. This research points out the need for improved collaboration on the international to national to regional levels to further merge large-scale (global) models to smaller (i.e., national or regional) scales. Future research topics should, collaboratively, focus on: improvement of rainfall-runoff and snowmelt methods; inclusion of river recharge; further improvement of input data (rainfall, evapotranspiration, soil and geology); and the impact of recharge uncertainty in mountainous and irrigated areas.


Sign in / Sign up

Export Citation Format

Share Document