Advances and next steps exploring the role of groundwater storage in watershed variability across spatial scales

Author(s):  
Laura Condon

<p>Groundwater is  by far the largest unfrozen freshwater resource on the planet. Yet it is often excluded or greatly simplified in global and  continental scale models.   It’s well established that feedbacks between groundwater depth, surface runoff and land energy fluxes can influence watershed dynamics. However, we still don’t understand the large scale implications of these exchanges in evolving systems.  Advances in continental scale integrated hydrologic modeling increasingly allow us to explore these interactions across spatial scales. With large scale models we can start to quantify the total impact that groundwater surface water exchanges have on the water balance as a whole, as well as watershed dynamics.  Here I will explore the buffering effect that groundwater can have on both human and natural water stressors across the US and the physical drivers of these connections.  I will also explore the impacts of long term trends on the stability of groundwater surface water exchanges. These results demonstrate the importance of the subsurface for future hydrologic predictions, and the potential gains from improved groundwater representations large scale simulations.  While there have been great advances in large scale groundwater modeling in recent years, there is still significant need for continued community model development and intercomparison.</p>

2021 ◽  
Author(s):  
Kor de Jong ◽  
Marc van Kreveld ◽  
Debabrata Panja ◽  
Oliver Schmitz ◽  
Derek Karssenberg

<p>Data availability at global scale is increasing exponentially. Although considerable challenges remain regarding the identification of model structure and parameters of continental scale hydrological models, we will soon reach the situation that global scale models could be defined at very high resolutions close to 100 m or less. One of the key challenges is how to make simulations of these ultra-high resolution models tractable ([1]).</p><p>Our research contributes by the development of a model building framework that is specifically designed to distribute calculations over multiple cluster nodes. This framework enables domain experts like hydrologists to develop their own large scale models, using a scripting language like Python, without the need to acquire the skills to develop low-level computer code for parallel and distributed computing.</p><p>We present the design and implementation of this software framework and illustrate its use with a prototype 100 m, 1 h continental scale hydrological model. Our modelling framework ensures that any model built with it is parallelized. This is made possible by providing the model builder with a set of building blocks of models, which are coded in such a manner that parallelization of calculations occurs within and across these building blocks, for any combination of building blocks. There is thus full flexibility on the side of the modeller, without losing performance.</p><p>This breakthrough is made possible by applying a novel approach to the implementation of the model building framework, called asynchronous many-tasks, provided by the HPX C++ software library ([3]). The code in the model building framework expresses spatial operations as large collections of interdependent tasks that can be executed efficiently on individual laptops as well as computer clusters ([2]). Our framework currently includes the most essential operations for building large scale hydrological models, including those for simulating transport of material through a flow direction network. By combining these operations, we rebuilt an existing 100 m, 1 h resolution model, thus far used for simulations of small catchments, requiring limited coding as we only had to replace the computational back end of the existing model. Runs at continental scale on a computer cluster show acceptable strong and weak scaling providing a strong indication that global simulations at this resolution will soon be possible, technically speaking.</p><p>Future work will focus on extending the set of modelling operations and adding scalable I/O, after which existing models that are currently limited in their ability to use the computational resources available to them can be ported to this new environment.</p><p>More information about our modelling framework is at https://lue.computationalgeography.org.</p><p><strong>References</strong></p><p>[1] M. Bierkens. Global hydrology 2015: State, trends, and directions. Water Resources Research, 51(7):4923–4947, 2015.<br>[2] K. de Jong, et al. An environmental modelling framework based on asynchronous many-tasks: scalability and usability. Submitted.<br>[3] H. Kaiser, et al. HPX - The C++ standard library for parallelism and concurrency. Journal of Open Source Software, 5(53):2352, 2020.</p>


2018 ◽  
Vol 76 (4) ◽  
pp. 1072-1082 ◽  
Author(s):  
Niels T Hintzen ◽  
Geert Aarts ◽  
Adriaan D Rijnsdorp

Abstract High-resolution vessel monitoring (VMS) data have led to detailed estimates of the distribution of fishing in both time and space. While several studies have documented large-scale changes in fishing distribution, fine-scale patterns are still poorly documented, despite VMS data allowing for such analyses. We apply a methodology that can explain and predict effort allocation at fine spatial scales; a scale relevant to assess impact on the benthic ecosystem. This study uses VMS data to quantify the stability of fishing grounds (i.e. aggregated fishing effort) at a microscale (tens of meters). The model links effort registered at a large scale (ICES rectangle; 1° longitude × 0.5° latitude, ˜3600 km2) to fine spatial trawling intensities at a local scale (i.e. scale matching gear width, here 24 m). For the first time in the literature, the method estimates the part of an ICES rectangle that is unfavourable or inaccessible for fisheries, which is shown to be highly stable over time and suggests higher proportions of inaccessible grounds for either extremely muddy or courser substrates. The study furthermore shows high stability in aggregation of fishing, where aggregation shows a positive relationship with depth heterogeneity and a negative relationship with year-on-year variability in fishing intensity.


2011 ◽  
Vol 50 (12) ◽  
pp. 2504-2513 ◽  
Author(s):  
Stefanie M. Herrmann ◽  
Karen I. Mohr

AbstractA classification of rainfall seasonality regimes in Africa was derived from gridded rainfall and land surface temperature products. By adapting a method that goes back to Walter and Lieth’s approach of presenting climatic diagrams, relationships between estimated rainfall and temperature were used to determine the presence and pattern of humid, arid, and dry months. The temporal sequence of humid, arid, and dry months defined nonseasonal as well as single-, dual-, and multiple-wet-season regimes with one or more rainfall peaks per wet season. The use of gridded products resulted in a detailed, spatially continuous classification for the entire African continent at two different spatial resolutions, which compared well to local-scale studies based on station data. With its focus on rainfall patterns at fine spatial scales, this classification is complementary to coarser and more genetic classifications based on atmospheric driving forces. An analysis of the stability of the resulting seasonality regimes shows areas of relatively high year-to-year stability in the single-wet-season regimes and areas of lower year-to-year stability in the dual- and multiple-wet-season regimes as well as in transition zones.


2021 ◽  
Author(s):  
Denis Anikiev ◽  
Hans-Jürgen Götze ◽  
Judith Bott ◽  
Angela Maria Gómez-García ◽  
Maria Laura Gomez Dacal ◽  
...  

<p>We introduce a modelling concept for the construction of 3-D data-constrained subsurface structural density models at different spatial scales: from large-scale models (thousands of square km) to regional (hundreds of square km) and small-scale (tens of square km) models used in applied geophysics. These models are important for understanding the drivers of geohazards, for efficient and sustainable extraction of resources from sedimentary basins such as groundwater, hydrocarbons or deep geothermal energy, as well as for investigation of capabilities of long-term underground storage of gas and radioactive materials.</p><p>The modelling concept involves interactive fitting of potential fields (gravity and magnetics) and their derivatives within IGMAS+ (Interactive Gravity and Magnetic Application System), a well-known software tool with almost 40 years of development behind it. The core of IGMAS+ is the analytical solution of the volume integral for gravity and magnetic effects of homogeneous bodies, bounded by polyhedrons of triangulated model interfaces. The backbone model is constrained by interdisciplinary data, e.g. geological maps, seismic reflection and refraction profiles, structural signatures obtained from seismic receiver functions, local surveys etc. The software supports spherical geometries to resolve the first-order effects related to the curvature of the Earth, which is especially important for large-scale models.</p><p>Currently being developed and maintained at the Helmholtz Centre Potsdam – GFZ German Research Centre, IGMAS+ has a cross-platform implementation with parallelization of computations and optimized storage. The powerful graphical interface makes the interactive modelling and geometry modification process user-friendly and robust. Historically IGMAS+ is free for research and education purposes and has a long-term plan to remain so.</p><p>IGMAS+ has been used in various tectonic settings and we demonstrate its flexibility and usability on several lithospheric-scale case studies in South America and Europe.</p><p>Both science and industry are close to the goal of treating all available geoscientific data and geophysical methods inside a single subsurface model that aims to integrate most of the interdisciplinary measurement-based constraints and essential structural trends coming from geology. This approach presents challenges for both its implementation within the modelling software and the usability and plausibility of generated results, requiring a modelling concept that integrates the data methods in a feasible way together with recent advances in data science methods. As such, we present the future outlook of our modelling concept in regards to these challenges.</p>


2020 ◽  
Author(s):  
Ilaria Tabone ◽  
Alexander Robinson ◽  
Jorge Alvarez-Solas ◽  
Javier Blasco ◽  
Daniel Moreno ◽  
...  

<p>Simulations of large-scale ice sheet models are crucial to understand the long-term evolution of an ice sheet and its response to climate forcings. However, solving the ice-flow equations and processes proper of the ice sheet at large spatial scales requires reducing the model computational complexity to a certain degree. To do so, coarse-resolution models represent several physical processes and ice characteristics through model parameterisations. Ice-sheet boundary conditions (e.g. basal sliding, surface ablation, grounded and marine basal melting) as well as unconstrained ice-flow properties (e.g. ice-flow enhancement factor) are some examples. However, choosing the best parameter values to well represent such processes is a demanding exercise. Statistical methods, from simple to advanced techniques involving Bayesian approaches, have been taken into account to evaluate the model performance. Here we optimise the performance of a new state-of-the-art hybrid ice-sheet-shelf model by applying a skill-score method based on a multi-misfits approach. A large ensemble of paleo-to-present transient simulations of the Greenland ice sheet (GrIS) is produced through the Latin Hypercube Sampling technique. Results are then evaluated against a variety of information, comprising the present-day state of the ice sheet (e.g. ice thickness, ice velocity, basal thermal state) as well as available paleo reconstructions (e.g. glacial maximum extent, past elevation at the ice core sites). Results are then assembled to generate a single skill-score value based on a gaussian approach. The procedure is applied to various model parameters to evaluate the best choice of values associated with their parameterisations. </p>


2011 ◽  
Vol 366 (1582) ◽  
pp. 3292-3302 ◽  
Author(s):  
Robert M. Ewers ◽  
Raphael K. Didham ◽  
Lenore Fahrig ◽  
Gonçalo Ferraz ◽  
Andy Hector ◽  
...  

Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification.


2011 ◽  
Vol 278 (1717) ◽  
pp. 2437-2445 ◽  
Author(s):  
Paul-Camilo Zalamea ◽  
François Munoz ◽  
Pablo R. Stevenson ◽  
C. E. Timothy Paine ◽  
Carolina Sarmiento ◽  
...  

Plant phenology is concerned with the timing of recurring biological events. Though phenology has traditionally been studied using intensive surveys of a local flora, results from such surveys are difficult to generalize to broader spatial scales. In this study, contrastingly, we assembled a continental-scale dataset of herbarium specimens for the emblematic genus of Neotropical pioneer trees, Cecropia , and applied Fourier spectral and cospectral analyses to investigate the reproductive phenology of 35 species. We detected significant annual, sub-annual and continuous patterns, and discuss the variation in patterns within and among climatic regions. Although previous studies have suggested that pioneer species generally produce flowers continually throughout the year, we found that at least one third of Cecropia species are characterized by clear annual flowering behaviour. We further investigated the relationships between phenology and climate seasonality, showing strong associations between phenology and seasonal variations in precipitation and temperature. We also verified our results against field survey data gathered from the literature. Our findings indicate that herbarium material is a reliable resource for use in the investigation of large-scale patterns in plant phenology, offering a promising complement to local intensive field studies.


2010 ◽  
Vol 23 (22) ◽  
pp. 5933-5957 ◽  
Author(s):  
G. M. Martin ◽  
S. F. Milton ◽  
C. A. Senior ◽  
M. E. Brooks ◽  
S. Ineson ◽  
...  

Abstract The reduction of systematic errors is a continuing challenge for model development. Feedbacks and compensating errors in climate models often make finding the source of a systematic error difficult. In this paper, it is shown how model development can benefit from the use of the same model across a range of temporal and spatial scales. Two particular systematic errors are examined: tropical circulation and precipitation distribution, and summer land surface temperature and moisture biases over Northern Hemisphere continental regions. Each of these errors affects the model performance on time scales ranging from a few days to several decades. In both cases, the characteristics of the long-time-scale errors are found to develop during the first few days of simulation, before any large-scale feedbacks have taken place. The ability to compare the model diagnostics from the first few days of a forecast, initialized from a realistic atmospheric state, directly with observations has allowed physical deficiencies in the physical parameterizations to be identified that, when corrected, lead to improvements across the full range of time scales. This study highlights the benefits of a seamless prediction system across a wide range of time scales.


2014 ◽  
Vol 15 (1) ◽  
pp. 505-516 ◽  
Author(s):  
Yasuhiro Ishizaki ◽  
Tokuta Yokohata ◽  
Seita Emori ◽  
Hideo Shiogama ◽  
Kiyoshi Takahashi ◽  
...  

Abstract A pattern scaling approach allows projection of regional climate changes under a wide range of emission scenarios. A basic assumption of this approach is that the spatial response pattern to global warming (scaling pattern) is the same for all emission scenarios. Precipitation minus evapotranspiration (PME) over land can be considered to be a measure of the maximum available renewable freshwater resource, and estimation of PME is fundamentally important for the assessment of water resources. The authors assessed the basic assumption of pattern scaling for PME by the use of five global climate models. A significant scenario dependency (SD) of the scaling pattern of PME was found over some regions. This SD of the scaling pattern of PME was mainly due to the SD and the nonlinear response of large-scale atmospheric and oceanic changes. When the SD of the scaling pattern of PME is significant in a target area, projections of the impact of climate change need to carefully take into consideration the SD. Although the SD of the anthropogenic aerosol scaling patterns tended to induce SDs of precipitation and evapotranspiration scaling patterns, the SDs of precipitation and evapotranspiration tended to cancel each other out. As a result, the SD of the PME scaling pattern tended to be insignificant over most regions where the SD of anthropogenic aerosol scaling patterns were significant. The authors could not find large impacts of land use change on PME scaling pattern, but the former may influence the latter on different time scales or spatial scales.


2004 ◽  
Vol 28 (3) ◽  
pp. 317-339 ◽  
Author(s):  
Yvonne Martin ◽  
Michael Church

A resurgence of interest in landscape evolution has occurred as computational technology has made possible spatially and temporally extended numerical modelling. We review elements of a structured approach to model development and testing. It is argued that natural breaks in landscape process and morphology define appropriate spatial domains for the study of landscape evolution. The concept of virtual velocity is used to define appropriate timescales for the study of landscape change. Process specification in numerical modelling requires that the detail incorporated into equations be commensurable with the particular scale being considered. This may entail a mechanistic approach at small (spatial) scales, whereas a generalized approach to process definition may be preferred in large-scale studies. The distinction is illustrated by parameterizations for hillslope and fluvial transport processes based on scale considerations. Issues relevant to model implementation, including validation, verification, calibration and confirmation, are discussed. Finally, key developments and characteristics associated with three approaches to the study of landscape modelling:(i) conceptual; (ii) quasi-mechanistic; and (iii) generalized physics, are reviewed.


Sign in / Sign up

Export Citation Format

Share Document