scholarly journals Independent evaluation of the SNODAS snow depth product using regional scale LiDAR-derived measurements

2014 ◽  
Vol 8 (3) ◽  
pp. 3141-3170
Author(s):  
A. Hedrick ◽  
H.-P. Marshall ◽  
A. Winstral ◽  
K. Elder ◽  
S. Yueh ◽  
...  

Abstract. Repeated Light Detection and Ranging (LiDAR) surveys are quickly becoming the de facto method for measuring spatial variability of montane snowpacks at high resolution. This study examines the potential of a 750 km2 LiDAR-derived dataset of snow depths, collected during the 2007 northern Colorado Cold Lands Processes Experiment (CLPX-2), as a validation source for an operational hydrologic snow model. The SNOw Data Assimilation System (SNODAS) model framework, operated by the US National Weather Service, combines a physically-based energy-and-mass-balance snow model with satellite, airborne and automated ground-based observations to provide daily estimates of snowpack properties at nominally 1 km resolution over the coterminous United States. Independent validation data is scarce due to the assimilating nature of SNODAS, compelling the need for an independent validation dataset with substantial geographic coverage. Within twelve distinctive 500 m × 500 m study areas located throughout the survey swath, ground crews performed approximately 600 manual snow depth measurements during each of the CLPX-2 LiDAR acquisitions. This supplied a dataset for constraining the uncertainty of upscaled LiDAR estimates of snow depth at the 1 km SNODAS resolution, resulting in a root-mean-square difference of 13 cm. Upscaled LiDAR snow depths were then compared to the SNODAS-estimates over the entire study area for the dates of the LiDAR flights. The remotely-sensed snow depths provided a more spatially continuous comparison dataset and agreed more closely to the model estimates than that of the in situ measurements alone. Finally, the results revealed three distinct areas where the differences between LiDAR observations and SNODAS estimates were most drastic, suggesting natural processes specific to these regions as causal influences on model uncertainty.

2015 ◽  
Vol 9 (1) ◽  
pp. 13-23 ◽  
Author(s):  
A. Hedrick ◽  
H.-P. Marshall ◽  
A. Winstral ◽  
K. Elder ◽  
S. Yueh ◽  
...  

Abstract. Repeated light detection and ranging (lidar) surveys are quickly becoming the de facto method for measuring spatial variability of montane snowpacks at high resolution. This study examines the potential of a 750 km2 lidar-derived data set of snow depths, collected during the 2007 northern Colorado Cold Lands Processes Experiment (CLPX-2), as a validation source for an operational hydrologic snow model. The SNOw Data Assimilation System (SNODAS) model framework, operated by the US National Weather Service, combines a physically based energy-and-mass-balance snow model with satellite, airborne and automated ground-based observations to provide daily estimates of snowpack properties at nominally 1 km resolution over the conterminous United States. Independent validation data are scarce due to the assimilating nature of SNODAS, compelling the need for an independent validation data set with substantial geographic coverage. Within 12 distinctive 500 × 500 m study areas located throughout the survey swath, ground crews performed approximately 600 manual snow depth measurements during each of the CLPX-2 lidar acquisitions. This supplied a data set for constraining the uncertainty of upscaled lidar estimates of snow depth at the 1 km SNODAS resolution, resulting in a root-mean-square difference of 13 cm. Upscaled lidar snow depths were then compared to the SNODAS estimates over the entire study area for the dates of the lidar flights. The remotely sensed snow depths provided a more spatially continuous comparison data set and agreed more closely to the model estimates than that of the in situ measurements alone. Finally, the results revealed three distinct areas where the differences between lidar observations and SNODAS estimates were most drastic, providing insight into the causal influences of natural processes on model uncertainty.


2013 ◽  
Vol 54 (62) ◽  
pp. 273-281 ◽  
Author(s):  
Kjetil Melvold ◽  
Thomas Skaugen

AbstractThis study presents results from an Airborne Laser Scanning (ALS) mapping survey of snow depth on the mountain plateau Hardangervidda, Norway, in 2008 and 2009 at the approximate time of maximum snow accumulation during the winter. The spatial extent of the survey area is >240 km2. Large variability is found for snow depth at a local scale (2 m2), and similar spatial patterns in accumulation are found between 2008 and 2009. The local snow-depth measurements were aggregated by averaging to produce new datasets at 10, 50, 100, 250 and 500 m2 and 1 km2 resolution. The measured values at 1 km2 were compared with simulated snow depth from the seNorge snow model (www.senorge.no), which is run on a 1 km2 grid resolution. Results show that the spatial variability decreases as the scale increases. At a scale of about 500 m2 to 1 km2 the variability of snow depth is somewhat larger than that modeled by seNorge. This analysis shows that (1) the regional-scale spatial pattern of snow distribution is well captured by the seNorge model and (2) relatively large differences in snow depth between the measured and modeled values are present.


2006 ◽  
Vol 7 (5) ◽  
pp. 880-895 ◽  
Author(s):  
M. J. Tribbeck ◽  
R. J. Gurney ◽  
E. M. Morris

Abstract Models of snow processes in areas of possible large-scale change need to be site independent and physically based. Here, the accumulation and ablation of the seasonal snow cover beneath a fir canopy has been simulated with a new physically based snow–soil vegetation–atmosphere transfer scheme (Snow-SVAT) called SNOWCAN. The model was formulated by coupling a canopy optical and thermal radiation model to a physically based multilayer snow model. Simple representations of other forest effects were included. These include the reduction of wind speed and hence turbulent transfer beneath the canopy, sublimation of intercepted snow, and deposition of debris on the surface. This paper tests this new modeling approach fully at a fir site within Reynolds Creek Experimental Watershed, Idaho. Model parameters were determined at an open site and subsequently applied to the fir site. SNOWCAN was evaluated using measurements of snow depth, subcanopy solar and thermal radiation, and snowpack profiles of temperature, density, and grain size. Simulations showed good agreement with observations (e.g., fir site snow depth was estimated over the season with r 2 = 0.96), generally to within measurement error. However, the simulated temperature profiles were less accurate after a melt–freeze event, when the temperature discrepancy resulted from underestimation of the rate of liquid water flow and/or the rate of refreeze. This indicates both that the general modeling approach is applicable and that a still more complete representation of liquid water in the snowpack will be important.


Geosciences ◽  
2021 ◽  
Vol 11 (1) ◽  
pp. 35
Author(s):  
Luca Schilirò ◽  
José Cepeda ◽  
Graziella Devoli ◽  
Luca Piciullo

In Norway, shallow landslides are generally triggered by intense rainfall and/or snowmelt events. However, the interaction of hydrometeorological processes (e.g., precipitation and snowmelt) acting at different time scales, and the local variations of the terrain conditions (e.g., thickness of the surficial cover) are complex and often unknown. With the aim of better defining the triggering conditions of shallow landslides at a regional scale we used the physically based model TRIGRS (Transient Rainfall Infiltration and Grid-based Regional Slope stability) in an area located in upper Gudbrandsdalen valley in South-Eastern Norway. We performed numerical simulations to reconstruct two scenarios that triggered many landslides in the study area on 10 June 2011 and 22 May 2013. A large part of the work was dedicated to the parameterization of the numerical model. The initial soil-hydraulic conditions and the spatial variation of the surficial cover thickness have been evaluated applying different methods. To fully evaluate the accuracy of the model, ROC (Receiver Operating Characteristic) curves have been obtained comparing the safety factor maps with the source areas in the two periods of analysis. The results of the numerical simulations show the high susceptibility of the study area to the occurrence of shallow landslides and emphasize the importance of a proper model calibration for improving the reliability.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
S Gao ◽  
D Stojanovski ◽  
A Parker ◽  
P Marques ◽  
S Heitner ◽  
...  

Abstract Background Correctly identifying views acquired in a 2D echocardiographic examination is paramount to post-processing and quantification steps often performed as part of most clinical workflows. In many exams, particularly in stress echocardiography, microbubble contrast is used which greatly affects the appearance of the cardiac views. Here we present a bespoke, fully automated convolutional neural network (CNN) which identifies apical 2, 3, and 4 chamber, and short axis (SAX) views acquired with and without contrast. The CNN was tested in a completely independent, external dataset with the data acquired in a different country than that used to train the neural network. Methods Training data comprised of 2D echocardiograms was taken from 1014 subjects from a prospective multisite, multi-vendor, UK trial with the number of frames in each view greater than 17,500. Prior to view classification model training, images were processed using standard techniques to ensure homogenous and normalised image inputs to the training pipeline. A bespoke CNN was built using the minimum number of convolutional layers required with batch normalisation, and including dropout for reducing overfitting. Before processing, the data was split into 90% for model training (211,958 frames), and 10% used as a validation dataset (23,946 frames). Image frames from different subjects were separated out entirely amongst the training and validation datasets. Further, a separate trial dataset of 240 studies acquired in the USA was used as an independent test dataset (39,401 frames). Results Figure 1 shows the confusion matrices for both validation data (left) and independent test data (right), with an overall accuracy of 96% and 95% for the validation and test datasets respectively. The accuracy for the non-contrast cardiac views of >99% exceeds that seen in other works. The combined datasets included images acquired across ultrasound manufacturers and models from 12 clinical sites. Conclusion We have developed a CNN capable of automatically accurately identifying all relevant cardiac views used in “real world” echo exams, including views acquired with contrast. Use of the CNN in a routine clinical workflow could improve efficiency of quantification steps performed after image acquisition. This was tested on an independent dataset acquired in a different country to that used to train the model and was found to perform similarly thus indicating the generalisability of the model. Figure 1. Confusion matrices Funding Acknowledgement Type of funding source: Private company. Main funding source(s): Ultromics Ltd.


2021 ◽  
Vol 10 (2) ◽  
pp. 88
Author(s):  
Dana Kaziyeva ◽  
Martin Loidl ◽  
Gudrun Wallentin

Transport planning strategies regard cycling promotion as a suitable means for tackling problems connected with motorized traffic such as limited space, congestion, and pollution. However, the evidence base for optimizing cycling promotion is weak in most cases, and information on bicycle patterns at a sufficient resolution is largely lacking. In this paper, we propose agent-based modeling to simulate bicycle traffic flows at a regional scale level for an entire day. The feasibility of the model is demonstrated in a use case in the Salzburg region, Austria. The simulation results in distinct spatio-temporal bicycle traffic patterns at high spatial (road segments) and temporal (minute) resolution. Scenario analysis positively assesses the model’s level of complexity, where the demographically parametrized behavior of cyclists outperforms stochastic null models. Validation with reference data from three sources shows a high correlation between simulated and observed bicycle traffic, where the predictive power is primarily related to the quality of the input and validation data. In conclusion, the implemented agent-based model successfully simulates bicycle patterns of 186,000 inhabitants within a reasonable time. This spatially explicit approach of modeling individual mobility behavior opens new opportunities for evidence-based planning and decision making in the wide field of cycling promotion


Water ◽  
2021 ◽  
Vol 13 (5) ◽  
pp. 692
Author(s):  
Boyu Mi ◽  
Haorui Chen ◽  
Shaoli Wang ◽  
Yinlong Jin ◽  
Jiangdong Jia ◽  
...  

The water movement research in irrigation districts is important for food production. Many hydrological models have been proposed to simulate the water movement on the regional scale, yet few of them have comprehensively considered processes in the irrigation districts. A novel physically based distributed model, the Irrigation Districts Model (IDM), was constructed in this study to address this problem. The model combined the 1D canal and ditch flow, the 1D soil water movement, the 2D groundwater movement, and the water interactions among these processes. It was calibrated and verified with two-year experimental data from Shahaoqu Sub-Irrigation Area in Hetao Irrigation District. The overall water balance error is 2.9% and 1.6% for the two years, respectively. The Nash–Sutcliffe efficiency coefficient (NSE) of water table depth and soil water content is 0.72 and 0.64 in the calibration year and 0.68 and 0.64 in the verification year. The results show good correspondence between the simulation and observation. It is practicable to apply the model in water movement research of irrigation districts.


2016 ◽  
Vol 9 (12) ◽  
pp. 4491-4519 ◽  
Author(s):  
Aurélien Gallice ◽  
Mathias Bavay ◽  
Tristan Brauchli ◽  
Francesco Comola ◽  
Michael Lehning ◽  
...  

Abstract. Climate change is expected to strongly impact the hydrological and thermal regimes of Alpine rivers within the coming decades. In this context, the development of hydrological models accounting for the specific dynamics of Alpine catchments appears as one of the promising approaches to reduce our uncertainty of future mountain hydrology. This paper describes the improvements brought to StreamFlow, an existing model for hydrological and stream temperature prediction built as an external extension to the physically based snow model Alpine3D. StreamFlow's source code has been entirely written anew, taking advantage of object-oriented programming to significantly improve its structure and ease the implementation of future developments. The source code is now publicly available online, along with a complete documentation. A special emphasis has been put on modularity during the re-implementation of StreamFlow, so that many model aspects can be represented using different alternatives. For example, several options are now available to model the advection of water within the stream. This allows for an easy and fast comparison between different approaches and helps in defining more reliable uncertainty estimates of the model forecasts. In particular, a case study in a Swiss Alpine catchment reveals that the stream temperature predictions are particularly sensitive to the approach used to model the temperature of subsurface flow, a fact which has been poorly reported in the literature to date. Based on the case study, StreamFlow is shown to reproduce hourly mean discharge with a Nash–Sutcliffe efficiency (NSE) of 0.82 and hourly mean temperature with a NSE of 0.78.


2018 ◽  
Author(s):  
Fabien Maussion ◽  
Anton Butenko ◽  
Julia Eis ◽  
Kévin Fourteau ◽  
Alexander H. Jarosch ◽  
...  

Abstract. Despite of their importance for sea-level rise, seasonal water availability, and as source of geohazards, mountain glaciers are one of the few remaining sub-systems of the global climate system for which no globally applicable, open source, community-driven model exists. Here we present the Open Global Glacier Model (OGGM, http://www.oggm.org), developed to provide a modular and open source numerical model framework for simulating past and future change of any glacier in the world. The modelling chain comprises data downloading tools (glacier outlines, topography, climate, validation data), a preprocessing module, a mass-balance model, a distributed ice thickness estimation model, and an ice flow model. The monthly mass-balance is obtained from gridded climate data and a temperature index melt model. To our knowledge, OGGM is the first global model explicitly simulating glacier dynamics: the model relies on the shallow ice approximation to compute the depth-integrated flux of ice along multiple connected flowlines. In this paper, we describe and illustrate each processing step by applying the model to a selection of glaciers before running global simulations under idealized climate forcings. Even without an in-depth calibration, the model shows a very realistic behaviour. We are able to reproduce earlier estimates of global glacier volume by varying the ice dynamical parameters within a range of plausible values. At the same time, the increased complexity of OGGM compared to other prevalent global glacier models comes at a reasonable computational cost: several dozens of glaciers can be simulated on a personal computer, while global simulations realized in a supercomputing environment take up to a few hours per century. Thanks to the modular framework, modules of various complexity can be added to the codebase, allowing to run new kinds of model intercomparisons in a controlled environment. Future developments will add new physical processes to the model as well as tools to calibrate the model in a more comprehensive way. OGGM spans a wide range of applications, from ice-climate interaction studies at millenial time scales to estimates of the contribution of glaciers to past and future sea-level change. It has the potential to become a self-sustained, community driven model for global and regional glacier evolution.


2015 ◽  
Vol 138 (1) ◽  
Author(s):  
Jeff R. Harris ◽  
Blake W. Lance ◽  
Barton L. Smith

A computational fluid dynamics (CFD) validation dataset for turbulent forced convection on a vertical plate is presented. The design of the apparatus is based on recent validation literature and provides a means to simultaneously measure boundary conditions (BCs) and system response quantities (SRQs). All important inflow quantities for Reynolds-Averaged Navier-Stokes (RANS). CFD are also measured. Data are acquired at two heating conditions and cover the range 40,000 < Rex < 300,000, 357 <  Reδ2 < 813, and 0.02 < Gr/Re2 < 0.232.


Sign in / Sign up

Export Citation Format

Share Document