scholarly journals On the Uncertainty of the Image Velocimetry Method Parameters

Hydrology ◽  
2020 ◽  
Vol 7 (3) ◽  
pp. 65
Author(s):  
Evangelos Rozos ◽  
Panayiotis Dimitriadis ◽  
Katerina Mazi ◽  
Spyridon Lykoudis ◽  
Antonis Koussis

Image velocimetry is a popular remote sensing method mainly because of the very modest cost of the necessary equipment. However, image velocimetry methods employ parameters that require high expertise to select appropriate values in order to obtain accurate surface flow velocity estimations. This introduces considerations regarding the subjectivity introduced in the definition of the parameter values and its impact on the estimated surface velocity. Alternatively, a statistical approach can be employed instead of directly selecting a value for each image velocimetry parameter. First, probability distribution should be defined for each model parameter, and then Monte Carlo simulations should be employed. In this paper, we demonstrate how this statistical approach can be used to simultaneously produce the confidence intervals of the estimated surface velocity, reduce the uncertainty of some parameters (more specifically, the size of the interrogation area), and reduce the subjectivity. Since image velocimetry algorithms are CPU-intensive, an alternative random number generator that allows obtaining the confidence intervals with a limited number of iterations is suggested. The case study indicated that if the statistical approach is applied diligently, one can achieve the previously mentioned threefold objective.

2021 ◽  
Author(s):  
Silvano Fortunato Dal Sasso ◽  
Alonso Pizarro ◽  
Sophie Pearce ◽  
Ian Maddock ◽  
Matthew T. Perks ◽  
...  

<p>Optical sensors coupled with image velocimetry techniques are becoming popular for river monitoring applications. In this context, new opportunities and challenges are growing for the research community aimed to: i) define standardized practices and methodologies; and ii) overcome some recognized uncertainty at the field scale. At this regard, the accuracy of image velocimetry techniques strongly depends on the occurrence and distribution of visible features on the water surface in consecutive frames. In a natural environment, the amount, spatial distribution and visibility of natural features on river surface are continuously challenging because of environmental factors and hydraulic conditions. The dimensionless seeding distribution index (SDI), recently introduced by Pizarro et al., 2020a,b and Dal Sasso et al., 2020, represents a metric based on seeding density and spatial distribution of tracers for identifying the best frame window (FW) during video footage. In this work, a methodology based on the SDI index was applied to different study cases with the Large Scale Particle Image Velocimetry (LSPIV) technique. Videos adopted are taken from the repository recently created by the COST Action Harmonious, which includes 13 case study across Europe and beyond for image velocimetry applications (Perks et al., 2020). The optimal frame window selection is based on two criteria: i) the maximization of the number of frames and ii) the minimization of SDI index. This methodology allowed an error reduction between 20 and 39% respect to the entire video configuration. This novel idea appears suitable for performing image velocimetry in natural settings where environmental and hydraulic conditions are extremely challenging and particularly useful for real-time observations from fixed river-gauged stations where an extended number of frames are usually recorded and analyzed.</p><p> </p><p><strong>References </strong></p><p>Dal Sasso S.F., Pizarro A., Manfreda S., Metrics for the Quantification of Seeding Characteristics to Enhance Image Velocimetry Performance in Rivers. Remote Sensing, 12, 1789 (doi: 10.3390/rs12111789), 2020.</p><p>Perks M. T., Dal Sasso S. F., Hauet A., Jamieson E., Le Coz J., Pearce S., …Manfreda S, Towards harmonisation of image velocimetry techniques for river surface velocity observations. Earth System Science Data, https://doi.org/10.5194/essd-12-1545-2020, 12(3), 1545 – 1559, 2020.</p><p>Pizarro A., Dal Sasso S.F., Manfreda S., Refining image-velocimetry performances for streamflow monitoring: Seeding metrics to errors minimisation, Hydrological Processes, (doi: 10.1002/hyp.13919), 1-9, 2020.</p><p>Pizarro A., Dal Sasso S.F., Perks M. and Manfreda S., Identifying the optimal spatial distribution of tracers for optical sensing of stream surface flow, Hydrology and Earth System Sciences, 24, 5173–5185, (10.5194/hess-24-5173-2020), 2020.</p>


Geosciences ◽  
2018 ◽  
Vol 8 (10) ◽  
pp. 383 ◽  
Author(s):  
Donatella Termini ◽  
Alice Di Leonardo

Digital particle image velocimetry records high resolution images and allows the identification of the position of points in different time instants. This paper explores the efficiency of the digital image-technique for remote monitoring of surface velocity and discharge measurement in hyper-concentrated flow by the way of laboratory experiment. One of the challenges in the application of the image-technique is the evaluation of the error in estimating surface velocity. The error quantification is complex because it depends on many factors characterizing either the experimental conditions or/and the processing algorithm. In the present work, attention is devoted to the estimation error due either to the acquisition time or to the size of the sub-images (interrogation areas) to be correlated. The analysis is conducted with the aid of data collected in a scale laboratory flume constructed at the Hydraulic laboratory of the Department of Civil, Environmental, Aerospace and of Materials Engineering (DICAM)—University of Palermo (Italy) and the image processing is carried out by the help of the PivLab algorithm in Matlab. The obtained results confirm that the number of frames used in processing procedure strongly affects the values of surface velocity; the estimation error decreases as the number of frames increases. The size of the interrogation area also exerts an important role in the flow velocity estimation. For the examined case, a reduction of the size of the interrogation area of one half compared to its original size has allowed us to obtain low values of the velocity estimation error. Results also demonstrate the ability of the digital image-technique to estimate the discharge at given cross-sections. The values of the discharge estimated by applying the digital image-technique downstream of the inflow sections by using the aforementioned size of the interrogation area compares well with those measured.


Author(s):  
Олег Горленко ◽  
Oleg Gorlenko

A method for technological support of roughness parameters for machine parts based on an experimental statistical approach is considered. The essence of the method consists in the processing of test blanks (or their test surfaces) according to a pre-planned scheme, in the roughness parameter assessment of test blanks, in the development of mathematical statistical model of the connection of roughness parameters with technological factors and on the basis of the model given a definition of their levels ensuring obtaining the roughness parameter values specified at machining a basic batch blanks. The peculiarities in technological support of relative supporting lengths of a rough surface profile and also a method for the formation of complex functional parameters of a rough surface are touched upon. The necessity for the creation of portable control measuring systems allowing the realization in practice this method is emphasized.


2020 ◽  
Author(s):  
Sophie Pearce ◽  
Robert Ljubicic ◽  
Salvador Pena-Haro ◽  
Matthew Perks ◽  
Flavia Tauro ◽  
...  

<p>Image velocimetry (IV) is a remote technique which calculates surface flow velocities of rivers (or fluids) via a range of cross-correlation and tracking algorithms. IV can be implemented via a range of camera sensors which can be mounted on tri-pods, or Unmanned Aerial Systems (UAS). IV has proven a powerful technique for monitoring river flows during flood conditions, whereby traditional in-situ techniques would be unsafe to deploy. However, little research has focussed upon the application of such techniques during low flow conditions. The applicability of IV to low flow studies could aid data collection at a higher spatial and temporal resolution than is currently available. Many IV techniques are under-development, that utilise different cross-correlation and tracking algorithms, including, Large Scale Particle Image Velocimetry (LSPIV), Large Scale Particle Tracking Velocimetry (LSPTV), Optical Tracking Velocimetry (OTV), Kanade Lucas Tomasi Image Velocimetry (KLT-IV) and Surface Structure Image Velocimetry (SSIV). Nevertheless, the true applications and limitations of such algorithms have yet to be extensively tested. Therefore, this study aimed to conduct a sensitivity analysis on the commonly relatable parameters between the different algorithms, including the particle identification area parameters (such as Interrogation Area (LSPIV, LSPTV and SSIV), Block Size (KLT-IV) and Trajectory Length (OTV)) and the feature extraction rate (or sub sampled frame rate).</p><p>Fieldwork was carried out on Kolubara River near the city of Obrenovac in Central Serbia. Cross-sectional surface width was relatively constant, varying between 23.30 and 23.45m. During the experiment, low flow conditions were present with a discharge of approx. 3.4m<sup>3 </sup>s<sup>-1</sup> (estimated using a Sontek M9 ADCP), and depths of up to 1.9m. A DJI Phantom 4 Pro UAS was used to collect video data of the surface flow. Artificial seeding material (wood-mulch) was distributed homogenously across the rivers’ surface, in order to improve the conditions for IV techniques during slow flows. Two 30-second videos were utilised for surface velocity analysis.</p><p>This study highlighted that KLT, SSIV, OTV and LSPIV are the least sensitive algorithms to changing parameters when no pre- or post-processing of results are conducted. On the other hand, LSPTV must undergo post-processing procedures in order to avoid spurious results and only then, results may be reliable. Furthermore, KLT and SSIV highlighted a slight sensitivity to changing the feature extraction rate, however changing the particle identification area did not affect significantly the outputted surface velocity results. OTV and LSPTV, on the other hand, highlighted that changing the particle identification area provided a higher variability in the results, whilst changing the feature extraction rate did not affect the surface velocity outputs. LSPIV proved to be sensitive to changing both the feature extraction rate and the particle identification area.</p><p>This analysis has led to the conclusions that during the conditions of sampling with surface velocities of approximately 0.12ms<sup>-1</sup>, and homogeneous seeding on the rivers surface, IV techniques can provide results comparable to traditional techniques such as ADCPs during low flow conditions. All IV algorithms provided results that were, on average, within 0.05ms<sup>-1</sup> of the ADCP measurements.</p><p> </p>


2020 ◽  
Vol 0 (0) ◽  
Author(s):  
Weixin Cai ◽  
Mark van der Laan

AbstractThe Highly-Adaptive least absolute shrinkage and selection operator (LASSO) Targeted Minimum Loss Estimator (HAL-TMLE) is an efficient plug-in estimator of a pathwise differentiable parameter in a statistical model that at minimal (and possibly only) assumes that the sectional variation norm of the true nuisance functions (i.e., relevant part of data distribution) are finite. It relies on an initial estimator (HAL-MLE) of the nuisance functions by minimizing the empirical risk over the parameter space under the constraint that the sectional variation norm of the candidate functions are bounded by a constant, where this constant can be selected with cross-validation. In this article we establish that the nonparametric bootstrap for the HAL-TMLE, fixing the value of the sectional variation norm at a value larger or equal than the cross-validation selector, provides a consistent method for estimating the normal limit distribution of the HAL-TMLE. In order to optimize the finite sample coverage of the nonparametric bootstrap confidence intervals, we propose a selection method for this sectional variation norm that is based on running the nonparametric bootstrap for all values of the sectional variation norm larger than the one selected by cross-validation, and subsequently determining a value at which the width of the resulting confidence intervals reaches a plateau. We demonstrate our method for 1) nonparametric estimation of the average treatment effect when observing a covariate vector, binary treatment, and outcome, and for 2) nonparametric estimation of the integral of the square of the multivariate density of the data distribution. In addition, we also present simulation results for these two examples demonstrating the excellent finite sample coverage of bootstrap-based confidence intervals.


Author(s):  
Christopher Pagano ◽  
Flavia Tauro ◽  
Salvatore Grimaldi ◽  
Maurizio Porfiri

Large scale particle image velocimetry (LSPIV) is a nonintrusive environmental monitoring methodology that allows for continuous characterization of surface flows in natural catchments. Despite its promise, the implementation of LSPIV in natural environments is limited to areas accessible to human operators. In this work, we propose a novel experimental configuration that allows for unsupervised LSPIV over large water bodies. Specifically, we design, develop, and characterize a lightweight, low cost, and stable quadricopter hosting a digital acquisition system. An active gimbal maintains the camera lens orthogonal to the water surface, thus preventing severe image distortions. Field experiments are performed to characterize the vehicle and assess the feasibility of the approach. We demonstrate that the quadricopter can hover above an area of 1×1m2 for 4–5 minutes with a payload of 500g. Further, LSPIV measurements on a natural stream confirm that the methodology can be reliably used for surface flow studies.


2002 ◽  
Vol 757 ◽  
Author(s):  
W. L. Ebert ◽  
J. C. Cunnane ◽  
N. L. Dietz

ABSTRACTThis paper describes how the results of vapor hydration tests (VHTs) are used to model the corrosion of waste glasses exposed to humid air in the glass degradation model for total system performance assessment (TSPA) calculations for the proposed Yucca Mountain disposal system. Corrosion rates measured in VHTs conducted at 125, 150, 175, and 200°C are compared with the rate equation for aqueous dissolution to determine parameter values that are applicable to glass degradation in humid air. These will be used to determine the minimum for the range and distribution of parameter values in calculations for the Yucca Mountain disposal system license application (TSPA-LA). The rate equation for glass dissolution is rate = kE • 10 η • pH • exp(–Ea/RT). Uncertainties in the calculated rate due to the range of waste glass compositions and water exposure conditions are taken into account by using a range of values for the rate coefficient kE. The parameter values for the pH dependence (η) and temperature dependence (Ea) and the upper limit for kE are being determined with other tests. Using the values of η and Ea from the site recommendation model, the VHT results described in this paper provide a value of log kE = 5.1 as the minimum value for the rate expression. This value will change slightly if different pH-and temperature-dependencies are used for the TSPA-LA model.


2000 ◽  
Vol 23 (4) ◽  
pp. 869-875 ◽  
Author(s):  
José Marcelo Soriano Viana

It was studied the parametric restrictions of the diallel analysis model of Griffing, method 2 (parents and F1 generations) and model 1 (fixed), in order to address the questions: i) does the statistical model need to be restricted? ii) do the restrictions satisfy the genetic parameter values? and iii) do they make the analysis and interpretation easier? Objectively, these questions can be answered as: i) yes, ii) not all of them, and iii) the analysis is easier, but the interpretation is the same as in the model with restrictions that satisfy the parameter values. The main conclusions were that: the statistical models for combining ability analysis are necessarily restricted; in the Griffing model (method 2, model 1), the restrictions relative to the specific combining ability (SCA) effects, <img src="http:/img/fbpe/gmb/v23n4/6246s1.gif" align="absmiddle"> and <img src="http:/img/fbpe/gmb/v23n4/6246s2.gif" align="absmiddle"> for all j, do not satisfy the parametric values, and the same inferences should be established from the analyses using the model with restrictions that satisfy the parametric values of SCA effects and that suggested by Griffing. A consequence of the restrictions of the Griffing model is to allow the definition of formulas for estimating the effects, their variances and the variances of contrasts of effects, as well as for calculating orthogonal sums of squares.


2016 ◽  
Author(s):  
Valerio De Biagi ◽  
Maria Lia Napoli ◽  
Monica Barbero ◽  
Daniele Peila

Abstract. With reference to the rockfall risk estimation and the planning of rockfall protection devices one of the most critical and most discussed problems is the correct definition of the design block taking into account its return period. In this paper, a methodology for the assessment of the design block linked with its return time is proposed and discussed, following a statistical approach. The procedure is based on the survey of the blocks already detached from the slope and accumulated at the foot of the slope and the available historical data.


Author(s):  
M. Gambini ◽  
G. L. Guizzi ◽  
M. Vellini

In this paper, the thermodynamic potentialities and limits of the H2/O2 cycles are investigated. Starting from the conventional gas turbine and steam turbine technology, the paper qualitatively tackles problems related to a change of oxidizer and fuel: from these considerations, an internal combustion steam cycle (ICSC) is analyzed where steam, injected in the combustion chamber together with oxygen and hydrogen, is produced in a regenerative way and plays the important role of inert. A proper parametric analysis is then performed in order to evaluate the influence of the main working parameters on the overall performance of H2/O2 cycles. All the results are carried out neglecting the energy requirements for O2 and H2 production systems, but taking into account their work compression only. This choice permits great freedom in the definition of these thermodynamic cycles and allows general considerations because there is no need of any specification about H2 and/or O2 production systems and their integration with thermodynamic cycles. Therefore this paper can be framed in a context of oxygen and hydrogen centralized production (by nuclear or renewable energy sources for example) and in their distribution as pure gases in the utilization place. Adopting realistic assumptions, TIT of about 1350°C, the potentialities of H2/O2 cycles are very limited: the net efficiency attains a value of about 50%. Instead, adopting futurist assumptions, TIT = I700°C, a different H2/O2 cycle scheme can be proposed and more interesting performance is attained (a net efficiency value over 60%). The thermodynamic and technological aspects are completely addressed in the paper, underlining the great importance of the choice of the main working parameters.


Sign in / Sign up

Export Citation Format

Share Document