scholarly journals HESS Opinions "More efforts and scientific rigour are needed to attribute trends in flood time series"

2012 ◽  
Vol 9 (1) ◽  
pp. 1345-1365 ◽  
Author(s):  
B. Merz ◽  
S. Vorogushyn ◽  
S. Uhlemann ◽  
J. Delgado ◽  
Y. Hundecha

Abstract. The question whether the magnitude and frequency of floods have changed due to climate change or other drivers of change is of high interest. The number of flood trend studies is rapidly rising. When changes are detected, many studies link the identified change to the underlying causes, i.e. they attribute the changes in flood behaviour to certain drivers of change. We propose a hypothesis testing framework for trend attribution which consists of essential ingredients for a sound attribution: proof of consistency, proof of inconsistency and provision of confidence statement. Further, we evaluate the current state-of-the-art of flood trend attribution. To this end, we assess how selected recent studies approach the attribution problem, and to which extent their attribution statements seem defendable. In our opinion, the current state of flood trend attribution is poor. Attribution statements are mostly based on qualitative reasoning or even speculation. Typically, the focus of flood trend studies is the detection of change, i.e. the statistical analysis of time series, and attribution is regarded as an appendix: (1) flood time series are analysed by means of trend tests, (2) if a significant change is detected, a hypothesis on the cause of change is given, and (3) explanations or published studies are sought which support the hypothesis. We believe that we need a change in perspective and more scientific rigour: detection should be seen as an integral part of the more challenging attribution problem, and detection and attribution should be placed in a sound hypothesis testing framework.

2012 ◽  
Vol 16 (5) ◽  
pp. 1379-1387 ◽  
Author(s):  
B. Merz ◽  
S. Vorogushyn ◽  
S. Uhlemann ◽  
J. Delgado ◽  
Y. Hundecha

Abstract. The question whether the magnitude and frequency of floods have changed due to climate change or other drivers of change is of high interest. The number of flood trend studies is rapidly rising. When changes are detected, many studies link the identified change to the underlying causes, i.e. they attribute the changes in flood behaviour to certain drivers of change. We propose a hypothesis testing framework for trend attribution which consists of essential ingredients for a sound attribution: evidence of consistency, evidence of inconsistency, and provision of confidence statement. Further, we evaluate the current state-of-the-art of flood trend attribution. We assess how selected recent studies approach the attribution problem, and to which extent their attribution statements seem defendable. In our opinion, the current state of flood trend attribution is poor. Attribution statements are mostly based on qualitative reasoning or even speculation. Typically, the focus of flood trend studies is the detection of change, i.e. the statistical analysis of time series, and attribution is regarded as an appendix: (1) flood time series are analysed by means of trend tests, (2) if a significant change is detected, a hypothesis on the cause of change is given, and (3) explanations or published studies are sought which support the hypothesis. We believe that we need a change in perspective and more scientific rigour: detection should be seen as an integral part of the more challenging attribution problem, and detection and attribution should be placed in a sound hypothesis testing framework.


2021 ◽  
Vol 13 (10) ◽  
pp. 2006
Author(s):  
Jun Hu ◽  
Qiaoqiao Ge ◽  
Jihong Liu ◽  
Wenyan Yang ◽  
Zhigui Du ◽  
...  

The Interferometric Synthetic Aperture Radar (InSAR) technique has been widely used to obtain the ground surface deformation of geohazards (e.g., mining subsidence and landslides). As one of the inherent errors in the interferometric phase, the digital elevation model (DEM) error is usually estimated with the help of an a priori deformation model. However, it is difficult to determine an a priori deformation model that can fit the deformation time series well, leading to possible bias in the estimation of DEM error and the deformation time series. In this paper, we propose a method that can construct an adaptive deformation model, based on a set of predefined functions and the hypothesis testing theory in the framework of the small baseline subset InSAR (SBAS-InSAR) method. Since it is difficult to fit the deformation time series over a long time span by using only one function, the phase time series is first divided into several groups with overlapping regions. In each group, the hypothesis testing theory is employed to adaptively select the optimal deformation model from the predefined functions. The parameters of adaptive deformation models and the DEM error can be modeled with the phase time series and solved by a least square method. Simulations and real data experiments in the Pingchuan mining area, Gaunsu Province, China, demonstrate that, compared to the state-of-the-art deformation modeling strategy (e.g., the linear deformation model and the function group deformation model), the proposed method can significantly improve the accuracy of DEM error estimation and can benefit the estimation of deformation time series.


Author(s):  
Lars Kegel ◽  
Claudio Hartmann ◽  
Maik Thiele ◽  
Wolfgang Lehner

AbstractProcessing and analyzing time series datasets have become a central issue in many domains requiring data management systems to support time series as a native data type. A core access primitive of time series is matching, which requires efficient algorithms on-top of appropriate representations like the symbolic aggregate approximation (SAX) representing the current state of the art. This technique reduces a time series to a low-dimensional space by segmenting it and discretizing each segment into a small symbolic alphabet. Unfortunately, SAX ignores the deterministic behavior of time series such as cyclical repeating patterns or a trend component affecting all segments, which may lead to a sub-optimal representation accuracy. We therefore introduce a novel season- and a trend-aware symbolic approximation and demonstrate an improved representation accuracy without increasing the memory footprint. Most importantly, our techniques also enable a more efficient time series matching by providing a match up to three orders of magnitude faster than SAX.


Author(s):  
Nikolay Atanasov ◽  
Bharath Sankaran ◽  
Jerome Le Ny ◽  
Thomas Koletschka ◽  
George J. Pappas ◽  
...  

2017 ◽  
Vol 232 (2) ◽  
pp. R131-R139 ◽  
Author(s):  
Smithamol Sithara ◽  
Tamsyn M Crowley ◽  
Ken Walder ◽  
Kathryn Aston-Mourney

Type 2 diabetes (T2D) is increasing in prevalence at an alarming rate around the world. Much effort has gone into the discovery and design of antidiabetic drugs; however, those already available are unable to combat the underlying causes of the disease and instead only moderate the symptoms. The reason for this is that T2D is a complex disease, and attempts to target one biological pathway are insufficient to combat the full extent of the disease. Additionally, the underlying pathophysiology of this disease is yet to be fully elucidated making it difficult to design drugs that target the mechanisms involved. Therefore, the approach of designing new drugs aimed at a specific molecular target is not optimal and a more expansive, unbiased approach is required. In this review, we will look at the current state of diabetes treatments and how these target the disease symptoms but are unable to combat the underlying causes. We will also review how the technique of gene expression signatures (GESs) has been used successfully for other complex diseases and how this may be applied as a powerful tool for the discovery of new drugs for T2D.


2018 ◽  
Vol 10 (9) ◽  
pp. 1383 ◽  
Author(s):  
Jili Yuan ◽  
Xiaolei Lv ◽  
Rui Li

To improve the suppression effect for the speckle noise of synthetic aperture radar (SAR) images and the ability of spatiotemporal information preservation of the filtered image without losing the spatial resolution, a novel multitemporal filtering method based on hypothesis testing is proposed in this paper. A framework of a two-step similarity measure strategy is adopted to further enhance the filtering results. Firstly, bi-date analysis using a two-sample Kolmogorov-Smirnov (KS) test is conducted in step 1 to extract homogeneous patches for 3-D patch stacks generation. Subsequently, the similarity between patch stacks is compared by a sliding time-series likelihood ratio (STSLR) test algorithm in step 2, which utilizes the multi-dimensional data structure of the stacks to improve the accuracy of unchanged pixels detection. Finally, the filtered values are obtained by averaging the similar pixels in time-series. The experimental results and analysis of two multitemporal datasets acquired by TerraSAR-X show that the proposed method outperforms the other typical methods with regard to the overall filtering effect, especially in terms of the consistency between the filtered images and the original ones. Furthermore, the performance of the proposed method is also discussed by analyzing the results from step 1 and step 2.


2020 ◽  
Vol 33 (5) ◽  
pp. 2134-2179 ◽  
Author(s):  
Tarun Chordia ◽  
Amit Goyal ◽  
Alessio Saretto

Abstract We use information from over 2 million trading strategies randomly generated using real data and from strategies that survive the publication process to infer the statistical properties of the set of strategies that could have been studied by researchers. Using this set, we compute $t$-statistic thresholds that control for multiple hypothesis testing, when searching for anomalies, at 3.8 and 3.4 for time-series and cross-sectional regressions, respectively. We estimate the expected proportion of false rejections that researchers would produce if they failed to account for multiple hypothesis testing to be about 45%.


Sign in / Sign up

Export Citation Format

Share Document