Sea level simulation with signal decomposition and machine learning

2021 ◽  
Vol 241 ◽  
pp. 110109
Author(s):  
Chao Song ◽  
Xiaohong Chen ◽  
Xinjun Ding ◽  
Lele Zhang
2020 ◽  
Vol 10 (23) ◽  
pp. 8481
Author(s):  
Cesar Federico Caiafa ◽  
Jordi Solé-Casals ◽  
Pere Marti-Puig ◽  
Sun Zhe ◽  
Toshihisa Tanaka

In many machine learning applications, measurements are sometimes incomplete or noisy resulting in missing features. In other cases, and for different reasons, the datasets are originally small, and therefore, more data samples are required to derive useful supervised or unsupervised classification methods. Correct handling of incomplete, noisy or small datasets in machine learning is a fundamental and classic challenge. In this article, we provide a unified review of recently proposed methods based on signal decomposition for missing features imputation (data completion), classification of noisy samples and artificial generation of new data samples (data augmentation). We illustrate the application of these signal decomposition methods in diverse selected practical machine learning examples including: brain computer interface, epileptic intracranial electroencephalogram signals classification, face recognition/verification and water networks data analysis. We show that a signal decomposition approach can provide valuable tools to improve machine learning performance with low quality datasets.


2021 ◽  
Author(s):  
Diarmuid Corr ◽  
Amber Leeson ◽  
Malcolm McMillan ◽  
Ce Zhang

<p>Mass loss from Greenlandic and Antarctic ice sheets are predicted to be the dominant contribution to global sea level rise in coming years. Supraglacial lakes and channels are thought to play a significant role in ice sheet mass balance by causing the speed-up of grounded ice and weakening, floating ice shelves to the point of collapse. Identifying the location, distribution and life cycle of these hydrological features on both the Greenland and Antarctic ice sheets is therefore important in understanding their present and future contribution to global sea level rise. Supraglacial hydrological features can be easily identified by eye in optical satellite imagery. However, given that there are many thousands of these features, and they appear in many hundreds of satellite images, automated approaches to mapping these features in such images are urgently needed.</p><p> </p><p>Current automated approaches in mapping supraglacial hydrology tend to have high false positive and false negative rates, which are often followed by manual corrections and quality control processes. Given the scale of the data however, methods such as those that require manual post-processing are not feasible for repeat monitoring of surface hydrology at continental scale. Here, we present initial results from our work conducted as part of the 4D Greenland and 4D Antarctica projects, which increases the accuracy of supraglacial lake and channel delineation using Sentinel-2 and Landsat-7/8 imagery, while reducing the need for manual intervention. We use Machine Learning approaches including a Random Forest algorithm trained to recognise water, ice, cloud, rock, shadow, blue-ice and crevassed regions. Both labelled optical imagery and auxiliary data (e.g. digital elevation models) are used in our approach. Our methods are trained and validated using data covering a range of glaciological and climatological conditions, including images of both ice sheets and those acquired at different points during the melt-season. The workflow, developed under Google Cloud Platform, which hosts the entire archive of Sentinel-2 and Landsat-8 data, allows for large-scale application over Greenlandic and Antarctic ice sheets, and is intended for repeated use throughout future melt-seasons.</p>


2019 ◽  
Vol 36 (9) ◽  
pp. 1889-1902
Author(s):  
Magnus Hieronymus ◽  
Jenny Hieronymus ◽  
Fredrik Hieronymus

Long sea level records with high temporal resolution are of paramount importance for future coastal protection and adaptation plans. Here we discuss the application of machine learning techniques to some regression problems commonly encountered when analyzing such time series. The performance of artificial neural networks is compared with that of multiple linear regression models on sea level data from the Swedish coast. The neural networks are found to be superior when local sea level forcing is used together with remote sea level forcing and meteorological forcing, whereas the linear models and the neural networks show similar performance when local sea level forcing is excluded. The overall performance of the machine learning algorithms is good, often surpassing that of the much more computationally costly numerical ocean models used at our institute.


2020 ◽  
Author(s):  
Jeremy Rohmer ◽  
Daniel Lincke ◽  
Jochen Hinckel ◽  
Goneri Le Cozannet ◽  
Erwin Lambert

<p>Global scale assessment of coastal flood damage and adaptation costs under 21st century sea-level rise are associated with a wide range of uncertainties including those in future projections of socioeconomic development (SSP scenarios), of greenhouse gas emissions (RCP scenarios), and of sea-level rise (SLR). These uncertainties also include structural uncertainties related to the modeling of extreme sea levels, vulnerability functions, and the translation of flooding-induced damage to costs. This raises the following questions: what is the relative importance of each source of uncertainty in the final global-scale results? Which sources of uncertainty need to be considered? What uncertainties are of negligible influence? Hence, getting better insights in the role played by these uncertainties allows to ease their communication and to structure the message on future coastal impacts and induced losses. Using the integrated DIVA Model (see e.g. Hinkel et al., 2014, PNAS), we extensively explore the impact of these uncertainties in a global manner, i.e. by considering a large number (~3,000) of scenario combinations and by analyzing the associated results using a regression-based machine learning technique (i.e. regression decision trees). On this basis, we show the decreasing roles, over time, of the uncertainties in the extremes’ modeling together with the increasing roles of SSP and of RCP after 2030 and 2080 for the damage and adaptation costs respectively. This means that mitigation of climate change helps to reduce uncertainty of adaptation costs, and choosing a particular SSP reduces the uncertainty on the expected damages. In addition, the tree structure of the machine learning technique allows an in-depth analysis of the interactions of the different uncertain factors. These results are discussed depending on the SLR data selected for the analysis, i.e. before and after the recently released IPCC SROCC report on September 2019.</p>


2020 ◽  
Author(s):  
Davide Faranda ◽  
Mathieu Vrac ◽  
Pascal Yiou ◽  
Flavio Maria Emanuele Pons ◽  
Adnane Hamid ◽  
...  

Abstract. Recent advances in statistical and machine learning have opened the possibility to forecast the behavior of chaotic systems using recurrent neural networks. In this article we investigate the applicability of such a framework to geophysical flows, known to involve multiple scales in length, time and energy and to feature intermittency. We show that both multiscale dynamics and intermittency introduce severe limitations on the applicability of recurrent neural networks, both for short-term forecasts, as well as for the reconstruction of the underlying attractor. We suggest that possible strategies to overcome such limitations should be based on separating the smooth large-scale dynamics from the intermittent/small-scale features. We test these ideas on global sea-level pressure data for the past 40 years, a proxy of the atmospheric circulation dynamics. Better short and long term forecasts of sea-level pressure data can be obtained with an optimal choice of spatial coarse grain and time filtering.


2020 ◽  
Vol 6 (2) ◽  
pp. 36-41
Author(s):  
Raini Hassan ◽  
Abid Ebna Saif Utsha ◽  
Mahfuzealahi Noman

Natural calamities are often unforeseen and cause massive destruction. It is extremely difficult to predict natural disasters. Existing machine learning techniques are not reliable enough to find the affected countries due to earthquakes and rising sea levels. The aim of this paper is to use predictive analysis to find the countries that will be affected by earthquakes and rising sea levels. Also, the purpose is to see how machine learning techniques perform in terms of sudden calamities like earthquakes or slow calamities like rising sea level. The results was deduced by data analysis, and deep learning techniques like Long-Short Term Memory (LSTM). It was found out that using the approached method in this paper can accurately identify the countries that are going to be affected and predict both earthquake and sea level anomalies accurately. For earthquake, the model was able to capture the happening of earthquake events into a certain quarter of the year with the Root Mean Square Error (RMSE) of 0.504. And for sea level rise, the RMSE was 0.064. It was concluded that Deep learning techniques (e.g.-LSTM) work well with slow changes like sea level anomaly rather than earthquakes. The techniques used in this paper can be upgraded further in the future to find and help more endangered countries to be prepared better against these sudden calamities.


Sign in / Sign up

Export Citation Format

Share Document