Bare-earth DEM Generation in Urban Areas Based on a Machine Learning Method

Author(s):  
Yinxue Liu ◽  
Paul Bates ◽  
Jeffery Neal ◽  
Dai Yamazaki

<p>Precise representation of global terrain is of great significance for estimating global flood risk. As the most vulnerable areas to flooding, urban areas need GDEMs of high quality. However, current Global Digital Elevation Models (GDEMs) are all Digital Surface Models (DSMs) in urban areas, which will cause substantial blockages of flow pathways within flood inundation models. By taking GPS and LIDAR data as terrain observations, errors of popular GDEMs (including SRTM 1” void-filled version DEM - SRTM, Multi-Error-Removed Improved-Terrain DEM - MERIT and TanDEM-X 3” resolution DEM -TDM3) were analysed in seven varied types of cities. It was found that the RMSE of GDEMs errors are in the range of 2.3 m – 7.9 m, and that MERIT and TDM3 both outperformed SRTM. The error comparison between MERIT and TDM3 showed that the most accurate model varied among the studied cities. Generally, error of TDM3 is slightly lower than MERIT, but TDM3 has more extreme errors (absolute value exceeds 15 m). For cities which have experienced rapid development in the past decade, the RMSE of MERIT is lower than that of TDM3, which is mainly caused by the acquisition time difference between these two models. A machine learning method was adopted to estimate MERIT error. Night Time Light, world population density data, Openstreetmap building data, slope, elevation and neighbourhood elevation values from widely available datasets, comprising 14 factors in total, were used in the regression. Models were trained based on single city and combinations of cities, respectively, and then used to estimate error in a target city. By this approach, the RMSE of corrected MERIT can decline by up to 75% with target city trained model, though less significant a reduction of 35% -68% was shown in the combined model with target city excluded in the training data. Further validation via flood simulation showed improvements in terms of both flood extent and inundation depth by the corrected MERIT over the original MERIT, with a validation in small sized city. However, the corrected MERIT was not as good as TDM3 in this case. This method has the potential to generate a better bare-earth global DEM in urban areas, but the sensitive level about the model extrapolative application needs investigation in more study sites.</p>

2019 ◽  
Author(s):  
Ge Liu ◽  
Haoyang Zeng ◽  
Jonas Mueller ◽  
Brandon Carter ◽  
Ziheng Wang ◽  
...  

AbstractThe precise targeting of antibodies and other protein therapeutics is required for their proper function and the elimination of deleterious off-target effects. Often the molecular structure of a therapeutic target is unknown and randomized methods are used to design antibodies without a model that relates antibody sequence to desired properties. Here we present a machine learning method that can design human Immunoglobulin G (IgG) antibodies with target affinities that are superior to candidates from phage display panning experiments within a limited design budget. We also demonstrate that machine learning can improve target-specificity by the modular composition of models from different experimental campaigns, enabling a new integrative approach to improving target specificity. Our results suggest a new path for the discovery of therapeutic molecules by demonstrating that predictive and differentiable models of antibody binding can be learned from high-throughput experimental data without the need for target structural data.SignificanceAntibody based therapeutics must meet both affinity and specificity metrics, and existing in vitro methods for meeting these metrics are based upon randomization and empirical testing. We demonstrate that with sufficient target-specific training data machine learning can suggest novel antibody variable domain sequences that are superior to those observed during training. Our machine learning method does not require any target structural information. We further show that data from disparate antibody campaigns can be combined by machine learning to improve antibody specificity.


Author(s):  
Jian Yi

The stability of the economic market is an important factor for the rapid development of the economy, especially for the listed companies, whose financial and economic stability affects the stability of the financial market. It is helpful for the healthy development of enterprises and financial markets to make an accurate early warning of the financial economy of listed enterprises. This paper briefly introduced the support vector machine (SVM) and back-propagation neural network (BPNN) algorithms in the machine learning method. To make up for the defects of the two algorithms, they were combined and applied to the enterprise financial economics early warning. A simulation experiment was carried out on the single SVM algorithm-based, single BPNN algorithm-based, and SVM algorithm and BPNN algorithm combined model with the MATLAB software. The results show that the SVM algorithm and BP algorithm combined model converges faster and has higher precision and recall rate and larger area under the curve (AUC) than the single SVM algorithm-based model and the single BPNN algorithm-based model.


2021 ◽  
Author(s):  
Ying Yang ◽  
Huaixin Cao

Abstract With the rapid development of machine learning, artificial neural networks provide a powerful tool to represent or approximate many-body quantum states. It was proved that every graph state can be generated by a neural network. In this paper, we aim to introduce digraph states and explore their neural network representations (NNRs). Based on some discussions about digraph states and neural network quantum states (NNQSs), we construct explicitly the NNR for any digraph state, implying every digraph state is an NNQS. The obtained results will provide a theoretical foundation for solving the quantum many-body problem with machine learning method whenever the wave-function is known as an unknown digraph state or it can be approximated by digraph states.


Author(s):  
N. A. K. Doan ◽  
W. Polifke ◽  
L. Magri

We propose a physics-constrained machine learning method—based on reservoir computing—to time-accurately predict extreme events and long-term velocity statistics in a model of chaotic flow. The method leverages the strengths of two different approaches: empirical modelling based on reservoir computing, which learns the chaotic dynamics from data only, and physical modelling based on conservation laws. This enables the reservoir computing framework to output physical predictions when training data are unavailable. We show that the combination of the two approaches is able to accurately reproduce the velocity statistics, and to predict the occurrence and amplitude of extreme events in a model of self-sustaining process in turbulence. In this flow, the extreme events are abrupt transitions from turbulent to quasi-laminar states, which are deterministic phenomena that cannot be traditionally predicted because of chaos. Furthermore, the physics-constrained machine learning method is shown to be robust with respect to noise. This work opens up new possibilities for synergistically enhancing data-driven methods with physical knowledge for the time-accurate prediction of chaotic flows.


Author(s):  
Vincent X. Gong ◽  
Winnie Daamen ◽  
Alessandro Bozzon ◽  
Serge P. Hoogendoorn

City events are being organized more frequently, and with larger crowds, in urban areas. There is an increased need for novel methods and tools that can provide information on the sentiments of crowds as an input for crowd management. Previous work has explored sentiment analysis and a large number of methods have been proposed relating to various contexts. None of them, however, aimed at deriving the sentiments of crowds using social media in city events, and no existing event-based dataset is available for such studies. This paper investigates how social media can be used to estimate the sentiments of crowds in city events. First, some lexicon-based and machine learning-based methods were selected to perform sentiment analyses, then an event-based sentiment annotated dataset was constructed. The performance of the selected methods was trained and tested in an experiment using common and event-based datasets. Results show that the machine learning method LinearSVC achieves the lowest estimation error for sentiment analysis on social media in city events. The proposed event-based dataset is essential for training methods to reduce estimation error in such contexts.


Author(s):  
D.-L. Cheng ◽  
W.-H. Lai

<p><strong>Abstract.</strong> The UAS fault problem has led to many potential risk factors behind its rapid development in recent years. Therefore, the diagnosis of UAS health status is still an important issue. This study adopted the SOM machine learning method which is an unsupervised clustering method to establish a model for diagnosing the health status of quadcopter. Take the vibration features of three flight states (undamaged, motor mount loose, unbalanced broken propeller). Through those training data the model can cluster different vibration pattern of fault situation. It not only can classify the failure status with 99% accuracy but also can provide pre-failure indicators.</p>


2019 ◽  
Author(s):  
Hironori Takemoto ◽  
Tsubasa Goto ◽  
Yuya Hagihara ◽  
Sayaka Hamanaka ◽  
Tatsuya Kitamura ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document