local convexity
Recently Published Documents


TOTAL DOCUMENTS

91
(FIVE YEARS 17)

H-INDEX

13
(FIVE YEARS 1)

Author(s):  
Chuin-Shan Chen ◽  
Jimmy Gaspard Jean ◽  
Tung-Huan Su
Keyword(s):  

2021 ◽  
Author(s):  
Junko Iwahashi ◽  
Dai Yamazaki

Abstract Global terrain classification data have been used for various issues that are known to be related to topography, such as estimation of soil types, estimation of Vs30, and creation of seismic hazard maps. However, due to the resolution of the DEMs used, the terrain classification data from previous studies could not discriminate small landforms, such as narrow valley bottom plains, and small-rises within the plains. We created a global polygon dataset of the shapefile format divided into uniform slopes from slope gradients and HAND (height above the nearest drainage) calculated using the 90m spatial resolution MERIT DEM, and combined this data with the unit catchments of MERIT-Basins. This dataset contains the calculated terrain measurements (slope gradient, HAND, surface texture, local convexity, Sinks) and polygon areas as attributes, as well as the ID number of the MERIT-Basins’ unit catchment. In addition, the results of k-means clustering using slope gradient, HAND, and surface texture, which can be joined with the dataset as a simple terrain classification, are also available. This dataset can be used as a proxy and is expected to contribute to the modeling and estimation of various points that are known to be related to topography.


2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Junko Iwahashi ◽  
Dai Yamazaki ◽  
Takayuki Nakano ◽  
Ryo Endo

AbstractThis study aims to create a terrain classification of Japan that allows both geomorphological and geoengineering classifications coexist without large contradictions and to distinguish landform elements even in urban plains which include noise associated with digital elevation models (DEMs). Because Japan is susceptible to natural disasters, we designed the classification to reflect the ground vulnerability of both alluvial plains and mountains through the application of terrain classification data to landslide susceptibility and seismic zoning. We updated an existing DEM-based terrain classification method for application in the high-resolution 30 m DEM. We used topographic measurements that do not amplify manmade unevenness or noise, which are usually the main problems associated with the use of high-resolution DEMs with high vertical accuracies. We selected the height above the nearest drainage (HAND), slope gradient, surface texture, and local convexity as geometric signatures, which were devised so as not to detect noise. Segment polygon data of terrain units were derived from the raster data of slope and HAND. The polygon data were classified into 40 clusters using the attributes of slope, HAND, and surface texture; then they were grouped into 16 legends following comparisons with the existing geological and geomorphological maps and supplementary reclassification by HAND and local convexity. The derived terrain classification, except for manmade cuts and fills, showed similarities with the existing expert-driven maps and some association with areas where shallow landslides or floods frequently occur. Based on a trial in California using a 30 m DEM, we concluded that the proposed method can be adopted in other regions outside of Japan.


2021 ◽  
Vol 13 (3) ◽  
pp. 477
Author(s):  
Juan Carlos Marrero ◽  
David Martín de Diego ◽  
Eduardo Martínez

<p style='text-indent:20px;'>A theory of local convexity for a second order differential equation (${\text{sode}}$) on a Lie algebroid is developed. The particular case when the ${\text{sode}}$ is homogeneous quadratic is extensively discussed.</p>


2020 ◽  
Author(s):  
Sorush Niknamian

Point cloud data reconstruction is the basis of point cloud data processing. The reconstruction effect has a great impact on application. For the problems of low precision, large error, and high time consumption of the current scattered point cloud data reconstruction algorithm, a new algorithm of scattered point cloud data reconstruction based on local convexity is proposed in this paper. Firstly, according to surface variation based on local outlier factor (SVLOF), the noise points of point cloud data are divided into near outlier and far outlier, and filtered for point cloud data preprocessing. Based on this, the algorithm based on local convexity is improved. The method of constructing local connection point set is used to replace triangulation to analyze the relationship of neighbor points. The connection part identification method is used for data reconstruction. Experimental results show that, the proposed method can reconstruct the scattered point cloud data accurately, with high precision, small error and low time consumption.


2020 ◽  
Author(s):  
Sorush Niknamian

Point cloud data reconstruction is the basis of point cloud data processing. The reconstruction effect has a great impact on application. For the problems of low precision, large error, and high time consumption of the current scattered point cloud data reconstruction algorithm, a new algorithm of scattered point cloud data reconstruction based on local convexity is proposed in this paper. Firstly, according to surface variation based on local outlier factor (SVLOF), the noise points of point cloud data are divided into near outlier and far outlier, and filtered for point cloud data preprocessing. Based on this, the algorithm based on local convexity is improved. The method of constructing local connection point set is used to replace triangulation to analyze the relationship of neighbor points. The connection part identification method is used for data reconstruction. Experimental results show that, the proposed method can reconstruct the scattered point cloud data accurately, with high precision, small error and low time consumption.


Author(s):  
Sorush Niknamian

Point cloud data reconstruction is the basis of point cloud data processing. The reconstruction effect has a great impact on application. For the problems of low precision, large error, and high time consumption of the current scattered point cloud data reconstruction algorithm, a new algorithm of scattered point cloud data reconstruction based on local convexity is proposed in this paper. Firstly, according to surface variation based on local outlier factor (SVLOF), the noise points of point cloud data are divided into near outlier and far outlier, and filtered for point cloud data preprocessing. Based on this, the algorithm based on local convexity is improved. The method of constructing local connection point set is used to replace triangulation to analyze the relationship of neighbor points. The connection part identification method is used for data reconstruction. Experimental results show that, the proposed method can reconstruct the scattered point cloud data accurately, with high precision, small error and low time consumption.


Sensors ◽  
2020 ◽  
Vol 20 (4) ◽  
pp. 1192 ◽  
Author(s):  
Mihai Niculiță

Archaeological topography identification from high-resolution DEMs (Digital Elevation Models) is a current method that is used with high success in archaeological prospecting of wide areas. I present a methodology through which burial mounds (tumuli) from LiDAR (Light Detection And Ranging) DEMS can be identified. This methodology uses geomorphometric and statistical methods to identify with high accuracy burial mound candidates. Peaks, defined as local elevation maxima are found as a first step. In the second step, local convexity watershed segments and their seeds are compared with positions of local peaks and the peaks that correspond or have in vicinity local convexity segments seeds are selected. The local convexity segments that correspond to these selected peaks are further fed to a Random Forest algorithm together with shape descriptors and descriptive statistics of geomorphometric variables in order to build a model for the classification. Multiple approaches to tune and select the proper training dataset, settings, and variables were tested. The validation of the model was performed on the full dataset where the training was performed and on an external dataset in order to test the usability of the method for other areas in a similar geomorphological and archaeological setting. The validation was performed against manually mapped, and field checked burial mounds from two neighbor study areas of 100 km2 each. The results show that by training the Random Forest on a dataset composed of between 75% and 100% of the segments corresponding to burial mounds and ten times more non-burial mounds segments selected using Latin hypercube sampling, 93% of the burial mound segments from the external dataset are identified. There are 42 false positive cases that need to be checked, and there are two burial mound segments missed. The method shows great promise to be used for burial mound detection on wider areas by delineating a certain number of tumuli mounds for model training.


Author(s):  
Mihai Niculita

Archaeological topography identification from high-resolution DEMs is a current method that is used with high success in archaeological prospecting of wide areas. I present a methodology trough which burial mounds (tumuli) from LiDAR DEMS can be identified. This methodology uses geomorphometric and statistical methods to identify with high accuracy burial mound candidates. Peaks, defined as local elevation maxima are found as a first step. In the second step, local convexity watershed segments and their seeds are compared with positions of local peaks and the peaks that correspond or have in vicinity local convexity segments seeds are selected. The local convexity segments that correspond to these selected peaks are further feed to a Random Forest algorithm together with shape descriptors and descriptive statistics of geomorphometric variables in order to build a model for the classification. Multiple approaches to tune and selected the proper training dataset, settings and variables were tested. The validation of the model was performed on the full dataset where the training was performed and on an external dataset in order to test the usability of the method for other areas in a similar geomorphological and archaeological setting. The validation was performed against manually mapped and field checked burial mounds from two neighbor study areas of 100 km2 each. The results show that by training the Random Forest on a dataset composed of between 75% to 100% of the segments corresponding to burial mounds and ten times more non-burial mounds segments selected using latin hypercube sampling, 93% of the burial mound segments from the external dataset are identified. There are 42 false positive cases that need to be checked, and there are two burial mound segments missed. The method shows great promise to be used for burial mound detection on wider areas by delineating a certain number of tumuli mounds for model training.


2020 ◽  
Author(s):  
Sorush Niknamian

Abstract: Point cloud data reconstruction is the basis of point cloud data processing. The reconstruction effect has a great impact on application. For the problems of low precision, large error, and high time consumption of the current scattered point cloud data reconstruction algorithm, a new algorithm of scattered point cloud data reconstruction based on local convexity is proposed in this paper. Firstly, according to surface variation based on local outlier factor (SVLOF), the noise points of point cloud data are divided into near outlier and far outlier, and filtered for point cloud data preprocessing. Based on this, the algorithm based on local convexity is improved. The method of constructing local connection point set is used to replace triangulation to analyze the relationship of neighbor points. The connection part identification method is used for data reconstruction. Experimental results show that, the proposed method can reconstruct the scattered point cloud data accurately, with high precision, small error and low time consumption.


Sign in / Sign up

Export Citation Format

Share Document