scholarly journals Benthic Habitat Mapping using Sentinel 2A: A preliminary Study in Image Classification Approach in An Absence of Training Data

2021 ◽  
Vol 750 (1) ◽  
pp. 012029
Author(s):  
Munawaroh ◽  
AW Rudiastuti ◽  
RS Dewi ◽  
YH Ramadhani ◽  
A Rahadiati ◽  
...  
2016 ◽  
Vol 76 ◽  
pp. 200-208 ◽  
Author(s):  
Christopher E. Parrish ◽  
Jennifer A. Dijkstra ◽  
Jarlath P.M. O'Neil-Dunne ◽  
Lindsay McKenna ◽  
Shachak Pe'eri

Sensors ◽  
2021 ◽  
Vol 21 (5) ◽  
pp. 1573
Author(s):  
Loris Nanni ◽  
Giovanni Minchio ◽  
Sheryl Brahnam ◽  
Gianluca Maguolo ◽  
Alessandra Lumini

Traditionally, classifiers are trained to predict patterns within a feature space. The image classification system presented here trains classifiers to predict patterns within a vector space by combining the dissimilarity spaces generated by a large set of Siamese Neural Networks (SNNs). A set of centroids from the patterns in the training data sets is calculated with supervised k-means clustering. The centroids are used to generate the dissimilarity space via the Siamese networks. The vector space descriptors are extracted by projecting patterns onto the similarity spaces, and SVMs classify an image by its dissimilarity vector. The versatility of the proposed approach in image classification is demonstrated by evaluating the system on different types of images across two domains: two medical data sets and two animal audio data sets with vocalizations represented as images (spectrograms). Results show that the proposed system’s performance competes competitively against the best-performing methods in the literature, obtaining state-of-the-art performance on one of the medical data sets, and does so without ad-hoc optimization of the clustering methods on the tested data sets.


2021 ◽  
Vol 3 ◽  
pp. 100015
Author(s):  
Benjamin Misiuk ◽  
Myriam Lacharité ◽  
Craig J. Brown

2020 ◽  
Vol 26 (4) ◽  
pp. 405-425
Author(s):  
Javed Miandad ◽  
Margaret M. Darrow ◽  
Michael D. Hendricks ◽  
Ronald P. Daanen

ABSTRACT This study presents a new methodology to identify landslide and landslide-susceptible locations in Interior Alaska using only geomorphic properties from light detection and ranging (LiDAR) derivatives (i.e., slope, profile curvature, and roughness) and the normalized difference vegetation index (NDVI), focusing on the effect of different resolutions of LiDAR images. We developed a semi-automated object-oriented image classification approach in ArcGIS 10.5 and prepared a landslide inventory from visual observation of hillshade images. The multistage work flow included combining derivatives from 1-, 2.5-, and 5-m-resolution LiDAR, image segmentation, image classification using a support vector machine classifier, and image generalization to clean false positives. We assessed classification accuracy by generating confusion matrix tables. Analysis of the results indicated that LiDAR image scale played an important role in the classification, and the use of NDVI generated better results. Overall, the LiDAR 5-m-resolution image with NDVI generated the best results with a kappa value of 0.55 and an overall accuracy of 83 percent. The LiDAR 1-m-resolution image with NDVI generated the highest producer accuracy of 73 percent in identifying landslide locations. We produced a combined overlay map by summing the individual classified maps that was able to delineate landslide objects better than the individual maps. The combined classified map from 1-, 2.5-, and 5-m-resolution LiDAR with NDVI generated producer accuracies of 60, 80, and 86 percent and user accuracies of 39, 51, and 98 percent for landslide, landslide-susceptible, and stable locations, respectively, with an overall accuracy of 84 percent and a kappa value of 0.58. This semi-automated object-oriented image classification approach demonstrated potential as a viable tool with further refinement and/or in combination with additional data sources.


2016 ◽  
Author(s):  
Pu Hong ◽  
Xiao-feng Ye ◽  
Hui Yu ◽  
Zhi-jie Zhang ◽  
Yu-fei Cai ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document