scholarly journals Performance of Different Classifiers for Marine Habitat Mapping using Side Scan Sonar and Object-Based Image Analysis

Author(s):  
Raihanah Rusmadi ◽  
Rozaimi Che Hasan
2020 ◽  
Vol 12 (3) ◽  
pp. 554 ◽  
Author(s):  
Apostolos Papakonstantinou ◽  
Chrysa Stamati ◽  
Konstantinos Topouzelis

The use of unmanned aerial systems (UAS) over the past years has exploded due to their agility and ability to image an area with high-end products. UAS are a low-cost method for close remote sensing, giving scientists high-resolution data with limited deployment time, accessing even the most inaccessible areas. This study aims to produce marine habitat mapping by comparing the results produced from true-color RGB (tc-RGB) and multispectral high-resolution orthomosaics derived from UAS geodata using object-based image analysis (OBIA). The aerial data was acquired using two different types of sensors—one true-color RGB and one multispectral—both attached to a UAS, capturing images simultaneously. Additionally, divers’ underwater images and echo sounder measurements were collected as in situ data. The produced orthomosaics were processed using three scenarios by applying different classifiers for the marine habitat classification. In the first and second scenario, the k-nearest neighbor (k-NN) and fuzzy rules were applied as classifiers, respectively. In the third scenario, fuzzy rules were applied in the echo sounder data to create samples for the classification process, and then the k-NN algorithm was used as the classifier. The in situ data collected were used as reference and training data. Additionally, these data were used for the calculation of the overall accuracy of the OBIA process in all scenarios. The classification results of the three scenarios were compared. Using tc-RGB instead of multispectral data provides better accuracy in detecting and classifying marine habitats when applying the k-NN as the classifier. In this case, the overall accuracy was 79%, and the Kappa index of agreement (KIA) was equal to 0.71, which illustrates the effectiveness of the proposed approach. The results showed that sub-decimeter resolution UAS data revealed the sub-bottom complexity to a large extent in relatively shallow areas as they provide accurate information that permits the habitat mapping in extreme detail. The produced habitat datasets are ideal as reference data for studying complex coastal environments using satellite imagery.


2018 ◽  
Vol 208 ◽  
pp. 27-41 ◽  
Author(s):  
Chris Roelfsema ◽  
Eva Kovacs ◽  
Juan Carlos Ortiz ◽  
Nicholas H. Wolff ◽  
David Callaghan ◽  
...  

2015 ◽  
Vol 24 ◽  
pp. 222-227 ◽  
Author(s):  
Nurhalis Wahidin ◽  
Vincentius P. Siregar ◽  
Bisman Nababan ◽  
Indra Jaya ◽  
Sam Wouthuyzen

2018 ◽  
Vol 39 (1-2) ◽  
pp. 271-288 ◽  
Author(s):  
Daniel Ierodiaconou ◽  
Alexandre C. G. Schimel ◽  
David Kennedy ◽  
Jacquomo Monk ◽  
Grace Gaylard ◽  
...  

2021 ◽  
Vol 193 (2) ◽  
Author(s):  
Jens Oldeland ◽  
Rasmus Revermann ◽  
Jona Luther-Mosebach ◽  
Tillmann Buttschardt ◽  
Jan R. K. Lehmann

AbstractPlant species that negatively affect their environment by encroachment require constant management and monitoring through field surveys. Drones have been suggested to support field surveyors allowing more accurate mapping with just-in-time aerial imagery. Furthermore, object-based image analysis tools could increase the accuracy of species maps. However, only few studies compare species distribution maps resulting from traditional field surveys and object-based image analysis using drone imagery. We acquired drone imagery for a saltmarsh area (18 ha) on the Hallig Nordstrandischmoor (Germany) with patches of Elymus athericus, a tall grass which encroaches higher parts of saltmarshes. A field survey was conducted afterwards using the drone orthoimagery as a baseline. We used object-based image analysis (OBIA) to segment CIR imagery into polygons which were classified into eight land cover classes. Finally, we compared polygons of the field-based and OBIA-based maps visually and for location, area, and overlap before and after post-processing. OBIA-based classification yielded good results (kappa = 0.937) and agreed in general with the field-based maps (field = 6.29 ha, drone = 6.22 ha with E. athericus dominance). Post-processing revealed 0.31 ha of misclassified polygons, which were often related to water runnels or shadows, leaving 5.91 ha of E. athericus cover. Overlap of both polygon maps was only 70% resulting from many small patches identified where E. athericus was absent. In sum, drones can greatly support field surveys in monitoring of plant species by allowing for accurate species maps and just-in-time captured very-high-resolution imagery.


2021 ◽  
Vol 13 (4) ◽  
pp. 830
Author(s):  
Adam R. Benjamin ◽  
Amr Abd-Elrahman ◽  
Lyn A. Gettys ◽  
Hartwig H. Hochmair ◽  
Kyle Thayer

This study investigates the use of unmanned aerial systems (UAS) mapping for monitoring the efficacy of invasive aquatic vegetation (AV) management on a floating-leaved AV species, Nymphoides cristata (CFH). The study site consists of 48 treatment plots (TPs). Based on six unique flights over two days at three different flight altitudes while using both a multispectral and RGB sensor, accuracy assessment of the final object-based image analysis (OBIA)-derived classified images yielded overall accuracies ranging from 89.6% to 95.4%. The multispectral sensor was significantly more accurate than the RGB sensor at measuring CFH areal coverage within each TP only with the highest multispectral, spatial resolution (2.7 cm/pix at 40 m altitude). When measuring response in the AV community area between the day of treatment and two weeks after treatment, there was no significant difference between the temporal area change from the reference datasets and the area changes derived from either the RGB or multispectral sensor. Thus, water resource managers need to weigh small gains in accuracy from using multispectral sensors against other operational considerations such as the additional processing time due to increased file sizes, higher financial costs for equipment procurements, and longer flight durations in the field when operating multispectral sensors.


Sign in / Sign up

Export Citation Format

Share Document