Coherence-based land cover classification in forested areas of Chattisgarh, Central India, using environmental satellite—advanced synthetic aperture radar data

2011 ◽  
Vol 5 (1) ◽  
pp. 059501 ◽  
Author(s):  
Vyjayanthi Nizalapur
1994 ◽  
Vol 38 ◽  
pp. 759-764
Author(s):  
Yasuto TACHIKAWA ◽  
Seiji SUHARA ◽  
Michiharu SHIIBA ◽  
Takuma TAKASAO ◽  
Kaoru TAKARA

2019 ◽  
Vol 11 (13) ◽  
pp. 1518 ◽  
Author(s):  
Rubén Valcarce-Diñeiro ◽  
Benjamín Arias-Pérez ◽  
Juan M. Lopez-Sanchez ◽  
Nilda Sánchez

Land-cover monitoring is one of the core applications of remote sensing. Monitoring and mapping changes in the distribution of agricultural land covers provide a reliable source of information that helps environmental sustainability and supports agricultural policies. Synthetic Aperture Radar (SAR) can contribute considerably to this monitoring effort. The first objective of this research is to extend the use of time series of polarimetric data for land-cover classification using a decision tree classification algorithm. With this aim, RADARSAT-2 (quad-pol) and Sentinel-1 (dual-pol) data were acquired over an area of 600 km2 in central Spain. Ten polarimetric observables were derived from both datasets and seven scenarios were created with different sets of observables to evaluate a multitemporal parcel-based approach for classifying eleven land-cover types, most of which were agricultural crops. The study demonstrates that good overall accuracies, greater than 83%, were achieved for all of the different proposed scenarios and the scenario with all RADARSAT-2 polarimetric observables was the best option (89.1%). Very high accuracies were also obtained when dual-pol data from RADARSAT-2 or Sentinel-1 were used to classify the data, with overall accuracies of 87.1% and 86%, respectively. In terms of individual crop accuracy, rapeseed achieved at least 95% of a producer’s accuracy for all scenarios and that was followed by the spring cereals (wheat and barley), which achieved high producer’s accuracies (79.9%-95.3%) and user’s accuracies (85.5% and 93.7%).


Sign in / Sign up

Export Citation Format

Share Document