scholarly journals OPPORTUNITIES FOR MACHINE LEARNING AND ARTIFICIAL INTELLIGENCE IN NATIONAL MAPPING AGENCIES: ENHANCING ORDNANCE SURVEY WORKFLOW

Author(s):  
J. Murray ◽  
I. Sargent ◽  
D. Holland ◽  
A. Gardiner ◽  
K. Dionysopoulou ◽  
...  

Abstract. National Mapping agencies (NMA) are frequently tasked with providing highly accurate geospatial data for a range of customers. Traditionally, this challenge has been met by combining the collection of remote sensing data with extensive field work, and the manual interpretation and processing of the combined data. Consequently, this task is a significant logistical undertaking which benefits the production of high quality output, but which is extremely expensive to deliver. Therefore, novel approaches that can automate feature extraction and classification from remotely sensed data, are of great potential interest to NMAs across the entire sector. Using research undertaken at Great Britain’s NMA; Ordnance Survey (OS) as an example, this paper provides an overview of the recent advances at an NMA in the use of artificial intelligence (AI), including machine learning (ML) and deep learning (DL) based applications. Examples of these approaches are in automating the process of feature extraction and classification from remotely sensed aerial imagery. In addition, recent OS research in applying deep (convolutional) neural network architectures to image classification are also described. This overview is intended to be useful to other NMAs who may be considering the adoption of similar approaches within their workflows.

Author(s):  
Nikifor Ostanin ◽  
Nikifor Ostanin

Coastal zone of the Eastern Gulf of Finland is subjected to essential natural and anthropogenic impact. The processes of abrasion and accumulation are predominant. While some coastal protection structures are old and ruined the problem of monitoring and coastal management is actual. Remotely sensed data is important component of geospatial information for coastal environment research. Rapid development of modern satellite remote sensing techniques and data processing algorithms made this data essential for monitoring and management. Multispectral imagers of modern high resolution satellites make it possible to produce advanced image processing, such as relative water depths estimation, sea-bottom classification and detection of changes in shallow water environment. In the framework of the project of development of new coast protection plan for the Kurortny District of St.-Petersburg a series of archival and modern satellite images were collected and analyzed. As a result several schemes of underwater parts of coastal zone and schemes of relative bathymetry for the key areas were produced. The comparative analysis of multi-temporal images allow us to reveal trends of environmental changes in the study areas. This information, compared with field observations, shows that remotely sensed data is useful and efficient for geospatial planning and development of new coast protection scheme.


2019 ◽  
Vol 11 (3) ◽  
pp. 284 ◽  
Author(s):  
Linglin Zeng ◽  
Shun Hu ◽  
Daxiang Xiang ◽  
Xiang Zhang ◽  
Deren Li ◽  
...  

Soil moisture mapping at a regional scale is commonplace since these data are required in many applications, such as hydrological and agricultural analyses. The use of remotely sensed data for the estimation of deep soil moisture at a regional scale has received far less emphasis. The objective of this study was to map the 500-m, 8-day average and daily soil moisture at different soil depths in Oklahoma from remotely sensed and ground-measured data using the random forest (RF) method, which is one of the machine-learning approaches. In order to investigate the estimation accuracy of the RF method at both a spatial and a temporal scale, two independent soil moisture estimation experiments were conducted using data from 2010 to 2014: a year-to-year experiment (with a root mean square error (RMSE) ranging from 0.038 to 0.050 m3/m3) and a station-to-station experiment (with an RMSE ranging from 0.044 to 0.057 m3/m3). Then, the data requirements, importance factors, and spatial and temporal variations in estimation accuracy were discussed based on the results using the training data selected by iterated random sampling. The highly accurate estimations of both the surface and the deep soil moisture for the study area reveal the potential of RF methods when mapping soil moisture at a regional scale, especially when considering the high heterogeneity of land-cover types and topography in the study area.


2021 ◽  
Vol 13 (3) ◽  
pp. 368
Author(s):  
Christopher A. Ramezan ◽  
Timothy A. Warner ◽  
Aaron E. Maxwell ◽  
Bradley S. Price

The size of the training data set is a major determinant of classification accuracy. Nevertheless, the collection of a large training data set for supervised classifiers can be a challenge, especially for studies covering a large area, which may be typical of many real-world applied projects. This work investigates how variations in training set size, ranging from a large sample size (n = 10,000) to a very small sample size (n = 40), affect the performance of six supervised machine-learning algorithms applied to classify large-area high-spatial-resolution (HR) (1–5 m) remotely sensed data within the context of a geographic object-based image analysis (GEOBIA) approach. GEOBIA, in which adjacent similar pixels are grouped into image-objects that form the unit of the classification, offers the potential benefit of allowing multiple additional variables, such as measures of object geometry and texture, thus increasing the dimensionality of the classification input data. The six supervised machine-learning algorithms are support vector machines (SVM), random forests (RF), k-nearest neighbors (k-NN), single-layer perceptron neural networks (NEU), learning vector quantization (LVQ), and gradient-boosted trees (GBM). RF, the algorithm with the highest overall accuracy, was notable for its negligible decrease in overall accuracy, 1.0%, when training sample size decreased from 10,000 to 315 samples. GBM provided similar overall accuracy to RF; however, the algorithm was very expensive in terms of training time and computational resources, especially with large training sets. In contrast to RF and GBM, NEU, and SVM were particularly sensitive to decreasing sample size, with NEU classifications generally producing overall accuracies that were on average slightly higher than SVM classifications for larger sample sizes, but lower than SVM for the smallest sample sizes. NEU however required a longer processing time. The k-NN classifier saw less of a drop in overall accuracy than NEU and SVM as training set size decreased; however, the overall accuracies of k-NN were typically less than RF, NEU, and SVM classifiers. LVQ generally had the lowest overall accuracy of all six methods, but was relatively insensitive to sample size, down to the smallest sample sizes. Overall, due to its relatively high accuracy with small training sample sets, and minimal variations in overall accuracy between very large and small sample sets, as well as relatively short processing time, RF was a good classifier for large-area land-cover classifications of HR remotely sensed data, especially when training data are scarce. However, as performance of different supervised classifiers varies in response to training set size, investigating multiple classification algorithms is recommended to achieve optimal accuracy for a project.


Author(s):  
Derya Yiltas-Kaplan

This chapter focuses on the process of the machine learning with considering the architecture of software-defined networks (SDNs) and their security mechanisms. In general, machine learning has been studied widely in traditional network problems, but recently there have been a limited number of studies in the literature that connect SDN security and machine learning approaches. The main reason of this situation is that the structure of SDN has emerged newly and become different from the traditional networks. These structural variances are also summarized and compared in this chapter. After the main properties of the network architectures, several intrusion detection studies on SDN are introduced and analyzed according to their advantages and disadvantages. Upon this schedule, this chapter also aims to be the first organized guide that presents the referenced studies on the SDN security and artificial intelligence together.


Author(s):  
Ram L. Ray ◽  
Maurizio Lazzari ◽  
Tolulope Olutimehin

Landslide is one of the costliest and fatal geological hazards, threatening and influencing the socioeconomic conditions in many countries globally. Remote sensing approaches are widely used in landslide studies. Landslide threats can also be investigated through slope stability model, susceptibility mapping, hazard assessment, risk analysis, and other methods. Although it is possible to conduct landslide studies using in-situ observation, it is time-consuming, expensive, and sometimes challenging to collect data at inaccessible terrains. Remote sensing data can be used in landslide monitoring, mapping, hazard prediction and assessment, and other investigations. The primary goal of this chapter is to review the existing remote sensing approaches and techniques used to study landslides and explore the possibilities of potential remote sensing tools that can effectively be used in landslide studies in the future. This chapter also provides critical and comprehensive reviews of landslide studies focus¬ing on the role played by remote sensing data and approaches in landslide hazard assessment. Further, the reviews discuss the application of remotely sensed products for landslide detection, mapping, prediction, and evaluation around the world. This systematic review may contribute to better understanding the extensive use of remotely sensed data and spatial analysis techniques to conduct landslide studies at a range of scales.


2020 ◽  
Vol 12 (24) ◽  
pp. 4139
Author(s):  
Ruirui Wang ◽  
Wei Shi ◽  
Pinliang Dong

The nighttime light (NTL) on the surface of Earth is an important indicator for the human transformation of the world. NTL remotely sensed data have been widely used in urban development, population estimation, economic activity, resource development and other fields. With the increasing use of artificial lighting technology in agriculture, it has become possible to use NTL remote sensing data for monitoring agricultural activities. In this study, National Polar Partnership (NPP)-Visible Infrared Imaging Radiometer Suite (VIIRS) NTL remote sensing data were used to observe the seasonal variation of artificial lighting in dragon fruit cropland in Binh Thuan Province, Vietnam. Compared with the statistics of planted area, area having products and production of dragon fruit by district in the Statistical Yearbook of Binh Thuan Province 2018, values of the mean and standard deviation of NTL brightness have significant positive correlations with the statistical data. The results suggest that the NTL remotely sensed data could be used to reveal some agricultural productive activities such as dragon fruits production accurately by monitoring the seasonal artificial lighting. This research demonstrates the application potential of NTL remotely sensed data in agriculture.


Author(s):  
Yung Ming ◽  
Lily Yuan

Machine Learning (ML) and Artificial Intelligence (AI) methods are transforming many commercial and academic areas, including feature extraction, autonomous driving, computational linguistics, and voice recognition. These new technologies are now having a significant effect in radiography, forensics, and many other areas where the accessibility of automated systems may improve the precision and repeatability of essential job performance. In this systematic review, we begin by providing a short overview of the different methods that are currently being developed, with a particular emphasis on those utilized in biomedical studies.


2020 ◽  
Vol 12 (20) ◽  
pp. 3338
Author(s):  
Rami Al-Ruzouq ◽  
Mohamed Barakat A. Gibril ◽  
Abdallah Shanableh ◽  
Abubakir Kais ◽  
Osman Hamed ◽  
...  

Remote sensing technologies and machine learning (ML) algorithms play an increasingly important role in accurate detection and monitoring of oil spill slicks, assisting scientists in forecasting their trajectories, developing clean-up plans, taking timely and urgent actions, and applying effective treatments to contain and alleviate adverse effects. Review and analysis of different sources of remotely sensed data and various components of ML classification systems for oil spill detection and monitoring are presented in this study. More than 100 publications in the field of oil spill remote sensing, published in the past 10 years, are reviewed in this paper. The first part of this review discusses the strengths and weaknesses of different sources of remotely sensed data used for oil spill detection. Necessary preprocessing and preparation of data for developing classification models are then highlighted. Feature extraction, feature selection, and widely used handcrafted features for oil spill detection are subsequently introduced and analyzed. The second part of this review explains the use and capabilities of different classical and developed state-of-the-art ML techniques for oil spill detection. Finally, an in-depth discussion on limitations, open challenges, considerations of oil spill classification systems using remote sensing, and state-of-the-art ML algorithms are highlighted along with conclusions and insights into future directions.


Acta Tropica ◽  
2018 ◽  
Vol 185 ◽  
pp. 167-175 ◽  
Author(s):  
Juan M. Scavuzzo ◽  
Francisco Trucco ◽  
Manuel Espinosa ◽  
Carolina B. Tauro ◽  
Marcelo Abril ◽  
...  

1973 ◽  
Vol 1973 (1) ◽  
pp. 117-125
Author(s):  
J. E. Estes ◽  
P. G. Mikolaj ◽  
R. R. Thaman ◽  
L. W. Senger

ABSTRACT The detection, measurement, and monitoring of oil pollution in the marine environment are receiving increased attention owing to: I) the growing incidence of oil spills; 2) the associated need for improved cleanup procedures; and, 3) the need for more effective surveillance systems, capable of gathering legal evidence for the prosecution of violators. The Geography Remote Sensing Unit and the Department of Chemical and Nuclear Engineering at the University of California, Santa Barbara for 2 1/2 years has been conducting experiments related to the application of remotely sensed data to these problem areas. As part of a United States Coast Guard test of a high seas oil containment device, a system for estimating the volume of oil loss resulting from oil pollution incidents was developed. This system involved the coordination of remote sensing data acquisition with simultaneous collection of surface sampling data. Results indicate that remotely sensed data, when effectively correlated with surface sampling data, can provide a base for volumetric estimations of a given oil slick. Refinements of these techniques can lead to more efficient, real-time day/night, operational monitoring of marine oil pollution incidents.


Sign in / Sign up

Export Citation Format

Share Document