scholarly journals Classification of Video Observation Data for Volcanic Activity Monitoring Using Computer Vision and Modern Neural NetWorks (on Klyuchevskoy Volcano Example)

2021 ◽  
Vol 13 (23) ◽  
pp. 4747
Author(s):  
Sergey Korolev ◽  
Aleksei Sorokin ◽  
Igor Urmanov ◽  
Aleksandr Kamaev ◽  
Olga Girina

Currently, video observation systems are actively used for volcano activity monitoring. Video cameras allow us to remotely assess the state of a dangerous natural object and to detect thermal anomalies if technical capabilities are available. However, continuous use of visible band cameras instead of special tools (for example, thermal cameras), produces large number of images, that require the application of special algorithms both for preliminary filtering out the images with area of interest hidden due to weather or illumination conditions, and for volcano activity detection. Existing algorithms use preselected regions of interest in the frame for analysis. This region could be changed occasionally to observe events in a specific area of the volcano. It is a problem to set it in advance and keep it up to date, especially for an observation network with multiple cameras. The accumulated perennial archives of images with documented eruptions allow us to use modern deep learning technologies for whole frame analysis to solve the specified task. The article presents the development of algorithms to classify volcano images produced by video observation systems. The focus is on developing the algorithms to create a labelled dataset from an unstructured archive using existing and authors proposed techniques. The developed solution was tested using the archive of the video observation system for the volcanoes of Kamchatka, in particular the observation data for the Klyuchevskoy volcano. The tests show the high efficiency of the use of convolutional neural networks in volcano image classification, and the accuracy of classification achieved 91%. The resulting dataset consisting of 15,000 images and labelled in three classes of scenes is the first dataset of this kind of Kamchatka volcanoes. It can be used to develop systems for monitoring other stratovolcanoes that occupy most of the video frame.

Author(s):  
Michael Evans ◽  
Taylor Minich

We have an unprecedented ability to analyze and map the Earth’s surface, as deep learning technologies are applied to an abundance of Earth observation systems collecting images of the planet daily. In order to realize the potential of these data to improve conservation outcomes, simple, free, and effective methods are needed to enable a wide variety of stakeholders to derive actionable insights from these tools. In this paper we demonstrate simple methods and workflows using free, open computing resources to train well-studied convolutional neural networks and use these to delineate objects of interest in publicly available Earth observation images. With limited training datasets (<1000 observations), we used Google Earth Engine and Tensorflow to process Sentinel-2 and National Agricultural Imaging Program data, and use these to train U-Net and DeepLab models that delineate ground mounted solar arrays and parking lots in satellite imagery. The trained models achieved 81.5% intersection over union between predictions and ground-truth observations in validation images. These images were generated at different times and from different places from those upon which they were trained, indicating the ability of models to generalize outside of data on which they were trained. The two case studies we present illustrate how these methods can be used to inform and improve the development of renewable energy in a manner that is consistent with wildlife conservation.


Author(s):  
Basiyr D. Rodney ◽  
David Devraj Kumar ◽  
Andrew Binder

This chapter discusses the conceptualization and development of a methodological tool for conducting classroom research and teacher evaluations with application to the analysis of the Trends in International Mathematics and Science Study (TIMSS) classroom data. The development process involved the creation of a structured video observation system (called the Synchronized Video Observation System, SIVOS) built on top of a database application. The concept applies the integration of an on-screen video frame containing classroom-teaching episodes alongside a structured teaching evaluation rubric. The conceptualization and development of such an application leverages rapid application development techniques. The application is of significance because it allows for the fine-grained and iterative analysis of classroom teaching episodes. It leverages the storing, searching, and retrieval capacity of a database application to code video segments with a structured observation tool. The tool offers an opportunity to enhance the fairness, accuracy, and transparency of teacher evaluations. The approach values low-inference, low-learning curve design. It allows for data to be quickly and easily analyzed. With such tools, teachers, researchers, and administrators have the ability to examine teaching behaviors for continuous improvement.


Sensors ◽  
2021 ◽  
Vol 21 (7) ◽  
pp. 2538
Author(s):  
Shuang Zhang ◽  
Feng Liu ◽  
Yuang Huang ◽  
Xuedong Meng

The direct-sequence spread-spectrum (DSSS) technique has been widely used in wireless secure communications. In this technique, the baseband signal is spread over a wider bandwidth using pseudo-random sequences to avoid interference or interception. In this paper, the authors propose methods to adaptively detect the DSSS signals based on knowledge-enhanced compressive measurements and artificial neural networks. Compared with the conventional non-compressive detection system, the compressive detection framework can achieve a reasonable balance between detection performance and sampling hardware cost. In contrast to the existing compressive sampling techniques, the proposed methods are shown to enable adaptive measurement kernel design with high efficiency. Through the theoretical analysis and the simulation results, the proposed adaptive compressive detection methods are also demonstrated to provide significantly enhanced detection performance efficiently, compared to their counterpart with the conventional random measurement kernels.


2021 ◽  
Author(s):  
Tom Smith

<p>Often in developing countries the spatial coverage with surface weather observations is sparse and the reliability of existing systems is lower than in other parts of the world. These gaps in the availability of observation data have significant negative consequences, locally and globally. For decades international funds have been used to acquire meteorological infrastructure with little to no focus on life-cycle management. Furthermore, improvements in one part of the value chain are often not connected with further downstream services meaning local benefits are generated with substantial delay, if at all.</p><p>DTN is one of the few organizations offering comprehensive solutions across the value chain from deployment and operation of observation systems through to weather analytics creating valuable insights for business, consumers and governments across the globe. DTN not only project manages the setup of weather observation systems but also maintains and operates measurement networks on different continents. The sensor agnostic approach enables us to offer the right sensor solution for each situation.</p><p>We see an opportunity to correct the mistakes of the past, changing the focus from acquiring observation systems to life cycle management to ensure the systems are maintained and leveraged effectively to provide forecasts and warnings for protection of life and property and enabling NMSs to focus on fulfilling their mission.</p><p>Funding organizations such as the World Bank must change the focus from hardware procurement to a performance-based PPE/P model that ensures the value of investments in infrastructure are realized. This sustainable approach will; ensure long lasting partnerships, harness the innovation in the private sector, create local jobs maintaining infrastructure and enable economic development through improved ability to manage the impact of weather and climate events.</p>


2020 ◽  
Vol 63 (10) ◽  
pp. 856-861
Author(s):  
A. V. Fedosov ◽  
G. V. Chumachenko

The article considers the issues of monitoring the thermal conditions of alloys melting and casting at foundries. It is noted that the least reliable method is when the measurement and fixing the temperature is assigned to the worker. On the other hand, a fully automatic approach is not always available for small foundries. In this regard, the expediency of using an automated approach is shown, in which the measurement is assigned to the worker, and the values are recorded automatically. This method assumes implementation of an algorithm for automatic classification of temperature measurements based on an end-to-end array of data obtained in the production stream. The solving of this task is divided into three stages. Preparing of raw data for classification process is provided on the first stage. On the second stage, the task of measurement classification is solved by using neural network principles. Analysis of the results of the artificial neural network has shown its high efficiency and degree of their correspondence with the actual situation on the work site. It was also noted that the application of artificial neural networks principles makes the classification process flexible, due to the ability to easily supplement the process with new parameters and neurons. The final stage is analysis of the obtained results. Correctly performed data classification provides an opportunity not only to assess compliance with technological discipline at the site, but also to improve the process of identifying the causes of casting defects. Application of the proposed approach allows us to reduce the influence of human factor in the analysis of thermal conditions of alloys melting and casting with minimal costs for melting monitoring.


2016 ◽  
Vol 47 (1) ◽  
pp. 275
Author(s):  
E. Kokinou ◽  
C. Panagiotakis ◽  
Th. Kinigopoulos

Image processing and understanding and further pattern recognition comprises a precious tool for the automatic extraction of information using digital topography. The aim of this work is the retrieval of areas with similar topography using digital elevation data. It can be applied to geomorphology, forestry, regional and urban planning, and many other applications for analyzing and managing natural resources. In specifics, the user selects the area of interest, navigating overhead a high resolution elevation image and determines two (3) parameters (step, number of local minima and display scale). Furthermore the regions with similar relief to the initial model are determined. Experimental results show high efficiency of the proposed scheme.


Author(s):  
A. P. Sysoev ◽  

The substantiation of parameters of the 3D observation system is considered from the perspective of the Kirchhoff migration. At the first step of this transformation, on the basis of diffraction transformation on a gather of CSP, the problem of wavelet extraction reflected from specified points of the medium (image points) is solved. The characteristic of the directivity of this transformation is determined by parameters of the arrangement of devices. At the second step, summation is performed by gathers of the common image point (СIP). The distribution density of the observation system sources determines the stacking fold by CIP. In the process of selecting survey parameters, the comparative analysis of equivalent observation systems with the same data properties for the migration task, but with different parameters of the observation system, is of great important. The relationship between the step of common midpoints of the observation system and the step of traces of resulting images of the medium is discussed. The Gaussian beam migration algorithm is considered as a method for solving the problem of constructing an image of the medium that correctly takes into account the irregularity of the initial data.


Algorithms ◽  
2020 ◽  
Vol 13 (3) ◽  
pp. 63 ◽  
Author(s):  
Krzysztof Ropiak ◽  
Piotr Artiemjew

The set of heuristics constituting the methods of deep learning has proved very efficient in complex problems of artificial intelligence such as pattern recognition, speech recognition, etc., solving them with better accuracy than previously applied methods. Our aim in this work has been to integrate the concept of the rough set to the repository of tools applied in deep learning in the form of rough mereological granular computing. In our previous research we have presented the high efficiency of our decision system approximation techniques (creating granular reflections of systems), which, with a large reduction in the size of the training systems, maintained the internal knowledge of the original data. The current research has led us to the question whether granular reflections of decision systems can be effectively learned by neural networks and whether the deep learning will be able to extract the knowledge from the approximated decision systems. Our results show that granulated datasets perform well when mined by deep learning tools. We have performed exemplary experiments using data from the UCI repository—Pytorch and Tensorflow libraries were used for building neural network and classification process. It turns out that deep learning method works effectively based on reduced training sets. Approximation of decision systems before neural networks learning can be important step to give the opportunity to learn in reasonable time.


2011 ◽  
Vol 16 (1) ◽  
pp. 73-81
Author(s):  
Gregory M. Benton ◽  
Bitapi C. Sinha

The first study of interpretation in India examined the effectiveness of interpretive facilities and exhibits to convey interpretive conservation messages. Kanha Tiger Reserve features a large budget, advanced technology, and international visitation. The single-case, multiple-methods approach examined visitor knowledge and behavior regarding exhibits. Pre- and post-program surveys, video observation of visitor flow through the interpretive center, and the readability of text were analyzed. Results from the survey indicate that visitor knowledge increased in spite of noise in the center. Video observation data suggests that visitor interest measured by attention index and holding power were greatest for the management related exhibits and decreased as participants moved further into the interpretive center. Images of tigers were found to be more important for attraction and holding power than the center's advanced floor light panels and other interpretive techniques. Dioramas, maps, and models were favored over text by visitors for readability.


2018 ◽  
Vol 146 (11) ◽  
pp. 3885-3900 ◽  
Author(s):  
Stephan Rasp ◽  
Sebastian Lerch

Abstract Ensemble weather predictions require statistical postprocessing of systematic errors to obtain reliable and accurate probabilistic forecasts. Traditionally, this is accomplished with distributional regression models in which the parameters of a predictive distribution are estimated from a training period. We propose a flexible alternative based on neural networks that can incorporate nonlinear relationships between arbitrary predictor variables and forecast distribution parameters that are automatically learned in a data-driven way rather than requiring prespecified link functions. In a case study of 2-m temperature forecasts at surface stations in Germany, the neural network approach significantly outperforms benchmark postprocessing methods while being computationally more affordable. Key components to this improvement are the use of auxiliary predictor variables and station-specific information with the help of embeddings. Furthermore, the trained neural network can be used to gain insight into the importance of meteorological variables, thereby challenging the notion of neural networks as uninterpretable black boxes. Our approach can easily be extended to other statistical postprocessing and forecasting problems. We anticipate that recent advances in deep learning combined with the ever-increasing amounts of model and observation data will transform the postprocessing of numerical weather forecasts in the coming decade.


Sign in / Sign up

Export Citation Format

Share Document