Spatio-temporal video search using the object based video representation

Author(s):  
Di Zhong ◽  
Shih-Fu Chang
Author(s):  
Niels Svane ◽  
Troels Lange ◽  
Sara Egemose ◽  
Oliver Dalby ◽  
Aris Thomasberger ◽  
...  

Traditional monitoring (e.g., in-water based surveys) of eelgrass meadows and perennial macroalgae in coastal areas is time and labor intensive, requires extensive equipment, and the collected data has a low temporal resolution. Further, divers and Remotely Operated Vehicles (ROVs) have a low spatial extent that cover small fractions of full systems. The inherent heterogeneity of eelgrass meadows and macroalgae assemblages in these coastal systems makes interpolation and extrapolation of observations complicated and, as such, methods to collect data on larger spatial scales whilst retaining high spatial resolution is required to guide management. Recently, the utilization of Unoccupied Aerial Vehicles (UAVs) has gained popularity in ecological sciences due to their ability to rapidly collect large amounts of area-based and georeferenced data, making it possible to monitor the spatial extent and status of SAV communities with limited equipment requirements compared to ROVs or diver surveys. This paper is focused on the increased value provided by UAV-based, data collection (visual/Red Green Blue imagery) and Object Based Image Analysis for gaining an improved understanding of eelgrass recovery. It is demonstrated that delineation and classification of two species of SAV ( Fucus vesiculosus and Zostera marina) is possible; with an error matrix indicating 86–92% accuracy. Classified maps also highlighted the increasing biomass and areal coverage of F. vesiculosus as a potential stressor to eelgrass meadows. Further, authors derive a statistically significant conversion of percentage cover to biomass ( R2 = 0.96 for Fucus vesiculosus, R2 = 0.89 for Zostera marina total biomass, and R2 = 0.94 for AGB alone, p < 0.001). Results here provide an example of mapping cover and biomass of SAV and provide a tool to undertake spatio-temporal analyses to enhance the understanding of eelgrass ecosystem dynamics.


2001 ◽  
Vol 01 (03) ◽  
pp. 507-526 ◽  
Author(s):  
TONG LIN ◽  
HONG-JIANG ZHANG ◽  
QING-YUN SHI

In this paper, we present a novel scheme on video content representation by exploring the spatio-temporal information. A pseudo-object-based shot representation containing more semantics is proposed to measure shot similarity and force competition approach is proposed to group shots into scene based on content coherences between shots. Two content descriptors, color objects: Dominant Color Histograms (DCH) and Spatial Structure Histograms (SSH), are introduced. To represent temporal content variations, a shot can be segmented into several subshots that are of coherent content, and shot similarity measure is formulated as subshot similarity measure that serves to shot retrieval. With this shot representation, scene structure can be extracted by analyzing the splitting and merging force competitions at each shot boundary. Experimental results on real-world sports video prove that our proposed approach for video shot retrievals achieve the best performance on the average recall (AR) and average normalized modified retrieval rank (ANMRR), and Experiment on MPEG-7 test videos achieves promising results by the proposed scene extraction algorithm.


Author(s):  
Sotirios Chatzis ◽  
Anastasios Doulamis ◽  
Dimitrios Kosmopoulos ◽  
Theodora Varvarigou

2020 ◽  
Vol 12 (22) ◽  
pp. 3798
Author(s):  
Lei Ma ◽  
Michael Schmitt ◽  
Xiaoxiang Zhu

Recently, time-series from optical satellite data have been frequently used in object-based land-cover classification. This poses a significant challenge to object-based image analysis (OBIA) owing to the presence of complex spatio-temporal information in the time-series data. This study evaluates object-based land-cover classification in the northern suburbs of Munich using time-series from optical Sentinel data. Using a random forest classifier as the backbone, experiments were designed to analyze the impact of the segmentation scale, features (including spectral and temporal features), categories, frequency, and acquisition timing of optical satellite images. Based on our analyses, the following findings are reported: (1) Optical Sentinel images acquired over four seasons can make a significant contribution to the classification of agricultural areas, even though this contribution varies between spectral bands for the same period. (2) The use of time-series data alleviates the issue of identifying the “optimal” segmentation scale. The finding of this study can provide a more comprehensive understanding of the effects of classification uncertainty on object-based dense multi-temporal image classification.


2020 ◽  
Vol 12 (13) ◽  
pp. 2118
Author(s):  
Bos Debusscher ◽  
Lisa Landuyt ◽  
Frieke Van Coillie

Insights into flood dynamics, rather than solely flood extent, are critical for effective flood disaster management, in particular in the context of emergency relief and damage assessment. Although flood dynamics provide insight in the spatio-temporal behaviour of a flood event, to date operational visualization tools are scarce or even non-existent. In this letter, we distil a flood dynamics map from a radar satellite image time series (SITS). For this, we have upscaled and refined an existing design that was originally developed on a small area, describing flood dynamics using an object-based approach and a graph-based representation. Two case studies are used to demonstrate the operational value of this method by visualizing flood dynamics which are not visible on regular flood extent maps. Delineated water bodies are grouped into graphs according to their spatial overlap on consecutive timesteps. Differences in area and backscatter are used to quantify the amount of variation, resulting in a global variation map and a temporal profile for each water body, visually describing the evolution of the backscatter and number of polygons that make up the water body. The process of upscaling led us to applying a different water delineation approach, a different way of ensuring the minimal mapping unit and an increased code efficiency. The framework delivers a new way of visualizing floods, which is straightforward and efficient. Produced global variation maps can be applied in a context of data assimilation and disaster impact management.


Sign in / Sign up

Export Citation Format

Share Document