scholarly journals SWARM: Adaptive Load Balancing in Distributed Streaming Systems for Big Spatial Data

2021 ◽  
Vol 7 (3) ◽  
pp. 1-43
Author(s):  
Anas Daghistani ◽  
Walid G. Aref ◽  
Arif Ghafoor ◽  
Ahmed R. Mahmood

The proliferation of GPS-enabled devices has led to the development of numerous location-based services. These services need to process massive amounts of streamed spatial data in real-time. The current scale of spatial data cannot be handled using centralized systems. This has led to the development of distributed spatial streaming systems. Existing systems are using static spatial partitioning to distribute the workload. In contrast, the real-time streamed spatial data follows non-uniform spatial distributions that are continuously changing over time. Distributed spatial streaming systems need to react to the changes in the distribution of spatial data and queries. This article introduces SWARM, a lightweight adaptivity protocol that continuously monitors the data and query workloads across the distributed processes of the spatial data streaming system and redistributes and rebalances the workloads as soon as performance bottlenecks get detected. SWARM is able to handle multiple query-execution and data-persistence models. A distributed streaming system can directly use SWARM to adaptively rebalance the system’s workload among its machines with minimal changes to the original code of the underlying spatial application. Extensive experimental evaluation using real and synthetic datasets illustrate that, on average, SWARM achieves 2 improvement in throughput over a static grid partitioning that is determined based on observing a limited history of the data and query workloads. Moreover, SWARM reduces execution latency on average 4 compared with the other technique.

2020 ◽  
Vol 6 (1) ◽  
Author(s):  
Svenja Ipsen ◽  
Sven Böttger ◽  
Holger Schwegmann ◽  
Floris Ernst

AbstractUltrasound (US) imaging, in contrast to other image guidance techniques, offers the distinct advantage of providing volumetric image data in real-time (4D) without using ionizing radiation. The goal of this study was to perform the first quantitative comparison of three different 4D US systems with fast matrix array probes and real-time data streaming regarding their target tracking accuracy and system latency. Sinusoidal motion of varying amplitudes and frequencies was used to simulate breathing motion with a robotic arm and a static US phantom. US volumes and robot positions were acquired online and stored for retrospective analysis. A template matching approach was used for target localization in the US data. Target motion measured in US was compared to the reference trajectory performed by the robot to determine localization accuracy and system latency. Using the robotic setup, all investigated 4D US systems could detect a moving target with sub-millimeter accuracy. However, especially high system latency increased tracking errors substantially and should be compensated with prediction algorithms for respiratory motion compensation.


Author(s):  
Gayathri Nadarajan ◽  
Cheng-Lin Yang ◽  
Yun-Heh Chen-Burger ◽  
Yu-Jung Cheng ◽  
Sun-In Lin ◽  
...  

2017 ◽  
Vol 43 (4) ◽  
pp. 142-146 ◽  
Author(s):  
Ugo FALCHI

The final goal of this paper was to fix a brief summary on the status of geographic information in Italy due to the technological steps and national regulations. The acquisition, processing and sharing of spatial data has experienced a significant acceleration thanks to the development of computer technology and the acknowledgment of the need for standardization and homogenization of information held by pub­lic authorities and individuals. The spatial data represents the essential knowledge in the management and development of a territory both in terms of planning for safety and environmental prevention. In Italy there is an enormous heritage of spatial information which is historically affected by a problem of consistency and uniformity, in order to make it often contradictory in its use by the public decision-maker and private par­ties. The recent history of geographic information is characterized by a significant effort aimed at optimiz­ing this decisive technical and cultural heritage allowing the use of it to all citizens in a logic of sharing and re-use and may finally represent a common good available to all.


2012 ◽  
Vol 39 (9) ◽  
pp. 1072-1082 ◽  
Author(s):  
Ali Montaser ◽  
Ibrahim Bakry ◽  
Adel Alshibani ◽  
Osama Moselhi

This paper presents an automated method for estimating productivity of earthmoving operations in near-real-time. The developed method utilizes Global Positioning System (GPS) and Google Earth to extract the data needed to perform the estimation process. A GPS device is mounted on a hauling unit to capture the spatial data along designated hauling roads for the project. The variations in the captured cycle times were used to model the uncertainty associated with the operation involved. This was carried out by automated classification, data fitting, and computer simulation. The automated classification is applied through a spreadsheet application that classifies GPS data and identifies, accordingly, durations of different activities in each cycle using spatial coordinates and directions captured by GPS and recorded on its receiver. The data fitting was carried out using commercially available software to generate the probability distribution functions used in the simulation software “Extend V.6”. The simulation was utilized to balance the production of an excavator with that of the hauling units. A spreadsheet application was developed to perform the calculations. An example of an actual project was analyzed to demonstrate the use of the developed method and illustrates its essential features. The analyzed case study demonstrates how the proposed method can assist project managers in taking corrective actions based on the near-real-time actual data captured and processed to estimate productivity of the operations involved.


2018 ◽  
Vol 7 (12) ◽  
pp. 467 ◽  
Author(s):  
Mengyu Ma ◽  
Ye Wu ◽  
Wenze Luo ◽  
Luo Chen ◽  
Jun Li ◽  
...  

Buffer analysis, a fundamental function in a geographic information system (GIS), identifies areas by the surrounding geographic features within a given distance. Real-time buffer analysis for large-scale spatial data remains a challenging problem since the computational scales of conventional data-oriented methods expand rapidly with increasing data volume. In this paper, we introduce HiBuffer, a visualization-oriented model for real-time buffer analysis. An efficient buffer generation method is proposed which introduces spatial indexes and a corresponding query strategy. Buffer results are organized into a tile-pyramid structure to enable stepless zooming. Moreover, a fully optimized hybrid parallel processing architecture is proposed for the real-time buffer analysis of large-scale spatial data. Experiments using real-world datasets show that our approach can reduce computation time by up to several orders of magnitude while preserving superior visualization effects. Additional experiments were conducted to analyze the influence of spatial data density, buffer radius, and request rate on HiBuffer performance, and the results demonstrate the adaptability and stability of HiBuffer. The parallel scalability of HiBuffer was also tested, showing that HiBuffer achieves high performance of parallel acceleration. Experimental results verify that HiBuffer is capable of handling 10-million-scale data.


2010 ◽  
Vol 6 (1) ◽  
pp. 970868 ◽  
Author(s):  
G. W. Eidson ◽  
S. T. Esswein ◽  
J. B. Gemmill ◽  
J. O. Hallstrom ◽  
T. R. Howard ◽  
...  

Water resources are under unprecedented strain. The combined effects of population growth, climate change, and rural industrialization have led to greater demand for an increasingly scarce resource. Ensuring that communities have adequate access to water—an essential requirement for community health and prosperity—requires finegrained management policies based on real-time in situ data, both environmental and hydrological. To address this requirement at the state level, we have developed the South Carolina Digital Watershed, an end-to-end system for monitoring water resources. In this paper, we describe the design and implementation of the core system components: (i) in situ sensing hardware, (ii) collection and uplink facilities, (iii) data streaming middleware, and (iv) back-end repository and presentation services. We conclude by discussing key organizational and technical challenges encountered during the development process.


Sign in / Sign up

Export Citation Format

Share Document