scholarly journals Real-time Analysis of City Scale Transportation Networks in New Orleans Metropolitan Area using an Agent Based Model Approach

2019 ◽  
Vol 271 ◽  
pp. 06007
Author(s):  
Millard McElwee ◽  
Bingyu Zhao ◽  
Kenichi Soga

The primary focus of this research is to develop and implement an agent-based model (ABM) to analyze the New Orleans Metropolitan transportation network near real-time. ABMs have grown in popularity because of their ability to analyze multifaceted community scale resilience with hundreds of thousands of links and millions of agents. Road closures and reduction in capacities are examples of influences on the weights or removal of edges which can affect the travel time, speed, and route of agents in the transportation model. Recent advances in high-performance computing (HPC) have made modeling networks on the city scale much less computationally intensive. We introduce an open-source ABM which utilizes parallel distributed computing to enable faster convergence to large scale problems. We simulate 50,000 agents on the entire southeastern Louisiana road network and part of Mississippi as well. This demonstrates the capability to simulate both city and regional scale transportation networks near real time.

Geosciences ◽  
2019 ◽  
Vol 9 (7) ◽  
pp. 317
Author(s):  
Jumadi Jumadi ◽  
Steve J. Carver ◽  
Duncan J. Quincey

Mass evacuation should be conducted when a disaster threatens within a regional scale. It is reported that 400,000 people were evacuated during the last eruption of Merapi Volcano in 2010. Such a large-scale evacuation can lead to chaos or congestion, unless well managed. Staged evacuation has been investigated as a solution to reducing the degree of chaos during evacuation processes. However, there is a limited conception of how the stages should be ordered in terms of which group should move first and which group should follow. This paper proposes to develop evacuation stage ordering based on the geographical character of the people at risk and examine the ordering scenarios through an agent-based model of evacuation. We use several geographical features, such as proximity to the hazard, road network conditions (accessibility), size of the population, and demographics as the parameters for ranking the order of each population unit in GIS. From this concept, we produced several scenarios of ranking based on different weightings of the parameters. We applied the scenarios in an agent-based model of volcanic evacuation experiment to observe the results. Afterwards, the results were evaluated based on the ability to reduce the risk and spatio-temporal traffic density along road networks compared to the result of simultaneous evacuation to establish the relative effectiveness of the outcome. The result shows that the staged scenario has a better ability to reduce the potential traffic congestion during the peak time of the evacuation compared to the simultaneous strategy. However, the simultaneous strategy has better performance regarding the speed of reducing the risk. An evaluation of the relative performance of the four varying staged scenarios is also presented and discussed in this paper.


2021 ◽  
Vol 77 (2) ◽  
pp. 98-108
Author(s):  
R. M. Churchill ◽  
C. S. Chang ◽  
J. Choi ◽  
J. Wong ◽  
S. Klasky ◽  
...  

2021 ◽  
Vol 10 (2) ◽  
pp. 88
Author(s):  
Dana Kaziyeva ◽  
Martin Loidl ◽  
Gudrun Wallentin

Transport planning strategies regard cycling promotion as a suitable means for tackling problems connected with motorized traffic such as limited space, congestion, and pollution. However, the evidence base for optimizing cycling promotion is weak in most cases, and information on bicycle patterns at a sufficient resolution is largely lacking. In this paper, we propose agent-based modeling to simulate bicycle traffic flows at a regional scale level for an entire day. The feasibility of the model is demonstrated in a use case in the Salzburg region, Austria. The simulation results in distinct spatio-temporal bicycle traffic patterns at high spatial (road segments) and temporal (minute) resolution. Scenario analysis positively assesses the model’s level of complexity, where the demographically parametrized behavior of cyclists outperforms stochastic null models. Validation with reference data from three sources shows a high correlation between simulated and observed bicycle traffic, where the predictive power is primarily related to the quality of the input and validation data. In conclusion, the implemented agent-based model successfully simulates bicycle patterns of 186,000 inhabitants within a reasonable time. This spatially explicit approach of modeling individual mobility behavior opens new opportunities for evidence-based planning and decision making in the wide field of cycling promotion


Author(s):  
Lele Zhang ◽  
Jiangyan Huang ◽  
Zhiyuan Liu ◽  
Hai L. Vu

2018 ◽  
Vol 7 (12) ◽  
pp. 467 ◽  
Author(s):  
Mengyu Ma ◽  
Ye Wu ◽  
Wenze Luo ◽  
Luo Chen ◽  
Jun Li ◽  
...  

Buffer analysis, a fundamental function in a geographic information system (GIS), identifies areas by the surrounding geographic features within a given distance. Real-time buffer analysis for large-scale spatial data remains a challenging problem since the computational scales of conventional data-oriented methods expand rapidly with increasing data volume. In this paper, we introduce HiBuffer, a visualization-oriented model for real-time buffer analysis. An efficient buffer generation method is proposed which introduces spatial indexes and a corresponding query strategy. Buffer results are organized into a tile-pyramid structure to enable stepless zooming. Moreover, a fully optimized hybrid parallel processing architecture is proposed for the real-time buffer analysis of large-scale spatial data. Experiments using real-world datasets show that our approach can reduce computation time by up to several orders of magnitude while preserving superior visualization effects. Additional experiments were conducted to analyze the influence of spatial data density, buffer radius, and request rate on HiBuffer performance, and the results demonstrate the adaptability and stability of HiBuffer. The parallel scalability of HiBuffer was also tested, showing that HiBuffer achieves high performance of parallel acceleration. Experimental results verify that HiBuffer is capable of handling 10-million-scale data.


2014 ◽  
Vol 571-572 ◽  
pp. 497-501 ◽  
Author(s):  
Qi Lv ◽  
Wei Xie

Real-time log analysis on large scale data is important for applications. Specifically, real-time refers to UI latency within 100ms. Therefore, techniques which efficiently support real-time analysis over large log data sets are desired. MongoDB provides well query performance, aggregation frameworks, and distributed architecture which is suitable for real-time data query and massive log analysis. In this paper, a novel implementation approach for an event driven file log analyzer is presented, and performance comparison of query, scan and aggregation operations over MongoDB, HBase and MySQL is analyzed. Our experimental results show that HBase performs best balanced in all operations, while MongoDB provides less than 10ms query speed in some operations which is most suitable for real-time applications.


Author(s):  
Manudul Pahansen de Alwis ◽  
Karl Garme

The stochastic environmental conditions together with craft design and operational characteristics make it difficult to predict the vibration environments aboard high-performance marine craft, particularly the risk of impact acceleration events and the shock component of the exposure often being associated with structural failure and human injuries. The different timescales and the magnitudes involved complicate the real-time analysis of vibration and shock conditions aboard these craft. The article introduces a new measure, severity index, indicating the risk of severe impact acceleration, and proposes a method for real-time feedback on the severity of impact exposure together with accumulated vibration exposure. The method analyzes the immediate 60 s of vibration exposure history and computes the severity of impact exposure as for the present state based on severity index. The severity index probes the characteristic of the present acceleration stochastic process, that is, the risk of an upcoming heavy impact, and serves as an alert to the crew. The accumulated vibration exposure, important for mapping and logging the crew exposure, is determined by the ISO 2631:1997 vibration dose value. The severity due to the impact and accumulated vibration exposure is communicated to the crew every second as a color-coded indicator: green, yellow and red, representing low, medium and high, based on defined impact and dose limits. The severity index and feedback method are developed and validated by a data set of 27 three-hour simulations of a planning craft in irregular waves and verified for its feasibility in real-world applications by full-scale acceleration data recorded aboard high-speed planing craft in operation.


Sensors ◽  
2019 ◽  
Vol 19 (10) ◽  
pp. 2229 ◽  
Author(s):  
Sen Zhang ◽  
Yong Yao ◽  
Jie Hu ◽  
Yong Zhao ◽  
Shaobo Li ◽  
...  

Traffic congestion prediction is critical for implementing intelligent transportation systems for improving the efficiency and capacity of transportation networks. However, despite its importance, traffic congestion prediction is severely less investigated compared to traffic flow prediction, which is partially due to the severe lack of large-scale high-quality traffic congestion data and advanced algorithms. This paper proposes an accessible and general workflow to acquire large-scale traffic congestion data and to create traffic congestion datasets based on image analysis. With this workflow we create a dataset named Seattle Area Traffic Congestion Status (SATCS) based on traffic congestion map snapshots from a publicly available online traffic service provider Washington State Department of Transportation. We then propose a deep autoencoder-based neural network model with symmetrical layers for the encoder and the decoder to learn temporal correlations of a transportation network and predicting traffic congestion. Our experimental results on the SATCS dataset show that the proposed DCPN model can efficiently and effectively learn temporal relationships of congestion levels of the transportation network for traffic congestion forecasting. Our method outperforms two other state-of-the-art neural network models in prediction performance, generalization capability, and computation efficiency.


2020 ◽  
Author(s):  
Markus Wiedemann ◽  
Bernhard S.A. Schuberth ◽  
Lorenzo Colli ◽  
Hans-Peter Bunge ◽  
Dieter Kranzlmüller

<p>Precise knowledge of the forces acting at the base of tectonic plates is of fundamental importance, but models of mantle dynamics are still often qualitative in nature to date. One particular problem is that we cannot access the deep interior of our planet and can therefore not make direct in situ measurements of the relevant physical parameters. Fortunately, modern software and powerful high-performance computing infrastructures allow us to generate complex three-dimensional models of the time evolution of mantle flow through large-scale numerical simulations.</p><p>In this project, we aim at visualizing the resulting convective patterns that occur thousands of kilometres below our feet and to make them "accessible" using high-end virtual reality techniques.</p><p>Models with several hundred million grid cells are nowadays possible using the modern supercomputing facilities, such as those available at the Leibniz Supercomputing Centre. These models provide quantitative estimates on the inaccessible parameters, such as buoyancy and temperature, as well as predictions of the associated gravity field and seismic wavefield that can be tested against Earth observations.</p><p>3-D visualizations of the computed physical parameters allow us to inspect the models such as if one were actually travelling down into the Earth. This way, convective processes that occur thousands of kilometres below our feet are virtually accessible by combining the simulations with high-end VR techniques.</p><p>The large data set used here poses severe challenges for real time visualization, because it cannot fit into graphics memory, while requiring rendering with strict deadlines. This raises the necessity to balance the amount of displayed data versus the time needed for rendering it.</p><p>As a solution, we introduce a rendering framework and describe our workflow that allows us to visualize this geoscientific dataset. Our example exceeds 16 TByte in size, which is beyond the capabilities of most visualization tools. To display this dataset in real-time, we reduce and declutter the dataset through isosurfacing and mesh optimization techniques.</p><p>Our rendering framework relies on multithreading and data decoupling mechanisms that allow to upload data to graphics memory while maintaining high frame rates. The final visualization application can be executed in a CAVE installation as well as on head mounted displays such as the HTC Vive or Oculus Rift. The latter devices will allow for viewing our example on-site at the EGU conference.</p>


Sign in / Sign up

Export Citation Format

Share Document