Data Challenges in Development of a Regional Assignment: Simulation Model to Evaluate Transit Signal Priority in Chicago

Author(s):  
Elaine Chang ◽  
Athanasios Ziliaskopoulos

Recent years have seen major advances in the field of simulation-based dynamic traffic assignment (DTA), resulting in the development of DTA software packages capable of simulating real-world networks. However, simulation of such networks requires not only sophisticated algorithms and software, but also large and detailed data sets. Although algorithmic and software-related issues in large-scale DTA development have received considerable research attention, there is little reported experience with the data-related challenges in real-world applications of large-scale simulation models. There were challenges in using a large-scale simulation-assignment model for evaluation of transit signal priority (TSP) in the Chicago, Illinois, region. Relevant impacts of TSP are described, to provide a framework for comparing simulation approaches and data sets. The practice of using microsimulation models to evaluate TSP impacts on short corridors is compared with that of using regional assignment-simulation approaches, with an emphasis on TSP impacts that can be captured and observed with each one of the approaches. The data sets used for the regional Chicago TSP study are then described, along with assumptions made to adapt each data set to the task of regional time-dependent simulation.

Author(s):  
Lior Shamir

Abstract Several recent observations using large data sets of galaxies showed non-random distribution of the spin directions of spiral galaxies, even when the galaxies are too far from each other to have gravitational interaction. Here, a data set of $\sim8.7\cdot10^3$ spiral galaxies imaged by Hubble Space Telescope (HST) is used to test and profile a possible asymmetry between galaxy spin directions. The asymmetry between galaxies with opposite spin directions is compared to the asymmetry of galaxies from the Sloan Digital Sky Survey. The two data sets contain different galaxies at different redshift ranges, and each data set was annotated using a different annotation method. The results show that both data sets show a similar asymmetry in the COSMOS field, which is covered by both telescopes. Fitting the asymmetry of the galaxies to cosine dependence shows a dipole axis with probabilities of $\sim2.8\sigma$ and $\sim7.38\sigma$ in HST and SDSS, respectively. The most likely dipole axis identified in the HST galaxies is at $(\alpha=78^{\rm o},\delta=47^{\rm o})$ and is well within the $1\sigma$ error range compared to the location of the most likely dipole axis in the SDSS galaxies with $z>0.15$ , identified at $(\alpha=71^{\rm o},\delta=61^{\rm o})$ .


1977 ◽  
Vol 3 (1/2) ◽  
pp. 126
Author(s):  
W. Brian Arthur ◽  
Geoffrey McNicoll

2015 ◽  
Vol 8 (1) ◽  
pp. 421-434 ◽  
Author(s):  
M. P. Jensen ◽  
T. Toto ◽  
D. Troyan ◽  
P. E. Ciesielski ◽  
D. Holdridge ◽  
...  

Abstract. The Midlatitude Continental Convective Clouds Experiment (MC3E) took place during the spring of 2011 centered in north-central Oklahoma, USA. The main goal of this field campaign was to capture the dynamical and microphysical characteristics of precipitating convective systems in the US Central Plains. A major component of the campaign was a six-site radiosonde array designed to capture the large-scale variability of the atmospheric state with the intent of deriving model forcing data sets. Over the course of the 46-day MC3E campaign, a total of 1362 radiosondes were launched from the enhanced sonde network. This manuscript provides details on the instrumentation used as part of the sounding array, the data processing activities including quality checks and humidity bias corrections and an analysis of the impacts of bias correction and algorithm assumptions on the determination of convective levels and indices. It is found that corrections for known radiosonde humidity biases and assumptions regarding the characteristics of the surface convective parcel result in significant differences in the derived values of convective levels and indices in many soundings. In addition, the impact of including the humidity corrections and quality controls on the thermodynamic profiles that are used in the derivation of a large-scale model forcing data set are investigated. The results show a significant impact on the derived large-scale vertical velocity field illustrating the importance of addressing these humidity biases.


2020 ◽  
Vol 223 (2) ◽  
pp. 1378-1397
Author(s):  
Rosemary A Renaut ◽  
Jarom D Hogue ◽  
Saeed Vatankhah ◽  
Shuang Liu

SUMMARY We discuss the focusing inversion of potential field data for the recovery of sparse subsurface structures from surface measurement data on a uniform grid. For the uniform grid, the model sensitivity matrices have a block Toeplitz Toeplitz block structure for each block of columns related to a fixed depth layer of the subsurface. Then, all forward operations with the sensitivity matrix, or its transpose, are performed using the 2-D fast Fourier transform. Simulations are provided to show that the implementation of the focusing inversion algorithm using the fast Fourier transform is efficient, and that the algorithm can be realized on standard desktop computers with sufficient memory for storage of volumes up to size n ≈ 106. The linear systems of equations arising in the focusing inversion algorithm are solved using either Golub–Kahan bidiagonalization or randomized singular value decomposition algorithms. These two algorithms are contrasted for their efficiency when used to solve large-scale problems with respect to the sizes of the projected subspaces adopted for the solutions of the linear systems. The results confirm earlier studies that the randomized algorithms are to be preferred for the inversion of gravity data, and for data sets of size m it is sufficient to use projected spaces of size approximately m/8. For the inversion of magnetic data sets, we show that it is more efficient to use the Golub–Kahan bidiagonalization, and that it is again sufficient to use projected spaces of size approximately m/8. Simulations support the presented conclusions and are verified for the inversion of a magnetic data set obtained over the Wuskwatim Lake region in Manitoba, Canada.


2009 ◽  
Vol 2 (1) ◽  
pp. 87-98 ◽  
Author(s):  
C. Lerot ◽  
M. Van Roozendael ◽  
J. van Geffen ◽  
J. van Gent ◽  
C. Fayt ◽  
...  

Abstract. Total O3 columns have been retrieved from six years of SCIAMACHY nadir UV radiance measurements using SDOAS, an adaptation of the GDOAS algorithm previously developed at BIRA-IASB for the GOME instrument. GDOAS and SDOAS have been implemented by the German Aerospace Center (DLR) in the version 4 of the GOME Data Processor (GDP) and in version 3 of the SCIAMACHY Ground Processor (SGP), respectively. The processors are being run at the DLR processing centre on behalf of the European Space Agency (ESA). We first focus on the description of the SDOAS algorithm with particular attention to the impact of uncertainties on the reference O3 absorption cross-sections. Second, the resulting SCIAMACHY total ozone data set is globally evaluated through large-scale comparisons with results from GOME and OMI as well as with ground-based correlative measurements. The various total ozone data sets are found to agree within 2% on average. However, a negative trend of 0.2–0.4%/year has been identified in the SCIAMACHY O3 columns; this probably originates from instrumental degradation effects that have not yet been fully characterized.


2017 ◽  
Vol 14 (4) ◽  
pp. 172988141770907 ◽  
Author(s):  
Hanbo Wu ◽  
Xin Ma ◽  
Zhimeng Zhang ◽  
Haibo Wang ◽  
Yibin Li

Human daily activity recognition has been a hot spot in the field of computer vision for many decades. Despite best efforts, activity recognition in naturally uncontrolled settings remains a challenging problem. Recently, by being able to perceive depth and visual cues simultaneously, RGB-D cameras greatly boost the performance of activity recognition. However, due to some practical difficulties, the publicly available RGB-D data sets are not sufficiently large for benchmarking when considering the diversity of their activities, subjects, and background. This severely affects the applicability of complicated learning-based recognition approaches. To address the issue, this article provides a large-scale RGB-D activity data set by merging five public RGB-D data sets that differ from each other on many aspects such as length of actions, nationality of subjects, or camera angles. This data set comprises 4528 samples depicting 7 action categories (up to 46 subcategories) performed by 74 subjects. To verify the challengeness of the data set, three feature representation methods are evaluated, which are depth motion maps, spatiotemporal depth cuboid similarity feature, and curvature space scale. Results show that the merged large-scale data set is more realistic and challenging and therefore more suitable for benchmarking.


2017 ◽  
Vol 44 (2) ◽  
pp. 203-229 ◽  
Author(s):  
Javier D Fernández ◽  
Miguel A Martínez-Prieto ◽  
Pablo de la Fuente Redondo ◽  
Claudio Gutiérrez

The publication of semantic web data, commonly represented in Resource Description Framework (RDF), has experienced outstanding growth over the last few years. Data from all fields of knowledge are shared publicly and interconnected in active initiatives such as Linked Open Data. However, despite the increasing availability of applications managing large-scale RDF information such as RDF stores and reasoning tools, little attention has been given to the structural features emerging in real-world RDF data. Our work addresses this issue by proposing specific metrics to characterise RDF data. We specifically focus on revealing the redundancy of each data set, as well as common structural patterns. We evaluate the proposed metrics on several data sets, which cover a wide range of designs and models. Our findings provide a basis for more efficient RDF data structures, indexes and compressors.


2011 ◽  
Vol 7 (3) ◽  
pp. 88-101 ◽  
Author(s):  
DongHong Sun ◽  
Li Liu ◽  
Peng Zhang ◽  
Xingquan Zhu ◽  
Yong Shi

Due to the flexibility of multi-criteria optimization, Regularized Multiple Criteria Linear Programming (RMCLP) has received attention in decision support systems. Numerous theoretical and empirical studies have demonstrated that RMCLP is effective and efficient in classifying large scale data sets. However, a possible limitation of RMCLP is poor interpretability and low comprehensibility for end users and experts. This deficiency has limited RMCLP’s use in many real-world applications where both accuracy and transparency of decision making are required, such as in Customer Relationship Management (CRM) and Credit Card Portfolio Management. In this paper, the authors present a clustering based rule extraction method to extract explainable and understandable rules from the RMCLP model. Experiments on both synthetic and real world data sets demonstrate that this rule extraction method can effectively extract explicit decision rules from RMCLP with only a small compromise in performance.


1983 ◽  
Vol 66 ◽  
pp. 411-425
Author(s):  
Frank Hill ◽  
Juri Toomre ◽  
Laurence J. November

AbstractTwo-dimensional power spectra of solar five-minute oscillations display prominent ridge structures in (k, ω) space, where k is the horizontal wavenumber and ω is the temporal frequency. The positions of these ridges in k and ω can be used to probe temperature and velocity structures in the subphotosphere. We have been carrying out a continuing program of observations of five-minute oscillations with the diode array instrument on the vacuum tower telescope at Sacramento Peak Observatory (SPO). We have sought to establish whether power spectra taken on separate days show shifts in ridge locations; these may arise from different velocity and temperature patterns having been brought into our sampling region by solar rotation. Power spectra have been obtained for six days of observations of Doppler velocities using the Mg I λ5173 and Fe I λ5434 spectral lines. Each data set covers 8 to 11 hr in time and samples a region 256″ × 1024″ in spatial extent, with a spatial resolution of 2″ and temporal sampling of 65 s. We have detected shifts in ridge locations between certain data sets which are statistically significant. The character of these displacements when analyzed in terms of eastward and westward propagating waves implies that changes have occurred in both temperature and horizontal velocity fields underlying our observing window. We estimate the magnitude of the velocity changes to be on the order of 100 m s -1; we may be detecting the effects of large-scale convection akin to giant cells.


Sign in / Sign up

Export Citation Format

Share Document