Transit Network Optimization Method Based on BRT Attracting Region: A Case Study of Changzhou

ICCTP 2009 ◽  
2009 ◽  
Author(s):  
Zimu Li ◽  
Xuewu Chen ◽  
Xiaoqing Liu
2021 ◽  
Vol 7 (4) ◽  
pp. 64
Author(s):  
Tanguy Ophoff ◽  
Cédric Gullentops ◽  
Kristof Van Beeck ◽  
Toon Goedemé

Object detection models are usually trained and evaluated on highly complicated, challenging academic datasets, which results in deep networks requiring lots of computations. However, a lot of operational use-cases consist of more constrained situations: they have a limited number of classes to be detected, less intra-class variance, less lighting and background variance, constrained or even fixed camera viewpoints, etc. In these cases, we hypothesize that smaller networks could be used without deteriorating the accuracy. However, there are multiple reasons why this does not happen in practice. Firstly, overparameterized networks tend to learn better, and secondly, transfer learning is usually used to reduce the necessary amount of training data. In this paper, we investigate how much we can reduce the computational complexity of a standard object detection network in such constrained object detection problems. As a case study, we focus on a well-known single-shot object detector, YoloV2, and combine three different techniques to reduce the computational complexity of the model without reducing its accuracy on our target dataset. To investigate the influence of the problem complexity, we compare two datasets: a prototypical academic (Pascal VOC) and a real-life operational (LWIR person detection) dataset. The three optimization steps we exploited are: swapping all the convolutions for depth-wise separable convolutions, perform pruning and use weight quantization. The results of our case study indeed substantiate our hypothesis that the more constrained a problem is, the more the network can be optimized. On the constrained operational dataset, combining these optimization techniques allowed us to reduce the computational complexity with a factor of 349, as compared to only a factor 9.8 on the academic dataset. When running a benchmark on an Nvidia Jetson AGX Xavier, our fastest model runs more than 15 times faster than the original YoloV2 model, whilst increasing the accuracy by 5% Average Precision (AP).


2021 ◽  
Vol 13 (13) ◽  
pp. 7504
Author(s):  
Jie Liu ◽  
Paul Schonfeld ◽  
Jinqu Chen ◽  
Yong Yin ◽  
Qiyuan Peng

Time reliability in a Rail Transit Network (RTN) is usually measured according to clock-based trip time, while the travel conditions such as travel comfort and convenience cannot be reflected by clock-based trip time. Here, the crowding level of trains, seat availability, and transfer times are considered to compute passengers’ Perceived Trip Time (PTT). Compared with the average PTT, the extra PTT needed for arriving reliably, which equals the 95th percentile PTT minus the average PTT, is converted into the monetary cost for estimating Perceived Time Reliability Cost (PTRC). The ratio of extra PTT needed for arriving reliably to the average PTT referring to the buffer time index is proposed to measure Perceived Time Reliability (PTR). To overcome the difficulty of obtaining passengers’ PTT who travel among rail transit modes, a Monte Carlo simulation is applied to generated passengers’ PTT for computing PTR and PTRC. A case study of Chengdu’s RTN shows that the proposed metrics and method measure the PTR and PTRC in an RTN effectively. PTTR, PTRC, and influential factors have significant linear relations among them, and the obtained linear regression models among them can guide passengers to travel reliably.


2021 ◽  
Author(s):  
Oliver Benning ◽  
Jonathan Calles ◽  
Burak Kantarci ◽  
Shahzad Khan

This article presents a practical method for the assessment of the risk profiles of communities by tracking / acquiring, fusing and analyzing data from public transportation, district population distribution, passenger interactions and cross-locality travel data. The proposed framework fuses these data sources into a realistic simulation of a transit network for a given time span. By shedding credible insights into the impact of public transit on pandemic spread, the research findings will help to set the groundwork for tools that could provide pandemic response teams and municipalities with a robust framework for the evaluations of city districts most at risk, and how to adjust municipal services accordingly.


2018 ◽  
Vol 7 (7) ◽  
pp. 278-283 ◽  
Author(s):  
Koji Oshima ◽  
Takumu Kobayashi ◽  
Yuki Taenaka ◽  
Kaori Kuroda ◽  
Mikio Hasegawa

2017 ◽  
Vol 46 (1) ◽  
pp. 84-102 ◽  
Author(s):  
Ruihong Huang

To measure job accessibility, person-based approaches have the advantage to capture all accessibility components: land use, transportation system, individual’s mobility and travel preference, as well as individual’s space and time constraints. This makes person-based approaches more favorable than traditional aggregated approaches in recent years. However, person-based accessibility measures require detailed individual trip data which are very difficult and expensive to acquire, especially at large scales. In addition, traveling by public transportation is a highly time sensitive activity, which can hardly be handled by traditional accessibility measures. This paper presents an agent-based model for simulating individual work trips in hoping to provide an alternative or supplementary solution to person-based accessibility study. In the model, population is simulated as three levels of agents: census tracts, households, and individual workers. And job opportunities (businesses) are simulated as employer agents. Census tract agents have the ability to generate household and worker agents based on their demographic profiles and a road network. Worker agents are the most active agents that can search jobs and find the best paths for commuting. Employer agents can estimate the number of transit-dependent employees, hire workers, and update vacancies. A case study is conducted in the Milwaukee metropolitan area in Wisconsin. Several person-based accessibility measures are computed based on simulated trips, which disclose low accessibility inner city neighborhoods well covered by a transit network.


2020 ◽  
Vol 0 (5) ◽  
pp. 45
Author(s):  
Muhammad Rayhan Azzindani ◽  
Nabila Fajri Kusuma Ningrum ◽  
Mega Rizkah Sudiar ◽  
Anak Agung Ngurah Perwira Redi

2017 ◽  
Vol 10 (2) ◽  
pp. 67
Author(s):  
Vina Ayumi ◽  
L.M. Rasdi Rere ◽  
Mohamad Ivan Fanany ◽  
Aniati Murni Arymurthy

Metaheuristic algorithm is a powerful optimization method, in which it can solve problemsby exploring the ordinarily large solution search space of these instances, that are believed tobe hard in general. However, the performances of these algorithms signicantly depend onthe setting of their parameter, while is not easy to set them accurately as well as completelyrelying on the problem's characteristic. To ne-tune the parameters automatically, manymethods have been proposed to address this challenge, including fuzzy logic, chaos, randomadjustment and others. All of these methods for many years have been developed indepen-dently for automatic setting of metaheuristic parameters, and integration of two or more ofthese methods has not yet much conducted. Thus, a method that provides advantage fromcombining chaos and random adjustment is proposed. Some popular metaheuristic algo-rithms are used to test the performance of the proposed method, i.e. simulated annealing,particle swarm optimization, dierential evolution, and harmony search. As a case study ofthis research is contrast enhancement for images of Cameraman, Lena, Boat and Rice. Ingeneral, the simulation results show that the proposed methods are better than the originalmetaheuristic, chaotic metaheuristic, and metaheuristic by random adjustment.


Sign in / Sign up

Export Citation Format

Share Document