scholarly journals Softpressure: A Schedule-Driven Backpressure Algorithm for Coping with Network Congestion

Author(s):  
Hsu-Chieh Hu ◽  
Stephen F. Smith

We consider the problem of minimizing the the delay of jobs moving through a directed graph of service nodes. In this problem, each node may have several links and is constrained to serve one link at a time. As jobs move through the network, they can pass through a node only after they have been serviced by that node. The objective is to minimize the delay jobs incur sitting on queues waiting to be serviced. Two popular approaches to this problem are backpressure algorithm and schedule-driven control. In this paper, we present a hybrid approach of those two methods that incorporates the stability of queuing theory into the schedule-driven control. We then demonstrate how this hybrid method outperforms the other two in a real-time traffic signal control problem, where the nodes are traffic lights, the links are roads, and the jobs are vehicles. We show through simulations that, in scenarios with heavy congestion, the hybrid method results in 50% and 15% reductions in delay over schedule-driven control and backpressure respectively. A theoretical analysis also justifies our results.

2015 ◽  
Vol 713-715 ◽  
pp. 915-918
Author(s):  
Yuan Xin Xu ◽  
Wan Ying Yang ◽  
Wen Shi

Aiming at the problem that individual control of urban traffic lights and stable signal timing. This paper proposed a real timing control method of traffic lights which based on Kalman filter. This method use Kalman filter to predict the next time traffic flows and then update the signal timing. By field researching the traffic flow of intersection in peak hour and predicting the traffic flow. Then update the signal timing. Meanwhile using the VISSIM to simulate the intersection. The result of the simulation shows that the length of vehicle queue decreased significantly and the number of stops dropped. The efficiency of access has been greatly improved.


Author(s):  
Chan Yu ◽  
Souran Manoochehri

A hybrid method combining a genetic algorithms based containment algorithm with a complex mating algorithm is presented. The approach uses mating between a pair of objects as means to accelerate the packaging process. In this study, mating between two objects has been defined as positioning one object relative to others by merging common features that are assigned through mating conditions between them. A constrained move set is derived from the mating condition that allows the transformation of a component in each mating pair to be fully or partially constrained with respect to the other. By using mating in the packaging, the number of components to be placed can be reduced significantly and overall speed of the packaging process can also be improved. The hybrid method uses a genetic algorithm to search mating pairs and global positions of selected objects. The mating pair is mated first by a simple mating condition which is derived from geometric features of mating objects. If a proper mating is not obtained, the complex mating algorithm finds an optimal mating condition using Quasi-Newton method.


2019 ◽  
Vol 1 (2) ◽  
pp. 67-73
Author(s):  
Mendil Samir ◽  
Aguili Taoufik

This article deals with a hybrid method combining the method of moments (MOM) with the general theory of diffraction (GTD). This hybrid approach is used to analyze antennas located near perfectly Bodies of arbitrary curved shape. Some examples, e.g. an antenna mounted near a perfect conductor cylinder with two plates, demonstrates that the hybrid approach is the most suitable technique for modeling large-scale objects and arbitrary shapes. This approach allows us to resolve the problem, that the other methods can’t solve it alone. Generally, random radiation locates on or near an arbitrary form, can be solved using this technique hence the strong advantages of our method.


Author(s):  
Hsu-Chieh Hu ◽  
Allen M. Hawkes ◽  
Stephen F. Smith

Key to the effectiveness of schedule-driven approaches to real-time traffic control is an ability to accurately predict when sensed vehicles will arrive at and pass through the intersection. Prior work in schedule-driven traffic control has assumed a static vehicle arrival model. However, this static predictive model ignores the fact that the queue count and the incurred delay should vary as different partial signal timing schedules (i.e., different possible futures) are explored during the online planning process. In this paper, we propose an alternative arrival time model that incorporates queueing dynamics into this forward search process for a signal timing schedule, to more accurately capture how the intersection’s queues vary over time. As each search state is generated, an incremental queueing delay is dynamically projected for each vehicle. The resulting total queueing delay is then considered in addition to the cumulative delay caused by signal operations. We demonstrate the potential of this approach through microscopic traffic simulation of a real-world road network, showing a 10-15% reduction in average wait times over the schedule-driven traffic signal control system in heavy traffic scenarios.


Author(s):  
Godfrey C. Hoskins ◽  
V. Williams ◽  
V. Allison

The method demonstrated is an adaptation of a proven procedure for accurately determining the magnification of light photomicrographs. Because of the stability of modern electrical lenses, the method is shown to be directly applicable for providing precise reproducibility of magnification in various models of electron microscopes.A readily recognizable area of a carbon replica of a crossed-line diffraction grating is used as a standard. The same area of the standard was photographed in Phillips EM 200, Hitachi HU-11B2, and RCA EMU 3F electron microscopes at taps representative of the range of magnification of each. Negatives from one microscope were selected as guides and printed at convenient magnifications; then negatives from each of the other microscopes were projected to register with these prints. By deferring measurement to the print rather than comparing negatives, correspondence of magnification of the specimen in the three microscopes could be brought to within 2%.


2020 ◽  
Vol 12 (7) ◽  
pp. 2767 ◽  
Author(s):  
Víctor Yepes ◽  
José V. Martí ◽  
José García

The optimization of the cost and CO 2 emissions in earth-retaining walls is of relevance, since these structures are often used in civil engineering. The optimization of costs is essential for the competitiveness of the construction company, and the optimization of emissions is relevant in the environmental impact of construction. To address the optimization, black hole metaheuristics were used, along with a discretization mechanism based on min–max normalization. The stability of the algorithm was evaluated with respect to the solutions obtained; the steel and concrete values obtained in both optimizations were analyzed. Additionally, the geometric variables of the structure were compared. Finally, the results obtained were compared with another algorithm that solved the problem. The results show that there is a trade-off between the use of steel and concrete. The solutions that minimize CO 2 emissions prefer the use of concrete instead of those that optimize the cost. On the other hand, when comparing the geometric variables, it is seen that most remain similar in both optimizations except for the distance between buttresses. When comparing with another algorithm, the results show a good performance in optimization using the black hole algorithm.


2021 ◽  
Vol 13 (10) ◽  
pp. 5411
Author(s):  
Elisabeth Bloder ◽  
Georg Jäger

Traffic and transportation are main contributors to the global CO2 emissions and resulting climate change. Especially in urban areas, traffic flow is not optimal and thus offers possibilities to reduce emissions. The concept of a Green Wave, i.e., the coordinated switching of traffic lights in order to favor a single direction and reduce congestion, is often discussed as a simple mechanism to avoid breaking and accelerating, thereby reducing fuel consumption. On the other hand, making car use more attractive might also increase emissions. In this study, we use an agent-based model to investigate the benefit of a Green Wave in order to find out whether it can outweigh the effects of increased car use. We find that although the Green Wave has the potential to reduce emissions, there is also a high risk of heaving a net increase in emissions, depending on the specifics of the traffic system.


2021 ◽  
Vol 5 (2) ◽  
pp. 32
Author(s):  
Esmehan Uçar ◽  
Sümeyra Uçar ◽  
Fırat Evirgen ◽  
Necati Özdemir

It is possible to produce mobile phone worms, which are computer viruses with the ability to command the running of cell phones by taking advantage of their flaws, to be transmitted from one device to the other with increasing numbers. In our day, one of the services to gain currency for circulating these malignant worms is SMS. The distinctions of computers from mobile devices render the existing propagation models of computer worms unable to start operating instantaneously in the mobile network, and this is particularly valid for the SMS framework. The susceptible–affected–infectious–suspended–recovered model with a classical derivative (abbreviated as SAIDR) was coined by Xiao et al., (2017) in order to correctly estimate the spread of worms by means of SMS. This study is the first to implement an Atangana–Baleanu (AB) derivative in association with the fractional SAIDR model, depending upon the SAIDR model. The existence and uniqueness of the drinking model solutions together with the stability analysis are shown through the Banach fixed point theorem. The special solution of the model is investigated using the Laplace transformation and then we present a set of numeric graphics by varying the fractional-order θ with the intention of showing the effectiveness of the fractional derivative.


Games ◽  
2021 ◽  
Vol 12 (3) ◽  
pp. 53
Author(s):  
Roberto Rozzi

We consider an evolutionary model of social coordination in a 2 × 2 game where two groups of players prefer to coordinate on different actions. Players can pay a cost to learn their opponent’s group: if they pay it, they can condition their actions concerning the groups. We assess the stability of outcomes in the long run using stochastic stability analysis. We find that three elements matter for the equilibrium selection: the group size, the strength of preferences, and the information’s cost. If the cost is too high, players never learn the group of their opponents in the long run. If one group is stronger in preferences for its favorite action than the other, or its size is sufficiently large compared to the other group, every player plays that group’s favorite action. If both groups are strong enough in preferences, or if none of the groups’ sizes is large enough, players play their favorite actions and miscoordinate in inter-group interactions. Lower levels of the cost favor coordination. Indeed, when the cost is low, in inside-group interactions, players always coordinate on their favorite action, while in inter-group interactions, they coordinate on the favorite action of the group that is stronger in preferences or large enough.


Sign in / Sign up

Export Citation Format

Share Document