scholarly journals Mechanism Design for Facility Location Problems: A Survey

Author(s):  
Hau Chan ◽  
Aris Filos-Ratsikas ◽  
Bo Li ◽  
Minming Li ◽  
Chenhao Wang

The study of approximate mechanism design for facility location has been in the center of research at the intersection of artificial intelligence and economics for the last decade, largely due to its practical importance in various domains, such as social planning and clustering. At a high level, the goal is to select a number of locations on which to build a set of facilities, aiming to optimize some social objective based on the preferences of strategic agents, who might have incentives to misreport their private information. This paper presents a comprehensive survey of the significant progress that has been made since the introduction of the problem, highlighting all the different variants and methodologies, as well as the most interesting directions for future research.

2018 ◽  
Vol 18 (5-6) ◽  
pp. 759-805 ◽  
Author(s):  
THOM FRÜHWIRTH

AbstractConstraint Handling Rules (CHR) is both an effective concurrent declarative programming language and a versatile computational logic formalism. In CHR, guarded reactive rules rewrite a multi-set of constraints. Concurrency is inherent, since rules can be applied to the constraints in parallel. In this comprehensive survey, we give an overview of the concurrent, parallel as well as distributed CHR semantics, standard and more exotic, that have been proposed over the years at various levels of refinement. These semantics range from the abstract to the concrete. They are related by formal soundness results. Their correctness is proven as a correspondence between parallel and sequential computations. On the more practical side, we present common concise example CHR programs that have been widely used in experiments and benchmarks. We review parallel and distributed CHR implementations in software as well as hardware. The experimental results obtained show a parallel speed-up for unmodified sequential CHR programs. The software implementations are available online for free download and we give the web links. Due to its high level of abstraction, the CHR formalism can also be used to implement and analyse models for concurrency. To this end, the Software Transaction Model, the Actor Model, Colored Petri Nets and the Join-Calculus have been faithfully encoded in CHR. Finally, we identify and discuss commonalities of the approaches surveyed and indicate what problems are left open for future research.


1981 ◽  
Vol 13 (8) ◽  
pp. 1001-1028 ◽  
Author(s):  
G Leonardi

This paper, a condensed report of the present state of the work in the Public Facility Location Task (formerly the Normative Location Modeling Task) at IIASA, has three main aims: first, to build a general framework for location problems; second, to use this framework to unify existing location models; and, third, to use the framework to develop new, more general, and more meaningful location models. Suggestions are also given on how to introduce multiple services and multiple time periods in location problems. The multiactivity dynamic location models that this perspective generates is the subject of future research in the Public Facility Location Task. This first part of the paper gives a nontechnical description of the proposed general framework for analyzing location problems. The second part will describe mathematical models for static, single-service, facility location problems and their possible extensions and improvements, and will appear in the next issue.


Methodology ◽  
2017 ◽  
Vol 13 (1) ◽  
pp. 9-22 ◽  
Author(s):  
Pablo Livacic-Rojas ◽  
Guillermo Vallejo ◽  
Paula Fernández ◽  
Ellián Tuero-Herrero

Abstract. Low precision of the inferences of data analyzed with univariate or multivariate models of the Analysis of Variance (ANOVA) in repeated-measures design is associated to the absence of normality distribution of data, nonspherical covariance structures and free variation of the variance and covariance, the lack of knowledge of the error structure underlying the data, and the wrong choice of covariance structure from different selectors. In this study, levels of statistical power presented the Modified Brown Forsythe (MBF) and two procedures with the Mixed-Model Approaches (the Akaike’s Criterion, the Correctly Identified Model [CIM]) are compared. The data were analyzed using Monte Carlo simulation method with the statistical package SAS 9.2, a split-plot design, and considering six manipulated variables. The results show that the procedures exhibit high statistical power levels for within and interactional effects, and moderate and low levels for the between-groups effects under the different conditions analyzed. For the latter, only the Modified Brown Forsythe shows high level of power mainly for groups with 30 cases and Unstructured (UN) and Autoregressive Heterogeneity (ARH) matrices. For this reason, we recommend using this procedure since it exhibits higher levels of power for all effects and does not require a matrix type that underlies the structure of the data. Future research needs to be done in order to compare the power with corrected selectors using single-level and multilevel designs for fixed and random effects.


2019 ◽  
pp. 68-72
Author(s):  
E. A. Volkova

A monograph “Vegetation and biotopes of the “Narochansky” National Park was published in Minsk, Belarus in 2017, edited by A. V. Pugachevsky (Grummo et al., 2017). It includes the Map of terrestrial vegetation (S. 1 : 60 000) and the Map of biotopes (S. 1 : 60 000). Some small-scale maps such as the Map of changes in forest cover of the “Narochansky” National Park for the period 1985–2016, the Map of forest loss in the “Narochansky” National Park for the period 1985–2016 and a series of inventory and analytical maps on the basin of the Naroch Lake are given. This monograph can be considered as a small regional Atlas with detailed explanatory texts to the maps. It presents the experience on vegetation mapping accumulated in the Laboratory of Geobotany and Vegetation mapping of the Institute of Experimental Botany of the National Academy of Sciences of Belarus. Despite some critical comments, mainly concerning the biotope map, this publication of Belarusian geobotanists deserves an approval. They received the full answers to the questions posed: “What do we protect?” and “What is a current state of the vegetation of the National Park and the main trends of its dynamics? Cartographic design is made at a high level; the maps have both scientific and practical importance in the planning of environmental and economic activities.


2020 ◽  
Author(s):  
Sina Faizollahzadeh Ardabili ◽  
Amir Mosavi ◽  
Pedram Ghamisi ◽  
Filip Ferdinand ◽  
Annamaria R. Varkonyi-Koczy ◽  
...  

Several outbreak prediction models for COVID-19 are being used by officials around the world to make informed-decisions and enforce relevant control measures. Among the standard models for COVID-19 global pandemic prediction, simple epidemiological and statistical models have received more attention by authorities, and they are popular in the media. Due to a high level of uncertainty and lack of essential data, standard models have shown low accuracy for long-term prediction. Although the literature includes several attempts to address this issue, the essential generalization and robustness abilities of existing models needs to be improved. This paper presents a comparative analysis of machine learning and soft computing models to predict the COVID-19 outbreak as an alternative to SIR and SEIR models. Among a wide range of machine learning models investigated, two models showed promising results (i.e., multi-layered perceptron, MLP, and adaptive network-based fuzzy inference system, ANFIS). Based on the results reported here, and due to the highly complex nature of the COVID-19 outbreak and variation in its behavior from nation-to-nation, this study suggests machine learning as an effective tool to model the outbreak. This paper provides an initial benchmarking to demonstrate the potential of machine learning for future research. Paper further suggests that real novelty in outbreak prediction can be realized through integrating machine learning and SEIR models.


2020 ◽  
Vol 12 (11) ◽  
pp. 4460 ◽  
Author(s):  
Mohammadsoroush Tafazzoli ◽  
Ehsan Mousavi ◽  
Sharareh Kermanshachi

Although the two concepts of lean and sustainable construction have been developed due to different incentives, and they do not pursue the same exact goals, there exists considerable commonality between them. This paper discusses the potentials for integrating the two approaches and their practices and how the resulting synergy from combining the two methods can potentially lead to higher levels of fulfilling the individual goals of each of them. Some limitations and challenges to implementing the integrated approach are also discussed. Based on a comprehensive review of existing papers related to sustainable and lean construction topics, the commonality between the two approaches is discussed and grouped in five categories of (1) cost savings, (2) waste minimization, (3) Jobsite safety improvement, (4) reduced energy consumption, and (5) customers’ satisfaction improvement. The challenges of this integration are similarly identified and discussed in the four main categories of (1) additional initial costs to the project, (2) difficulty of providing specialized expertise, (3) contractors’ unwillingness to adopt the additional requirements, and (4) challenges to establish a high level of teamwork. Industry professionals were then interviewed to rank the elements in each of the two categories of opportunities and challenges. The results of the study highlight how future research can pursue the development of a new Green-Lean approach by investing in the communalities and meeting the challenges of this integration.


Author(s):  
Mateusz Iwo Dubaniowski ◽  
Hans Rudolf Heinimann

A system-of-systems (SoS) approach is often used for simulating disruptions to business and infrastructure system networks allowing for integration of several models into one simulation. However, the integration is frequently challenging as each system is designed individually with different characteristics, such as time granularity. Understanding the impact of time granularity on propagation of disruptions between businesses and infrastructure systems and finding the appropriate granularity for the SoS simulation remain as major challenges. To tackle these, we explore how time granularity, recovery time, and disruption size affect the propagation of disruptions between constituent systems of an SoS simulation. To address this issue, we developed a high level architecture (HLA) simulation of three networks and performed a series of simulation experiments. Our results revealed that time granularity and especially recovery time have huge impact on propagation of disruptions. Consequently, we developed a model for selecting an appropriate time granularity for an SoS simulation based on expected recovery time. Our simulation experiments show that time granularity should be less than 1.13 of expected recovery time. We identified some areas for future research centered around extending the experimental factors space.


2021 ◽  
Vol 54 (4) ◽  
pp. 1-34
Author(s):  
Pengzhen Ren ◽  
Yun Xiao ◽  
Xiaojun Chang ◽  
Po-yao Huang ◽  
Zhihui Li ◽  
...  

Deep learning has made substantial breakthroughs in many fields due to its powerful automatic representation capabilities. It has been proven that neural architecture design is crucial to the feature representation of data and the final performance. However, the design of the neural architecture heavily relies on the researchers’ prior knowledge and experience. And due to the limitations of humans’ inherent knowledge, it is difficult for people to jump out of their original thinking paradigm and design an optimal model. Therefore, an intuitive idea would be to reduce human intervention as much as possible and let the algorithm automatically design the neural architecture. Neural Architecture Search ( NAS ) is just such a revolutionary algorithm, and the related research work is complicated and rich. Therefore, a comprehensive and systematic survey on the NAS is essential. Previously related surveys have begun to classify existing work mainly based on the key components of NAS: search space, search strategy, and evaluation strategy. While this classification method is more intuitive, it is difficult for readers to grasp the challenges and the landmark work involved. Therefore, in this survey, we provide a new perspective: beginning with an overview of the characteristics of the earliest NAS algorithms, summarizing the problems in these early NAS algorithms, and then providing solutions for subsequent related research work. In addition, we conduct a detailed and comprehensive analysis, comparison, and summary of these works. Finally, we provide some possible future research directions.


Sign in / Sign up

Export Citation Format

Share Document