performance guarantees
Recently Published Documents


TOTAL DOCUMENTS

476
(FIVE YEARS 138)

H-INDEX

30
(FIVE YEARS 6)

2022 ◽  
Vol 22 (1) ◽  
pp. 1-18
Author(s):  
Alessio Pagani ◽  
Zhuangkun Wei ◽  
Ricardo Silva ◽  
Weisi Guo

Infrastructure monitoring is critical for safe operations and sustainability. Like many networked systems, water distribution networks (WDNs) exhibit both graph topological structure and complex embedded flow dynamics. The resulting networked cascade dynamics are difficult to predict without extensive sensor data. However, ubiquitous sensor monitoring in underground situations is expensive, and a key challenge is to infer the contaminant dynamics from partial sparse monitoring data. Existing approaches use multi-objective optimization to find the minimum set of essential monitoring points but lack performance guarantees and a theoretical framework. Here, we first develop a novel Graph Fourier Transform (GFT) operator to compress networked contamination dynamics to identify the essential principal data collection points with inference performance guarantees. As such, the GFT approach provides the theoretical sampling bound. We then achieve under-sampling performance by building auto-encoder (AE) neural networks (NN) to generalize the GFT sampling process and under-sample further from the initial sampling set, allowing a very small set of data points to largely reconstruct the contamination dynamics over real and artificial WDNs. Various sources of the contamination are tested, and we obtain high accuracy reconstruction using around 5%–10% of the network nodes for known contaminant sources, and 50%–75% for unknown source cases, which although larger than that of the schemes for contaminant detection and source identifications, is smaller than the current sampling schemes for contaminant data recovery. This general approach of compression and under-sampled recovery via NN can be applied to a wide range of networked infrastructures to enable efficient data sampling for digital twins.


2022 ◽  
Author(s):  
Mahsa Derakhshan ◽  
Negin Golrezaei ◽  
Vahideh Manshadi ◽  
Vahab Mirrokni

On online platforms, consumers face an abundance of options that are displayed in the form of a position ranking. Only products placed in the first few positions are readily accessible to the consumer, and she needs to exert effort to access more options. For such platforms, we develop a two-stage sequential search model where, in the first stage, the consumer sequentially screens positions to observe the preference weight of the products placed in them and forms a consideration set. In the second stage, she observes the additional idiosyncratic utility that she can derive from each product and chooses the highest-utility product within her consideration set. For this model, we first characterize the optimal sequential search policy of a welfare-maximizing consumer. We then study how platforms with different objectives should rank products. We focus on two objectives: (i) maximizing the platform’s market share and (ii) maximizing the consumer’s welfare. Somewhat surprisingly, we show that ranking products in decreasing order of their preference weights does not necessarily maximize market share or consumer welfare. Such a ranking may shorten the consumer’s consideration set due to the externality effect of high-positioned products on low-positioned ones, leading to insufficient screening. We then show that both problems—maximizing market share and maximizing consumer welfare—are NP-complete. We develop novel near-optimal polynomial-time ranking algorithms for each objective. Further, we show that, even though ranking products in decreasing order of their preference weights is suboptimal, such a ranking enjoys strong performance guarantees for both objectives. We complement our theoretical developments with numerical studies using synthetic data, in which we show (1) that heuristic versions of our algorithms that do not rely on model primitives perform well and (2) that our model can be effectively estimated using a maximum likelihood estimator. This paper was accepted by Gabriel Weintraub, revenue management and market analytics.


2021 ◽  
Vol 46 (4) ◽  
pp. 1-49
Author(s):  
Alejandro Grez ◽  
Cristian Riveros ◽  
Martín Ugarte ◽  
Stijn Vansummeren

Complex event recognition (CER) has emerged as the unifying field for technologies that require processing and correlating distributed data sources in real time. CER finds applications in diverse domains, which has resulted in a large number of proposals for expressing and processing complex events. Existing CER languages lack a clear semantics, however, which makes them hard to understand and generalize. Moreover, there are no general techniques for evaluating CER query languages with clear performance guarantees. In this article, we embark on the task of giving a rigorous and efficient framework to CER. We propose a formal language for specifying complex events, called complex event logic (CEL), that contains the main features used in the literature and has a denotational and compositional semantics. We also formalize the so-called selection strategies, which had only been presented as by-design extensions to existing frameworks. We give insight into the language design trade-offs regarding the strict sequencing operators of CEL and selection strategies. With a well-defined semantics at hand, we discuss how to efficiently process complex events by evaluating CEL formulas with unary filters. We start by introducing a formal computational model for CER, called complex event automata (CEA), and study how to compile CEL formulas with unary filters into CEA. Furthermore, we provide efficient algorithms for evaluating CEA over event streams using constant time per event followed by output-linear delay enumeration of the results.


2021 ◽  
Author(s):  
Isabella Rodas Arango ◽  
Mateo Dulce Rubio ◽  
Alvaro J. Riascos Villegas

We address the tradeoff of developing good predictive models for police allocation vs. optimally deploying police officers over a city in a way that does not imply an unfair allocation of resources. We modify the fair allocation algorithm of [1] to tackle a real world problem: crime in the city of Bogota, Colombia. Our approach allows for more sophisticated prediction models and we ´ show that the whole methodology outperforms the current police allocating mechanism in the city. Results show that even with a simple model such as a Kernel Density Estimation of crime, one can have much better prediction than the current police model and, at the same time, mitigate fairness concerns. Although we can not provide general performance guarantees, our results apply to a real life problem and should be seriously considered by policy makers.


2021 ◽  
Vol 66 (2) ◽  
pp. 79
Author(s):  
S.-M. Avram

In this paper we conducted an investigation on the performance of the students during the second semester of the academic year 2020-2021. We looked at the performance results obtained by students on the laboratory work, practical and final exams while we were forced by the Covid pandemic to move entirely into an online education system. Our focus was to determine the impact of a consistent behaviour (or lack of it) on the final student performance. We determined that, even in an online setting, a good involvement (in terms of attendance and good performance) guarantees good final results. The investigations were performed using the Formal Concept Analysis, which is a very powerful instrument already used by us in previous research in order to detect student behaviour in using an e-learning portal. Another set of results showed that the change of the final mark computation formula to be based in a higher proportion on the lab work was closer to the actual overall performance of students


Author(s):  
Shizhen Zhao ◽  
Peirui Cao ◽  
Xinbing Wang

As a first step of designing O ptical-circuit-switched D ata C enters (ODC), physical topology design is critical as it determines the scalability and the performance limit of the entire ODC. However, prior works on ODC have not yet paid much attention to physical topology design, and the adopted physical topologies either scale poorly, or lack performance guarantee. We offer a mathematical foundation for the design and performance analysis of ODC physical topologies in this paper. We introduce a new performance metric β(G ) to evaluate the gap between a physical topology G and the ideal physical topology. We develop a coupling technique that bypasses a significant amount of computational complexity of calculating β(G). Using β(G ) and the coupling technique, we study four physical topologies that are representative of those in literature, analyze their scalabilities and prove their performance guarantees. Our analysis may provide new guidance for network operators to design better physical topologies for their ODCs.


Author(s):  
Adrián Esteban-Pérez ◽  
Juan M. Morales

AbstractWe consider stochastic programs conditional on some covariate information, where the only knowledge of the possible relationship between the uncertain parameters and the covariates is reduced to a finite data sample of their joint distribution. By exploiting the close link between the notion of trimmings of a probability measure and the partial mass transportation problem, we construct a data-driven Distributionally Robust Optimization (DRO) framework to hedge the decision against the intrinsic error in the process of inferring conditional information from limited joint data. We show that our approach is computationally as tractable as the standard (without side information) Wasserstein-metric-based DRO and enjoys performance guarantees. Furthermore, our DRO framework can be conveniently used to address data-driven decision-making problems under contaminated samples. Finally, the theoretical results are illustrated using a single-item newsvendor problem and a portfolio allocation problem with side information.


Sign in / Sign up

Export Citation Format

Share Document