scholarly journals Worst-case & average-case Efficiency trade-offs for search problems

Author(s):  
Preeti Sharma

Evacuation problems fall under the vast area of search theory and operations research. Problems of evacuation of two robots on a unit disc have been studied for an efficient evacuation time. Work done so far has focused on improving the ’worst-case’ evacuation time with deterministic algorithms. We study the ’average-case’ evacuation time (randomized algorithms) while considering the efficiency trade-off between worst-case and average-case costs. Our other contribution is to analyze average-case and worst-case costs for the cowpath problem (another search problem) which helped us to set a parallel method for the evacuation problem.

2021 ◽  
Author(s):  
Preeti Sharma

Evacuation problems fall under the vast area of search theory and operations research. Problems of evacuation of two robots on a unit disc have been studied for an efficient evacuation time. Work done so far has focused on improving the ’worst-case’ evacuation time with deterministic algorithms. We study the ’average-case’ evacuation time (randomized algorithms) while considering the efficiency trade-off between worst-case and average-case costs. Our other contribution is to analyze average-case and worst-case costs for the cowpath problem (another search problem) which helped us to set a parallel method for the evacuation problem.


2005 ◽  
Vol 15 (02) ◽  
pp. 151-166
Author(s):  
TAKESHI KANDA ◽  
KOKICHI SUGIHARA

This paper studies the two-dimensional range search problem, and constructs a simple and efficient algorithm based on the Voronoi diagram. In this problem, a set of points and a query range are given, and we want to enumerate all the points which are inside the query range as quickly as possible. In most of the previous researches on this problem, the shape of the query range is restricted to particular ones such as circles, rectangles and triangles, and the improvement on the worst-case performance has been pursued. On the other hand, the algorithm proposed in this paper is designed for a general shape of the query range in the two-dimensional space, and is intended to accomplish a good average-case performance. This performance is actually observed by numerical experiments. In these experiments, we compare the execution time of the proposed algorithm with those of other representative algorithms such as those based on the bucketing technique and the k-d tree. We can observe that our algorithm shows the better performance in almost all the cases.


2019 ◽  
Vol 2019 (2) ◽  
pp. 166-186
Author(s):  
Hans Hanley ◽  
Yixin Sun ◽  
Sameer Wagh ◽  
Prateek Mittal

Abstract Recent work has shown that Tor is vulnerable to attacks that manipulate inter-domain routing to compromise user privacy. Proposed solutions such as Counter-RAPTOR [29] attempt to ameliorate this issue by favoring Tor entry relays that have high resilience to these attacks. However, because these defenses bias Tor path selection on the identity of the client, they invariably leak probabilistic information about client identities. In this work, we make the following contributions. First, we identify a novel means to quantify privacy leakage in guard selection algorithms using the metric of Max-Divergence. Max-Divergence ensures that probabilistic privacy loss is within strict bounds while also providing composability over time. Second, we utilize Max-Divergence and multiple notions of entropy to understand privacy loss in the worst-case for Counter-RAPTOR. Our worst-case analysis provides a fresh perspective to the field, as prior work such as Counter-RAPTOR only analyzed average case-privacy loss. Third, we propose modifications to Counter-RAPTOR that incorporate worst-case Max-Divergence in its design. Specifically, we utilize the exponential mechanism (a mechanism for differential privacy) to guarantee a worst-case bound on Max-Divergence/privacy loss. For the quality function used in the exponential mechanism, we show that a Monte-Carlo sampling-based method for stochastic optimization can be used to improve multi-dimensional trade-offs between security, privacy, and performance. Finally, we demonstrate that compared to Counter-RAPTOR, our approach achieves an 83% decrease in Max-Divergence after one guard selection and a 245% increase in worst-case Shannon entropy after 5 guard selections. Notably, experimental evaluations using the Shadow emulator shows that our approach provides these privacy benefits with minimal impact on system performance.


2020 ◽  
Vol 15 (1) ◽  
pp. 60-71
Author(s):  
Thijs Laarhoven

AbstractWe revisit the approximate Voronoi cells approach for solving the closest vector problem with preprocessing (CVPP) on high-dimensional lattices, and settle the open problem of Doulgerakis–Laarhoven–De Weger [PQCrypto, 2019] of determining exact asymptotics on the volume of these Voronoi cells under the Gaussian heuristic. As a result, we obtain improved upper bounds on the time complexity of the randomized iterative slicer when using less than $2^{0.076d + o(d)}$ memory, and we show how to obtain time–memory trade-offs even when using less than $2^{0.048d + o(d)}$ memory. We also settle the open problem of obtaining a continuous trade-off between the size of the advice and the query time complexity, as the time complexity with subexponential advice in our approach scales as $d^{d/2 + o(d)}$ matching worst-case enumeration bounds, and achieving the same asymptotic scaling as average-case enumeration algorithms for the closest vector problem.


Author(s):  
Sunil Pathak

Background: The significant work has been present to identify suspects, gathering information and examining any videos from CCTV Footage. This exploration work expects to recognize suspicious exercises, i.e. object trade, passage of another individual, peeping into other's answer sheet and individual trade from the video caught by a reconnaissance camera amid examinations. This requires the procedure of face acknowledgment, hand acknowledgment and distinguishing the contact between the face and hands of a similar individual and that among various people. Methods: Segmented frames has given as input to obtain foreground image with the help of Gaussian filtering and background modeling method. Suh foreground images has given to Activity Recognition model to detect normal activity or suspicious activity. Results: Accuracy rate, Precision and Recall are calculate for activities detection, contact detection for Best Case, Average Case and Worst Case. Simulation results are compare with performance parameter such as Material Exchange, Position Exchange, and Introduction of a new person, Face and Hand Detection and Multi Person Scenario. Conclusion: In this paper, a framework is prepared for suspect detection. This framework will absolutely realize an unrest in the field of security observation in the training area.


2014 ◽  
Vol 2014 ◽  
pp. 1-11
Author(s):  
Wei Zhou ◽  
Zilong Tan ◽  
Shaowen Yao ◽  
Shipu Wang

Resource location in structured P2P system has a critical influence on the system performance. Existing analytical studies of Chord protocol have shown some potential improvements in performance. In this paper a splay tree-based new Chord structure called SChord is proposed to improve the efficiency of locating resources. We consider a novel implementation of the Chord finger table (routing table) based on the splay tree. This approach extends the Chord finger table with additional routing entries. Adaptive routing algorithm is proposed for implementation, and it can be shown that hop count is significantly minimized without introducing any other protocol overheads. We analyze the hop count of the adaptive routing algorithm, as compared to Chord variants, and demonstrate sharp upper and lower bounds for both worst-case and average case settings. In addition, we theoretically analyze the hop reducing in SChord and derive the fact that SChord can significantly reduce the routing hops as compared to Chord. Several simulations are presented to evaluate the performance of the algorithm and support our analytical findings. The simulation results show the efficiency of SChord.


Algorithmica ◽  
2021 ◽  
Author(s):  
Jie Zhang

AbstractApart from the principles and methodologies inherited from Economics and Game Theory, the studies in Algorithmic Mechanism Design typically employ the worst-case analysis and design of approximation schemes of Theoretical Computer Science. For instance, the approximation ratio, which is the canonical measure of evaluating how well an incentive-compatible mechanism approximately optimizes the objective, is defined in the worst-case sense. It compares the performance of the optimal mechanism against the performance of a truthful mechanism, for all possible inputs. In this paper, we take the average-case analysis approach, and tackle one of the primary motivating problems in Algorithmic Mechanism Design—the scheduling problem (Nisan and Ronen, in: Proceedings of the 31st annual ACM symposium on theory of computing (STOC), 1999). One version of this problem, which includes a verification component, is studied by Koutsoupias (Theory Comput Syst 54(3):375–387, 2014). It was shown that the problem has a tight approximation ratio bound of $$(n+1)/2$$ ( n + 1 ) / 2 for the single-task setting, where n is the number of machines. We show, however, when the costs of the machines to executing the task follow any independent and identical distribution, the average-case approximation ratio of the mechanism given by Koutsoupias (Theory Comput Syst 54(3):375–387, 2014) is upper bounded by a constant. This positive result asymptotically separates the average-case ratio from the worst-case ratio. It indicates that the optimal mechanism devised for a worst-case guarantee works well on average.


2017 ◽  
Vol 139 (11) ◽  
Author(s):  
Wei Chen ◽  
Mark Fuge

To solve a design problem, sometimes it is necessary to identify the feasible design space. For design spaces with implicit constraints, sampling methods are usually used. These methods typically bound the design space; that is, limit the range of design variables. But bounds that are too small will fail to cover all possible designs, while bounds that are too large will waste sampling budget. This paper tries to solve the problem of efficiently discovering (possibly disconnected) feasible domains in an unbounded design space. We propose a data-driven adaptive sampling technique—ε-margin sampling, which learns the domain boundary of feasible designs and also expands our knowledge on the design space as available budget increases. This technique is data-efficient, in that it makes principled probabilistic trade-offs between refining existing domain boundaries versus expanding the design space. We demonstrate that this method can better identify feasible domains on standard test functions compared to both random and active sampling (via uncertainty sampling). However, a fundamental problem when applying adaptive sampling to real world designs is that designs often have high dimensionality and thus require (in the worst case) exponentially more samples per dimension. We show how coupling design manifolds with ε-margin sampling allows us to actively expand high-dimensional design spaces without incurring this exponential penalty. We demonstrate this on real-world examples of glassware and bottle design, where our method discovers designs that have different appearance and functionality from its initial design set.


Sign in / Sign up

Export Citation Format

Share Document