What search algorithm gives optimal average-case performance when search resources are highly limited?

Author(s):  
David Mutchler
2010 ◽  
Vol 5 (1) ◽  
pp. 78-88 ◽  
Author(s):  
Marcelo Porto ◽  
André Silva ◽  
Sergo Almeida ◽  
Eduardo Da Costa ◽  
Sergio Bampi

This paper presents real time HDTV (High Definition Television) architecture for Motion Estimation (ME) using efficient adder compressors. The architecture is based on the Quarter Sub-sampled Diamond Search algorithm (QSDS) with Dynamic Iteration Control (DIC) algorithm. The main characteristic of the proposed architecture is the large amount of Processing Units (PUs) that are used to calculate the SAD (Sum of Absolute Difference) metric. The internal structures of the PUs are composed by a large number of addition operations to calculate the SADs. In this paper, efficient 4-2 and 8-2 adder compressors are used in the PUs architecture to achieve the performance to work with HDTV (High Definition Television) videos in real time at 30 frames per second. These adder compressors enable the simultaneous addition of 4 and 8 operands respectively. The PUs, using adder compressors, were applied to the ME architecture. The implemented architecture was described in VHDL and synthesized to FPGA and, with Leonardo Spectrum tool, to the TSMC 0.18μm CMOS standard cell technology. Synthesis results indicate that the new QSDS-DIC architecture reach the best performance result and enable gains of 12% in terms of processing rate. The architecture can reach real time for full HDTV (1920x1080 pixels) in the worst case processing 65 frames per second, and it can process 269 HDTV frames per second in the average case.


2006 ◽  
Vol 6 (6) ◽  
pp. 483-494
Author(s):  
T. Tulsi ◽  
L.K. Grover ◽  
A. Patel

The standard quantum search lacks a feature, enjoyed by many classical algorithms, of having a fixed point, i.e. monotonic convergence towards the solution. Recently a fixed point quantum search algorithm has been discovered, referred to as the Phase-\pi/3 search algorithm, which gets around this limitation. While searching a database for a target state, this algorithm reduces the error probability from \epsilon to \epsilon^{2q+1} using q oracle queries, which has since been proved to be asymptotically optimal. A different algorithm is presented here, which has the same worst-case behavior as the Phase-\pi/3 search algorithm but much better average-case behavior. Furthermore the new algorithm gives \epsilon^{2q+1} convergence for all integral q, whereas the Phase-\pi/3 search algorithm requires q to be (3^{n}-1)/2 with n a positive integer. In the new algorithm, the operations are controlled by two ancilla qubits, and fixed point behavior is achieved by irreversible measurement operations applied to these ancillas. It is an example of how measurement can allow us to bypass some restrictions imposed by unitarity on quantum computing.


2011 ◽  
Vol 03 (04) ◽  
pp. 457-471 ◽  
Author(s):  
B. BALAMOHAN ◽  
P. FLOCCHINI ◽  
A. MIRI ◽  
N. SANTORO

In a network environment supporting mobile entities (called robots or agents), a black hole is a harmful site that destroys any incoming entity without leaving any visible trace. The black-hole search problit is the task of a team of k > 1 mobile entities, starting from the same safe location and executing the same algorithm, to determine within finite time the location of the black hole. In this paper, we consider the black hole search problit in asynchronous ring networks of n nodes, and focus on time complexity. It is known that any algorithm for black-hole search in a ring requires at least 2(n - 2) time in the worst case. The best known algorithm achieves this bound with a team of n - 1 agents with an average time cost of 2(n - 2), equal to the worst case. In this paper, we first show how the same number of agents using 2 extra time units in the worst case, can solve the problit in only [Formula: see text] time on the average. We then prove that the optimal average case complexity of [Formula: see text] can be achieved without increasing the worst case using 2(n - 1) agents. Finally, we design an algorithm that achieves asymptotically optimal both worst and average case time complexities itploying an optimal team of k = 2 agents, thus improving on the earlier results that required O(n) agents.


Algorithms ◽  
2022 ◽  
Vol 15 (1) ◽  
pp. 23
Author(s):  
Yang Zhang ◽  
Jiacheng Li ◽  
Lei Li

To overcome the shortcomings of the harmony search algorithm, such as its slow convergence rate and poor global search ability, a reward population-based differential genetic harmony search algorithm is proposed. In this algorithm, a population is divided into four ordinary sub-populations and one reward sub-population, for each of which the evolution strategy of the differential genetic harmony search is used. After the evolution, the population with the optimal average fitness is combined with the reward population to produce a new reward population. During an experiment, tests were conducted first on determining the value of the harmony memory size (HMS) and the harmony memory consideration rate (HMCR), followed by an analysis of the effect of their values on the performance of the proposed algorithm. Then, six benchmark functions were selected for the experiment, and a comparison was made on the calculation results of the standard harmony memory search algorithm, reward population harmony search algorithm, differential genetic harmony algorithm, and reward population-based differential genetic harmony search algorithm. The result suggests that the reward population-based differential genetic harmony search algorithm has the merits of a strong global search ability, high solving accuracy, and satisfactory stability.


2011 ◽  
Vol 8 (2) ◽  
pp. 625-629
Author(s):  
Baghdad Science Journal

There are many methods of searching large amount of data to find one particular piece of information. Such as find name of person in record of mobile. Certain methods of organizing data make the search process more efficient the objective of these methods is to find the element with least cost (least time). Binary search algorithm is faster than sequential and other commonly used search algorithms. This research develops binary search algorithm by using new structure called Triple, structure in this structure data are represented as triple. It consists of three locations (1-Top, 2-Left, and 3-Right) Binary search algorithm divide the search interval in half, this process makes the maximum number of comparisons (Average case complexity of Search) is O(log2 n) (pronounce this "big-Oh-n" or "the order of magnitude"), if we search in a list consists of (N) elements. In this research the number of comparison is reduced to triple by using Triple structure, this process makes the maximum number of comparisons is O(log2 (n)/3+1) if we search key in list consist of (N) elements.


1985 ◽  
Vol 17 (01) ◽  
pp. 231-233 ◽  
Author(s):  
Dietmar Pfeifer

We give an upper bound for the average complexity (i.e. the expected number of steps until termination) for a continuous random search algorithm using results from renewal theory. It is thus possible to show that for a predefined accuracy ε, the average complexity of the algorithm is O(–log ε) for ε → 0 which is optimal up to a constant factor.


Sign in / Sign up

Export Citation Format

Share Document