scholarly journals Exact Nearest Neighbour Search within Constrained Neighbourhood Using the Forest of Vp-Tree-Like Structures

2021 ◽  
Vol 2096 (1) ◽  
pp. 012199
Author(s):  
E Myasnikov

Abstract In this paper, we address the problem of fast nearest neighbour search. Unfortunately, well-known indexing data structures, such as vp-trees perform poorly on some datasets and do not provide significant acceleration compared to the brute force approach. In the paper, we consider an alternative solution, which can be applied if we are not interested in some fraction of distant nearest neighbours. This solution is based on building the forest of vp-tree-like structures and guarantees the exact nearest neighbour search in the epsilon-neighbourhood of the query point.

2016 ◽  
Vol 43 (4) ◽  
pp. 440-457
Author(s):  
Youngki Park ◽  
Heasoo Hwang ◽  
Sang-goo Lee

Finding k-nearest neighbours ( k-NN) is one of the most important primitives of many applications such as search engines and recommendation systems. However, its computational cost is extremely high when searching for k-NN points in a huge collection of high-dimensional points. Locality-sensitive hashing (LSH) has been introduced for an efficient k-NN approximation, but none of the existing LSH approaches clearly outperforms others. We propose a novel LSH approach, Signature Selection LSH (S2LSH), which finds approximate k-NN points very efficiently in various datasets. It first constructs a large pool of highly diversified signature regions with various sizes. Given a query point, it dynamically generates a query-specific signature region by merging highly effective signature regions selected from the signature pool. We also suggest S2LSH-M, a variant of S2LSH, which processes multiple queries more efficiently by using query-specific features and optimization techniques. Extensive experiments show the performance superiority of our approaches in diverse settings.


2013 ◽  
Vol 23 (04n05) ◽  
pp. 335-355 ◽  
Author(s):  
HAIM KAPLAN ◽  
MICHA SHARIR

Let P be a set of n points in the plane. We present an efficient algorithm for preprocessing P, so that, for a given query point q, we can quickly report the largest disk that contains q but its interior is disjoint from P. The storage required by the data structure is O(n log n), the preprocessing cost is O(n log 2 n), and a query takes O( log 2 n) time. We also present an alternative solution with an improved query cost and with slightly worse storage and preprocessing requirements.


Nature ◽  
2018 ◽  
Vol 560 (7718) ◽  
pp. 293-294
Author(s):  
Davide Castelvecchi

2013 ◽  
Vol 11 ◽  
pp. 25-36
Author(s):  
Eva Stopková

Proceeding deals with development and testing of the module for GRASS GIS [1], based on Nearest Neighbour Analysis. This method can be useful for assessing whether points located in area of interest are distributed randomly, in clusters or separately. The main principle of the method consists of comparing observed average distance between the nearest neighbours r A to average distance between the nearest neighbours r E that is expected in case of randomly distributed points. The result should be statistically tested. The method for two- or three-dimensional space differs in way how to compute r E . Proceeding also describes extension of mathematical background deriving standard deviation of r E , needed in statistical test of analysis result. As disposition of phenomena (e.g. distribution of birds’ nests or plant species) and test results suggest, anisotropic function would repre- sent relationships between points in three-dimensional space better than isotropic function that was used in this work.


Author(s):  
Rajesh Prasad

Word matching problem is to find all the exact occurrences of a pattern P[0...m-1] in the text T[0...n-1], where P neither contains any white space nor preceded and followed by space. In the parameterized word matching problem, a given word P[0...m-1] is said to match with a sub-word t of the text T[0...n-1], if there exists a one-to-one correspondence between the symbols of P and the symbols of t. Exact Word Matching (EWM) problem has been previously solved by partitioning the text into number of tables in the pre-processing phase and then applying either brute force approach or fast hashing during the searching process. This paper presents an extension of EWM problem for parameterized word matching. It first split the text into number of tables in the pre-processing phase and then applying prev-encoding and bit-parallelism technique, Parameterized Shift-Or (PSO) during the searching phase. Experimental results show that this technique performs better than PSO.


2020 ◽  
Vol 70 (6) ◽  
pp. 612-618
Author(s):  
Maiya Din ◽  
Saibal K. Pal ◽  
S. K. Muttoo ◽  
Sushila Madan

The Playfair cipher is a symmetric key cryptosystem-based on encryption of digrams of letters. The cipher shows higher cryptanalytic complexity compared to mono-alphabetic cipher due to the use of 625 different letter-digrams in encryption instead of 26 letters from Roman alphabets. Population-based techniques like Genetic algorithm (GA) and Swarm intelligence (SI) are more suitable compared to the Brute force approach for cryptanalysis of cipher because of specific and unique structure of its Key Table. This work is an attempt to automate the process of cryptanalysis using hybrid computational intelligence. Multiple particle swarm optimization (MPSO) and GA-based hybrid technique (MPSO-GA) have been proposed and applied in solving Playfair ciphers. The authors have attempted to find the solution key applied in generating Playfair crypts by using the proposed hybrid technique to reduce the exhaustive search space. As per the computed results of the MPSO-GA technique, correct solution was obtained for the Playfair ciphers of 100 to 200 letters length. The proposed technique provided better results compared to either GA or PSO-based technique. Furthermore, the technique was also able to recover partial English text message for short Playfair ciphers of 80 to 120 characters length.


Perception ◽  
10.1068/p3416 ◽  
2003 ◽  
Vol 32 (7) ◽  
pp. 871-886 ◽  
Author(s):  
Douglas Vickers ◽  
Pierre Bovet ◽  
Michael D Lee ◽  
Peter Hughes

The planar Euclidean version of the travelling salesperson problem (TSP) requires finding a tour of minimal length through a two-dimensional set of nodes. Despite the computational intractability of the TSP, people can produce rapid, near-optimal solutions to visually presented versions of such problems. To explain this, MacGregor et al (1999, Perception28 1417–1428) have suggested that people use a global-to-local process, based on a perceptual tendency to organise stimuli into convex figures. We review the evidence for this idea and propose an alternative, local-to-global hypothesis, based on the detection of least distances between the nodes in an array. We present the results of an experiment in which we examined the relationships between three objective measures and performance measures of optimality and response uncertainty in tasks requiring participants to construct a closed tour or an open path. The data are not well accounted for by a process based on the convex hull. In contrast, results are generally consistent with a locally focused process based initially on the detection of nearest-neighbour clusters. Individual differences are interpreted in terms of a hierarchical process of constructing solutions, and the findings are related to a more general analysis of the role of nearest neighbours in the perception of structure and motion.


2021 ◽  
Author(s):  
Charles R. Krouse ◽  
Grant O. Musgrove ◽  
Taewoan Kim ◽  
Seungmin Lee ◽  
Muhyoung Lee ◽  
...  

Abstract The Chaboche model is a well-validated non-linear kinematic hardening material model. This material model, like many models, depends on a set of material constants that must be calibrated for it to match the experimental data. Due to the challenge of calibrating these constants, the Chaboche model is often disregarded. The challenge with calibrating the Chaboche constants is that the most reliable method for doing the calibration is a brute force approach, which tests thousands of combinations of constants. Different sampling techniques and optimization schemes can be used to select different combinations of these constants, but ultimately, they all rely on iteratively selecting values and running simulations for each selected set. In the experience of the authors, such brute force methods require roughly 2,500 combinations to be evaluated in order to have confidence that a reasonable solution is found. This process is not efficient. It is time-intensive and labor-intensive. It requires long simulation times, and it requires significant effort to develop the accompanying scripts and algorithms that are used to iterate through combinations of constants and to calculate agreement. A better, more automated method exists for calibrating the Chaboche material constants. In this paper, the authors describe a more efficient, automated method for calibrating Chaboche constants. The method is validated by using it to calibrate Chaboche constants for an IN792 single-crystal material and a CM247 directionally-solidified material. The calibration results using the automated approach were compared to calibration results obtained using a brute force approach. It was determined that the automated method achieves agreeable results that are equivalent to, or supersede, results obtained using the conventional brute force method. After validating the method for cases that only consider a single material orientation, the automated method was extended to multiple off-axis calibrations. The Chaboche model that is available in commercial software, such as ANSYS, will only accept a single set of Chaboche constants for a given temperature. There is no published method for calibrating Chaboche constants that considers multiple material orientations. Therefore, the approach outlined in this paper was extended to include multiple material orientations in a single calibration scheme. The authors concluded that the automated approach can be used to successfully, accurately, and efficiently calibrate multiple material directions. The approach is especially well-suited when off-axis calibration must be considered concomitantly with longitudinal calibration. Overall, the automated Chaboche calibration method yielded results that agreed well with experimental data. Thus, the method can be used with confidence to efficiently and accurately calibrate the Chaboche non-linear kinematic hardening material model.


Author(s):  
Eliot Rudnick-Cohen

Abstract Multi-objective decision making problems can sometimes involve an infinite number of objectives. In this paper, an approach is presented for solving multi-objective optimization problems containing an infinite number of parameterized objectives, termed “infinite objective optimization”. A formulation is given for infinite objective optimization problems and an approach for checking whether a Pareto frontier is a solution to this formulation is detailed. Using this approach, a new sampling based approach is developed for solving infinite objective optimization problems. The new approach is tested on several different example problems and is shown to be faster and perform better than a brute force approach.


Sign in / Sign up

Export Citation Format

Share Document