covering algorithms
Recently Published Documents


TOTAL DOCUMENTS

28
(FIVE YEARS 3)

H-INDEX

7
(FIVE YEARS 1)

2021 ◽  
Vol 6 (1) ◽  
Author(s):  
Péter Tamás Kovács ◽  
Marcell Nagy ◽  
Roland Molontay

AbstractResearch on fractal networks is a dynamically growing field of network science. A central issue is to analyze the fractality with the so-called box-covering method. As this problem is known to be NP-hard, a plethora of approximating algorithms have been proposed throughout the years. This study aims to establish a unified framework for comparing approximating box-covering algorithms by collecting, implementing, and evaluating these methods in various aspects including running time and approximation ability. This work might also serve as a reference for both researchers and practitioners, allowing fast selection from a rich collection of box-covering algorithms with a publicly available codebase.


2020 ◽  
Vol 12 (14) ◽  
pp. 2285 ◽  
Author(s):  
Joshua E. Hammond ◽  
Cory A. Vernon ◽  
Trent J. Okeson ◽  
Benjamin J. Barrett ◽  
Samuel Arce ◽  
...  

Remote sensing with unmanned aerial vehicles (UAVs) facilitates photogrammetry for environmental and infrastructural monitoring. Models are created with less computational cost by reducing the number of photos required. Optimal camera locations for reducing the number of photos needed for structure-from-motion (SfM) are determined through eight mathematical set-covering algorithms as constrained by solve time. The algorithms examined are: traditional greedy, reverse greedy, carousel greedy (CG), linear programming, particle swarm optimization, simulated annealing, genetic, and ant colony optimization. Coverage and solve time are investigated for these algorithms. CG is the best method for choosing optimal camera locations as it balances number of photos required and time required to calculate camera positions as shown through an analysis similar to a Pareto Front. CG obtains a statistically significant 3.2 fewer cameras per modeled area than base greedy algorithm while requiring just one additional order of magnitude of solve time. For comparison, linear programming is capable of fewer cameras than base greedy but takes at least three orders of magnitude longer to solve. A grid independence study serves as a sensitivity analysis of the CG algorithms α (iteration number) and β (percentage to be recalculated) parameters that adjust traditional greedy heuristics, and a case study at the Rock Canyon collection dike in Provo, UT, USA, compares the results of all eight algorithms and the uniqueness (in terms of percentage comparisons based on location/angle metadata and qualitative visual comparison) of each selected set. Though this specific study uses SfM, the principles could apply to other instruments such as multi-spectral cameras or aerial LiDAR.


Author(s):  
Ran Ben Basat ◽  
Guy Even ◽  
Ken-ichi Kawarabayashi ◽  
Gregory Schwartzman
Keyword(s):  

2013 ◽  
Vol 29 (1) ◽  
pp. 237-272 ◽  
Author(s):  
Bart Minnaert ◽  
David Martens ◽  
Manu De Backer ◽  
Bart Baesens

2013 ◽  
Vol 146 (1-2) ◽  
pp. 583-615 ◽  
Author(s):  
Anupam Gupta ◽  
Viswanath Nagarajan ◽  
R. Ravi
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document