pure random search
Recently Published Documents


TOTAL DOCUMENTS

15
(FIVE YEARS 3)

H-INDEX

3
(FIVE YEARS 1)

Mathematics ◽  
2021 ◽  
Vol 9 (23) ◽  
pp. 3043
Author(s):  
Manuel L. Esquível ◽  
Nadezhda P. Krasii ◽  
Pedro P. Mota ◽  
Nélio Machado

We propose a stochastic algorithm for global optimisation of a regular function, possibly unbounded, defined on a bounded set with regular boundary; a function that attains its extremum in the boundary of its domain of definition. The algorithm is determined by a diffusion process that is associated with the function by means of a strictly elliptic operator that ensures an adequate maximum principle. In order to preclude the algorithm to be trapped in a local extremum, we add a pure random search step to the algorithm. We show that an adequate procedure of parallelisation of the algorithm can increase the rate of convergence, thus superseding the main drawback of the addition of the pure random search step.


2021 ◽  
Vol 11 (11) ◽  
pp. 5053
Author(s):  
Vagelis Plevris ◽  
Nikolaos P. Bakas ◽  
German Solorzano

A new, fast, elegant, and simple stochastic optimization search method is proposed, which exhibits surprisingly good performance and robustness considering its simplicity. We name the algorithm pure random orthogonal search (PROS). The method does not use any assumptions, does not have any parameters to adjust, and uses basic calculations to evolve a single candidate solution. The idea is that a single decision variable is randomly changed at every iteration and the candidate solution is updated only when an improvement is observed; therefore, moving orthogonally towards the optimal solution. Due to its simplicity, PROS can be easily implemented with basic programming skills and any non-expert in optimization can use it to solve problems and start exploring the fascinating optimization world. In the present work, PROS is explained in detail and is used to optimize 12 multi-dimensional test functions with various levels of complexity. The performance is compared with the pure random search strategy and other three well-established algorithms: genetic algorithms (GA), particle swarm optimization (PSO), and differential evolution (DE). The results indicate that, despite its simplicity, the proposed PROS method exhibits very good performance with fast convergence rates and quick execution time. The method can serve as a simple alternative to established and more complex optimizers. Additionally, it could also be used as a benchmark for other metaheuristic optimization algorithms as one of the simplest, yet powerful, optimizers. The algorithm is provided with its full source code in MATLAB for anybody interested to use, test or explore.


Author(s):  
Zachary Bethune ◽  
Michael Choi ◽  
Randall Wright

Abstract We analyse dynamic general equilibrium models with more-or-less directed search by informed buyers and random search by uninformed buyers. This nests existing specifications and generates new insights. A quantitative application concerns the welfare cost of inflation, which is known to be quite high with pure random search and low with pure directed search. Our calibration implies the impact of inflation is fairly low, in part because, in addition to the usual costs, it provides benefits by more heavily taxing high-price sellers that inefficiently profit from exploiting the uninformed. Other applications analyse analytically and numerically changes in credit conditions and information.


2012 ◽  
Vol 4 (6) ◽  
pp. 359 ◽  
Author(s):  
Michael P. Poland ◽  
Christopher D. Nugent ◽  
Hui Wang ◽  
Liming Chen

2011 ◽  
Vol 19 (3) ◽  
pp. 137-160 ◽  
Author(s):  
Michael P. Poland ◽  
Chris D. Nugent ◽  
Hui Wang ◽  
Liming Chen

Optimization ◽  
2010 ◽  
Vol 59 (2) ◽  
pp. 289-303 ◽  
Author(s):  
Dragan Radulović

Sign in / Sign up

Export Citation Format

Share Document