scholarly journals lsmear: a variable selection strategy for interval branch and bound solvers

2017 ◽  
Vol 71 (3) ◽  
pp. 483-500 ◽  
Author(s):  
Ignacio Araya ◽  
Bertrand Neveu
2021 ◽  
Vol 2021 ◽  
pp. 1-10
Author(s):  
Ziting Zhao ◽  
Tong Liu ◽  
Xudong Zhao

Machine learning plays an important role in computational intelligence and has been widely used in many engineering fields. Surface voids or bugholes frequently appearing on concrete surface after the casting process make the corresponding manual inspection time consuming, costly, labor intensive, and inconsistent. In order to make a better inspection of the concrete surface, automatic classification of concrete bugholes is needed. In this paper, a variable selection strategy is proposed for pursuing feature interpretability, together with an automatic ensemble classification designed for getting a better accuracy of the bughole classification. A texture feature deriving from the Gabor filter and gray-level run lengths is extracted in concrete surface images. Interpretable variables, which are also the components of the feature, are selected according to a presented cumulative voting strategy. An ensemble classifier with its base classifier automatically assigned is provided to detect whether a surface void exists in an image or not. Experimental results on 1000 image samples indicate the effectiveness of our method with a comparable prediction accuracy and model explicable.


2014 ◽  
Vol 2014 ◽  
pp. 1-15
Author(s):  
Dhiranuch Bunnag

This paper presents global optimization algorithms that incorporate the idea of an interval branch and bound and the stochastic search algorithms. Two algorithms for unconstrained problems are proposed, the hybrid interval simulated annealing and the combined interval branch and bound and genetic algorithm. The numerical experiment shows better results compared to Hansen’s algorithm and simulated annealing in terms of the storage, speed, and number of function evaluations. The convergence proof is described. Moreover, the idea of both algorithms suggests a structure for an integrated interval branch and bound and genetic algorithm for constrained problems in which the algorithm is described and tested. The aim is to capture one of the solutions with higher accuracy and lower cost. The results show better quality of the solutions with less number of function evaluations compared with the traditional GA.


2009 ◽  
Vol 91 (5) ◽  
pp. 307-311 ◽  
Author(s):  
KLARA L. VERBYLA ◽  
BEN J. HAYES ◽  
PHILIP J. BOWMAN ◽  
MICHAEL E. GODDARD

SummaryGenomic selection describes a selection strategy based on genomic breeding values predicted from dense single nucleotide polymorphism (SNP) data. Multiple methods have been proposed but the critical issue is how to decide whether an SNP should be included in the predictive set to estimate breeding values. One major disadvantage of the traditional Bayes B approach is its high computational demands caused by the changing dimensionality of the models. The use of stochastic search variable selection (SSVS) retains the same assumptions about the distribution of SNP effects as Bayes B, while maintaining constant dimensionality. When Bayesian SSVS was used to predict genomic breeding values for real dairy data over a range of traits it produced accuracies higher or equivalent to other genomic selection methods with significantly decreased computational and time demands than Bayes B.


Sign in / Sign up

Export Citation Format

Share Document