Does Feature Selection Improve Classification? A Large Scale Experiment in OpenML

Author(s):  
Martijn J. Post ◽  
Peter van der Putten ◽  
Jan N. van Rijn
2015 ◽  
Vol 282 (1805) ◽  
pp. 20150120 ◽  
Author(s):  
Robert A. McCleery ◽  
Adia Sovie ◽  
Robert N. Reed ◽  
Mark W. Cunningham ◽  
Margaret E. Hunter ◽  
...  

To address the ongoing debate over the impact of invasive species on native terrestrial wildlife, we conducted a large-scale experiment to test the hypothesis that invasive Burmese pythons ( Python molurus bivittatus ) were a cause of the precipitous decline of mammals in Everglades National Park (ENP). Evidence linking pythons to mammal declines has been indirect and there are reasons to question whether pythons, or any predator, could have caused the precipitous declines seen across a range of mammalian functional groups. Experimentally manipulating marsh rabbits, we found that pythons accounted for 77% of rabbit mortalities within 11 months of their translocation to ENP and that python predation appeared to preclude the persistence of rabbit populations in ENP. On control sites, outside of the park, no rabbits were killed by pythons and 71% of attributable marsh rabbit mortalities were classified as mammal predations. Burmese pythons pose a serious threat to the faunal communities and ecological functioning of the Greater Everglades Ecosystem, which will probably spread as python populations expand their range.


2021 ◽  
Vol 26 (1) ◽  
pp. 67-77
Author(s):  
Siva Sankari Subbiah ◽  
Jayakumar Chinnappan

Now a day, all the organizations collecting huge volume of data without knowing its usefulness. The fast development of Internet helps the organizations to capture data in many different formats through Internet of Things (IoT), social media and from other disparate sources. The dimension of the dataset increases day by day at an extraordinary rate resulting in large scale dataset with high dimensionality. The present paper reviews the opportunities and challenges of feature selection for processing the high dimensional data with reduced complexity and improved accuracy. In the modern big data world the feature selection has a significance in reducing the dimensionality and overfitting of the learning process. Many feature selection methods have been proposed by researchers for obtaining more relevant features especially from the big datasets that helps to provide accurate learning results without degradation in performance. This paper discusses the importance of feature selection, basic feature selection approaches, centralized and distributed big data processing using Hadoop and Spark, challenges of feature selection and provides the summary of the related research work done by various researchers. As a result, the big data analysis with the feature selection improves the accuracy of the learning.


2021 ◽  
Author(s):  
Hyeyoung Koh ◽  
Hannah Beth Blum

This study presents a machine learning-based approach for sensitivity analysis to examine how parameters affect a given structural response while accounting for uncertainty. Reliability-based sensitivity analysis involves repeated evaluations of the performance function incorporating uncertainties to estimate the influence of a model parameter, which can lead to prohibitive computational costs. This challenge is exacerbated for large-scale engineering problems which often carry a large quantity of uncertain parameters. The proposed approach is based on feature selection algorithms that rank feature importance and remove redundant predictors during model development which improve model generality and training performance by focusing only on the significant features. The approach allows performing sensitivity analysis of structural systems by providing feature rankings with reduced computational effort. The proposed approach is demonstrated with two designs of a two-bay, two-story planar steel frame with different failure modes: inelastic instability of a single member and progressive yielding. The feature variables in the data are uncertainties including material yield strength, Young’s modulus, frame sway imperfection, and residual stress. The Monte Carlo sampling method is utilized to generate random realizations of the frames from published distributions of the feature parameters, and the response variable is the frame ultimate strength obtained from finite element analyses. Decision trees are trained to identify important features. Feature rankings are derived by four feature selection techniques including impurity-based, permutation, SHAP, and Spearman's correlation. Predictive performance of the model including the important features are discussed using the evaluation metric for imbalanced datasets, Matthews correlation coefficient. Finally, the results are compared with those from reliability-based sensitivity analysis on the same example frames to show the validity of the feature selection approach. As the proposed machine learning-based approach produces the same results as the reliability-based sensitivity analysis with improved computational efficiency and accuracy, it could be extended to other structural systems.


Author(s):  
Taha Yasseri ◽  
Jannie Reher

AbstractThrough a large-scale online field experiment, we provide new empirical evidence for the presence of the anchoring bias in people’s judgement due to irrational reliance on a piece of information that they are initially given. The comparison of the anchoring stimuli and respective responses across different tasks reveals a positive, yet complex relationship between the anchors and the bias in participants’ predictions of the outcomes of events in the future. Participants in the treatment group were equally susceptible to the anchors regardless of their level of engagement, previous performance, or gender. Given the strong and ubiquitous influence of anchors quantified here, we should take great care to closely monitor and regulate the distribution of information online to facilitate less biased decision making.


Engineering ◽  
2012 ◽  
Vol 04 (09) ◽  
pp. 557-567 ◽  
Author(s):  
Joongu Kang ◽  
Changsung Kim ◽  
Sanghwa Jung ◽  
Hongkoo Yeo

Sign in / Sign up

Export Citation Format

Share Document