scholarly journals A Lightweight Machine Learning Assisted Power Optimization for Minimum Error in NOMA-CRS over Nakagami-m channels

Author(s):  
Ferdi Kara ◽  
Hakan Kaya ◽  
Halim Yanikomeroglu
2014 ◽  
Vol 10 (S306) ◽  
pp. 288-291
Author(s):  
Lise du Buisson ◽  
Navin Sivanandam ◽  
Bruce A. Bassett ◽  
Mathew Smith

AbstractUsing transient imaging data from the 2nd and 3rd years of the SDSS supernova survey, we apply various machine learning techniques to the problem of classifying transients (e.g. SNe) from artefacts, one of the first steps in any transient detection pipeline, and one that is often still carried out by human scanners. Using features mostly obtained from PCA, we show that we can match human levels of classification success, and find that a K-nearest neighbours algorithm and SkyNet perform best, while the Naive Bayes, SVM and minimum error classifier have performances varying from slightly to significantly worse.


Author(s):  
Alessandro Maria Selvitella ◽  
Julio J. Valdés

In this paper, we discuss the problem of estimating the minimum error reachable by a regression model given a dataset, prior to learning. More specifically, we extend the Gamma Test estimates of the variance of the noise from the continuous case to the binary case. We give some heuristics for further possible extensions of the theory in the continuous case with the [Formula: see text]-norm and conclude with some applications and simulations. From the point of view of machine learning, the result is relevant because it gives conditions under which there is no need to learn the model in order to predict the best possible performance.


2020 ◽  
Vol 43 ◽  
Author(s):  
Myrthe Faber

Abstract Gilead et al. state that abstraction supports mental travel, and that mental travel critically relies on abstraction. I propose an important addition to this theoretical framework, namely that mental travel might also support abstraction. Specifically, I argue that spontaneous mental travel (mind wandering), much like data augmentation in machine learning, provides variability in mental content and context necessary for abstraction.


2020 ◽  
Author(s):  
Mohammed J. Zaki ◽  
Wagner Meira, Jr
Keyword(s):  

2020 ◽  
Author(s):  
Marc Peter Deisenroth ◽  
A. Aldo Faisal ◽  
Cheng Soon Ong
Keyword(s):  

Author(s):  
Lorenza Saitta ◽  
Attilio Giordana ◽  
Antoine Cornuejols

Sign in / Sign up

Export Citation Format

Share Document