AI and Algorithmic Bias: Source, Detection, Mitigation and Implications

2020 ◽  
Author(s):  
Runshan Fu ◽  
Yan Huang ◽  
Param Vir Singh
2021 ◽  
Author(s):  
Adam Augustyniak ◽  
David J. Hanley ◽  
Timothy W. Bretl ◽  
Neil J. Hejmanowski ◽  
David L. Carroll

interactions ◽  
2018 ◽  
Vol 25 (6) ◽  
pp. 58-63 ◽  
Author(s):  
Henriette Cramer ◽  
Jean Garcia-Gathright ◽  
Aaron Springer ◽  
Sravana Reddy
Keyword(s):  

2021 ◽  
pp. 146144482110127
Author(s):  
Marcus Carter ◽  
Ben Egliston

Virtual reality (VR) is an emerging technology with the potential to extract significantly more data about learners and the learning process. In this article, we present an analysis of how VR education technology companies frame, use and analyse this data. We found both an expansion and acceleration of what data are being collected about learners and how these data are being mobilised in potentially discriminatory and problematic ways. Beyond providing evidence for how VR represents an intensification of the datafication of education, we discuss three interrelated critical issues that are specific to VR: the fantasy that VR data is ‘perfect’, the datafication of soft-skills training, and the commercialisation and commodification of VR data. In the context of the issues identified, we caution the unregulated and uncritical application of learning analytics to the data that are collected from VR training.


Proceedings ◽  
2019 ◽  
Vol 33 (1) ◽  
pp. 21
Author(s):  
Fabrizia Guglielmetti ◽  
Eric Villard ◽  
Ed Fomalont

A stable and unique solution to the ill-posed inverse problem in radio synthesis image analysis is sought employing Bayesian probability theory combined with a probabilistic two-component mixture model. The solution of the ill-posed inverse problem is given by inferring the values of model parameters defined to describe completely the physical system arised by the data. The analysed data are calibrated visibilities, Fourier transformed from the ( u , v ) to image planes. Adaptive splines are explored to model the cumbersome background model corrupted by the largely varying dirty beam in the image plane. The de-convolution process of the dirty image from the dirty beam is tackled in probability space. Probability maps in source detection at several resolution values quantify the acquired knowledge on the celestial source distribution from a given state of information. The information available are data constrains, prior knowledge and uncertain information. The novel algorithm has the aim to provide an alternative imaging task for the use of the Atacama Large Millimeter/Submillimeter Array (ALMA) in support of the widely used Common Astronomy Software Applications (CASA) enhancing the capabilities in source detection.


2017 ◽  
Vol 29 (18) ◽  
pp. e4203
Author(s):  
Gunther H. Weber ◽  
Mark S. Bandstra ◽  
Daniel H. Chivers ◽  
Hamdy H. Elgammal ◽  
Valerie Hendrix ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document