algorithmic accountability
Recently Published Documents


TOTAL DOCUMENTS

44
(FIVE YEARS 31)

H-INDEX

7
(FIVE YEARS 4)

2021 ◽  
pp. 146144482110329
Author(s):  
Philipp Müller ◽  
Ruben L Bach

This study explores voters’ populist alternative news use during (different types of) democratic elections and investigates starting points for preventing potentially harmful effects. We draw from two combined data sets of web-tracking and survey data which were collected during the 2017 German Bundestag campaign (1523 participants) and the 2019 European Parliamentary election campaign in Germany (1009 participants). Results indicate that while populist alternative news outlets drew more interest during the first-order election campaign, they reached only 16.5% of users even then. Moreover, most users visited their websites rather seldom. Nonetheless, our data suggest that alternative news exposure is strongly linked to voting for (right-wing) populist parties. Regarding the origins of exposure, our analyses punctuate the role of platforms in referring users to populist alternative news. About 40% of website visits originated from Facebook alone in both data sets and another third of visits from search engines. This raises questions about algorithmic accountability.


AI & Society ◽  
2021 ◽  
Author(s):  
Antonin Descampe ◽  
Clément Massart ◽  
Simon Poelman ◽  
François-Xavier Standaert ◽  
Olivier Standaert

2021 ◽  
Vol 3 ◽  
Author(s):  
Nikolaus Poechhacker ◽  
Severin Kacianka

The increasing use of automated decision making (ADM) and machine learning sparked an ongoing discussion about algorithmic accountability. Within computer science, a new form of producing accountability has been discussed recently: causality as an expression of algorithmic accountability, formalized using structural causal models (SCMs). However, causality itself is a concept that needs further exploration. Therefore, in this contribution we confront ideas of SCMs with insights from social theory, more explicitly pragmatism, and argue that formal expressions of causality must always be seen in the context of the social system in which they are applied. This results in the formulation of further research questions and directions.


2021 ◽  
pp. 000276422198978
Author(s):  
Kitae Kim ◽  
Shin-Il Moon

Content curation in contemporary digital platforms leverages both algorithmic decision making and human judgment. As much as algorithm has become an integral part of digital configurations, there are growing concerns about the lack of accountability surrounding algorithm-driven digital services. The issue of algorithmic accountability is attributed not only to intrinsic opacity in computational processes but also to the lack of transparency in platform governance. This article discusses two controversial cases surrounding algorithmic transparency in the South Korean digital environment. It first epitomizes the notion of algorithmic transparency as a prerequisite for accountability. Then, it situates the use of algorithms for online content curation in the South Korean digital environment to illustrate how algorithmic transparency is complicated by sociopolitical conditions. Finally, this research offers several suggestions for promoting a more accountable algorithm society.


2020 ◽  
Vol 8 (4) ◽  
pp. 456-467 ◽  
Author(s):  
Marc J. C. Van den Homberg ◽  
Caroline M. Gevaert ◽  
Yola Georgiadou

Over the past two decades, humanitarian conduct has been drifting away from the classical paradigm. This drift is caused by the blurring of boundaries between development aid and humanitarianism and the increasing reliance on digital technologies and data. New humanitarianism, especially in the form of disaster risk reduction, involved government authorities in plans to strengthen their capacity to deal with disasters. Digital humanitarianism now enrolls remote data analytics: GIS capacity, local data and information management experts, and digital volunteers. It harnesses the power of artificial intelligence to strengthen humanitarian agencies and governments’ capacity to anticipate and cope better with crises. In this article, we first trace how the meaning of accountability changed from classical to new and finally to digital humanitarianism. We then describe a recent empirical case of anticipatory humanitarian action in the Philippines. The Red Cross Red Crescent movement designed an artificial intelligence algorithm to trigger the release of funds typically used for humanitarian response in advance<em> </em>of an impending typhoon to start up early actions to mitigate its potential impact. We highlight emerging actors and fora in the accountability relationship of anticipatory humanitarian action as well as the consequences arising from actors’ (mis)conduct. Finally, we reflect on the implications of this new form of algorithmic accountability for classical humanitarianism.


PLoS ONE ◽  
2020 ◽  
Vol 15 (11) ◽  
pp. e0241286
Author(s):  
Irene Unceta ◽  
Jordi Nin ◽  
Oriol Pujol

Significance This is blurring boundaries between traditional operational remits and is making it difficult to understand and mitigate risk, at both organisational and systemic levels. Impacts New models of service provision (rather than ownership) make risk control more difficult. Legal and practical safeguards for algorithmic accountability are weak. Privacy protections are being tested by the huge data flows and data processing potential of the Industrial IoT.


Sign in / Sign up

Export Citation Format

Share Document