scholarly journals From pre-emption to slowness: Assessing the contrasting temporalities of data-driven predictive policing

2020 ◽  
Vol 22 (9) ◽  
pp. 1528-1544
Author(s):  
Mark Andrejevic ◽  
Lina Dencik ◽  
Emiliano Treré

Debates on the temporal shift associated with digitalization often stress notions of speed and acceleration. With the advent of big data and predictive analytics, the time-compressing features of digitalization are compounded within a distinct operative logic: that of pre-emption. The temporality of pre-emption attempts to project the past into a simulated future that can be acted upon in the present; a temporality of pure imminence. Yet, inherently paradoxical, pre-emption is marked by myriads of contrasts and frictions as it is caught between the supposedly all-encompassing knowledge of the data-processing ‘Machine’, and the daily reality of decision-making practices by relevant social actors. In this article, we explore the contrasting temporalities of automated data processing and predictive analytics, using policing as an illustrative example. Drawing on insights from two cases of predictive policing systems that have been implemented among UK police forces, we highlight the prevalence of counter-temporalities as predictive analytics is situated in institutional contexts and consider the conditions of possibility for agency and deliberation. Analysing these temporal tensions in relation to ‘slowness’ as a mode of resistance, the contextual examination of predictive policing advanced in the article provides a contribution to the formation of a deeper awareness of the politics of time in automated data processing; one that may serve to counter the imperative of pre-emption that, taken to the limit, seeks to foreclose the time for politics, action and life.

Author(s):  
Christian Pentzold ◽  
Denise Fechner

This article explores how newsmakers exploit numeric records in order to anticipate the future. As this nascent area of data journalism experiments with predictive analytics, we examine its reports and computer-generated presentations, often infographics and data visualizations, and ask what time frames and topics are covered by these diagrammatic displays. We also interrogate the strategies that are employed in order to modulate the uncertainty involved in calculating for more than one possible outlook. Based on a comprehensive sample of projects, our analysis shows how data journalism seeks accuracy but has to cope with a number of different prospective probabilities and the puzzle of how to address this multiplicity of futures. Despite their predictive ambition, these forecasts are inherently grounded in the past because they are based on archival data. We conclude that this form of quantified premediation limits the range of imaginable future thoughts to one preferred mode, namely extrapolation.


2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Giacomo Baggio ◽  
Danielle S. Bassett ◽  
Fabio Pasqualetti

AbstractOur ability to manipulate the behavior of complex networks depends on the design of efficient control algorithms and, critically, on the availability of an accurate and tractable model of the network dynamics. While the design of control algorithms for network systems has seen notable advances in the past few years, knowledge of the network dynamics is a ubiquitous assumption that is difficult to satisfy in practice. In this paper we overcome this limitation, and develop a data-driven framework to control a complex network optimally and without any knowledge of the network dynamics. Our optimal controls are constructed using a finite set of data, where the unknown network is stimulated with arbitrary and possibly random inputs. Although our controls are provably correct for networks with linear dynamics, we also characterize their performance against noisy data and in the presence of nonlinear dynamics, as they arise in power grid and brain networks.


2021 ◽  
Vol 75 (3) ◽  
pp. 76-82
Author(s):  
G.T. Balakayeva ◽  
◽  
D.K. Darkenbayev ◽  
M. Turdaliyev ◽  
◽  
...  

The growth rate of these enterprises has increased significantly in the last decade. Research has shown that over the past two decades, the amount of data has increased approximately tenfold every two years - this exceeded Moore's Law, which doubles the power of processors. About thirty thousand gigabytes of data are accumulated every second, and their processing requires an increase in the efficiency of data processing. Uploading videos, photos and letters from users on social networks leads to the accumulation of a large amount of data, including unstructured ones. This leads to the need for enterprises to work with big data of different formats, which must be prepared in a certain way for further work in order to obtain the results of modeling and calculations. In connection with the above, the research carried out in the article on processing and storing large data of an enterprise, developing a model and algorithms, as well as using new technologies is relevant. Undoubtedly, every year the information flows of enterprises will increase and in this regard, it is important to solve the issues of storing and processing large amounts of data. The relevance of the article is due to the growing digitalization, the increasing transition to professional activities online in many areas of modern society. The article provides a detailed analysis and research of these new technologies.


2021 ◽  
Author(s):  
Aleksei Seleznev ◽  
Dmitry Mukhin ◽  
Andrey Gavrilov ◽  
Alexander Feigin

<p>We investigate the decadal-to-centennial ENSO variability based on nonlinear data-driven stochastic modeling. We construct data-driven model of yearly Niño-3.4 indices reconstructed from paleoclimate proxies based on three different sea-surface temperature (SST) databases at the time interval from 1150 to 1995 [1]. The data-driven model is forced by the solar activity and CO2 concentration signals. We find the persistent antiphasing relationship between the solar forcing and Niño-3.4 SST on the bicentennial time scale. The dynamical mechanism of such a response is discussed.</p><p>The work was supported by the Russian Science Foundation (Grant No. 20-62-46056)</p><p>1. Emile-Geay, J., Cobb, K. M., Mann, M. E., & Wittenberg, A. T. (2013). Estimating Central Equatorial Pacific SST Variability over the Past Millennium. Part II: Reconstructions and Implications, Journal of Climate, 26(7), 2329-2352.</p>


2019 ◽  
Vol 15 (S367) ◽  
pp. 199-209
Author(s):  
Shanshan Li ◽  
Chenzhou Cui ◽  
Cuilan Qiao ◽  
Dongwei Fan ◽  
Changhua Li ◽  
...  

AbstractAstronomy education and public outreach (EPO) is one of the important part of the future development of astronomy. During the past few years, as the rapid evolution of Internet and the continuous change of policy, the breeding environment of science EPO keep improving and the number of related projects show a booming trend. EPO is no longer just a matter of to teachers and science educators but also attracted the attention of professional astronomers. Among all activates of astronomy EPO, the data driven astronomy education and public outreach (abbreviated as DAEPO) is special and important. It benefits from the development of Big Data and Internet technology and is full of flexibility and diversity. We will present the history, definition, best practices and prospective development of DAEPO for better understanding this active field.


2021 ◽  
pp. 026638212110619
Author(s):  
Sharon Richardson

During the past two decades, there have been a number of breakthroughs in the fields of data science and artificial intelligence, made possible by advanced machine learning algorithms trained through access to massive volumes of data. However, their adoption and use in real-world applications remains a challenge. This paper posits that a key limitation in making AI applicable has been a failure to modernise the theoretical frameworks needed to evaluate and adopt outcomes. Such a need was anticipated with the arrival of the digital computer in the 1950s but has remained unrealised. This paper reviews how the field of data science emerged and led to rapid breakthroughs in algorithms underpinning research into artificial intelligence. It then discusses the contextual framework now needed to advance the use of AI in real-world decisions that impact human lives and livelihoods.


Sign in / Sign up

Export Citation Format

Share Document