mass digitization
Recently Published Documents


TOTAL DOCUMENTS

89
(FIVE YEARS 31)

H-INDEX

7
(FIVE YEARS 1)

Semiotica ◽  
2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Maarja Ojamaa ◽  
Indrek Ibrus

Abstract Digital cultural archives and databases are promising an era of heritage democratization and an enhancement of the role of arts in everyday cultures. It is hoped that mass digitization initiatives in many corners of the world can facilitate the secure preservation of human cultural heritage, with easy access and diverse ways for creative reuse. Understanding the dialogic processes within these increasingly vast databases necessitates a dynamic conceptualization of data they contain. The paper argues that this can be found in Juri Lotman’s cultural semiotic notion of text and text’s functions in culture. It elaborates on the three key characteristics of text – expression, boundary, and structure – as manifested within the digital semiosphere. At the same time, the textual dialogues within digital archives are increasingly conditioned by metadata, which is hereby conceptualized as metalanguage inducing a modeling effect on archived texts and defining their possible sphere of dynamics. To balance the explanations of creative operations of digital archives, the paper also demonstrates their auto-communicative mechanisms for facilitating cultural continuities and stability.


2021 ◽  
Vol 10 ◽  
pp. 30-33
Author(s):  
Sanne Muurling ◽  
Evelien Walhout

The future of the Historical Sample of the Netherlands (HSN) will certainly include the enrichment of the foundational database with additional, new sources of information. In general, the HSN would highly benefit from current mass digitization projects involving citizen science. This essay proposes a pilot in linking 19th- and early 20th-century criminal records to HSN. In spite of the extensive state and parish registration documenting individual and family lives in close systematic detail, life course approaches to historical crime are less common. The large datasets necessary to conduct longitudinal life course research into deviant behaviour will facilitate both the analysis of criminality as an event and the scrutiny of the trajectories of individuals' lives leading up to their involvement in crime.


2021 ◽  
pp. 17-45
Author(s):  
Adam Crymble

This chapter outlines the multiple origin myths of “digital” historical research, arguing that social science inspired cliometricians and linguistically inclined humanities computing scholars working on textual collections were both using computers from the mid-twentieth century, but with very different intellectual agendas and only occasionally crossing paths. With the rise of mass digitization in the 1990s, both groups inspired a new generation of “digital” historians who worked to unlock the potential of the newly digitized archives. Wrestling with practical and intellectual challenges ranging from poor-quality transcription to dealing with incomplete data, this group generated new knowledge and answered new questions such as “what do you do with a million books?” but were not necessarily contributing directly to the existing conversations of the historiography


2021 ◽  
pp. 46-78
Author(s):  
Adam Crymble

By the twenty-first century, billions of historical sources were digitized, with many historians actively involved in this unprecedented archival revisionism. Understanding the history of mass digitization is fundamental to understanding the environment of historians in the late twentieth and early twenty-first centuries, as well as one of the key ways that historians applied computers to their cause. Charting the history of the archive through waves of interest in hypertext, multimedia, the Internet, Web 2.0, user experience, and mobile computing, this chapter argues that changes in technology-enabled historians to revise the nature of the archive, first by bringing primary sources into the classroom and then into the streets.


Author(s):  
Tonia Sutherland ◽  
Alyssa Purcell

This article uses Indigenous decolonizing methodologies and Critical Race Theory (CRT) as methodological and theoretical frameworks to address colonial and racialized concerns about archival description; to argue against notions of diversity and inclusion in archival descriptive practices; and to make recommendations for decolonizing description and embracing redescription as liberatory archival praxis. First, we argue that extant descriptive practices do not diversify archives. Rather, we find that descriptive work that isolates and scatters aims to erase the identifiable existence of unique Indigenous voices. Next, we argue that while on one hand, the mass digitization of slavery-era records holds both the promise of new historical knowledge and of genealogical reconstruction for descendants of enslaved peoples, on the other hand, this trend belies a growing tendency to reinscribe racist ideologies and codify damaging ideas about how we organize and create new knowledge through harmful descriptive practices. Finally, working specifically against the rhetoric of diversity and inclusion, we challenge the ways archives claim diverse representation by uncritically describing records rooted in generational trauma, hatred, and genocide, and advocate for developing and employing decolonizing and redescriptive practices to support an archival praxis rooted in justice and liberation, rather than more palatable (and less effective) notions of “diversity and inclusion”.


2020 ◽  
pp. 1-16
Author(s):  
Sarah Brayne

This introductory chapter provides a definition of big data as well as an overview of the use of big data in policing. Big data is a data environment made possible by the mass digitization of information and is associated with the use of advanced analytics, including network analysis and machine learning algorithms. Law enforcement’s adoption of big data is part of a broader shift toward the use of big data and machine-learned decisions throughout the criminal justice system. From surveillance to pretrial determinations and sentencing, big data saturates American criminal justice. It is also the subject of contentious debate in policy, media, legal, regulatory, advocacy, and academic circles. Focusing on the Los Angeles Police Department’s use of big data and associated surveillance technologies, this book studies how big data is actually used by police in practice—and to what consequence.


2020 ◽  
pp. 37-55
Author(s):  
Sarah Brayne

This chapter discusses dragnet surveillance, which is the collection and analysis of information on everyone, rather than merely those under suspicion. Dragnet surveillance—and the data it produces—can be useful for law enforcement to solve crimes. Dragnet surveillance widens and deepens social oversight: it includes a broader swath of people and can follow any single individual across a greater range of institutional settings. It is associated with three key transformations in the practice of policing: the shift from query-based to alert-based systems makes it possible to systematically surveil an unprecedentedly large number of people; individuals with no direct police contact are now included in law enforcement systems, lowering the threshold for inclusion in police databases; and institutional data systems are integrated, with police now collecting and using information gleaned from institutions not typically associated with crime control. However, dragnet surveillance is not an inevitable result of mass digitization. Rather, it is the result of choices that reflect the social and political positions of the subjects and the subject matter under surveillance.


Sign in / Sign up

Export Citation Format

Share Document