Racial Bias in Child Protection? A Comparison of Competing Explanations Using National Data

PEDIATRICS ◽  
2011 ◽  
Vol 127 (3) ◽  
pp. peds.2010-1710d-peds.2010-1710d
PEDIATRICS ◽  
2011 ◽  
Vol 127 (3) ◽  
pp. 471-478 ◽  
Author(s):  
B. Drake ◽  
J. M. Jolley ◽  
P. Lanier ◽  
J. Fluke ◽  
R. P. Barth ◽  
...  

2003 ◽  
Vol 28 (2) ◽  
pp. 45-47 ◽  
Author(s):  
Helen Johnstone

This paper outlines the parameters of the national out-of-home care data collection managed by the Australian Institute of Health and Welfare. The paper discusses the need for national data, what is included in the national data collection and the current data collection process. In addition possible developments to the national collection are outlined, in particular the proposal to collect the data electronically in unit record format. The benefits of this would include greater flexibility of the data and the ability to analyse how children move through the child protection and out-of-home care systems.


2018 ◽  
Vol 4 (1) ◽  
pp. 1-16 ◽  
Author(s):  
Mitali Thakor

In this paper, I discuss the embodied labor of policing child pornography through the ways in which algorithms and human reviewers like Linda see abuse images. I employ the concept of “apprehension” to suggest that the ways that reviewers “see” child pornography is always already oriented toward the capture and arrest of suspected offenders. As I have argued elsewhere (Author 2017; Forthcoming), the use of new digital techniques to find child pornography has fundamentally transformed and expanded policing into a distributed network of labor increasingly done by computer scientists and technology companies. Rather than suggest new software is the cause of these transformations, I draw attention to the constitutive and mutually defining relation between computing and corporeality, or how image detection algorithms need the work of human perception to put their detective skills to work.            I argue further still that the case study of child pornography detection offers an entry point into examining the algorithmic management of race. I suggest that childhood innocence is coded as whiteness, and whiteness as innocence, in the algorithmic detection of victims and abusers. By taking ‘detection’ as a dynamic practice between human and machine, I make an intervention into critical algorithm studies that have tended to focus solely on the programming of racial bias into software. The algorithmic detection of child pornography hinges, crucially, upon practice and the tacit observation of human reviewers, whose instinctual feelings about child protection and offender apprehension become embedded within the reviewing and reporting process as cases escalate for law enforcement.


2020 ◽  
Vol 49 (4) ◽  
pp. 273-284 ◽  
Author(s):  
Jordan G. Starck ◽  
Travis Riddle ◽  
Stacey Sinclair ◽  
Natasha Warikoo

Schools are heralded by some as unique sites for promoting racial equity. Central to this characterization is the presumption that teachers embrace racial equity and teaching about this topic. In contrast, others have documented the ongoing role of teachers in perpetuating racial inequality in schools. In this article, we employ data from two national data sets to investigate teachers’ explicit and implicit racial bias, comparing them to adults with similar characteristics. We find that both teachers and nonteachers hold pro-White explicit and implicit racial biases. Furthermore, differences between teachers and nonteachers were negligible or insignificant. The findings suggest that if schools are to effectively promote racial equity, teachers should be provided with training to either shift or mitigate the effects of their own racial biases.


2011 ◽  
Vol 9 (1) ◽  
pp. 67-86
Author(s):  
Gabriella Tonk ◽  
Júlia Szigeti

Abstract This article proposes to perform a mainly qualitative analysis of the national data collection system on child abuse and neglect from the perspective of the right of the child to protection, participation and development.The study performed with the initiative of ChildONEurope network identifies a few characteristics of good practice regarding the development of complex databases on child abuse and neglect. These characteristics are organized around three elements that will constitute analyzing criteria for the present study, applied to the Romanian data systems. The three elements are: objectives, criteria and resources.The main scopes of the study are: 1. Presenting the Romanian data system regarding child abuse and neglect; 2. Identifying the strengths and weaknesses of the Romanian data system based on the above mentioned criteria; 3. Identifying new directions for development of the data collection and monitoring the Romanian child protection system.In order to achieve the above mentioned scopes, a qualitative analysis has been made of the Romanian institutional-legislative framework, through document analysis, completed with an interview with the representative of the National Authority for Protection of Family and Child’s Rights. Here the study aims to identify the possible gaps and ambiguousness of the legal framework that could be one of the sources of unreliable data.The analysis of the national database has been made through document analysis, while one of county level databases has been made through implementation of a questionnaire in 47 county directorates for social assistance and child protection, completed with 3 interviews with representatives of two local institutions from Cluj County.


2018 ◽  
Vol 4 (1) ◽  
pp. 1-16
Author(s):  
Mitali Thakor

In this paper, I discuss the embodied labor of policing child pornography through the ways in which algorithms and human reviewers like Linda see abuse images. I employ the concept of “apprehension” to suggest that the ways that reviewers “see” child pornography is always already oriented toward the capture and arrest of suspected offenders. As I have argued elsewhere (Author 2017; Forthcoming), the use of new digital techniques to find child pornography has fundamentally transformed and expanded policing into a distributed network of labor increasingly done by computer scientists and technology companies. Rather than suggest new software is the cause of these transformations, I draw attention to the constitutive and mutually defining relation between computing and corporeality, or how image detection algorithms need the work of human perception to put their detective skills to work.            I argue further still that the case study of child pornography detection offers an entry point into examining the algorithmic management of race. I suggest that childhood innocence is coded as whiteness, and whiteness as innocence, in the algorithmic detection of victims and abusers. By taking ‘detection’ as a dynamic practice between human and machine, I make an intervention into critical algorithm studies that have tended to focus solely on the programming of racial bias into software. The algorithmic detection of child pornography hinges, crucially, upon practice and the tacit observation of human reviewers, whose instinctual feelings about child protection and offender apprehension become embedded within the reviewing and reporting process as cases escalate for law enforcement.


Author(s):  
Jane Nusbaum Feller ◽  
Howard A. Davidson ◽  
Mark Hardin ◽  
Robert M. Horowitz
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document