scholarly journals Auto-essentialization: Gender in automated facial analysis as extended colonial project

2021 ◽  
Vol 8 (2) ◽  
pp. 205395172110537
Author(s):  
Morgan Klaus Scheuerman ◽  
Madeleine Pape ◽  
Alex Hanna

Scholars are increasingly concerned about social biases in facial analysis systems, particularly with regard to the tangible consequences of misidentification of marginalized groups. However, few have examined how automated facial analysis technologies intersect with the historical genealogy of racialized gender—the gender binary and its classification as a highly racialized tool of colonial power and control. In this paper, we introduce the concept of auto-essentialization: the use of automated technologies to re-inscribe the essential notions of difference that were established under colonial rule. We consider how the face has emerged as a legitimate site of gender classification, despite being historically tied to projects of racial domination. We examine the history of gendering the face and body, from colonial projects aimed at disciplining bodies which do not fit within the European gender binary, to sexology's role in normalizing that binary, to physiognomic practices that ascribed notions of inferiority to non-European groups and women. We argue that the contemporary auto-essentialization of gender via the face is both racialized and trans-exclusive: it asserts a fixed gender binary and it elevates the white face as the ultimate model of gender difference. We demonstrate that imperialist ideologies are reflected in modern automated facial analysis tools in computer vision through two case studies: (1) commercial gender classification and (2) the security of both small-scale (women-only online platforms) and large-scale (national borders) spaces. Thus, we posit a rethinking of ethical attention to these systems: not as immature and novel, but as mature instantiations of much older technologies.

2021 ◽  
Vol 9 (08) ◽  
pp. 604-610
Author(s):  
Tanmay Munjal

Large scale censorship and control over the free flow of information on the internet that was already implemented on a large scale in many authoritarian countries in China in the past few decades has started to work its way through the more liberal and western countries including India, US etc. especially in the last decade raising concerns over privacy issues and the possibility of a dystopian future of tyrannical governments empowered by the use of digital surveillance technology to increase their power and make them essentially undefeatable on a level unforeseen in the history of humanity among many great thinkers in our era. In this paper, we wish to outline a method to not only combat but to completely eliminate both the possibility and current usage of all censorship and control over flow of information on the internet, hence heralding an era of free flow of information throughout the world and destroying practically all mind control that tyrannical governments can hold over their people, in essence ending the era of propaganda and tyranny from the face of this earth forever, using blockchain technology.


Author(s):  
Jama Shelton ◽  
Kel Kroehle ◽  
Emilie K. Clark ◽  
Kristie Seelman ◽  
SJ Dodd

The enforcement of the gender binary is a root cause of gender-based violence (GBV) for trans people. Disrupting GBV requires that we ensure that ‘gender’ is not presumed synonymous with White cisgender womanhood. Transfeminists suggest that attaining gender equity requires confronting all forms of oppression that police people and their bodies, including White supremacy, colonialism and capitalism (Silva and Ornat, 2016; Simpkins, 2016). Part of this project, we argue, includes confronting the structures of GBV embedded within digital technologies that are increasingly part of our everyday lives. Informed by transfeminist theory (Koyama, 2003; Stryker and Bettcher, 2016; Simpkins, 2016; Weerawardhana, 2018), we interrogate the ways in which digital technologies naturalise and reinforce GBV against bodies marked as divergent. We examine the subtler ways that digital technology can fortify binary gender as a mechanism of power and control. We highlight how gendered forms of data violence cannot be disentangled from digital technologies that surveil, police or punish on the basis of race, nationhood and citizenship, particularly in relation to predictive policing practices. We conclude with recommendations to guide technological development to reduce the violence enacted upon trans people and those whose gender presentations transgress society’s normative criteria for what constitutes a compliant (read: appropriately gendered) citizen.<br /><br />Key messages<br /><ul><li>Violence against trans people is inherently gender-based.</li><br /><li>A root cause of gender-based violence against trans people is the strict reinforcement of the gender binary.</li><br /><li>Digital technology and predictive policing can fortify binary gender as a mechanism of power and control.</li><br /><li>Designers of digital technologies and the policymakers regulating surveillance capitalism must interrogate the ways in which their work upholds the gender binary and gender-based violence against trans people.</li></ul>


This chapter extends the book’s insights about nature, technology, and nation to the larger history of the modern period. While the modern nation loses its grip as a locus of identity and analysis, attempts to understand the operation, disruption, and collapse of continental and global infrastructures continue to mix the natural and the machinic in ways that define them both. Those vulnerabilities emphasize large-scale catastrophe; historiographically, they mask the crucial role of small-scale failures in the experience and culture of late modernity, including its definition of nature. Historical actors turned the uneven geographical distribution of small-scale failures into a marker of distinctive local natures and an element of regional and national identity. Attending to those failures helps not only situate cold-war technologies in the larger modern history of natural and machinic orders; it helps provincialize the superpowers by casting problematic “other” natures as central and primary.


2019 ◽  
pp. 107-130
Author(s):  
Samy Cohen

2006-2010: during these four decisive years in the history of the peace movement, the movement experienced a dramatic eclipse. Within an Israeli society that had grown increasingly nationalist, more attached to symbols of Jewish identity and the memory of the Holocaust, more concerned than ever about security, and less interested in making peace with the Palestinians, the movement was incapable both of promoting a message of peace and taking a stance on the subject of human rights. It seemed apathetic, paralyzed, almost non-existent in the face of the terrible events that marked the period. This chapter shows how and why this eclipse occurred. These years were punctuated by two large-scale military operations, the war in Lebanon in July 2006 and Operation Cast Lead in the Gaza Strip from late 2008 to early 2009. These hostilities caused turmoil in the Israeli collective psychology and the perception of war and peace.


Author(s):  
Hans-Jörg Schmid

This chapter discusses how the Entrenchment-and-Conventionalization Model explains language change. First, it is emphasized that not only innovation and variation, but also the frequency of repetition can serve as important triggers of change. Conventionalization and entrenchment processes can interact and be influenced by numerous forces in many ways, resulting in various small-scale processes of language change, which can stop, change direction, or even become reversed. This insight serves as a basis for the systematic description of nine basic modules of change which differ in the ways in which they are triggered and controlled by processes and forces. Large-scale pathways of change such as grammaticalization, lexicalization, pragmaticalization, context-induced change, or colloquialization and standardization are all explained by reference to these modules. The system is applied in a case study on the history of do-periphrasis.


2010 ◽  
Vol 16 (1) ◽  
Author(s):  
Rob Garbutt

Clearings make settlement possible. Whether on a small scale using an axe and other hand implements to make way for a dwelling and a garden, or on a large scale with a chain strung between two D9 bulldozers in preparation for a major agribusiness development, the process of clearing creates spaces for installing something new. This paper uses the idea of (the) clearing, as practice, process, outcome and metaphor, to examine the installation of the locals in a settler society. Using Lismore on the far-north coast of New South Wales, Australia, as a case example, the particular work of clearing that is discussed here is a practice that enables a form of colonisation and settlement that distances itself from its history of migration. This is a history of settler locals who were 'always here', and a colonial form of clearing clears the land and the mind of troubling pasts and of troubling presences. For the locals within a place, then, clearing manages and simplifies a complex set of social and material relations, histories and identities.Using Anthony Appiah's concept the 'space clearing gesture', the paper concludes with a reflection on the space in which the idea of "the clearing" and this paper appears. Do places, in this instance rural places, provide a type of clearing in which certain ideas might appear that may not appear elsewhere? If situatedness matters then the diversity of places where thinking is done is important for our ecology of thought, and in connection with this, perhaps what 'rural cultural studies' does is clear a particular type of space for thinking.


1990 ◽  
Vol 5 ◽  
pp. 262-272
Author(s):  
William Miller

Paleontologists have lavished much time and energy on description and explanation of large-scale patterns in the fossil record (e.g., mass extinctions, histories of monophyletic taxa, deployment of major biogeographic units), while paying comparatively little attention to biologic patterns preserved only in local stratigraphic sequences. Interpretation of the large-scale patterns will always be seen as the chief justification for the science of paleontology, but solving problems framed by long time spans and large areas is rife with tenuous inference and patterns are prone to varied interpretation by different investigators using virtually the same data sets (as in the controversy over ultimate cause of the terminal Cretaceous extinctions). In other words, the large-scale patterns in the history of life are the true philosophical property of paleontology, but there will always be serious problems in attempting to resolve processes that transpired over millions to hundreds-of-millions of years and encompassed vast areas of seafloor or landscape. By contrast, less spectacular and more commonplace changes in local habitats (often related to larger-scale events and cycles) and attendant biologic responses are closer to our direct experience of the living world and should be easier to interpret unequivocally. These small-scale responses are reflected in the fossil record at the scale of local outcrops.


2000 ◽  
Vol 407 ◽  
pp. 105-122 ◽  
Author(s):  
JACQUES VANNESTE

The effect of a small-scale topography on large-scale, small-amplitude oceanic motion is analysed using a two-dimensional quasi-geostrophic model that includes free-surface and β effects, Ekman friction and viscous (or turbulent) dissipation. The topography is two-dimensional and periodic; its slope is assumed to be much larger than the ratio of the ocean depth to the Earth's radius. An averaged equation of motion is derived for flows with spatial scales that are much larger than the scale of the topography and either (i) much larger than or (ii) comparable to the radius of deformation. Compared to the standard quasi-geostrophic equation, this averaged equation contains an additional dissipative term that results from the interaction between topography and dissipation. In case (i) this term simply represents an additional Ekman friction, whereas in case (ii) it is given by an integral over the history of the large-scale flow. The properties of the additional term are studied in detail. For case (i) in particular, numerical calculations are employed to analyse the dependence of the additional Ekman friction on the structure of the topography and on the strength of the original dissipation mechanisms.


2020 ◽  
pp. 109963622093582
Author(s):  
Juho T Siivola ◽  
Shu Minakuchi ◽  
Tadahito Mizutani ◽  
Kazuya Kitamoto ◽  
Nobuo Takeda

Dimpling in the composite face sheets of honeycomb sandwich structures due to mismatch in the thermal expansion coefficients of the constituent materials was studied with emphasis on its monitoring and prediction. Strain distributions along optical fibers embedded in the face sheet were monitored during manufacturing. Dimple formation and in-plane strain distributions in the face sheets were studied using finite element analysis, and an analytical model based on the beam theory was constructed to predict the dimple depths from the strain data. A system using twin optical fiber sensors was proposed to accurately measure the dimpling-induced strains. The usability and performance of the system was evaluated using small scale specimens and finally on a more realistic large-scale specimen. The system could measure the strain changes due to dimpling of the face sheets and provided decent prediction of the dimple depth distribution along the sandwich panels.


2020 ◽  
Vol 12 (4) ◽  
pp. 1-20
Author(s):  
Kuan-Chung Shih ◽  
Yan-Kwang Chen ◽  
Yi-Ming Li ◽  
Chih-Teng Chen

Integrated decisions on merchandise image display and inventory planning are closely related to operational performance of online stores. A visual-attention-dependent demand (VADD) model has been developed to support online stores make the decisions. In the face of evolving products, customer needs, and competitors in an e-commerce environment, the benefits of using VADD model depend on how fast the model runs on the computer. As a result, a discrete particle swarm optimization (DPSO) method is employed to solve the VADD model. To verify the usability and effectiveness of DPSO method, it was compared with the existing methods for large-scale, medium-scale, and small-scale problems. The comparison results show that both GA and DPSO method perform well in terms of the approximation rate, but the DPSO method takes less time than the GA method. A sensitivity is conducted to determine the model parameters that influence the above comparison result.


Sign in / Sign up

Export Citation Format

Share Document