Making Sense of “Microaggression”

2021 ◽  
Vol 37 (1) ◽  
pp. 111-124
Author(s):  
Heather Stewart ◽  

Though philosophers are beginning to pay attention to the phenomenon of microaggressions, they are yet to fully draw on their training and skills in conceptual analysis to help make sense of what microaggression is. In this paper, I offer a philosophical analysis of the concept of microaggression. I ultimately argue that ‘microaggression’ as a concept gets its meaning not by decomposing into a set of necessary and sufficient conditions, but rather by means of what Ludwig Wittgenstein (1953) has called “family resemblance.” That is to say, what unifies the concept of microaggression is a set of common, overlapping features that link related instances together, but are not necessarily all present in all cases. I identify and explain a common set of features that together form the basis for a family resemblance account of the concept. I then argue that despite the difficulty that microaggressions pose in terms of being reliably recognized and understood as such, some people, in virtue of their epistemic standpoint, are better suited to recognize these features and subsequently identify instances of micraoggression in practice. I argue this by drawing on the vast literature in feminist standpoint epistemology (Alcoff, 1993; Hill Collins, 1990, 2004; hooks, 2004; Harding, 2004, 2008; Wylie, 2013).

Naukovedenie ◽  
2021 ◽  
pp. 40-52
Author(s):  
Mikhail Sushchin ◽  

The problem of characterization scientific knowledge and its difference from other forms of activity has been discussed since antiquity. Nevertheless, in its modern guise, the problem of demarcation of science and non-science appeared in the works of logical positivists and their critic K. Popper. Thanks to their efforts, this problem has become one of the central problems of the philosophy of science of the XX century. However, the difficulties of the well-known classical and modern criteria of demarcation (including the verificationist and falsificationist criteria) prompted the American philosopher of science L. Laudan to declare «the demise of the demarcation problem». According to Laudan, any satisfactory criterion of demarcation must provide a set of necessary and sufficient conditions, on the basis of which only it will be possible to distinguish between science and non-science. Meanwhile, based on the heterogeneity of the forms of scientific knowledge and long unsuccessful attempts to give a set of necessary and sufficient conditions for what determines science, Laudan concluded that attempts to find the criterion of demarcation are futile. However, as further investigations have showed, Laudan may have been hasty in his conclusions. In particular, one of the promising approaches to the solution of the demarcation problem may be associated with the idea of family resemblance popularized by L. Wittgenstein and the application of developments from the field of fuzzy logic.


2020 ◽  
Vol 76 (6) ◽  
pp. 1473-1491
Author(s):  
E. E. Lawrence

PurposeThe term diverse books is increasingly popular yet persistently nebulous. The purpose of this paper – Part I of II – is to illuminate both that the concept is in need of a unified account and that conceptual analysis, though at first seemingly quite promising, fails as a method for identifying one.Design/methodology/approachThis paper utilizes traditional (or intuitive) conceptual analysis to specify the respective clusters of necessary and sufficient conditions that constitute four broad candidate accounts of diverse books.FindingsThough diverse books is a concept in need of a definition, conceptual analysis is not an appropriate method for adjudicating between the definitions we have on offer. This is because the concept is fundamentally political, serving as a resource for re-shaping collective social arrangements and ways of life. The conceptual problem outlined here requires for its resolution a method that will move us from a descriptive project to an explicitly normative one, wherein we consider what we properly work to achieve with and through the concept in question.Originality/valueThis paper initiates a systematic analytical project aimed at defining diverse books. In illustrating a moment of methodological failure, it paves the way for a critical alternative – namely, Part II's proposal of an analytical intervention in which political concepts are defined partially in terms of their benefits vis-á-vis informational justice.


NASKO ◽  
2011 ◽  
Vol 3 (1) ◽  
pp. 151 ◽  
Author(s):  
Melodie J. Fox

Classical theories of classification and concepts, originating in ancient Greek logic, have been criticized by classificationists, feminists, and scholars of marginalized groups because of the rigidity of conceptual boundaries and hierarchical structure. Despite this criticism, the principles of classical theory still underlie major library classification schemes. Rosch’s prototype theory, originating from cognitive psychology, uses Wittgenstein’s “family resemblance” as a basis for conceptual definition. Rather than requiring all necessary and sufficient conditions, prototype theory requires possession of some but not all common qualities for membership in a category. This paper explores prototype theory to determine whether it captures the fluidity of gender to avoid essentialism and accommodate transgender and queer identities. Ultimately, prototype theory constitutes a desirable conceptual framework for gender because it permits commonality without essentialism, difference without eliminating similarity. However, the instability of prototypical definitions would be difficult to implement in a practical environment and could still be manipulated to subordinate. Therefore, at best, prototype theory could complement more stable concept theories by incorporating contextual difference.


Philosophy ◽  
2013 ◽  
Author(s):  
Aaron M. Griffith

The notions of “truthmaking” and “truthmakers” are central to many attempts in contemporary metaphysics to come to grips with the connection between truth and reality. The intuitive motivation for theories of truthmaking is the idea that truth depends on reality: that truth is not primitive or fundamental, but rather derivative and dependent. The idea, more precisely stated, is that true propositions (or whatever are the primary truth-bearers, e.g., statements, sentences, or beliefs) are not true in and of themselves but must be made true by reality. Truthmaker theorists think that for a proposition to be made true is for it to be true in virtue of the existence of some entity, which is called its “truthmaker.” While many find the thought that truths are “true in virtue of,” or “grounded in,” or “determined by” reality compelling, not everyone finds the truthmaker theorist’s way of articulating this idea adequate. This article focuses on recent truthmaker theories, their challenges, and alternative approaches to truthmaking. One major point of contention surveyed here is the scope of truthmaking: i.e., whether every truth has a truthmaker, or only some. Another important issue is the nature of truthmakers. Some contend that states of affairs are truthmakers, while others hold that particular property instances (“tropes”) are better qualified to ground truths. Truthmaker theorists also disagree about how to characterize the “truthmaking relation” that holds between truths and their truthmakers. The various principles of truthmaking (principles setting out necessary and sufficient conditions under which an entity is a truthmaker for some proposition) offered in the literature are also surveyed in this entry. Perhaps the most contentious matter in truthmaker theory is how to deal with “problem cases”: i.e., truths for which there are no obvious truthmakers, such as negative existential truths, necessary truths, and subjunctive conditional truths. Some deny that these truths have truthmakers, but others have come up with ingenious and therefore controversial accounts of the truthmakers for these truths. Works on the relation between theories of truth and theories of truthmaking are also surveyed. Because it brings together foundational issues in ontology and truth, the nature of truthmaking and truthmakers has and will continue to be a source of interest and excitement for philosophers.


Author(s):  
Keith Dowding

Chapter 10 discusses conceptual analysis. It argues we should try to define terms in as non-normative a manner as possible. Whilst defining terms for specific purposes is justified we cannot expect to define important political concepts in a universal manner without acknowledging the research question that is being posed as part of that analysis. Whilst defining terms by necessary and sufficient conditions will always seem desirable, given that society changes and morality develops normative terms will evolve and change over time much like species. The chapter returns the essential contestability and suggests that some concepts are actually incoherent once we try to bring precision. This incoherence is hidden by their vagueness in application. It argues that power is not a vague term in the sense that freedom or democracy are. The analyses of power in this book is designed to give a scientific account of our folk understandings and enable a scientific description and analysis of the power and luck structure. It returns to the type-token distinction bringing out how important the distinction is to the analysis of power in the book, which is directed at type-level explanations. The analysis is comparative statics, but dynamic game theory can provide a way to examine token power struggles as they unfold. It shows how the analysis offered in the book is structural despite seeing power as measured by the resources of agents – their capacity is given by the power and luck structure. It acknowledges that deep structure goes right into the heart of the formation of human preferences.


Philosophy ◽  
2007 ◽  
Vol 82 (2) ◽  
pp. 275-299 ◽  
Author(s):  
Nicholas Everitt

Virtue ethics (VE for short) is currently so widely embraced that different versions of the theory can now be distinguished. Some of these are mapped out in Statman's useful introduction to his collection. There are enough of these versions to constitute a family, and consequently what they share is a family resemblance rather than agreement to a defining set of necessary and sufficient conditions. What I propose to do, therefore, is to criticise one of the main versions of VE. Rosalind Hursthouse is the main proponent of the version which I will criticise. I choose her as a spokesperson, not because her version of VE is especially weak. On the contrary, it is because she is one of the leading protagonists of VE, and because her writings provide a lucid, powerful and elegant exposition of VE that her version of the theory is an appropriate object of scrutiny.


1986 ◽  
Vol 23 (04) ◽  
pp. 851-858 ◽  
Author(s):  
P. J. Brockwell

The Laplace transform of the extinction time is determined for a general birth and death process with arbitrary catastrophe rate and catastrophe size distribution. It is assumed only that the birth rates satisfyλ0= 0,λj> 0 for eachj> 0, and. Necessary and sufficient conditions for certain extinction of the population are derived. The results are applied to the linear birth and death process (λj=jλ, µj=jμ) with catastrophes of several different types.


Sign in / Sign up

Export Citation Format

Share Document