Case Selection in Small-N Research

Author(s):  
Jason Seawright

Recent methodological work on systematic case selection techniques offers ways of choosing cases for in-depth analysis such that the probability of learning from the cases is enhanced. This research has undermined several long-standing ideas about case selection. In particular, random selection of cases, paired or grouped selection of cases for purposes of controlled comparison, typical cases, and extreme cases on the outcome variable all appear to be much less useful than their reputations have suggested. Instead, it appears that scholars gain the most in terms of making new discoveries about causal relationships when they study extreme cases on the causal variable or deviant cases.

2021 ◽  
pp. 004912412110046
Author(s):  
Daniel J. Galvin ◽  
Jason N. Seawright

Scholarship on multimethod case selection in the social sciences has developed rapidly in recent years, but many possibilities remain unexplored. This essay introduces an attractive and advantageous new alternative, involving the selection of extreme cases on the treatment variable, net of the statistical influence of the set of known control variables. Cases that are extreme in this way are those in which the value of the main causal variable is as surprising as possible, and thus, this approach can be referred to as seeking “surprising causes.” There are practical advantages to selecting on surprising causes, and there are also advantages in terms of statistical efficiency in facilitating case-study discovery. We first argue for these advantages in general terms and then demonstrate them in an application regarding the dynamics of U.S. labor legislation.


2020 ◽  
pp. 123-150
Author(s):  
Julia Saviello

Smell and taste – of the five senses these are the two most strongly stimulated by smoking tobacco. The article presents an in-depth analysis of the reflection of both these forms of sensory perception in textual and visual sources concerning the early consumption of the herb. In a first step, tobacco’s changing reception, first as medicine and then as stimulant, is traced through the years of its increasing distribution in Europe, starting in the middle of the 16th century. As this overview reveals, at that time the still little known substance gave rise to new forms of sense perception. Following recent studies on smell and gustation, which have stressed the need to take into account the interactions between these senses, the article probes the manifold stimulation of the senses by tobacco with reference to allegorical representations and genre scenes addressing the five senses. The smoking of tobacco was thematized in both of these art forms as a means of visualizing either smell or taste. Yet, these depictions show no indication of any deliberate engagement with the exchange of sense data between mouth and nose. The question posed at the end of this paper is whether this holds true also for early smoker’s still lifes. In the so-called toebakjes or rookertjes, a subgenre of stilllife painting that, like tobacco, was still a novelty at the beginning of the 17th century, various smoking paraphernalia – such as rolled or cut tobacco, pipes and tins – are arrayed with various kinds of foods and drinks. Finally, the article addresses a selection of such smoker’s still lifes, using the toebakje by Pieter Claesz., probably the first of its kind, as a starting point and the work by Georg Flegel as a comparative example. Through their selection of objects, both offer a complex image of how tobacco engages different senses.


2003 ◽  
Vol 17 (1) ◽  
pp. 1-14 ◽  
Author(s):  
Peggy A. Hite ◽  
John Hasseldine

This study analyzes a random selection of Internal Revenue Service (IRS) office audits from October 1997 to July 1998, the type of audit that concerns most taxpayers. Taxpayers engage paid preparers in order to avoid this type of audit and to avoid any resulting tax adjustments. The study examines whether there are more audit adjustments and penalty assessments on tax returns with paid-preparer assistance than on tax returns without paid-preparer assistance. By comparing the frequency of adjustments on IRS office audits, the study finds that there are significantly fewer tax adjustments on paid-preparer returns than on self-prepared returns. Moreover, CPA-prepared returns resulted in fewer audit adjustments than non CPA-prepared returns.


2021 ◽  
Vol 54 (3) ◽  
pp. 1-36
Author(s):  
Syed Wasif Abbas Hamdani ◽  
Haider Abbas ◽  
Abdul Rehman Janjua ◽  
Waleed Bin Shahid ◽  
Muhammad Faisal Amjad ◽  
...  

Cyber threats have been growing tremendously in recent years. There are significant advancements in the threat space that have led towards an essential need for the strengthening of digital infrastructure security. Better security can be achieved by fine-tuning system parameters to the best and optimized security levels. For the protection of infrastructure and information systems, several guidelines have been provided by well-known organizations in the form of cybersecurity standards. Since security vulnerabilities incur a very high degree of financial, reputational, informational, and organizational security compromise, it is imperative that a baseline for standard compliance be established. The selection of security standards and extracting requirements from those standards in an organizational context is a tedious task. This article presents a detailed literature review, a comprehensive analysis of various cybersecurity standards, and statistics of cyber-attacks related to operating systems (OS). In addition to that, an explicit comparison between the frameworks, tools, and software available for OS compliance testing is provided. An in-depth analysis of the most common software solutions ensuring compliance with certain cybersecurity standards is also presented. Finally, based on the cybersecurity standards under consideration, a comprehensive set of minimum requirements is proposed for OS hardening and a few open research challenges are discussed.


2012 ◽  
Vol 22 (03) ◽  
pp. 1250007 ◽  
Author(s):  
PEDRO RODRÍGUEZ ◽  
MARÍA CECILIA RIVARA ◽  
ISAAC D. SCHERSON

A novel parallelization of the Lepp-bisection algorithm for triangulation refinement on multicore systems is presented. Randomization and wise use of the memory hierarchy are shown to highly improve algorithm performance. Given a list of selected triangles to be refined, random selection of candidates together with pre-fetching of Lepp-submeshes lead to a scalable and efficient multi-core parallel implementation. The quality of the refinement is shown to be preserved.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Kent McFadzien ◽  
Lawrence W. Sherman

PurposeThe purpose of this paper is to demonstrate a “maintenance pathway” for ensuring a low false negative rate in closing investigations unlikely to lead to a clearance (detection).Design/methodology/approachA randomised controlled experiment testing solvability factors for non-domestic cases of minor violence.FindingsA random selection of 788 cases, of which 428 would have been screened out, were sent forward for full investigation. The number of cases actually detected was 22. A total of 19 of these were from the 360 recommended for allocation. This represents an improvement of accuracy over the original tests of the model three years earlier.Research limitations/implicationsThis study shows how the safety of an investigative triage tool can be checked on a continuous basis for accuracy in predicting the cases unlikely to be solved if referred for full investigations.Practical implicationsThis safety check pathway means that many more cases can be closed after preliminary investigations, thus saving substantial time for working on cases more likely to yield a detection if sufficient time is put into the cases.Social implicationsMore offenders may be caught and brought to justice by using triage with a safety backstop for accurate forecasting.Originality/valueThis is the first published study of a maintenance pathway based on a random selection of cases that would otherwise not have been investigated. If widely applied, it could yield far greater time for police to pursue high-harm, serious violence.


2020 ◽  
Vol 38 (2) ◽  
pp. 311-327
Author(s):  
Luis Lizasoain Hernández

El objetivo de este artículo es presentar los criterios y modelos estadísticos empleados en un estudio de eficacia escolar desarrollado en la Comunidad Autónoma del País Vasco empleando como variable criterio los resultados en matemáticas, comprensión lectora en lengua castellana y en lengua vasca, resultantes de las evaluaciones de Diagnóstico aplicadas en cinco años. Se definen cuatro criterios de eficacia escolar: puntuaciones extremas, residuos extremos, crecimiento de puntuaciones y crecimiento de residuos. Para ello se han aplicado técnicas de regresión multinivel empleando modelos jerárquicos lineales. Los resultados permiten una selección de centros tanto de alta como de baja eficacia que se basa en cuatro enfoques distintos y complementarios de la eficacia (o ineficacia) escolar. The aim of this paper is to present the statistical criteria and models used in a school effectiveness research carried out in the Basque Country Autonomous Community using as outcome variable the mathematics, spanish language and basque language scores. These scores come from the Diagnosis Assessments applied for five years. Four school effectiveness criteria are defined: extreme scores, extreme residuals, scores growth and residuals growth. Multilevel regression techniques have been applied using hierarchical linear models (HLM). Results have permitted a selection of both high and low effective schools based on four different and complementary school effectiveness approaches.


Sign in / Sign up

Export Citation Format

Share Document