Doing and Allowing

Philosophy ◽  
2016 ◽  
Author(s):  
Fiona Woollard

According to common-sense morality, the difference between doing and allowing harm matters morally. Doing harm can be wrong when merely allowing harm would not be, even if all other factors are equal: the level of harm is the same, the agent’s motivation is the same, the cost to the agent of avoiding countenancing harm is the same, and so on. Suppose that you have accidentally swallowed poison and you need to get to hospital urgently. It might be permissible for you to refuse to stop and help if you spot Sarah drowning, but impermissible for you to push her into a river to clear your way to the hospital. Without this moral distinction between doing and allowing, it seems likely that our everyday morality would look very different. Treating doing and allowing harm as equivalent seems to leave us with a morality that is either much more permissive than we normally think it is (permitting us to do harm to others to avoid personal sacrifices) or much more demanding (requiring us to prevent harm to others even at great personal sacrifice). Yet the moral significance of the distinction is highly controversial. When serious harm to others is at stake, it may seem puzzling that it should matter whether the harm is done or merely allowed. Powerful critics have argued that the distinction is morally irrelevant. Others have charged that the distinction itself falls apart under scrutiny: our intuitions about whether behavior counts as doing or allowing harm do not reflect any clear, nonmoral distinction. Much of the early contemporary debate on the moral relevance of the distinction between doing and allowing harm focused on appeals to intuitions. We are asked to examine “contrast cases” in which all others factors are supposedly held constant. However, appeals to intuitions are of limited use. They may establish whether we treat the distinction as morally relevant, but they cannot show whether we ought to do so. The real challenge for a defender of the doing/allowing distinction is to provide a clear analysis of the distinction and a convincing argument that, under this analysis, the distinction connects appropriately to more fundamental moral concepts. This article maps out the philosophical literature on the analysis and moral significance of the distinction between doing and allowing harm, from the beginnings of contemporary interest in the issue in the 1960s and 1970s to recent trends and developments.

Author(s):  
Jeff Sovern

For more than forty years, a standard tool in the consumer protection toolbox has been the cooling- off period.  Federal statutes, state statutes, and federal regulations all oblige merchants to give consumers three days to rescind certain contracts.  This paper reports on a survey of businesses subject to such cooling-off periods.  The study has two principal findings. First, the respondents indicated that few consumers rescind their purchases.  Thus, the study raises doubts about whether cooling-off periods benefit consumers or whether they provide only illusory consumer protection. The article also offers speculations about why cooling-off periods have been of such little value to consumers.Second, the study found that consumers who receive both oral and written notice of their rights are more likely to avail themselves of those rights than those who receive only written notices, and that the differences are statistically significant.  Fifty-three percent of the sellers who gave only a written notice and did not speak of the buyer’s right to cancel said buyers never cancelled, nearly double the percentage for sellers who did tell buyers (27%).  Businesses that provided both oral in-person and written notices of the right to rescind were more than twice as likely to report that more than 1% of their customers cancelled contracts as those that provided only written notices.  The article also explores why oral and written notices combined were more effective than written notice alone.Finally, the survey asked respondents about the cost of cooling-off periods.   More than four-fifths of the respondents who answered the question reported that the right to cancel had cost them either nothing or very little. This contrasts with the vehement opposition of opponents of such rules when they were first adopted in the 1960s and 1970s.


Author(s):  
Anatoly Vishnevsky ◽  
Boris Denisov ◽  
Victoria Sakevich

In the 1960s and 1970s, with the introduction of hormonal contraception, as well as of a new generation of intrauterine contraception, Western countries saw cardinal changes in methods of fertility regulation so significant that the American demographers Ch. Westoff and N. Ryder called them "The contraceptive revolution." By this time, the transition to low fertility in developed countries, as, indeed, in Russia, was completed, and family planning had become a common practice. However, the new technologies significantly increased the effectiveness of birth control, and this change would have important social and demographic consequences. Underestimation of the importance of family planning and underdevelopment of the corresponding services in the USSR and in Russia led to the contraceptive revolution beginning here much later than in the West, not until the post-Soviet years with the arrival of a market economy and information openness. For decades, induced abortion played a key role in the regulation of fertility, and only in the 1990s did modern methods of contraception become widespread and the unfavorable ratio of abortions to births begin to change for the better. The article describes the composition of the contraceptive methods used in countries of European culture and of those in Russia, and attempts to explain the difference between them. Based on national representative sample data, an analysis is made of current practice of contraceptive use in Russia. The conclusion is drawn that the contraceptive revolution in Russia is proceeding rather quickly, but without substantial state support.


Author(s):  
Andrew Davies

‘Arup’s adhocracy and projects in theory’ considers how the spread of adaptive project structures in the 1960s and 1970s encouraged management scholars to develop new ways of thinking about organizations. It begins with Ove Arup’s work on the Sydney Opera House, which established a new model of architect and engineer collaborating in project teams to innovate and solve challenging problems. It then goes on to discuss some of the theoretical insights and perspectives introduced by organizational scholars to help us think about projects as an adaptive structure in a complex, unstable, and rapidly changing environment. It explains organization theory and adhocracy, the difference between stable and flexible project teams, and the contingent dimension of projects.


2016 ◽  
Vol 42 (1) ◽  
pp. 233 ◽  
Author(s):  
W. Jlassi ◽  
E. Nadal-Romero ◽  
J. M. García-Ruiz

Large rainfed, dryland areas were transformed into irrigated land in northeast Spain, where rivers from the Pyrenees Range ensure the availability of abundant water resources. The transformation of the Riegos del Alto Aragón area (RAA), mainly during the second half of the 20th century, was subject to major problems during the 1960s and 1970s, including monoculture of poorly productive winter cereals, water wastage, and soil degradation. Since the 1990s the RAA has been affected by modernization involving: (i) a change in the mode of irrigation, from gravity to sprinkler systems; (ii) the concentration of plots to enlarge the size of irrigated fields; and (iii) the introduction of more productive and highly water-consuming crops (corn, lucerne, vegetables). These changes coincided with enlargement of the irrigated area, increasing water demand at a time of increasing water scarcity because of restriction on the construction of new large reservoirs and declining water resources, because of climate and land use changes. Addressing this major problem has required new strategies, specifically the construction of small reservoirs and water ponds within the irrigated area. The ponds increase water reserves and facilitate sprinkling irrigation by adding pressure to the system. However, this has involved a huge rise in electricity consumption, which has increased the cost of production.


2018 ◽  
Vol 3 (3) ◽  
pp. 105-111
Author(s):  
Yukihiro Yamamoto

In Henri Lefebvre’s theory, the space in process of social production is regarded as the very condition of accomplishing the ‘desire’ to do or to create something. This article argues that we need to understand the implications of the ‘desire’ in order to make use of his urban theory in today’s planning. Introducing this idea, in the 1960s and 1970s, Lefebvre attempted  to create our own style of living, that is, to produce the appropriated space which differed from the technocratically-planned spaces where people devote themselves into repetitively fulfilling their needs for specific objects like a laboratory rat in the experiment of looped system. For all his utopian strategies, Lefebvre made practical suggestions on turning our cities more desire-based, that is to say, more democratically designed; it would be very helpful for today’s urban planning to go back to his argument on the difference between ‘desire’ and ‘need’, or the connection between ‘desire’ and the style of living.


2015 ◽  
Vol 12 (4) ◽  
pp. 539-555 ◽  
Author(s):  
Kevin M. Flanagan

This article traces Ken Russell's explorations of war and wartime experience over the course of his career. In particular, it argues that Russell's scattered attempts at coming to terms with war, the rise of fascism and memorialisation are best understood in terms of a combination of Russell's own tastes and personal style, wider stylistic and thematic trends in Euro-American cinema during the 1960s and 1970s, and discourses of collective national experience. In addition to identifying Russell's recurrent techniques, this article focuses on how the residual impacts of the First and Second World Wars appear in his favoured genres: literary adaptations and composer biopics. Although the article looks for patterns and similarities in Russell's war output, it differentiates between his First and Second World War films by indicating how he engages with, and temporarily inhabits, the stylistic regime of the enemy within the latter group.


2013 ◽  
Vol 10 (1) ◽  
pp. 27-48 ◽  
Author(s):  
Alan Burton

Brainwashing assumed the proportions of a cultural fantasy during the Cold War period. The article examines the various political, scientific and cultural contexts of brainwashing, and proceeds to a consideration of the place of mind control in British spy dramas made for cinema and television in the 1960s and 1970s. Particular attention is given to the films The Mind Benders (1963) and The Ipcress File (1965), and to the television dramas Man in a Suitcase (1967–8), The Prisoner (1967–8) and Callan (1967–81), which gave expression to the anxieties surrounding thought-control. Attention is given to the scientific background to the representations of brainwashing, and the significance of spy scandals, treasons and treacheries as a distinct context to the appearance of brainwashing on British screens.


2020 ◽  
Vol 22 (3) ◽  
pp. 341-361
Author(s):  
Gonzalo Grau-Pérez ◽  
J. Guillermo Milán

In Uruguay, Lacanian ideas arrived in the 1960s, into a context of Kleinian hegemony. Adopting a discursive approach, this study researched the initial reception of these ideas and its effects on clinical practices. We gathered a corpus of discursive data from clinical cases and theoretical-doctrinal articles (from the 1960s, 1970s and 1980s). In order to examine the effects of Lacanian ideas, we analysed the difference in the way of interpreting the clinical material before and after Lacan's reception. The results of this research illuminate some epistemological problems of psychoanalysis, especially the relationship between theory and clinical practice.


2016 ◽  
Vol 25 (3) ◽  
pp. 294-316 ◽  
Author(s):  
Chik Collins ◽  
Ian Levitt

This article reports findings of research into the far-reaching plan to ‘modernise’ the Scottish economy, which emerged from the mid-late 1950s and was formally adopted by government in the early 1960s. It shows the growing awareness amongst policy-makers from the mid-1960s as to the profoundly deleterious effects the implementation of the plan was having on Glasgow. By 1971 these effects were understood to be substantial with likely severe consequences for the future. Nonetheless, there was no proportionate adjustment to the regional policy which was creating these understood ‘unwanted’ outcomes, even when such was proposed by the Secretary of State for Scotland. After presenting these findings, the paper offers some consideration as to their relevance to the task of accounting for Glasgow's ‘excess mortality’. It is suggested that regional policy can be seen to have contributed to the accumulation of ‘vulnerabilities’, particularly in Glasgow but also more widely in Scotland, during the 1960s and 1970s, and that the impact of the post-1979 UK government policy agenda on these vulnerabilities is likely to have been salient in the increase in ‘excess mortality’ evident in subsequent years.


Sign in / Sign up

Export Citation Format

Share Document