Why the pretense of pursuing democracy? The necessity and rationality of democratic slogans for civil revolutions

2018 ◽  
Vol 4 (3) ◽  
pp. 242-257
Author(s):  
Taku Yukawa ◽  
Kaoru Hidaka

While democratic revolutions are not uniform in their pursuit of democracy, they do have something in common: those calling for revolution and participating in demonstrations do so under the banner of democracy. However, studies have revealed that these citizens were not at first committed to democracy per se; rather, they took the opportunity to vent their frustration against the current regime because of their struggle against poverty and social inequality. Why, then, do citizens who are not pursuing democracy per se participate in revolutions under the banner of democracy? Previous studies have failed to clarify this point. To fill this gap, we outline three strategic rationalities and necessities behind the use of “democracy” as a common slogan to justify civil revolutions: 1) organizing large scale dissident movements in a country; 2) attracting international support; and 3) imitating successful examples from the past. Evidence from the 2003 Rose Revolution in Georgia and the 2005 Orange Revolution in Ukraine supports this theory.

2021 ◽  
Vol 7 ◽  
pp. 237802312110201
Author(s):  
Thomas A. DiPrete ◽  
Brittany N. Fox-Williams

Social inequality is a central topic of research in the social sciences. Decades of research have deepened our understanding of the characteristics and causes of social inequality. At the same time, social inequality has markedly increased during the past 40 years, and progress on reducing poverty and improving the life chances of Americans in the bottom half of the distribution has been frustratingly slow. How useful has sociological research been to the task of reducing inequality? The authors analyze the stance taken by sociological research on the subject of reducing inequality. They identify an imbalance in the literature between the discipline’s continual efforts to motivate the plausibility of large-scale change and its lesser efforts to identify feasible strategies of change either through social policy or by enhancing individual and local agency with the potential to cumulate into meaningful progress on inequality reduction.


AERA Open ◽  
2019 ◽  
Vol 5 (4) ◽  
pp. 233285841988889 ◽  
Author(s):  
Joseph R. Cimpian ◽  
Jennifer D. Timmer

Although numerous survey-based studies have found that students who identify as lesbian, gay, bisexual, or questioning (LGBQ) have elevated risk for many negative academic, disciplinary, psychological, and health outcomes, the validity of the types of data on which these results rest have come under increased scrutiny. Over the past several years, a variety of data-validity screening techniques have been used in attempts to scrub data sets of “mischievous responders,” youth who systematically provide extreme and untrue responses to outcome items and who tend to falsely report being LGBQ. We conducted a preregistered replication of Cimpian et al. with the 2017 Youth Risk Behavior Survey to (1) estimate new LGBQ-heterosexual disparities on 20 outcomes; (2) test a broader, mechanistic theory relating mischievousness effects with a feature of items (i.e., item response-option extremity); and (3) compare four techniques used to address mischievous responders. Our results are consistent with Cimpian et al.’s findings that potentially mischievous responders inflate LGBQ-heterosexual disparities, do so more among boys than girls, and affect outcomes differentially. For example, we find that removing students suspected of being mischievous responders can cut male LGBQ-heterosexual disparities in half overall and can completely or mostly eliminate disparities in outcomes including fighting at school, driving drunk, and using cocaine, heroin, and ecstasy. Methodologically, we find that some methods are better than others at addressing the issue of data integrity, with boosted regressions coupled with data removal leading to potentially very large decreases in the estimates of LGBQ-heterosexual disparities, but regression adjustment having almost no effect. While the empirical focus of this article is on LGBQ youth, the issues discussed are relevant to research on other minority groups and youth generally, and speak to survey development, methodology, and the robustness and transparency of research.


2020 ◽  
Vol 57 (4) ◽  
pp. 87-106
Author(s):  
Maria A. Sekatskaya ◽  

The most important difference between contemporary compatibilist and libertarian theories is not the difference in their positions regarding the truth of the thesis of physical determinism, but their different approaches to the causal role of agents. According to libertarians, volitional acts performed by agents constitute a specific type of causes, which are not themselves caused by other causes. In this respect, event-causal libertarianism is similar to the agent-causal libertarianism, because it insists that in performing a volitional act an agent can choose one of the alternative outcomes without being caused to do so by anything else, where ‘anything else’ includes all the facts about the past and the present. Since event-causal libertarians maintain that volitional acts and the causal role of agents can be explained naturalistically, they must solve the problem of luck, i.e., they must explain how an agent is able to control her choices, given that she can choose one way or another without there being any difference in her state immediately preceding the moment of choice. This problem arises not from the indeterminism per se, but from the way it is coupled with the causal role of agents.In section one, I consider the historical development of compatibilist views on physical determinism and indeterminism. In section two, I present an overview of conditional analyses of alternative possibilities. In section three, I analyze the reasons why libertarians reject any type of conditional analysis, and show that intuitive objections against physical determinism, which portrait it as an obstacle to freedom, are untenable. In section four, I consider the consequence argument and show how it is related to the libertarian condition of sourcehood. In the final section, I analyze the problem of luck and show that it inevitably arises for any version of libertarianism. I demonstrate that indeterminism is a problem for libertarians, although they need it. And it is not a problem for compatibilists, who, while they do not need it, can incorporate it in their theories without facing the problem of luck.


2014 ◽  
Vol 11 (2) ◽  
pp. 125-136 ◽  
Author(s):  
Roland Verwiebe ◽  
Laura Wiesböck ◽  
Roland Teitzer

This article deals mainly with new forms of Intra-European migration, processes of integration and inequality, and the dynamics of emerging transnational labour markets in Europe. We discuss these issues against the background of fundamental changes which have been taking place on the European continent over the past two decades. Drawing on available comparative European data, we examine, in a first step, whether the changes in intra-European migration patterns have been accompanied by a differentiation of the causes of migration. In a second step, we discuss the extent to which new forms of transnational labour markets have been emerging within Europe and their effects on systems of social stratification.


Author(s):  
Georgi Derluguian

The author develops ideas about the origin of social inequality during the evolution of human societies and reflects on the possibilities of its overcoming. What makes human beings different from other primates is a high level of egalitarianism and altruism, which contributed to more successful adaptability of human collectives at early stages of the development of society. The transition to agriculture, coupled with substantially increasing population density, was marked by the emergence and institutionalisation of social inequality based on the inequality of tangible assets and symbolic wealth. Then, new institutions of warfare came into existence, and they were aimed at conquering and enslaving the neighbours engaged in productive labour. While exercising control over nature, people also established and strengthened their power over other people. Chiefdom as a new type of polity came into being. Elementary forms of power (political, economic and ideological) served as a basis for the formation of early states. The societies in those states were characterised by social inequality and cruelties, including slavery, mass violence and numerous victims. Nowadays, the old elementary forms of power that are inherent in personalistic chiefdom are still functioning along with modern institutions of public and private bureaucracy. This constitutes the key contradiction of our time, which is the juxtaposition of individual despotic power and public infrastructural one. However, society is evolving towards an ever more efficient combination of social initiatives with the sustainability and viability of large-scale organisations.


2014 ◽  
Vol 7 (1) ◽  
pp. 1-25
Author(s):  
Jodie Eichler-Levine

In this article I analyze how Americans draw upon the authority of both ancient, so-called “hidden” texts and the authority of scholarly discourse, even overtly fictional scholarly discourse, in their imaginings of the “re-discovered” figure of Mary Magdalene. Reading recent treatments of Mary Magdalene provides me with an entrance onto three topics: how Americans see and use the past, how Americans understand knowledge itself, and how Americans construct “religion” and “spirituality.” I do so through close studies of contemporary websites of communities that focus on Mary Magdalene, as well as examinations of relevant books, historical novels, reader reviews, and comic books. Focusing on Mary Magdalene alongside tropes of wisdom also uncovers the gendered dynamics at play in constructions of antiquity, knowledge, and religious accessibility.


1997 ◽  
Vol 36 (4I) ◽  
pp. 321-331
Author(s):  
Sarfraz Khan Qureshi

It is an honour for me as President of the Pakistan Society of Development Economists to welcome you to the 13th Annual General Meeting and Conference of the Society. I consider it a great privilege to do so as this Meeting coincides with the Golden Jubilee celebrations of the state of Pakistan, a state which emerged on the map of the postwar world as a result of the Muslim freedom movement in the Indian Subcontinent. Fifty years to the date, we have been jubilant about it, and both as citizens of Pakistan and professionals in the social sciences we have also been thoughtful about it. We are trying to see what development has meant in Pakistan in the past half century. As there are so many dimensions that the subject has now come to have since its rather simplistic beginnings, we thought the Golden Jubilee of Pakistan to be an appropriate occasion for such stock-taking.


2020 ◽  
Author(s):  
Lungwani Muungo

The purpose of this review is to evaluate progress inmolecular epidemiology over the past 24 years in canceretiology and prevention to draw lessons for futureresearch incorporating the new generation of biomarkers.Molecular epidemiology was introduced inthe study of cancer in the early 1980s, with theexpectation that it would help overcome some majorlimitations of epidemiology and facilitate cancerprevention. The expectation was that biomarkerswould improve exposure assessment, document earlychanges preceding disease, and identify subgroupsin the population with greater susceptibility to cancer,thereby increasing the ability of epidemiologic studiesto identify causes and elucidate mechanisms incarcinogenesis. The first generation of biomarkers hasindeed contributed to our understanding of riskandsusceptibility related largely to genotoxic carcinogens.Consequently, interventions and policy changes havebeen mounted to reduce riskfrom several importantenvironmental carcinogens. Several new and promisingbiomarkers are now becoming available for epidemiologicstudies, thanks to the development of highthroughputtechnologies and theoretical advances inbiology. These include toxicogenomics, alterations ingene methylation and gene expression, proteomics, andmetabonomics, which allow large-scale studies, includingdiscovery-oriented as well as hypothesis-testinginvestigations. However, most of these newer biomarkershave not been adequately validated, and theirrole in the causal paradigm is not clear. There is a needfor their systematic validation using principles andcriteria established over the past several decades inmolecular cancer epidemiology.


1987 ◽  
Vol 19 (5-6) ◽  
pp. 701-710 ◽  
Author(s):  
B. L. Reidy ◽  
G. W. Samson

A low-cost wastewater disposal system was commissioned in 1959 to treat domestic and industrial wastewaters generated in the Latrobe River valley in the province of Gippsland, within the State of Victoria, Australia (Figure 1). The Latrobe Valley is the centre for large-scale generation of electricity and for the production of pulp and paper. In addition other industries have utilized the brown coal resource of the region e.g. gasification process and char production. Consequently, industrial wastewaters have been dominant in the disposal system for the past twenty-five years. The mixed industrial-domestic wastewaters were to be transported some eighty kilometres to be treated and disposed of by irrigation to land. Several important lessons have been learnt during twenty-five years of operating this system. Firstly the composition of the mixed waste stream has varied significantly with the passage of time and the development of the industrial base in the Valley, so that what was appropriate treatment in 1959 is not necessarily acceptable in 1985. Secondly the magnitude of adverse environmental impacts engendered by this low-cost disposal procedure was not imagined when the proposal was implemented. As a consequence, clean-up procedures which could remedy the adverse effects of twenty-five years of impact are likely to be costly. The question then may be asked - when the total costs including rehabilitation are considered, is there really a low-cost solution for environmentally safe disposal of complex wastewater streams?


2019 ◽  
Vol 19 (1) ◽  
pp. 4-16 ◽  
Author(s):  
Qihui Wu ◽  
Hanzhong Ke ◽  
Dongli Li ◽  
Qi Wang ◽  
Jiansong Fang ◽  
...  

Over the past decades, peptide as a therapeutic candidate has received increasing attention in drug discovery, especially for antimicrobial peptides (AMPs), anticancer peptides (ACPs) and antiinflammatory peptides (AIPs). It is considered that the peptides can regulate various complex diseases which are previously untouchable. In recent years, the critical problem of antimicrobial resistance drives the pharmaceutical industry to look for new therapeutic agents. Compared to organic small drugs, peptide- based therapy exhibits high specificity and minimal toxicity. Thus, peptides are widely recruited in the design and discovery of new potent drugs. Currently, large-scale screening of peptide activity with traditional approaches is costly, time-consuming and labor-intensive. Hence, in silico methods, mainly machine learning approaches, for their accuracy and effectiveness, have been introduced to predict the peptide activity. In this review, we document the recent progress in machine learning-based prediction of peptides which will be of great benefit to the discovery of potential active AMPs, ACPs and AIPs.


Sign in / Sign up

Export Citation Format

Share Document