Trade-offs, Transitivity, and Temkin

2015 ◽  
Vol 12 (3) ◽  
pp. 331-342
Author(s):  
Dale Dorsey

In this essay I critically assess Larry S. Temkin’s new book, Rethinking the Good: Moral Ideals and the Nature of Practical Reasoning. While I find that there is much to praise about this work, I focus on two points of critique. Generally, Temkin’s aims in this book are to expose a radical tension in our beliefs about value, and to show that one potentially palatable (if not ultimately acceptable) option is to reject the transitivity of the predicate “better than”. However, I argue that in both his motivation for claiming that such a tension exists, and one of his arguments that transitivity is a palatable option, his discussion is missing a crucial step: a first-order discussion of the relationship between intrinsic values; both personal welfare goods and impersonal goods (such as equality, overall utility, etc.).

2015 ◽  
Vol 12 (4) ◽  
pp. 363-392 ◽  
Author(s):  
Larry S. Temkin

This article gives a brief overview of Rethinking the Good, whose impossibility arguments illuminate the difficulty of arriving at a coherent theory of the good. I show that an additive-aggregationistprinciple is plausible for some comparisons, while an anti- additive-aggregationistprinciple is plausible for others. Invoking SpectrumArguments, I show that these principles are incompatible with an empirical premise, and various Axioms of Transitivity. I argue that whether the “all-things-considered better than” relation is transitive is not a matter of language or logic, but the nature of moral ideals. If an Internal Aspects View holds, then many standard assumptions about rationality follow, including the Axioms of Transitivity, but not if an Essentially Comparative View holds. Yet many important ideals are essentially comparative. My results have important implications for the normative significance of economics, and require substantial revision in our understanding of the good, moral ideals, and the nature of practical reasoning.


2021 ◽  
Vol 75 (1) ◽  
pp. 71-102
Author(s):  
Anton Strezhnev ◽  
Judith G. Kelley ◽  
Beth A. Simmons

AbstractThe international community often seeks to promote political reforms in recalcitrant states. Recently, some scholars have argued that, rather than helping, international law and advocacy create new problems because they have negative spillovers that increase rights violations. We review three mechanisms for such spillovers: backlash, trade-offs, and counteraction and concentrate on the last of these. Some researchers assert that governments sometimes “counteract” international human rights pressures by strategically substituting violations in adjacent areas that are either not targeted or are harder to monitor. However, most such research shows only that both outcomes correlate with an intervention—the targeted positively and the spillover negatively. The burden of proof, however, should be as rigorous as those for studies of first-order policy consequences. We show that these correlations by themselves are insufficient to demonstrate counteraction outside of the narrow case where the intervention is assumed to have no direct effect on the spillover, a situation akin to having a valid instrumental variable design. We revisit two prominent findings and show that the evidence for the counteraction claim is weak in both cases. The article contributes methodologically to the study of negative spillovers in general by proposing mediation and sensitivity analysis within an instrumental variables framework for assessing such arguments. It revisits important prior findings that claim negative consequences to human rights law and/or advocacy, and raises critical normative questions regarding how we empirically evaluate hypotheses about causal mechanisms.


2020 ◽  
Vol 55 (6) ◽  
pp. 848-862
Author(s):  
Thi Kim Phung Dang

Although forest devolution has become a key strategy of forestry reforms to mobilise local resources for sustainable forest management, there is growing concern about the legitimacy of this strategy. There have been escalating disputes between forestry agencies and local people as to who receives the rights to forests. Examining the policy of forest land allocation in Vietnam helps us to understand this legitimacy issue. Research findings from three case studies show trade-offs between the two policies’ goals, environmental protection and livelihood improvement, due to locals’ low awareness of the intrinsic values of forests and their lack of knowledge regarding the policy.


2013 ◽  
Vol 2013 ◽  
pp. 1-6 ◽  
Author(s):  
Mohammad Ahmadian ◽  
Sohyla Reshadat ◽  
Nader Yousefi ◽  
Seyed Hamed Mirhossieni ◽  
Mohammad Reza Zare ◽  
...  

Due to complex composition of leachate, the comprehensive leachate treatment methods have been not demonstrated. Moreover, the improper management of leachate can lead to many environmental problems. The aim of this study was application of Fenton process for decreasing the major pollutants of landfill leachate on Kermanshah city. The leachate was collected from Kermanshah landfill site and treated by Fenton process. The effect of various parameters including solution pH, Fe2+and H2O2dosage, Fe2+/H2O2molar ratio, and reaction time was investigated. The result showed that with increasing Fe2+and H2O2dosage, Fe2+/H2O2molar ratio, and reaction time, the COD, TOC, TSS, and color removal increased. The maximum COD, TOC, TSS, and color removal were obtained at low pH (pH: 3). The kinetic data were analyzed in term of zero-order, first-order, and second-order expressions. First-order kinetic model described the removal of COD, TOC, TSS, and color from leachate better than two other kinetic models. In spite of extremely difficulty of leachate treatment, the previous results seem rather encouraging on the application of Fenton’s oxidation.


Author(s):  
Javier Vidal

According to the method of transparency, genuine self-knowledge is the outcome of an inference from world to mind. A. Byrne (2018) has developed a theory in which the method of transparency consists in following an epistemic rule in order to form self-verifying second-order beliefs. In this paper, I argue that Byrne’s theory does not establish sufficient conditions for having self-knowledge of first-order beliefs. Examining a case of self-deception, I strive to show that following such a rule might not result in self-knowledge when one is involved in rational deliberation. In the case under consideration, one precisely comes to believe that one believes that p without coming to believe that p. The justification for one’s not forming the belief that p with its distinctive causal pattern in mental life and behaviour, is that one already had the unconscious belief that not-p, a belief that is not sensitive to the principles governing theoretical and practical reasoning.


2020 ◽  
Vol 1 (2) ◽  
pp. 54-62
Author(s):  
Naser Al Amery ◽  
Hussein Rasool Abid ◽  
Shaobin Wang ◽  
Shaomin Liu

In this study, two improved versions of UiO-66 were successfully synthesised. Modified UiO-66 and UiO-66-Ce were characterised to confirm the integrity of the structure, the stability of functional groups on the surface and the thermal stability. Activated samples were used for removal harmful anionic dye (methyl orange) (MO) from wastewater. Batch adsorption process was relied to investigate the competition between those MOFs for removing MO from aqueous solution. Based on the results, at a higher initial concentration, the maximum MO uptake was achieved by UiO-66-Ce which was better than modified-UiO-66. They adsorbed 71.5 and 62.5 mg g-1 respectively. Langmuir and Freundlich isotherms were employed to simulate the experimental data. In addition, Pseudo first order and Pseudo second order equations were used to describe the dynamic behaviour of MO through the adsorption process. The high adsorption capacities on these adsorbents can make them promised adsorbents in industrial areas.


2021 ◽  
Author(s):  
Richard Czikhardt ◽  
Juraj Papco ◽  
Peter Ondrejka ◽  
Peter Ondrus ◽  
Pavel Liscak

<p>SAR interferometry (InSAR) is inherently a relative geodetic technique requiring one temporal and one spatial reference to obtain the datum-free estimates on millimetre-level displacements within the network of radar scatterers. To correct the systematic errors, such as the varying atmospheric delay, and solve the phase ambiguities, it relies on the first-order estimation network of coherent point scatterers (PS).</p><p>For vegetated and sparsely urbanized areas, commonly affected by landslides in Slovakia, it is often difficult to construct a reliable first-order estimation network, as they lack the PS. Purposedly deploying corner reflectors (CR) at such areas strengthens the estimation network and, if these CR are collocated with a Global Navigation Satellite Systems (GNSS), they provide an absolute geodetic reference to a well-defined terrestrial reference frame (TRF), as well as independent quality control.</p><p>For landslides, line-of-sight (LOS) InSAR displacements can be difficult to interpret. Using double CR, i.e. two reflectors for ascending/descending geometries within a single instrument, enables the assumption-less decomposition of the observed cross-track LOS displacements into the vertical and the horizontal displacement components.</p><p>In this study, we perform InSAR analysis on the one-year of Sentinel-1 time series of five areas in Slovakia, affected by landslides. 24 double back-flipped trihedral CR were carefully deployed at these sites to form a reference network, guaranteeing reliable displacement information over the critical landslide zones. To confirm the measurement quality, we show that the temporal average Signal-to-Clutter Ratio (SCR) of the CR is better than 20 dB. The observed CR motions in vertical and east-west directions vary from several millimetres up to 3 centimetres, with average standard deviation better than 0.5 mm.<br>Repeated GNSS measurements of the CR confirm the displacement observed by the InSAR, improve the positioning precision of the nearby PS, and attain the transformation into the national TRF.</p>


1996 ◽  
Vol 22 (3) ◽  
Author(s):  
T. L. Nell ◽  
L. Kamfer ◽  
R. P. Van Der Merwe ◽  
D. J. L. Venter

The personality profile of successful prison warders. In an attempt to develop a personality profile for successful prison warders, scores on CattelFs 16-PF (SA92-form) were obtained from 361 warders employed by the South African Department of Correctional Services. Independent criterion information (tempo of promotion) was also obtained and used as indicator of job success. Using Hotelling's T2, it was found that the first order factor profiles of successful and unsuccessful warders differed significantly. There was no difference in their second order profiles. By means of stepwise discriminant analysis with personality as independent and success (expressed as a dichotomy) as the dependant variable, four first order factors were identified and formulae derived which predicted 14,8 better than chance whether a warder would be correctly classified as successful or not on the dichotomous success criterion. Opsomming In 'n poging om 'n persoonlikheidsprofiel vir suksesvolle bewakingsdienspersoneel saam te stel, is response op Cattell se 16-PF (SA92-vorm) vanaf 361 bewakingsdienspersoneellede werksaam by die Suid-Afrikaanse Departement van Korrektiewe Dienste, verkry. Inligting in terme van onafhanklike kriterium (tempo van bevordering) is ook verkry en gebruik as aanduiding van werksukses. Deur die gebruik van Hotelling se T2 is bepaal dat die ecrsteorde profiele van suksesvolle en onsuksesvolle bewakingsdienslede beduidend van mekaar verskil. Daar is geen beduidende verskil ten opsigte van die tweedeorde profiele gevind nie. Deur middel van stapsgewyse diskriminantontleding, met persoonlikheid as onafhanklike en sukses (uitgedruk as 'n digotomie) as afhanklike veranderlike, is vier eersteorde faktore geidentifiseer en formules saamgestel wat 14,8 beter as toeval kan voorspel of "n persoon suksesvol of onsuksesvol volgens die digotomiese sukseskriterium sal wees.


Author(s):  
Özgür Şimşek

The lexicographic decision rule is one of the simplest methods of choosing among decision alternatives. It is based on a simple priority ranking of the attributes available. According to the lexicographic decision rule, a decision alternative is better than another alternative if and only if it is better than the other alternative in the most important attribute on which the two alternatives differ. In other words, the lexicographic decision rule does not allow trade-offs among the various attributes. For example, if quality is considered to be more important than cost, no difference in price can compensate for a difference in quality: The lexicographic decision rule chooses the item with the best quality regardless of the cost. Over the years, the lexicographic decision rule has been compared to various statistical learning methods, including multiple linear regression, support vector machines, decision trees, and random forests. The results show that the lexicographic decision rule can sometimes compete remarkably well with more complex statistical methods, and even outperform them, despite its naively simple structure. These results have stimulated a rich scientific literature on why, and under what conditions, lexicographic decision rules yield accurate decisions. Due to the simplicity of its decision process, its fast execution time, and the robustness of its performance in various decision environments, the lexicographic decision rule is considered to be a plausible model of human decision making. In particular, the lexicographic decision rule is put forward as a model of how the human mind implements bounded rationality to make accurate decisions when information is scarce, time is short, and computational capacity is limited.


2019 ◽  
Vol 113 (3) ◽  
pp. 694-709 ◽  
Author(s):  
MICHAEL BECHER ◽  
IRENE MENÉNDEZ GONZÁLEZ

We examine the effect of electoral institutions on two important features of representation that are often studied separately: policy responsiveness and the quality of legislators. Theoretically, we show that while a proportional electoral system is better than a majoritarian one at representing popular preferences in some contexts, this advantage can come at the price of undermining the selection of good politicians. To empirically assess the relevance of this trade-off, we analyze an unusually controlled electoral reform in Switzerland early in the twentieth century. To account for endogeneity, we exploit variation in the intensive margin of the reform, which introduced proportional representation, based on administrative constraints and data on voter preferences. A difference-in-difference analysis finds that higher reform intensity increases the policy congruence between legislators and the electorate and reduces legislative effort. Contemporary evidence from the European Parliament supports this conclusion.


Sign in / Sign up

Export Citation Format

Share Document