Frequency of enforcement is more important than the severity of punishment in reducing violation behaviors

2021 ◽  
Vol 118 (42) ◽  
pp. e2108507118
Author(s):  
Kinneret Teodorescu ◽  
Ori Plonsky ◽  
Shahar Ayal ◽  
Rachel Barkan

External enforcement policies aimed to reduce violations differ on two key components: the probability of inspection and the severity of the punishment. Different lines of research offer different insights regarding the relative importance of each component. In four studies, students and Prolific crowdsourcing participants (Ntotal = 816) repeatedly faced temptations to commit violations under two enforcement policies. Controlling for expected value, we found that a policy combining a high probability of inspection with a low severity of fines (HILS) was more effective than an economically equivalent policy that combined a low probability of inspection with a high severity of fines (LIHS). The advantage of prioritizing inspection frequency over punishment severity (HILS over LIHS) was greater for participants who, in the absence of enforcement, started out with a higher violation rate. Consistent with studies of decisions from experience, frequent enforcement with small fines was more effective than rare severe fines even when we announced the severity of the fine in advance to boost deterrence. In addition, in line with the phenomenon of underweighting of rare events, the effect was stronger when the probability of inspection was rarer (as in most real-life inspection probabilities) and was eliminated under moderate inspection probabilities. We thus recommend that policymakers looking to effectively reduce recurring violations among noncriminal populations should consider increasing inspection rates rather than punishment severity.

2021 ◽  
Author(s):  
Kinneret Teodorescu ◽  
Ori Plonsky ◽  
Shahar Ayal ◽  
Rachel Barkan

External enforcement policies aimed to reduce violations differ on two key components: the probability of inspection and the severity of punishments. Different lines of research offer competing predictions regarding the relative importance of each component. In three incentive compatible studies, students and Prolific crowdsourcing participants (Ntotal=430) repeatedly faced temptations to commit violations under two enforcement policies. Controlling for expected value, the results indicated that a policy combining High probability of Inspection with Low Severity of fine (HILS) was more effective than a policy combining Low probability of Inspection with High Severity of fine (LIHS). Consistent with the prediction of Decisions from Experience research, this finding held even when the severity of the fine was stated in advance to boost deterrence. In addition, the advantage of HILS over LIHS was greater as participants’ baseline rate of violation (without enforcement) was higher, implying that HILS is more effective among frequent offenders.


2018 ◽  
Author(s):  
Michel Failing ◽  
Benchi Wang ◽  
Jan Theeuwes

Where and what we attend to is not only determined by what we are currently looking for but also by what we have encountered in the past. Recent studies suggest that biasing the probability by which distractors appear at locations in visual space may lead to attentional suppression of high probability distractor locations which effectively reduces capture by a distractor but also impairs target selection at this location. However, in many of these studies introducing a high probability distractor location was tantamount to increasing the probability of the target appearing in any of the other locations (i.e. the low probability distractor locations). Here, we investigate an alternative interpretation of previous findings according to which attentional selection at high probability distractor locations is not suppressed. Instead, selection at low probability distractor locations is facilitated. In two visual search tasks, we found no evidence for this hypothesis: neither when there was only a bias in target presentation but no bias in distractor presentation (Experiment 1), nor when there was only a bias in distractor presentation but no bias in target presentation (Experiment 2). We conclude that recurrent presentation of a distractor in a specific location leads to attentional suppression of that location through a mechanism that is unaffected by any regularities regarding the target location.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
D Doudesis ◽  
J Yang ◽  
A Tsanas ◽  
C Stables ◽  
A Shah ◽  
...  

Abstract Introduction The myocardial-ischemic-injury-index (MI3) is a promising machine learned algorithm that predicts the likelihood of myocardial infarction in patients with suspected acute coronary syndrome. Whether this algorithm performs well in unselected patients or predicts recurrent events is unknown. Methods In an observational analysis from a multi-centre randomised trial, we included all patients with suspected acute coronary syndrome and serial high-sensitivity cardiac troponin I measurements without ST-segment elevation myocardial infarction. Using gradient boosting, MI3 incorporates age, sex, and two troponin measurements to compute a value (0–100) reflecting an individual's likelihood of myocardial infarction, and estimates the negative predictive value (NPV) and positive predictive value (PPV). Model performance for an index diagnosis of myocardial infarction, and for subsequent myocardial infarction or cardiovascular death at one year was determined using previously defined low- and high-probability thresholds (1.6 and 49.7, respectively). Results In total 20,761 of 48,282 (43%) patients (64±16 years, 46% women) were eligible of whom 3,278 (15.8%) had myocardial infarction. MI3 was well discriminated with an area under the receiver-operating-characteristic curve of 0.949 (95% confidence interval 0.946–0.952) identifying 12,983 (62.5%) patients as low-probability (sensitivity 99.3% [99.0–99.6%], NPV 99.8% [99.8–99.9%]), and 2,961 (14.3%) as high-probability (specificity 95.0% [94.7–95.3%], PPV 70.4% [69–71.9%]). At one year, subsequent myocardial infarction or cardiovascular death occurred more often in high-probability compared to low-probability patients (17.6% [520/2,961] versus 1.5% [197/12,983], P<0.001). Conclusions In unselected consecutive patients with suspected acute coronary syndrome, the MI3 algorithm accurately estimates the likelihood of myocardial infarction and predicts probability of subsequent adverse cardiovascular events. Performance of MI3 at example thresholds Funding Acknowledgement Type of funding source: Foundation. Main funding source(s): Medical Research Council


2010 ◽  
Vol 2010 ◽  
pp. 1-5 ◽  
Author(s):  
Wael N. Yacoub ◽  
Mikael Petrosyan ◽  
Indu Sehgal ◽  
Yanling Ma ◽  
Parakrama Chandrasoma ◽  
...  

The objective was to develop a score, to stratify patients with acute cholecystitis into high, intermediate, or low probability of gangrenous cholecystitis. The probability of gangrenous cholecystitis (score) was derived from a logistic regression of a clinical and pathological review of 245 patients undergoing urgent cholecystectomy. Sixty-eight patients had gangrenous inflammation, 132 acute, and 45 no inflammation. The score comprised of: age > 45 years (1 point), heart rate > 90 beats/min (1 point), male (2 points), Leucocytosis > 13,000/mm3(1.5 points), and ultrasound gallbladder wall thickness>4.5 mm (1 point). The prevalence of gangrenous cholecystitis was 13% in the low-probability (0–2 points), 33% in the intermediate-probability (2–4.5 points), and 87% in the high probability category (>4.5 points). A cutoff score of 2 identified 31 (69%) patients with no acute inflammation (PPV 90%). This scoring system can prioritize patients for emergent cholecystectomy based on their expected pathology.


2018 ◽  
Vol 35 (10) ◽  
pp. 1032-1038 ◽  
Author(s):  
Aaron S. Weinberg ◽  
William Chang ◽  
Grace Ih ◽  
Alan Waxman ◽  
Victor F. Tapson

Objective: Computed tomography angiography is limited in the intensive care unit (ICU) due to renal insufficiency, hemodynamic instability, and difficulty transporting unstable patients. A portable ventilation/perfusion (V/Q) scan can be used. However, it is commonly believed that an abnormal chest radiograph can result in a nondiagnostic scan. In this retrospective study, we demonstrate that portable V/Q scans can be helpful in ruling in or out clinically significant pulmonary embolism (PE) despite an abnormal chest x-ray in the ICU. Design: Two physicians conducted chart reviews and original V/Q reports. A staff radiologist, with 40 years of experience, rated chest x-ray abnormalities using predetermined criteria. Setting: The study was conducted in the ICU. Patients: The first 100 consecutive patients with suspected PE who underwent a portable V/Q scan. Interventions: Those with a portable V/Q scan. Results: A normal baseline chest radiograph was found in only 6% of patients. Fifty-three percent had moderate, 24% had severe, and 10% had very-severe radiographic abnormalities. Despite the abnormal x-rays, 88% of the V/Q scans were low probability for a PE despite an average abnormal radiograph rating of moderate. A high-probability V/Q for PE was diagnosed in 3% of the population despite chest x-ray ratings of moderate to severe. Six patients had their empiric anticoagulation discontinued after obtaining the results of the V/Q scan, and no anticoagulation was started for PE after a low-probability V/Q scan. Conclusion: Despite the large percentage of moderate-to-severe x-ray abnormalities, PE can still be diagnosed (high-probability scan) in the ICU with a portable V/Q scan. Although low-probability scans do not rule out acute PE, it appeared less likely that any patient with a low-probability V/Q scan had severe hypoxemia or hemodynamic instability due to a significant PE, which was useful to clinicians and allowed them to either stop or not start anticoagulation.


Author(s):  
S. Karaali ◽  
S. Bilir ◽  
E. Yaz Gökçe ◽  
O. Plevne

Abstract We used the spectroscopic and astrometric data provided from the GALactic Archaeology with HERMES (GALAH) Data Release (DR2) and Gaia DR2, respectively, for a large sample of stars to investigate the behaviour of the [ $\alpha$ /Fe] abundances via two procedures, that is, kinematically and spectroscopically. With the kinematical procedure, we investigated the distribution of the [ $\alpha$ /Fe] abundances into the high-/low-probability thin disc, and high-/low-probability thick-disc populations in terms of total space velocity, [Fe/H] abundance, and age. The high-probability thin-disc stars dominate in all sub-intervals of [ $\alpha$ /Fe], including the rich ones: [ $\alpha$ /Fe] $\,>\,0.3$ dex, where the high-probability thick-disc stars are expected to dominate. This result can be explained by the limiting apparent magnitude of the GALAH DR2 ( $V \lt 14$ mag) and intermediate galactic latitude of the star sample. Stars in the four populations share equivalent [ $\alpha$ /Fe] and [Fe/H] abundances, total space velocities, and ages. Hence, none of these parameters can be used alone for separation of a sample of stars into different populations. High-probability thin-disc stars with abundance $-1.3 \lt {\rm[Fe/H]}\leq -0.5$ dex and age $9 \lt \tau\leq13$ Gyr are assumed to have different birth places relative to the metal-rich and younger ones. With the spectroscopic procedure, we separated the sample stars into $\alpha$ -rich and $\alpha$ -poor categories by means of their ages as well as their [ $\alpha$ /Fe] and [Fe/H] abundances. Stars older than 8 Gyr are richer in [ $\alpha$ /Fe] than the younger ones. We could estimate the abundance [ $\alpha$ /Fe] = 0.14 dex as the boundary separating the $\alpha$ -rich and $\alpha$ -poor sub-samples in the [ $\alpha$ /Fe] $\,\times\,$ [Fe/H] plane.


2012 ◽  
Vol 2012 ◽  
pp. 1-20 ◽  
Author(s):  
Mousumi Gupta ◽  
Debasish Bhattacharjee

We propose two new methods to find the solution of fuzzy goal programming (FGP) problem by weighting method. Here, the relative weights represent the relative importance of the objective functions. The proposed methods involve one additional goal constraint by introducing only underdeviation variables to the fuzzy operatorλ(resp., 1-λ), which is more efficient than some well-known existing methods such as those proposed by Zimmermann, Hannan, Tiwari, and Mohamed. Mohamed proposed that every fuzzy linear program has an equivalent weighted linear goal program where the weights are restricted as the reciprocals of the admissible violation constants. But the above proposition of Mohamed is not always true. Furthermore, the proposed methods are easy to apply in real-life situations which give better solution in the sense that the objective values are sufficiently closer to their aspiration levels. Finally, for illustration, two real examples are used to demonstrate the correctness and usefulness of the proposed methods.


1977 ◽  
Vol 5 (2) ◽  
pp. 295-304 ◽  
Author(s):  
Alice Ross Gold ◽  
Pamela G. Landerman ◽  
Kathryn Wold Bullock

Two studies were conducted that explored observers' perceptions of the responsibility of a victim for her involvement in a premeditated crime. Male and female college students listened to tapes of a purported victim describing a crime (either a rape or a mugging). Severity of crime was manipulated by having some of the crimes described as unsuccessful attempts and others as successful ones. There was a general tendency toward what we have called a sympathetic reaction pattern, that is, for victims to be assigned less responsibility the more severe the crime. This effect was strongest among those individuals who believed they had a low probability of encountering a fate similar to that of the victim. Those individuals who believed they had a high probability of encountering a fate similar to the victim's tended to make defensive attributions.


1953 ◽  
Vol 31 (5) ◽  
pp. 675-692 ◽  
Author(s):  
G. P. Thomas ◽  
D. G. Podmore

A study of decay of black cotton wood, Populus trichocarpa Torr. and Gray, revealed that, although 70 species of fungi caused decay, only six caused significant loss in living trees and that two of these, Polyporus delectans Peck and Pholiota destruens (Brand.) Quél., caused 92% of this loss. The sporophores and associated decays of certain of the fungi are described and their relative importance is indicated. Data on the relation of decay to tree age showed that, despite a high incidence of infection, the average volume of decay per infected tree was low. In general, decay proved important only to the recovery of specialty products such as plywood. A significant reduction of strength was found to occur in wood containing an early stage of decay caused by P. destruens. The practicability of segregating trees having a high or low probability of being decayed was demonstrated through the use of decay indicators.


Blood ◽  
2019 ◽  
Vol 134 (Supplement_1) ◽  
pp. 3687-3687
Author(s):  
Fabian Zanchetta-Balint ◽  
France Pirenne ◽  
Marc Michel ◽  
Armand Mekontso-Dessap ◽  
Matthieu Mahevas ◽  
...  

Background: Transfusion is a major therapeutic of sickle cell disease (SCD); however, DHTR is one of the most feared complications . Prevention of allo immunization, by extended RBC matching is insufficient to prevent all cases of DHTR. Therefore, B cell depletion therapy should be also useful, especially in previously immunized patients to avoid the emergence of new allo-antibodies. Rituximab (RTX) is used for preventing alloimmunization for patients with a history of DHTR. Therefore, secondary prevention with rituximab prior a new exposure to transfused RBCs could be a relevant option. Here, we will report our experiences of RTX use in SCD adult patients with a previous history of DHTR. Methods: In this retrospective observational study, the data from 58 consecutive RTX infusion in 44 SCD patients with history of DHTR in our French referral center for SCD were analysed. Medical, biological and blood bank records of patients, clinical signs, rate of hemoglobin A (HbA) after transfusion (TF) were collected. To evaluate the persistence of transfused RBCs, the DHTR risk probability on days 15 and 30 after TF was evaluated according to Mekontso Dessaps nomogram. We also reported serious adverse events like infections in the year after RTX infusion. In cases of programmed surgery, 1 gramme of RTX was administred at day 1 and 15 few weeks before or one injection in emergency situation, with low dose of steroides. Adjuvant measure to avoid transfusion like EPO, Iron injection and hydroxyurea was decided in some cases. Results: We analyzed 58 cases of RTX administered to 44 adult patients with SCD, 10 of whom received two or more times this drug. A transfusion (TF) was required in 33/58 cases (56%). We distinguished three groups of patients. In the first group of 21 cases (36%), rituximab was used preventively before planned surgery at risk of bleeding, only 8 cases were transfused. In the second group of 30 cases (53%) during an acute event, in 19 cases patients received a transfusion. The third group of 7 patients received RTX during an active DHTR with hyperhemolysis requiring transfusion to protect an imminent transfusion and finally 6 of them was transfused. To evaluate the efficacy of transfusion we analyzed group 1 and 2 together and separately the third group with active DHTR and hyperhemolysis. In the first and second groups, HbA measurements was not available or interpretable in 11,1% of cases. On day 15 after TF, 77,8% of cases were classified as having a low probability of hemolysis, 7.4 % as intermediate probability and 3.7% as high probability. On day 30 after TF: 55,6% were into the low probability of hemolysis subgroup, 11,1% in the intermediate probability and 22,2% in the high probability group. (Figure 1) In group 3, HbA measurement wasn't available in 2 cases. On day 15 after TF, no cases were classified as having a low probability of hemolysis, 33,3 % as intermediate probability and 33.3% as high probability. On day 30 after TF: 33,3% were in the intermediate probability and 50 % in the high probability group. (Figure 2) Infection requiring intravenous antibiotic were observed in 19 cases/58 (32.7%) with a bacterial documentation in 73,7 %. In 63% of these cases, patients have been hospitalized in intensive care unit for acute events before RTX administration and had other risk factors of infection. The median time of apparition of infection was 28 days [11.5-46.5]. We report 4 deaths (6,8%), two patient died due to a hyperhemolysis syndrome with multiorgan failure that started before RTX administration, two other were due to an end stage cancer. These deaths are not related to the use of RTX. Conclusion: This study suggests that RTX can be safely used for preventing DHTR in patients with a previous history of DHTR and detected antibodies. We show that transfusion efficiency at day 15 post TF is better than days 30 postTF. The effectiveness of TF in active DHTR with h yperhemolysis is much lower, as most patients lose the transfused units at day 30 post TF.Beyond the use of RTX, the use of other measures such as hydroxyurea and erythropoietin to avoid the need of transfusion in these patients must be emphasized. Infection risk after RTX therapy is difficult to assess. In most cases an active inflammatory event was in process. Additional prospective studies are needed to improve the management of this challenging clinical situation. Disclosures Michel: Novartis: Consultancy; Amgen: Consultancy; Rigel: Consultancy. Galactéros:Addmedica: Membership on an entity's Board of Directors or advisory committees. Bartolucci:Novartis: Membership on an entity's Board of Directors or advisory committees; AddMedica: Honoraria, Membership on an entity's Board of Directors or advisory committees; Roche: Membership on an entity's Board of Directors or advisory committees; HEMANEXT: Membership on an entity's Board of Directors or advisory committees; Global Blood Therapeutics: Membership on an entity's Board of Directors or advisory committees; Agios: Membership on an entity's Board of Directors or advisory committees.


Sign in / Sign up

Export Citation Format

Share Document