The crimes of first-time offenders: same or different from the crimes of habitual criminals?

2019 ◽  
Vol 10 (1) ◽  
pp. 1-15
Author(s):  
Glenn D. Walters

Purpose The purpose of this paper is to illustrate how first-time offenders and habitual criminals, while displaying wide differences in offense frequency, appear to follow a similar pattern in committing crime. Design/methodology/approach A conceptual approach is adopted in this paper. Findings It is argued that criminal thinking is the common denominator in both patterns, the difference being that habitual criminals have a higher resting level of proactive and reactive criminal thinking than first-time offenders. With an earlier age of onset, the habitual criminal may be more impulsive and reactive than first-time offenders, which partially explains why most low-rate offenders are not identified until adulthood. Practical implications Because actual and perceived deterrents to crime correlate weakly, if at all, it is recommended that perceived environmental events and criminal thinking be the primary targets of prevention and intervention programs. Social implications Environmental stimuli, such as events that produce general strain, increase opportunities for crime, reinforce criminal associations, irritate the individual and interfere with the deterrent effect of perceived certainty, can both augment and interact with criminal thinking to increase the likelihood of a criminal act in both first-time offenders and habitual criminals. Originality/value The unique aspect of this paper is that it illustrates that certain features of crime and criminality are found across offending levels, whereas other features are more specific to a particular level.

2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Helen Kathryn Cyrus

Purpose Overview of coaching for recovery. The paper aims to show an overview of work that was carried out over 11 years with groups of mental health and physical staff. As the facilitator who had run this course for the duration in Nottingham, this was an excellent opportunity to be at the forefront of a brand new project. Design/methodology/approach The introduction of the skills are taught over two consecutive days followed by a further day a month later. The idea of coaching is to be enabled to find the answers in themselves by the use of powerful questions and using the technique of the grow model, combined with practice enables the brain to come up with its own answers. Using rapport and enabling effective communication to deliver the outcome. Findings Evidence from staff/clients and the purpose of the paper shows that when you step back it allows the individual patients/staff to allow the brain to process to create to come up with their solutions, which then helps them to buy into the process and creates ownership. Research limitations/implications The evidence suggests that the approach that was there prior to the course was very much a clinical approach to working with clients and treating the person, administering medication and not focussing on the inner person or personal recovery. The staff review has shown that in the clinical context change is happening from the inside out. Practical implications “Helps change culture”; “change of work practice”; “it changed staff focus – not so prescriptive”; “powerful questions let clients come to their own conclusions”; “coaching gives the ability to find half full. Helps to offer reassurance and to find one spark of hope”. Social implications This has shown that the approach is now person-centred/holistic. This has been the “difference that has made the difference”. When this paper looks at the issues from a different angle in this case a coaching approach, applying technique, knowledge and powerful questions the results have changed. The same clients, same staff and same problems but with the use of a different approach, there is the evidence of a different outcome, which speaks for itself. The coaching method is more facilitative, therefore it illicit’s a different response, and therefore, result. Originality/value The results/evidence starts with the individual attending and their commitment to the process over the two-day course. Then going away for the four weeks/six for managers and a commitment again to practice. Returning to share the impact if any with the group. This, in turn, helps to inspire and gain motivation from the feedback to go back to work invigorated to keep going.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Mark Edward Phillips ◽  
Hannah Tarver

Purpose This study furthers metadata quality research by providing complementary network-based metrics and insights to analyze metadata records and identify areas for improvement. Design/methodology/approach Metadata record graphs apply network analysis to metadata field values; this study evaluates the interconnectedness of subjects within each Hub aggregated into the Digital Public Library of America. It also reviews the effects of NACO normalization – simulating revision of values for consistency – and breaking up pre-coordinated subject headings – to simulate applying the Faceted Application of Subject Terminology to Library of Congress Subject Headings. Findings Network statistics complement count- or value-based metrics by providing context related to the number of records a user might actually find starting from one item and moving to others via shared subject values. Additionally, connectivity increases through the normalization of values to correct or adjust for formatting differences or by breaking pre-coordinated subject strings into separate topics. Research limitations/implications This analysis focuses on exact-string matches, which is the lowest-common denominator for searching, although many search engines and digital library indexes may use less stringent matching methods. In terms of practical implications for evaluating or improving subjects in metadata, the normalization components demonstrate where resources may be most effectively allocated for these activities (depending on a collection). Originality/value Although the individual components of this research are not particularly novel, network analysis has not generally been applied to metadata analysis. This research furthers previous studies related to metadata quality analysis of aggregations and digital collections in general.


2020 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Isabel María Parra Oller ◽  
Salvador Cruz Rambaud ◽  
María del Carmen Valls Martínez

PurposeThe main purpose of this paper is to determine the discount function which better fits the individuals' preferences through the empirical analysis of the different functions used in the field of intertemporal choice.Design/methodology/approachAfter an in-depth revision of the existing literature and unlike most studies which only focus on exponential and hyperbolic discounting, this manuscript compares the adjustment of data to six different discount functions. To do this, the analysis is based on the usual statistical methods, and the non-linear least squares regression, through the algorithm of Gauss-Newton, in order to estimate the models' parameters; finally, the AICc method is used to compare the significance of the six proposed models.FindingsThis paper shows that the so-called q-exponential function deformed by the amount is the model which better explains the individuals' preferences on both delayed gains and losses. To the extent of the authors' knowledge, this is the first time that a function different from the general hyperbola fits better to the individuals' preferences.Originality/valueThis paper contributes to the search of an alternative model able to explain the individual behavior in a more realistic way.


Kybernetes ◽  
2008 ◽  
Vol 37 (3/4) ◽  
pp. 453-457 ◽  
Author(s):  
Wujia Zhu ◽  
Yi Lin ◽  
Guoping Du ◽  
Ningsheng Gong

PurposeThe purpose is to show that all uncountable infinite sets are self‐contradictory non‐sets.Design/methodology/approachA conceptual approach is taken in the paper.FindingsGiven the fact that the set N={x|n(x)} of all natural numbers, where n(x)=df “x is a natural number” is a self‐contradicting non‐set in this paper, the authors prove that in the framework of modern axiomatic set theory ZFC, various uncountable infinite sets are either non‐existent or self‐contradicting non‐sets. Therefore, it can be astonishingly concluded that in both the naive set theory or the modern axiomatic set theory, if any of the actual infinite sets exists, it must be a self‐contradicting non‐set.Originality/valueThe first time in history, it is shown that such convenient notion as the set of all real numbers needs to be reconsidered.


2011 ◽  
Vol 55 (5) ◽  
pp. 1906-1911 ◽  
Author(s):  
Gabriel Cabot ◽  
Alain A. Ocampo-Sosa ◽  
Fe Tubau ◽  
María D. Macia ◽  
Cristina Rodríguez ◽  
...  

ABSTRACTThe prevalence and impact of the overexpression of AmpC and efflux pumps were evaluated with a collection of 190Pseudomonas aeruginosaisolates recovered from bloodstream infections in a 2008 multicenter study (10 hospitals) in Spain. The MICs of a panel of 13 antipseudomonal agents were determined by microdilution, and the expressions ofampC,mexB,mexY,mexD, andmexFwere determined by real-time reverse transcription (RT)-PCR. Up to 39% of the isolates overexpressed at least one of the mechanisms.ampCoverexpression (24.2%) was the most prevalent mechanism, followed bymexY(13.2%),mexB(12.6%),mexF(4.2%), andmexD(2.2%). The overexpression ofmexBplusmexY, documented for 5.3% of the isolates, was the only combination showing a significantly (P= 0.02) higher prevalence than expected from the frequencies of the individual mechanisms (1.6%). Additionally, all imipenem-resistant isolates studied (25 representative isolates) showed inactivating mutations inoprD. Most of the isolates nonsusceptible to piperacillin-tazobactam (96%) and ceftazidime (84%) overexpressedampC, whilemexB(25%) andmexY(29%) overexpressions gained relevance among cefepime-nonsusceptible isolates. Nevertheless, the prevalence ofmexYoverexpression was highest among tobramycin-nonsusceptible isolates (37%), and that ofmexBwas highest among meropenem-nonsusceptible isolates (33%). Regarding ciprofloxacin-resistant isolates, besides the expected increased prevalence of efflux pump overexpression, a highly significant link toampCoverexpression was documented for the first time: up to 52% of ciprofloxacin-nonsusceptible isolates overexpressedampC, sharply contrasting with the 24% documented for the complete collection (P< 0.001). In summary, mutation-driven resistance was frequent inP. aeruginosaisolates from bloodstream infections, whereas metallo-β-lactamases, detected in 2 isolates (1%) producing VIM-2, although with increasing prevalences, were still uncommon.


2018 ◽  
Vol 14 (S343) ◽  
pp. 487-488
Author(s):  
Ernst Paunzen ◽  
Jan Janík ◽  
Petr Kurfürst ◽  
Jiří Liška ◽  
Martin Netopil ◽  
...  

AbstractThe a-index samples the flux of the 5200 Å region by comparing the flux at the center with the adjacent regions. The final intrinsic peculiarity index Δa was defined as the difference between the individual a-values and the a-values of normal stars of the same colour (spectral type). Here we present, for the first time, a case study to detect and analyse Asymptotic Giant Branch (AGB) stars in the Magellanic Clouds. For this, we use our photometric survey of the Magellanic Clouds within the a-index. We find that AGB stars can be easily detected on the basis of their Δa index in an efficient way.


2018 ◽  
Vol 120 (12) ◽  
pp. 2843-2856 ◽  
Author(s):  
Daniel Leufkens

Purpose For a long time the European geographical indication (GI) regulation has been of great interest to economists and policymakers. To justify exclusive European regulation it is necessary to prove the positive value of a GI quality signal (i.e. label), which is often achieved by quantifying its monetary value for the consumers. But even though a large number of literary contributions already deal with this question, they lack the evaluation of overall effect sizes for the GI label. The purpose of this paper is, therefore, to quantify and evaluate the overall marginal consumer willingness to pay for the European GI label. Design/methodology/approach To reach this aim, a meta-analysis is used for which a literature survey had been carried out in order to determine the GI label effects (LEs). In addition to previous works, this paper not only includes a meta-analysis, but also implements a heterogeneity analysis to distinguish between the LEs of individual GI standards. To eliminate study- and product-specific determinants of heterogeneity, moderator variables are used. Findings The empirical results indicate that consumers have a highly significant and positive marginal willingness to pay for GIs. However, the marginal willingness to pay differs significantly between the individual GI standards and indicates great heterogeneity between the protected products. Originality/value As an extension to previous studies and meta-analysis; this paper includes the most extensive GIs meta-data set so far, and conducts for the first time an independent heterogeneity analysis to distinguish between the LEs of individual GI standards and implements a moderator analysis to eliminate study- and product-specific determinants of heterogeneity from the GI effects.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
André de Waal

Purpose The purpose of this paper is to arrive at a general definition of an HPO and a (practical) way to measure an HPO. Managers are looking for techniques to strengthen their organizations in a way that they cannot only cope with threats but could also quickly take advantage of opportunities, and thus, grow and thrive. The academic and especially the practitioner fields reacted on this “thirst for high performance knowledge” with a plethora of books and articles on the topic of high performance organizations (HPOs). These publications each came with their own description and measurement of HPOs, which created a lot of confusion among practitioners. Design/methodology/approach In this study the following reserach question is answered: how can an HPO be defined and its performance measured? So that with the answer, this paper can take away the aforementioned confusion. This paper does this by conducting an extensive systematic review of the literature on HPO, after which this paper synthesizes the findings into a proposal on how to define and measure the HPO. Findings This paper was able to obtain from the literature a list of definitions and measurements for an HPO. The common denominator in these definitions and measurements turned out to be respondents given their opinion on the effects of the organizational practices they apply on organizational performance vis-à-vis that of competitors. This paper concluded therefore that an HPO should be defined and measured relative to competitors and should be based on the perception of managers and employees of the organization: An HPO is an organization that achieves results that are better than those of its peer group over a longer period of time. Research limitations/implications With the answer on the research question, this paper fills the current gap in the definition and measurement literature on HPOs, and thus, has moved the research into HPOs forward, as researchers can use these research results in their future studies on high performance and HPOs. Originality/value Although there is a plethora of literature on high performance and HPOs no univocal definition and measurement of the HPO can be found. This study provides for the first time an academically well-founded definition and measurement method.


2017 ◽  
Vol 7 (1) ◽  
pp. 114-130 ◽  
Author(s):  
Yiming Hu ◽  
Ying Yang ◽  
Pengfei Han

Purpose The purpose of this paper is to examine the difference of credit enhancement of variously secured bonds issued by local government financing platform bond (LGFPB). Design/methodology/approach The approaches to secure the bonds usually include mortgage, collateral, guarantee, etc. Findings Using a sample of LGFPBs issued during the 2007-2013 period, the authors find that all of the approaches to secure the bonds would increase the bond rating and that compounded approaches have a higher credit enhancement effect than single approaches. Among these approaches, the requirement of collateral has the strongest enhancement effect. Moreover, the authors find that the guarantee provided by a state-owned bank or enterprise increases the bond rating more than the guarantee provided by other local government financing platforms. Research limitations/implications The findings in this study suggest that the credit enhancement would be deeply affected by the approach used to secure the bond. Practical implications These results can help the local government make better decisions when issuing bond. Originality/value This study empirically analyzes the different credit enhancement approaches for securing LGPFBs for the first time and contributes to the literature regarding credit ratings of local government bonds.


2020 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Hossein Derakhshanfar ◽  
J. Jorge Ochoa ◽  
Konstantinos Kirytopoulos ◽  
Wolfgang Mayer ◽  
Craig Langston

PurposeThe purpose of this research is to identify the most impactful delay risks in Australian construction projects, including the associations amongst those risks as well as the project phases in which they are most likely present. The correlation between project and organisational characteristics with the impact of delay risks was also studied.Design/methodology/approachA questionnaire survey was used to collect data from 118 delayed construction projects in Australia. Data were analysed to rank the most impactful delay risks, their correlation to project and organisational characteristics and project phases where those risks are likely to emerge. Association rule learning was used to capture associations between the delay risks.FindingsThe top five most impactful delay risks in Australia were changes by the owner, slow decisions by the owner, preparation and approval of design drawings, underestimation of project complexity and unrealistic duration imposed to the project, respectively. There is a set of delay risks that are mutually associated with project complexity. In addition, while delay risks associated with resources most likely arise in the execution phase, stakeholder and process-related risks are more smoothly distributed along all the project phases.Originality/valueThis research for the first time investigated the impact of delay risks, associations amongst them and project phases in which they are likely to happen in the Australian context. Also, this research for the first time sheds light on the project phases for the individual project delay risks which aids the project managers to understand where to focus on during each phase of the project.


Sign in / Sign up

Export Citation Format

Share Document