theoretical foundation
Recently Published Documents


TOTAL DOCUMENTS

2123
(FIVE YEARS 624)

H-INDEX

56
(FIVE YEARS 7)

2022 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Eden Yin ◽  
Abeer Mahrous

PurposeDespite the growing importance of workplace spirituality, organisations have been reluctant to integrate spirituality into their workplaces; this paper discusses how to integrate spirituality into the workplace.Design/methodology/approachThis is a theoretical paper that builds its arguments on the synthesis of workplace spirituality and contemporary management paradigms.FindingsThe study argues that workplace spirituality is an extremely important driving force for the sustainable and healthy growth of any organisation; however, infusing workplace spirituality into companies in the industrial and digital eras would be a futile effort, as industrial organisations are built on an ethos highly incongruent with spiritual principles. Therefore, in the post-digital era, spirituality-driven organisations (SDOs) will emerge, marking the beginning of a true “spiritual paradigm” for business and human society at large. The study also elaborates on the characteristics of the post-digital era and the nature of SDOs.Originality/valueWorkplace spirituality has been a research topic for years but has never gained sufficient momentum. The Covid-19 global pandemic has made workplace spirituality a more pertinent issue on corporate agendas. Therefore, this paper provides the theoretical foundation to embed workplace spirituality in contemporary management thoughts and practices.


2022 ◽  
Vol 6 (POPL) ◽  
pp. 1-24
Author(s):  
Wenlei He ◽  
Julián Mestre ◽  
Sergey Pupyrev ◽  
Lei Wang ◽  
Hongtao Yu

Profile-guided optimization (PGO) is an important component in modern compilers. By allowing the compiler to leverage the program’s dynamic behavior, it can often generate substantially faster binaries. Sampling-based profiling is the state-of-the-art technique for collecting execution profiles in data-center environments. However, the lowered profile accuracy caused by sampling fully optimized binary often hurts the benefits of PGO; thus, an important problem is to overcome the inaccuracy in a profile after it is collected. In this paper we tackle the problem, which is also known as profile inference and profile rectification . We investigate the classical approach for profile inference, based on computing minimum-cost maximum flows in a control-flow graph, and develop an extended model capturing the desired properties of real-world profiles. Next we provide a solid theoretical foundation of the corresponding optimization problem by studying its algorithmic aspects. We then describe a new efficient algorithm for the problem along with its implementation in an open-source compiler. An extensive evaluation of the algorithm and existing profile inference techniques on a variety of applications, including Facebook production workloads and SPEC CPU benchmarks, indicates that the new method outperforms its competitors by significantly improving the accuracy of profile data and the performance of generated binaries.


Author(s):  
Maximilian Paul Niroomand ◽  
Conor T Cafolla ◽  
John William Roger Morgan ◽  
David J Wales

Abstract One of the most common metrics to evaluate neural network classifiers is the area under the receiver operating characteristic curve (AUC). However, optimisation of the AUC as the loss function during network training is not a standard procedure. Here we compare minimising the cross-entropy (CE) loss and optimising the AUC directly. In particular, we analyse the loss function landscape (LFL) of approximate AUC (appAUC) loss functions to discover the organisation of this solution space. We discuss various surrogates for AUC approximation and show their differences. We find that the characteristics of the appAUC landscape are significantly different from the CE landscape. The approximate AUC loss function improves testing AUC, and the appAUC landscape has substantially more minima, but these minima are less robust, with larger average Hessian eigenvalues. We provide a theoretical foundation to explain these results. To generalise our results, we lastly provide an overview of how the LFL can help to guide loss function analysis and selection.


2022 ◽  
Vol 23 (4) ◽  
pp. 1041-1050
Author(s):  
N. A. Kurakina ◽  
I. S. Achinovich

Phono-stylistics is a promising research area. Expressive power of a text depends on its phonetic imagery. The research objective was to identify the pragmatic features of phonic expressive means in translations of contemporary English poetry. The methods included a comparative analysis, phono-semantic and phono-stylistic interpretation of the original poems and their translations, and O. N. Tynyanov's law of versification. The method of sound counting developed by E. V. Elkina and L. S. Yudina was used to calculate the frequency of sounds in the context of phono-semantic analysis in the Russian translations. The method of sound counting designed by Tsoi Vi Chuen Thomas was used to calculate the frequency of sounds in the original English texts. The theoretical foundation of the research was formed by the works by M. A. Balash, G. V. Vekshin, Z. S. Dotmurzieva, V. N. Elkina, A. P. Zhuravlev, L. V. Laenko, F. Miko, L. P. Prokofyeva, E. A. Titov, etc. The study featured the phonics and pragmatics of S. Dugdale’s poem Zaitz and its three translations made by E. Tretyakova, A. Shchetinina, and M. Vinogradova, and C. E. Duffy’s Anne Hathaway translated by Yu. Fokina. The author compared the pragmatics of sound imagery in the English originals and their Russian translations. The research made it possible to define the role of sound imagery in the poetic discourse, as well as the relationship between the sound organization of poetic speech and the pragmatic value at the phonographic level. The results can be used in courses of translation, stylistics, and phonetics.


2022 ◽  
pp. 205556362110616
Author(s):  
Katri Nousiainen

We need law and economics to do the scientific measurement necessary for legal design to be seen as on the stage of science. Law and economics—which is the application of economic theory, especially microeconomic theory, to the analysis and the practice of law--is a valid tool and approach to reflect on what should be empirically investigated in the practice of legal design. The neoclassical (mainstream) theoretical foundation of economic analysis of law is, however, at times far from reality as it often predicts uncooperative and even selfish behaviour. In real life people do cooperate, have empathy, emotions and even behave in an altruistic way. For those reasons, behavioural law and economics and conventional wisdom are needed to complement the teachings from standard theory in the field of commercial contracting.


Author(s):  
Nahúm Misael Tórrez

Textbooks hold a fundamental position in English Language Teaching (ELT). Today, their main aim is to contribute to the development of the learner’s communicative competence. This paper sets out to set the basis for constructing a framework for characterizing ELT textbooks, in terms of their opportunities to promote communicative competence. In order to provide a theoretical foundation for the framework, it first introduces the notions of input (Krashen, 1989) and output (Swain & Lapkin, 1995). Then, it presents two influential models of communicative competence, i.e., those of Canale and Swain (1980), and the Common European Framework for Reference of Languages (Council of Europe, 2001, 2018). Following that, it presents two significantly quoted sets of principles for the study of learning materials in Communicative Language Teaching (CLT), i.e., the principles of Richards and Rodgers (2014) and Nation (2007). Building on the models and principles, the paper suggests eleven criteria for characterizing communication-oriented ELT textbooks, covering input in the form of topics and texts, and output in the form of activities. A short discussion of the main affordances of the suggested framework is provided at the end of the article.   Keywords: Communicative Competence, ELT Textbooks, Textbook Analysis, Communicative Language Teaching (CLT).  


2022 ◽  
Author(s):  
Shubin Liu ◽  
Shujing Zhong ◽  
Xin He ◽  
Siyuan Liu ◽  
Bin Wang ◽  
...  

Chemical bonds and noncovalent interactions are extraordinarily important concepts in chemistry and beyond. Using density-based quantities to describe them has a long history in the literature, yet none can satisfactorily describe the entire spectrum of interactions from strong chemical bonds to weak van der Waals forces. In this work, employing Pauli energy as the theoretical foundation, we fill in that knowledge gap. Our results show that the newly established density-based index can describe single and multiple covalent bonds, ionic bonds, metallic bonds, and different kinds of noncovalent interactions, all with unique and readily identifiable signature shapes. Two new descriptors, NBI (nonbonding and bonding identification) index and USI (ultra-strong interaction) index, have been introduced in this work. Together with NCI (noncovalent interaction) and SCI (strong covalent interaction) indexes already available in the literature, a density-based description of both chemical bonds and noncovalent interactions is accomplished.


Sign in / Sign up

Export Citation Format

Share Document