axiomatic theory
Recently Published Documents


TOTAL DOCUMENTS

160
(FIVE YEARS 23)

H-INDEX

20
(FIVE YEARS 1)

Author(s):  
Alireza Jamali

An axiomatic theory is proposed that reconciles the existence of an absolute scale for time (Planck time) and special relativity. According to this theory speed of light c becomes a variable which is proposed to be taken as the fifth dimension.


2021 ◽  
Author(s):  
Filippo A. Salustri ◽  
R. D. Venter

An axiomatic theory of engineering design information


2021 ◽  
Author(s):  
Filippo A. Salustri ◽  
R. D. Venter

An axiomatic theory of engineering design information


Author(s):  
Abiodun Babatunde Onamusi

Research question: This study assessed the combined moderating effect of organizational structure and environmental turbulence on entry mode strategy - performance link focusing on baby care industry in Lagos State, Nigeria. Motivation: The manufacturers of baby care products in Nigeria have struggled to understand the complexities of entry mode strategy and how to use firm-specific capabilities to contain the threats from new entrants given the idiosyncrasies of the business environment in Nigeria. Also, considering the fast changing business environment and the need for firms to align internal organisational structure to manage the external environmental challenges, this study via the supposition of Hage (1965) axiomatic theory of organisations examined the joint moderating effect of organisational structure and environmental turbulence on entry mode strategy - performance linkage focusing on the baby care industry in Lagos State, Nigeria. Idea: Empirical submissions on the combined moderating effect of organizational structure and environmental turbulence on the interactions of entry mode strategy and organizational performance are sparse. Hence, this study addressed this gap in literature. Data: The survey research design and a sample of 518 employees engaged in FMCG manufacturers in Lagos State, Nigeria were adopted for this study. Tools: A validated structured questionnaire was the instrument of data collection for this study and the hierarchical regression analysis was adopted to test the hypotheses formulated. Findings: The results showed that the interaction between entry mode strategy and firm performance was positive and significant. Further analysis revealed that the interaction term of organizational structure and environmental turbulence accounted for the rise in firm performance to suggest that organizational structure and environmental turbulence are joint significant moderators. This suggests that entry mode strategy appropriateness is key to firm performance and that the fit between organisational structure and the macro-environment is a precondition to higher performance. Contribution: This study adds to recent empirical literature on the link between entry mode strategy, organizational structure, environmental turbulence, and firm performance within emerging economy context, and it provides additional support for the assumptions of the eclectic theory and Hage’s axiomatic theory of organization.


PARADIGMA ◽  
2021 ◽  
pp. 221-256
Author(s):  
Manfred Borovcnik

In this paper, we analyse the various meanings of probability and its different applications, and we focus especially on the classical, the frequentist, and the subjectivist view. We describe the different problems of how probability can be measured in each of the approaches, and how each of them can be well justified by a mathematical theory. We analyse the foundations of probability, where the scientific analysis of the theory that allows for a frequentist interpretation leads to unsolvable problems. Kolmogorov’s axiomatic theory does not suffice to establish statistical inference without further definitions and principles. Finally, we show how statistical inference essentially determines the meaning of probability and a shift emerges from purely objectivist views to a complementary conception of probability with frequentist and subjectivist constituents. For didactical purpose, the result of the present analyses explains basic problems of teaching, originating from a biased focus on frequentist aspects of probability. It also indicates a high priority for the design of suitable learning paths to a complementary conception of probability. In the applications, modellers use information in a pragmatic way processing this information regardless of its connotation into formal mathematical models, which are always thought as essentially wrong but useful.


Author(s):  
Jae Kyu Lee ◽  
Jinsoo Park ◽  
Shirley Gregor ◽  
Victoria Yoon

Axiomatic Theories and Improving the Relevance of Information Systems Research This paper examines the fact that a significant number of empirical information systems (IS) studies engage in confirmative testing of self-evident axiomatic theories without yielding highly relevant knowledge for the IS community. The authors conduct both a horizontal analysis of 72 representative IS theories and an in-depth vertical analysis of 3 well-known theories (i.e., technology acceptance model, diffusion of innovation theory, and institutional theory) in order to measure how pervasive such testing of axiomatic theories is. The authors discovered that more than 60% of 666 hypotheses from the horizontal analysis could be regarded as axiomatic theory elements. In the vertical analysis, 68.1% of 1,301 hypotheses from 148 articles were axiomatic. Based on these findings, the authors propose four complementary IS research approaches: (1) identifying disconfirming boundary conditions, (2) measuring the relative importance of axiomatic causal factors, (3) measuring the stage of progression toward visionary goals when the nature of the axiomatic theory can be extended to future visions, and (4) engaging in the conceptual design of visionary axiomatic goals. They argue that these complementary IS research approaches can enhance the relevance of IS research outcomes without sacrificing methodological rigor.


2021 ◽  
Author(s):  
Andrey Shishkin

Contains an exposition of the basic concepts and theorems of the axiomatic theory of the basic elementary functions of real and complex variables. The textbook is written on the basis of lectures given by the author for a number of years at the Armavir State Pedagogical University, at the Slavyansk-on-Kuban State Pedagogical Institute and at the branch of the Kuban State University in Slavyansk-on-Kuban. It is intended for students of natural-mathematical profiles of preparation of the direction "Pedagogical education". It can be used in the study of mathematical analysis, the theory of functions of a real variable, the theory of functions of a complex variable, etc.


2021 ◽  
Vol 31 ◽  
Author(s):  
MAX S. NEW ◽  
DANIEL R. LICATA ◽  
AMAL AHMED

Abstract Gradually typed languages are designed to support both dynamically typed and statically typed programming styles while preserving the benefits of each. Sound gradually typed languages dynamically check types at runtime at the boundary between statically typed and dynamically typed modules. However, there is much disagreement in the gradual typing literature over how to enforce complex types such as tuples, lists, functions and objects. In this paper, we propose a new perspective on the design of runtime gradual type enforcement: runtime type casts exist precisely to ensure the correctness of certain type-based refactorings and optimizations. For instance, for simple types, a language designer might desire that beta-eta equality is valid. We show that this perspective is useful by demonstrating that a cast semantics can be derived from beta-eta equality. We do this by providing an axiomatic account program equivalence in a gradual cast calculus in a logic we call gradual type theory (GTT). Based on Levy’s call-by-push-value, GTT allows us to axiomatize both call-by-value and call-by-name gradual languages. We then show that we can derive the behavior of casts for simple types from the corresponding eta equality principle and the assumption that the language satisfies a property called graduality, also known as the dynamic gradual guarantee. Since we can derive the semantics from the assumption of eta equality, we also receive a useful contrapositive: any observably different cast semantics that satisfies graduality must violate the eta equality. We show the consistency and applicability of our axiomatic theory by proving that a contract-based implementation using the lazy cast semantics gives a logical relations model of our type theory, where equivalence in GTT implies contextual equivalence of the programs. Since GTT also axiomatizes the dynamic gradual guarantee, our model also establishes this central theorem of gradual typing. The model is parameterized by the implementation of the dynamic types, and so gives a family of implementations that validate type-based optimization and the gradual guarantee.


Author(s):  
Tabithah Kabui Kimani ◽  
Paul Omato Gesimba ◽  
David Gichuhi

There are visible challenges with employee development at Telkom Kenya where employees remain in one position for many years with some remaining in the same position until their retirement. Anecdotal evidence suggests that this problem could be linked to the design of the organization, but no systematic study has been conducted to ascertain this claim. This study, therefore, sought to determine the influence of work specialization on employee development at Telkom Kenya Nakuru Branch. The study was guided by the Axiomatic Theory of Organizations. It made use of the correlational research design where all the 51 employees of the Telkom Kenya Nakuru Branch were involved in the census study. Questionnaires were used to collect quantitative data from operational staffs while interview guides were used to gather qualitative data from heads of departments. Quantitative data were analyzed using frequencies, percentages, means, and the multiple regression techniques while qualitative data were analyzed using the thematic content analysis technique. Results showed that work specialization (β= 0.491, p=.000) had a statistically significant and positive influence on the development of employees at TKNB. Based on the findings, the study recommends that the management at TKNB should enhance work specialization by implementing job rotation programs and promoting a match between employees’ skills and their job.


2020 ◽  
Vol 07 (02) ◽  
pp. 155-181
Author(s):  
Selmer Bringsjord ◽  
G. Naveen Sundar

We provide an overview of the theory of cognitive consciousness (TCC), and of [Formula: see text]; the latter provides a means of measuring the amount of cognitive consciousness present in a given cognizer, whether natural or artificial, at a given time, along a number of different dimensions. TCC and [Formula: see text] stand in stark contrast to Tononi’s Integrated information Theory (IIT) and [Formula: see text]. We believe, for reasons we present, that the former pair is superior to the latter. TCC includes a formal axiomatic theory, [Formula: see text], the 12 axioms of which we present and briefly comment upon herein; no such formal theory accompanies IIT/[Formula: see text]. TCC/[Formula: see text] and IIT/[Formula: see text] each offer radically different verdicts as to whether and to what degree AIs of yesterday, today, and tomorrow were/are/will be conscious. Another noteworthy difference between TCC/[Formula: see text] and IIT/[Formula: see text] is that the former enables the measurement of cognitive consciousness in those who have passed on, and in fictional characters; no such enablement is remotely possible for IIT/[Formula: see text]. For instance, we apply [Formula: see text] to measure the cognitive consciousness of: Descartes; and the first fictional detective to be described on Earth (by Edgar Allen Poe), Auguste Dupin. We also apply [Formula: see text] to compute the cognitive consciousness of an artificial agent able to make ethical decisions using the Doctrine of Double Effect.


Sign in / Sign up

Export Citation Format

Share Document