theory of measurement
Recently Published Documents


TOTAL DOCUMENTS

289
(FIVE YEARS 31)

H-INDEX

29
(FIVE YEARS 1)

2022 ◽  
pp. 55-76
Author(s):  
Marcelo de Carvalho Alves ◽  
Luciana Sanches

2021 ◽  
Vol 11 (3) ◽  
Author(s):  
Elina Vessonen

AbstractThe Representational Theory of Measurement (RTM) is the best known account of the kind of representation measurement requires. However, RTM has been challenged from various angles, with critics claiming e.g. that RTM fails to account for actual measurement practice and that it is ambiguous about the nature of measurable attributes. In this paper I use the critical literature on RTM to formulate Representation Minimalism – a characterization of what measurement-relevant representation requires at the minimum. I argue that Representation Minimalism avoids the main problems with RTM while acknowledging its usefulness as the formal foundation of representation in measurement.


2021 ◽  
Vol 2 (2) ◽  
pp. 111-119
Author(s):  
Joko Syahputra ◽  
Alex Rikki

Thesis is a requirement to obtain undergraduate status (S1) in every university in Indonesia. The term thesis as an undergraduate final project is only used in Indonesia. To complete the thesis, students are required to apply what they learn into an application or application in certain fields, both learning and in the form of computer applications. Analytical hierarchy process (AHP) is a general theory of measurement used for scale ratios, both from discrete and continuous pairwise comparisons. Analytical hierarchy process (AHP) can simplify complex and unstructured, strategic and dynamic problems into its parts, and make variables in a hierarchy (level) the Analytical hierarchical process (AHP) method has the ability to solve multi-objective problems based on comparisons preferences of each element and hierarchy.


Entropy ◽  
2021 ◽  
Vol 23 (1) ◽  
pp. 121
Author(s):  
Claudio Garola

Most scholars maintain that quantum mechanics (QM) is a contextual theory and that quantum probability does not allow for an epistemic (ignorance) interpretation. By inquiring possible connections between contextuality and non-classical probabilities we show that a class TμMP of theories can be selected in which probabilities are introduced as classical averages of Kolmogorovian probabilities over sets of (microscopic) contexts, which endows them with an epistemic interpretation. The conditions characterizing TμMP are compatible with classical mechanics (CM), statistical mechanics (SM), and QM, hence we assume that these theories belong to TμMP. In the case of CM and SM, this assumption is irrelevant, as all of the notions introduced in them as members of TμMP reduce to standard notions. In the case of QM, it leads to interpret quantum probability as a derived notion in a Kolmogorovian framework, explains why it is non-Kolmogorovian, and provides it with an epistemic interpretation. These results were anticipated in a previous paper, but they are obtained here in a general framework without referring to individual objects, which shows that they hold, even if only a minimal (statistical) interpretation of QM is adopted in order to avoid the problems following from the standard quantum theory of measurement.


2021 ◽  
Vol 1 (1) ◽  
pp. 18-27
Author(s):  
L. S. Zvyagin ◽  

The development of the general theory of measurement stimulated the expansion of metrological requirements and the nomenclature of data quality indicators, as well as the emergence of new types of measurements, metrological certification and control, in particular, metrological algorithms, models, objects and measurement conditions. As a rule, the practice of modern measurement problems is accompanied by complex experimental conditions associated with the presence of significant a priori uncertainty about the properties of objects and factors affecting the environment of their functioning, the relationship between them, inaccuracies and incompleteness of experimental information, the unavailability of direct observation of many properties of objects or influencing factors, which distinguishes the cognitive function of the methodology for solving them from the fundamental one. Therefore, the formulation of these problems as measurements increases the role of the cognitive function of measurements and requires the results of their solutions in the form of knowledge (analytical expressions for models, as well as conclusions and solutions) based on the entire volume of a priori information and information obtained during the measurement experiment, including nonnumerical information.


2020 ◽  
Vol 2 (4) ◽  
pp. 600-616
Author(s):  
Andrea Oldofredi

It is generally accepted that quantum mechanics entails a revision of the classical propositional calculus as a consequence of its physical content. However, the universal claim according to which a new quantum logic is indispensable in order to model the propositions of every quantum theory is challenged. In the present essay, we critically discuss this claim by showing that classical logic can be rehabilitated in a quantum context by taking into account Bohmian mechanics. It will be argued, indeed, that such a theoretical framework provides the necessary conceptual tools to reintroduce a classical logic of experimental propositions by virtue of its clear metaphysical picture and its theory of measurement. More precisely, it will be shown that the rehabilitation of a classical propositional calculus is a consequence of the primitive ontology of the theory, a fact that is not yet sufficiently recognized in the literature concerning Bohmian mechanics. This work aims to fill this gap.


2020 ◽  
pp. 32-78
Author(s):  
Pieter Adriaans

A computational theory of meaning tries to understand the phenomenon of meaning in terms of computation. Here we give an analysis in the context of Kolmogorov complexity. This theory measures the complexity of a data set in terms of the length of the smallest program that generates the data set on a universal computer. As a natural extension, the set of all programs that produce a data set on a computer can be interpreted as the set of meanings of the data set. We give an analysis of the Kolmogorov structure function and some other attempts to formulate a mathematical theory of meaning in terms of two-part optimal model selection. We show that such theories will always be context dependent: the invariance conditions that make Kolmogorov complexity a valid theory of measurement fail for this more general notion of meaning. One cause is the notion of polysemy: one data set (i.e., a string of symbols) can have different programs with no mutual information that compresses it. Another cause is the existence of recursive bijections between ℕ and ℕ2 for which the two-part code is always more efficient. This generates vacuous optimal two-part codes. We introduce a formal framework to study such contexts in the form of a theory that generalizes the concept of Turing machines to learning agents that have a memory and have access to each other’s functions in terms of a possible world semantics. In such a framework, the notions of randomness and informativeness become agent dependent. We show that such a rich framework explains many of the anomalies of the correct theory of algorithmic complexity. It also provides perspectives for, among other things, the study of cognitive and social processes. Finally, we sketch some application paradigms of the theory.


Entropy ◽  
2020 ◽  
Vol 22 (10) ◽  
pp. 1147
Author(s):  
Valentin Lychagin ◽  
Mikhail Roop

We present the development of the approach to thermodynamics based on measurement. First of all, we recall that considering classical thermodynamics as a theory of measurement of extensive variables one gets the description of thermodynamic states as Legendrian or Lagrangian manifolds representing the average of measurable quantities and extremal measures. Secondly, the variance of random vectors induces the Riemannian structures on the corresponding manifolds. Computing higher order central moments, one drives to the corresponding higher order structures, namely the cubic and the fourth order forms. The cubic form is responsible for the skewness of the extremal distribution. The condition for it to be zero gives us so-called symmetric processes. The positivity of the fourth order structure gives us an additional requirement to thermodynamic state.


Sign in / Sign up

Export Citation Format

Share Document