ratio scales
Recently Published Documents


TOTAL DOCUMENTS

69
(FIVE YEARS 12)

H-INDEX

19
(FIVE YEARS 1)

SAGE Open ◽  
2022 ◽  
Vol 12 (1) ◽  
pp. 215824402110684
Author(s):  
Zhonggen Yu ◽  
Mingle Gao

Although the flipped pedagogical approach has been exhaustively explored, the effect of video length remains sparsely studied. Through a mixed design, videos, and three ratio scales, this study determined the effect of video length on English proficiency, student engagement, and satisfaction in a flipped English classroom in China. We concluded that: (1) The short video (shorter than 5 minutes)-assisted English flipped classroom could lead to significantly higher English proficiency, student engagement, and satisfaction than the flipped classroom assisted with medium videos (10–20 minutes); and (2) The medium video-assisted English flipped classroom could lead to significantly higher English proficiency, student engagement, and satisfaction than the flipped classroom assisted with long videos (longer than 30 minutes). Designers of videos could make every effort to provide short videos to improve English proficiency, engagement, and satisfaction. They could also seriously consider a scale, a mobile platform, quizzes, pictures, and multimedia in the future design and innovation of videos.


2021 ◽  
Vol 14 (9) ◽  
pp. 6181-6193
Author(s):  
Nobuyuki Aoki ◽  
Shigeyuki Ishidoya ◽  
Yasunori Tohjima ◽  
Shinji Morimoto ◽  
Ralph F. Keeling ◽  
...  

Abstract. A study was conducted to compare the δ(O2/N2) scales used by four laboratories engaged in atmospheric δ(O2/N2) measurements. These laboratories are the Research Institute for Environmental Management Technology, Advanced Industrial Science and Technology (EMRI/AIST); the National Institute for Environmental Studies (NIES); Tohoku University (TU); and Scripps Institution of Oceanography (SIO). Therefore, five high-precision standard mixtures for the O2 molar fraction gravimetrically prepared by the National Metrology Institute of Japan, AIST (NMIJ/AIST) with a standard uncertainty of less than 5 per meg (0.001 ‰) were used as round-robin standard mixtures. EMRI/AIST, NIES, TU, and SIO reported the analyzed values of the standard mixtures on their own δ(O2/N2) scales, and the values were compared with the δ(O2/N2) values gravimetrically determined by NMIJ/AIST (the NMIJ/AIST scale). The δ(O2/N2) temporal drift in the five standard mixtures during the intercomparison experiment from May 2017 to March 2020 was corrected based on the δ(O2/N2) values analyzed before and after the laboratory measurements by EMRI/AIST. The scales are compared based on offsets in zero and span. The relative span offsets of EMRI/AIST, TU, NIES, and SIO scales against the NMIJ/AIST scale were -0.11%±0.10%, -0.10%±0.13%, 3.39 %±0.13 %, and 0.93 %±0.10 %, respectively. The largest offset corresponded to a 0.30 Pg yr−1 decrease and increase in global estimates for land biospheric and oceanic CO2 uptakes based on trends in atmospheric CO2 and δ(O2/N2). The deviations in the measured δ(O2/N2) values on the laboratory scales from the NMIJ/AIST scale are 65.8±2.2, 425.7±3.1, 404.5±3.0, and 596.4±2.4 per meg for EMRI/AIST, TU, NIES, and SIO, respectively. The difference between atmospheric δ(O2/N2) values observed at Hateruma Island (HAT; 24.05∘ N, 123.81∘ E), Japan, by EMRI/AIST and NIES were reduced from -329.3±6.9 to -6.6±6.8 per meg by converting their scales to the NMIJ/AIST scale.


2021 ◽  
Author(s):  
Nobuyuki Aoki ◽  
Shigeyuki Ishidoya ◽  
Yasunori Tohjima ◽  
Shinji Morimoto ◽  
Ralph F. Keeling ◽  
...  

Abstract. A study was conducted to compare the δ(O2/N2) scales used by four laboratories engaged in atmospheric δ(O2/N2) measurements. These laboratories are the Research Institute for Environmental Management Technology, Advanced Industrial Science and Technology (EMRI/AIST), the National Institute for Environmental Studies (NIES), Tohoku University (TU), and Scripps Institution of Oceanography (SIO). Therefore, five high-precision standard mixtures for O2 molar fraction gravimetrically prepared by the National Metrology Institute of Japan (NMIJ), AIST (NMIJ/AIST) with a standard uncertainty of less than 5 per meg were used as round-robin standard mixtures. EMRI/AIST, NIES, TU, and SIO reported the analysed values of the standard mixtures on their own δ(O2/N2) scales, and the values were compared with the δ(O2/N2) values gravimetrically determined by NMIJ/AIST (the NMIJ/AIST scale). The δ(O2/N2) temporal drift in the five standard mixtures during the inter-comparison experiment was corrected based on the δ(O2/N2) values analysed before and after the experiments by EMRI/AIST. The scales are compared based on offsets in zero and span. The span offsets from the NMIJ/AIST scale ranged from −0.17 % to 3.3 %, corresponding with the difference of 0.29 Pg yr−1 in the estimates for land biospheric and oceanic CO2 uptakes. The zero offsets from the NMIJ/AIST scale are −581.0 ± 2.2, −221.4 ± 3.1, −243.0 ± 3.0, and −50.7 ± 2.4 per meg for EMRI/AIST, TU, NIES, and SIO, respectively. The atmospheric δ(O2/N2) values observed at Hateruma Island (HAT; 24.05° N, 123.81° E), Japan, by EMRI/AIST and NIES became comparable by converting their scales to the NMIJ/AIST scale.


2021 ◽  
pp. 111-128 ◽  
Author(s):  
Ahmet Aytekin ◽  
Hasan Durucasu

In MCDM problems, the decision maker is often ready to adopt the closest solution to the reference values in a choice or ranking problem. The reference values represent the desired results as established subjectively by the decision maker or determined through various scientific tools. In a criterion, the reference value could be the maximum value, the minimum value, or a specific value or range. Also, the acceptances degrees of ranges outside the reference may differ from each other in a criterion. Furthermore, measurements in a criterion may have been obtained with any of the nominal, ordinal, interval, and ratio scales. For the decision problems, including qualitative criteria, the solution cannot be achieved without scaling of criteria with the existing MCDM methods. The purpose of this study is to propose the Nearest Solution to References (REF) Method, a novel reference-based MCDM method, for the solution of decision problems having mixed data structure where references can be determined for criteria.


Nanophotonics ◽  
2020 ◽  
Vol 10 (1) ◽  
pp. 513-521 ◽  
Author(s):  
Charles A. Downing ◽  
Luis Martín-Moreno

AbstractWe consider a periodic chain of oscillating dipoles, interacting via long-range dipole–dipole interactions, embedded inside a cuboid cavity waveguide. We show that the mixing between the dipolar excitations and cavity photons into polaritons can lead to the appearance of new states localized at the ends of the dipolar chain, which are reminiscent of Tamm surface states found in electronic systems. A crucial requirement for the formation of polaritonic Tamm states is that the cavity cross section is above a critical size. Above this threshold, the degree of localization of the Tamm states is highly dependent on the cavity size since their participation ratio scales linearly with the cavity cross-sectional area. Our findings may be important for quantum confinement effects in one-dimensional systems with strong light–matter coupling.


F1000Research ◽  
2020 ◽  
Vol 9 ◽  
pp. 1048
Author(s):  
Paul C. Langley ◽  
Stephen P. McKenna

Over the past 30 years, a mainstay of health technology assessment has been the creation of modeled incremental cost-per-quality adjusted life year (QALY) claims. These are intended to inform resource allocation decisions. Unfortunately, the reliance on the construction of QALYs from generic utility scales is misplaced. Those advocating QALY-based lifetime modeled claims fail to appreciate the limitations placed on these constructs by the axioms of fundamental measurement. Utility scales, such as those created by the EQ-5D-3L instrument, are nothing more than multidimensional, ordinal scales. Such scales cannot support basic arithmetic operations. Interval scales can support addition and subtraction; ratio scales the further operations of multiplication and division. Those who advocate the construction of QALYs fail to appreciate that such an operation is only possible if the utility scale is unidimensional and has ratio properties with a true zero. The utility measures available do not meet these requirements. As we cannot produce meaningful utility values, the QALY is an invalid construct. Consequently, cost-per-incremental QALY claims are impossible to sustain and the application of cost-per QALY thresholds meaningless. As utility is a latent, unidimensional variable, the best a measure of utility could achieve would be unidimensionality and interval scaling properties. Where such measures are available, they could support claims for response to therapy. Consequently, there would be no need to continue constructing imaginary lifetime value assessment frameworks. Admitting that the QALY is a fatally flawed construct means rejecting 30 years of cost-per-QALY models.


2020 ◽  
Author(s):  
Dao Duy Tung

A measurement scale is used to qualify or quantify data variables in statistics. It determines the kind of techniques to be used for statistical analysis. The measurement scales are used to measure qualitative and quantitative data. With nominal and ordinal scale being used to measure qualitative data while interval and ratio scales are used to measure quantitative data.


Author(s):  
J. E. Wolff

This chapter addresses two challenges for using the representational theory of measurement (RTM) as a basis for a metaphysics of quantities. The first is the dominant interpretation of representationalism as being committed to operationalism and empiricism. The chapter argues in favour of treating RTM itself as a mathematical framework open to different interpretations and proposes a more realist understanding of RTM, which treats the mapping between represented and representing structure as an isomorphism rather than a mere homomorphism. This adjustment then enables us to address the second challenge, which is the permissivism present in standard representationalism, according to which there is no special division into quantitative and non-quantitative attributes. Based on results in abstract measurement theory, the chapter argues that, on the contrary, RTM provides the means to draw such a distinction at an intuitively plausible place: only attributes representable on ‘super-ratio scales’ are quantitative.


2020 ◽  
Vol 28 (1) ◽  
pp. 80-90
Author(s):  
Krzysztof Dmytrów ◽  
Anna Gdakowicz ◽  
Ewa Putek-Szeląg

AbstractVariables occurring in a real estate market are frequently presented on scales other than interval or ratio scales. Most frequently, the scale is an ordinal (for instance – onerous, unfavourable, neutral, favourable), or possibly a nominal one. That is why the use of scales intended for quantitative attributes (such as Pearson linear correlation coefficient) is not possible. The paper presents the results of employing other coefficients (Kendall’s τB and Spearman’s ρ coefficients) in analyzing correlations on the real estate market.The objective of the article is to present a method of analyzing the correlation of qualitative variables (attributes) and to present the possibility of using the obtained results in the process of real estate appraisal.


2019 ◽  
Vol 11 (4) ◽  
pp. 489-538
Author(s):  
Victor Tang

Purpose The purpose of this paper is to present a fresh approach to stimulate individual creativity. It introduces a mathematical representation for creative ideas, six creativity operators and methods of matrix-algebra to evaluate, improve and stimulate creative ideas. Creativity begins with ideas to resolve a problem or tackle an opportunity. By definition, a creative idea must be simultaneously novel and useful. To inject analytic rigor into these concepts of creative ideas, the author introduces a feature-attribute matrix-construct to represent ideas, creativity operators that use ideas as operands and methods of matrix algebra. It is demonstrated that it is now possible to analytically and quantitatively evaluate the intensity of the variables that make an idea more, equal or less, creative than another. The six creativity operators are illustrated with detailed multi-disciplinary real-world examples. The mathematics and working principles of each creativity operator are discussed. Design/methodology/approach The unit of analysis is ideas, not theory. Ideas are man-made artifacts. They are represented by an original feature-attribute matrix construct. Using matrix algebra, idea matrices can be manipulated to improve their creative intensity, which are now quantitatively measurable. Unlike atoms and cute rabbits, creative ideas, do not occur in nature. Only people can conceive and develop creative ideas for embodiment in physical, non-physical forms, or in a mix of both. For example, as widgets, abstract theorems, business processes, symphonies, organization structures, and so on. The feature-attribute matrix construct is used to represent novelty and usefulness. The multiplicative product of these two matrices forms the creativity matrix. Six creativity operators and matrix algebra are introduced to stimulate and measure creative ideas. Creativity operators use idea matrices as operands. Uses of the six operators are demonstrated using multi-disciplinary real-world examples. Metrics for novelty, usefulness and creativity are in ratio scales, grounded on the Weber–Fechner Law. This law is about persons’ ability to discern differences in the intensity of stimuli. Findings Ideas are represented using feature-attribute matrices. This construct is used to represent novel, useful and creative ideas with more clarity and precision than before. Using matrices, it is shown how to unambiguously and clearly represent creative ideas endowed with novelty and usefulness. It is shown that using matrix algebra, on idea matrices, makes it possible to analyze multi-disciplinary, real-world cases of creative ideas, with clarity and discriminatory power, to uncover insights about novelty and usefulness. Idea-matrices and the methods of matrix algebra have strong explanatory and predictive power. Using of matrix algebra and eigenvalue analyses, of idea-matrices, it is demonstrated how to quantitatively rank ideas, features and attributes of creative ideas. Matrix methods operationalize and quantitatively measure creativity, novelty and usefulness. The specific elementary variables that characterize creativity, novelty and usefulness factors, can now be quantitatively ranked. Creativity, novelty and usefulness factors are not considered as monolithic, irreducible factors, vague “lumpy” qualitative factors, but as explicit sets of elementary, specific and measurable variables in ratio scales. This significantly improves the acuity and discriminatory power in the analyses of creative ideas. The feature-attribute matrix approach and its matrix operators are conceptually consistent and complementary with key extant theories engineering design and creativity. Originality/value First to define and specify ideas as feature-attribute matrices. It is demonstrated that creative ideas, novel ideas and useful ideas can be analytically and unambiguously specified and measured for creativity. It is significant that verbose qualitative narratives will no longer be the exclusive means to specify creative ideas. Rather, qualitative narratives will be used to complement the matrix specifications of creative ideas. First to specify six creativity operators enabling matrix algebra to operate on idea-matrices as operands to generate new ideas. This capability informs and guides a person’s intuition. The myth and dependency, on non-repeatable or non-reproducible serendipity, flashes of “eureka” moments or divine inspiration, can now be vacated. Though their existence cannot be ruled out. First to specify matrix algebra and eigen-value methods of quantitative analyses of feature-attribute matrices to rank the importance of elementary variables that characterize factors of novelty, usefulness and creativity. Use of verbose qualitative narratives of novelty, usefulness and creativity as monolithic “lumpy” factors can now be vacated. Such lumpy narratives risk being ambiguous, imprecise, unreliable and non-reproducible, Analytic and quantitative methods are more reliable and consistent. First to define and specify a method of “attacking the negatives” to systematically pinpoint the improvements of an idea’s novelty, usefulness and creativity. This procedure informs and methodically guides the improvements of deficient ideas.


Sign in / Sign up

Export Citation Format

Share Document