scholarly journals An Empirical Test of Formal Equivalence between Emmert's Law and the Size-Distance Invariance Hypothesis

2006 ◽  
Vol 9 (2) ◽  
pp. 295-299 ◽  
Author(s):  
Mariko Imamura ◽  
Sachio Nakamizo

Emmert's law and the size-distance invariance hypothesis have been said to be formally equivalent, provided that Emmert's law means that the perceived size of an afterimage is proportional to the perceived distance of the projected surface of the afterimage. However, there have been very few studies that have attempted to verify this formal equivalence empirically. We measured both the perceived size and distance of afterimages and real objects with the same proximal size. Nineteen participants projected afterimages of 1 deg in visual angle on the wall located at distances of 1 to 23 meters from the participants. They also observed real objects, disc-shaped and made from a sheet of Styrofoam board, with the same proximal size as that of the afterimages, which were located at the same physical distances as those of the wall on which the afterimages were projected. Each participant reproduced the apparent sizes of the afterimages and real objects using the reproduction method and estimated the apparent distances using the magnitude estimation method. When the mean apparent sizes of the afterimages and real objects, represented as a function of apparent distance, were fitted to a linear function, the slopes for the afterimages and real objects did not differ significantly. These results are interpreted as evidence for the formal equivalence of Emmert's law and the size-distance invariance hypothesis.

Sensors ◽  
2021 ◽  
Vol 21 (5) ◽  
pp. 1867
Author(s):  
Tasbiraha Athaya ◽  
Sunwoong Choi

Blood pressure (BP) monitoring has significant importance in the treatment of hypertension and different cardiovascular health diseases. As photoplethysmogram (PPG) signals can be recorded non-invasively, research has been highly conducted to measure BP using PPG recently. In this paper, we propose a U-net deep learning architecture that uses fingertip PPG signal as input to estimate arterial BP (ABP) waveform non-invasively. From this waveform, we have also measured systolic BP (SBP), diastolic BP (DBP), and mean arterial pressure (MAP). The proposed method was evaluated on a subset of 100 subjects from two publicly available databases: MIMIC and MIMIC-III. The predicted ABP waveforms correlated highly with the reference waveforms and we have obtained an average Pearson’s correlation coefficient of 0.993. The mean absolute error is 3.68 ± 4.42 mmHg for SBP, 1.97 ± 2.92 mmHg for DBP, and 2.17 ± 3.06 mmHg for MAP which satisfy the requirements of the Association for the Advancement of Medical Instrumentation (AAMI) standard and obtain grade A according to the British Hypertension Society (BHS) standard. The results show that the proposed method is an efficient process to estimate ABP waveform directly using fingertip PPG.


2009 ◽  
Vol 139 (6) ◽  
pp. 1905-1920 ◽  
Author(s):  
G. Dierckx ◽  
J. Beirlant ◽  
D. De Waal ◽  
A. Guillou

2017 ◽  
Vol 35 (2) ◽  
pp. 211-234 ◽  
Author(s):  
Asterios Zacharakis ◽  
Maximos Kaliakatsos-Papakostas ◽  
Costas Tsougras ◽  
Emilios Cambouropoulos

The cognitive theory of conceptual blending may be employed to understand the way music becomes meaningful and, at the same time, it may form a basis for musical creativity per se. This work constitutes a case study whereby conceptual blending is used as a creative tool for inventing musical cadences. Specifically, the perfect and the renaissance Phrygian cadential sequences are used as input spaces to a cadence blending system that produces various cadential blends based on musicological and blending optimality criteria. A selection of “novel” cadences is subject to empirical evaluation in order to gain a better understanding of perceptual relationships between cadences. Pairwise dissimilarity ratings between cadences are transformed into a perceptual space and a verbal attribute magnitude estimation method on six descriptive axes (preference, originality, tension, closure, expectancy, and fit) is used to associate the dimensions of this space with descriptive qualities (closure and tension emerged as the most prominent qualities). The novel cadences generated by the computational blending system are mainly perceived as single-scope blends (i.e., blends where one input space is dominant), since categorical perception seems to play a significant role (especially in relation to the upward leading note movement). Insights into perceptual aspects of conceptual bending are presented and ramifications for developing sophisticated creative systems are discussed.


Mathematics ◽  
2020 ◽  
Vol 8 (9) ◽  
pp. 1578 ◽  
Author(s):  
Hazem Al-Mofleh ◽  
Ahmed Z. Afify ◽  
Noor Akma Ibrahim

In this paper, a new two-parameter generalized Ramos–Louzada distribution is proposed. The proposed model provides more flexibility in modeling data with increasing, decreasing, J-shaped, and reversed-J shaped hazard rate functions. Several statistical properties of the model were derived. The unknown parameters of the new distribution were explored using eight frequentist estimation approaches. These approaches are important for developing guidelines to choose the best method of estimation for the model parameters, which would be of great interest to practitioners and applied statisticians. Detailed numerical simulations are presented to examine the bias and the mean square error of the proposed estimators. The best estimation method and ordering performance of the estimators were determined using the partial and overall ranks of all estimation methods for various parameter combinations. The performance of the proposed distribution is illustrated using two real datasets from the fields of medicine and geology, and both datasets show that the new model is more appropriate as compared to the Marshall–Olkin exponential, exponentiated exponential, beta exponential, gamma, Poisson–Lomax, Lindley geometric, generalized Lindley, and Lindley distributions, among others.


2020 ◽  
Vol 30 (01) ◽  
pp. 2050003
Author(s):  
Wenjie Peng ◽  
Kaiqi Fu ◽  
Wei Zhang ◽  
Yanlu Xie ◽  
Jinsong Zhang

Pitch-range estimation from brief speech segments could bring benefits to many tasks like automatic speech recognition and speaker recognition. To estimate pitch range, previous studies have proposed to utilize deep-learning-based models with spectrum information as input. They demonstrated that such method works and could still achieve reliable estimation results when the speech segment is as brief as 300 ms. In this study, we evaluated the robustness of this method. We take the following scenarios into account: (1) a large number of training speakers; (2) different language backgrounds; and (3) monosyllabic utterances with different tones. Experimental results showed that: (1) The use of a large number of training speakers improved the estimation accuracies. (2) The mean absolute percentage error (MAPE) rate evaluated on the L2 speakers is similar to that on the native speakers. (3) Different tonal information will affect the LSTM-based model, but this influence is limited compared to the baseline method which calculates pitch-range targets from the distribution of [Formula: see text]0 values. These experimental results verified the efficiency of the LSTM-based pitch-range estimation method.


Sensors ◽  
2019 ◽  
Vol 19 (6) ◽  
pp. 1416 ◽  
Author(s):  
Jin-Hee Ahn ◽  
Young-Soo Jeong ◽  
In-Tae Kim ◽  
Seok-Hyeon Jeon ◽  
Chan-Hee Park

In this study, a time-dependent corrosion depth estimation method using atmospheric corrosion monitor (ACM) sensor data to evaluate time-dependent corrosion behaviors is proposed. For the time-dependent corrosion depth estimation of uncoated carbon steel and weathering steel, acceleration corrosion tests were conducted in salt-spray corrosion environments and evaluated with a corrosion damage estimation method using ACM sensing data and corrosion loss data of the tested steel specimens. To estimate the time-dependent corrosion depth using corrosion current by an ACM sensor, the relationship between the mean corrosion depth calculated from the weight loss method and the corrosion current was evaluated. The mean corrosion depth was estimated by calculating the corrosion current and evaluating the relationship between the mean corrosion depth and corrosion current during the expected period. From the test and estimation results, the corrosion current demonstrated a good linear correlation with the mean corrosion depth of carbon steel and weathering. The calculated mean corrosion depth is nearly the same as that of the tested specimen, which can be well used to estimate corrosion rate for the uncoated carbon steel and weathering steel.


2016 ◽  
Vol 24 (2) ◽  
pp. 194-204 ◽  
Author(s):  
Teodor Sommestad ◽  
Henrik Karlzén ◽  
Peter Nilsson ◽  
Jonas Hallberg

Purpose In methods and manuals, the product of an information security incident’s probability and severity is seen as a risk to manage. The purpose of the test described in this paper is to investigate if information security risk is perceived in this way, if decision-making style influences the perceived relationship between the three variables and if the level of information security expertise influences the relationship between the three variables. Design/methodology/approach Ten respondents assessed 105 potential information security incidents. Ratings of the associated risks were obtained independently from ratings of the probability and severity of the incidents. Decision-making style was measured using a scale inspired from the Cognitive Style Index; information security expertise was self-reported. Regression analysis was used to test the relationship between variables. Findings The ten respondents did not assess risk as the product of probability and severity, regardless of experience, expertise and decision-making style. The mean variance explained in risk ratings using an additive term is 54.0 or 38.4 per cent, depending on how risk is measured. When a multiplicative term was added, the mean variance only increased by 1.5 or 2.4 per cent. For most of the respondents, the contribution of the multiplicative term is statistically insignificant. Practical Implications The inability or unwillingness to see risk as a product of probability and severity suggests that procedural support (e.g. risk matrices) has a role to play in the risk assessment processes. Originality/value This study is the first to test if information security risk is assessed as an interaction between probability and severity using suitable scales and a within-subject design.


2011 ◽  
Vol 8 (4) ◽  
pp. 6419-6442 ◽  
Author(s):  
T. H. Choo ◽  
I. J. Jeong ◽  
S. K. Chae ◽  
H. C. Yoon ◽  
H. S. Son

Abstract. This study proposed a new discharge estimation method using a mean velocity formula derived from Chiu's 2D velocity formula of probabilistic entropy concept and the river bed shear stress of channel. In particular, we could calculate the mean velocity, which is hardly measurable in flooding natural rivers, in consideration of several factors reflecting basic hydraulic characteristics such as river bed slope, wetted perimeter, width, and water level that are easily obtainable from rivers. In order to test the proposed method, we used highly reliable flow rate data measured in the field and published in SCI theses, estimated entropy M from the results of the mean velocity formula and, at the same time, calculated the maximum velocity. In particular, we obtained phi(M) expressing the overall equilibrium state of river through regression analysis between the maximum velocity and the mean velocity, and estimated the flow rate from the newly proposed mean velocity formula. The relation between estimated and measured discharge was analyzed through the discrepancy ratio, and the result showed that the estimate value was quite close to the measured data.


2020 ◽  
Vol 2020 ◽  
pp. 1-14
Author(s):  
Xiaoxue Huo ◽  
Qiong Wu ◽  
Huiming Tang ◽  
Zhen Meng ◽  
Di Wang ◽  
...  

Trace intensity is defined as mean total trace length of discontinuities per unit area, which is an important geometric parameter to describe fracture networks. The probability of each trace appearing in the sampling surface is different since discontinuity orientation has a scatter and is probabilistically distributed, so this factor should be taken into account in trace intensity estimation. This paper presents an approach to estimate the two-dimensional trace intensity by considering unequal appearing probability for discontinuities sampled by rectangular windows. The estimation method requires the number of discontinuities intersecting the window, the appearing probability of discontinuities with both ends observed, one end observed, and both ends censored, and the mean trace length of discontinuities intersecting the window. The new estimator is validated by using discontinuity data from an outcrop in Wenchuan area in China. Similarly, circular windows are used along with Mauldon’s equation to calculate trace intensity using discontinuity trace data of the same outcrop as a contrast. Results indicate that the proposed new method based on rectangular windows shows close accuracy and less variability than that of the method based on circular windows due to the influence of finite sample size and the variability of location of the window and has advantage in application to sampling surfaces longer in one direction than in the other such as tunnel cross sections and curved sampling surfaces such as outcrops that show some curvature.


Sign in / Sign up

Export Citation Format

Share Document