scholarly journals Geodesic Distance on Gaussian Manifolds to Reduce the Statistical Errors in the Investigation of Complex Systems

Complexity ◽  
2019 ◽  
Vol 2019 ◽  
pp. 1-24 ◽  
Author(s):  
Michele Lungaroni ◽  
Andrea Murari ◽  
Emmanuele Peluso ◽  
Pasqualino Gaudio ◽  
Michela Gelfusa

In the last years the reputation of medical, economic, and scientific expertise has been strongly damaged by a series of false predictions and contradictory studies. The lax application of statistical principles has certainly contributed to the uncertainty and loss of confidence in the sciences. Various assumptions, generally held as valid in statistical treatments, have proved their limits. In particular, since some time it has emerged quite clearly that even slightly departures from normality and homoscedasticity can affect significantly classic significance tests. Robust statistical methods have been developed, which can provide much more reliable estimates. On the other hand, they do not address an additional problem typical of the natural sciences, whose data are often the output of delicate measurements. The data can therefore not only be sampled from a nonnormal pdf but also be affected by significant levels of Gaussian additive noise of various amplitude. To tackle this additional source of uncertainty, in this paper it is shown how already developed robust statistical tools can be usefully complemented with the Geodesic Distance on Gaussian Manifolds. This metric is conceptually more appropriate and practically more effective, in handling noise of Gaussian distribution, than the traditional Euclidean distance. The results of a series of systematic numerical tests show the advantages of the proposed approach in all the main aspects of statistical inference, from measures of location and scale to size effects and hypothesis testing. Particularly relevant is the reduction even of 35% in Type II errors, proving the important improvement in power obtained by applying the methods proposed in the paper. It is worth emphasizing that the proposed approach provides a general framework, in which also noise of different statistical distributions can be dealt with.

2020 ◽  
pp. 37-55 ◽  
Author(s):  
A. E. Shastitko ◽  
O. A. Markova

Digital transformation has led to changes in business models of traditional players in the existing markets. What is more, new entrants and new markets appeared, in particular platforms and multisided markets. The emergence and rapid development of platforms are caused primarily by the existence of so called indirect network externalities. Regarding to this, a question arises of whether the existing instruments of competition law enforcement and market analysis are still relevant when analyzing markets with digital platforms? This paper aims at discussing advantages and disadvantages of using various tools to define markets with platforms. In particular, we define the features of the SSNIP test when being applyed to markets with platforms. Furthermore, we analyze adjustment in tests for platform market definition in terms of possible type I and type II errors. All in all, it turns out that to reduce the likelihood of type I and type II errors while applying market definition technique to markets with platforms one should consider the type of platform analyzed: transaction platforms without pass-through and non-transaction matching platforms should be tackled as players in a multisided market, whereas non-transaction platforms should be analyzed as players in several interrelated markets. However, if the platform is allowed to adjust prices, there emerges additional challenge that the regulator and companies may manipulate the results of SSNIP test by applying different models of competition.


2018 ◽  
Vol 41 (1) ◽  
pp. 1-30 ◽  
Author(s):  
Chelsea Rae Austin

ABSTRACT While not explicitly stated, many tax avoidance studies seek to investigate tax avoidance that is the result of firms' deliberate actions. However, measures of firms' tax avoidance can also be affected by factors outside the firms' control—tax surprises. This study examines potential complications caused by tax surprises when measuring tax avoidance by focusing on one specific type of surprise tax savings—the unanticipated tax benefit from employees' exercise of stock options. Because the cash effective tax rate (ETR) includes the benefits of this tax surprise, the cash ETR mismeasures firms' deliberate tax avoidance. The analyses conducted show this mismeasurement is material and can lead to both Type I and Type II errors in studies of deliberate tax avoidance. Suggestions to aid researchers in mitigating these concerns are also provided.


1999 ◽  
Vol 18 (1) ◽  
pp. 37-54 ◽  
Author(s):  
Andrew J. Rosman ◽  
Inshik Seol ◽  
Stanley F. Biggs

The effect of different task settings within an industry on auditor behavior is examined for the going-concern task. Using an interactive computer process-tracing method, experienced auditors from four Big 6 accounting firms examined cases based on real data that differed on two dimensions of task settings: stage of organizational development (start-up and mature) and financial health (bankrupt and nonbankrupt). Auditors made judgments about each entity's ability to continue as a going concern and, if they had substantial doubt about continued existence, they listed evidence they would seek as mitigating factors. There are seven principal results. First, information acquisition and, by inference, problem representations were sensitive to differences in task settings. Second, financial mitigating factors dominated nonfinancial mitigating factors in both start-up and mature settings. Third, auditors' behavior reflected configural processing. Fourth, categorizing information into financial and nonfinancial dimensions was critical to understanding how auditors' information acquisition and, by inference, problem representations differed across settings. Fifth, Type I errors (determining that a healthy company is a going-concern problem) differed from correct judgments in terms of information acquisition, although Type II errors (determining that a problem company is viable) did not. This may indicate that Type II errors are primarily due to deficiencies in other stages of processing, such as evaluation. Sixth, auditors who were more accurate tended to follow flexible strategies for financial information acquisition. Finally, accurate performance in the going-concern task was found to be related to acquiring (1) fewer information cues, (2) proportionately more liquidity information and (3) nonfinancial information earlier in the process.


2002 ◽  
Vol 12 (03) ◽  
pp. 249-261 ◽  
Author(s):  
XUEHOU TAN

Let π(a,b) denote the shortest path between two points a, b inside a simple polygon P, which totally lies in P. The geodesic distance between a and b in P is defined as the length of π(a,b), denoted by gd(a,b), in contrast with the Euclidean distance between a and b in the plane, denoted by d(a,b). Given two disjoint polygons P and Q in the plane, the bridge problem asks for a line segment (optimal bridge) that connects a point p on the boundary of P and a point q on the boundary of Q such that the sum of three distances gd(p′,p), d(p,q) and gd(q,q′), with any p′ ∈ P and any q′ ∈ Q, is minimized. We present an O(n log 3 n) time algorithm for finding an optimal bridge between two simple polygons. This significantly improves upon the previous O(n2) time bound. Our result is obtained by making substantial use of a hierarchical structure that consists of segment trees, range trees and persistent search trees, and a structure that supports dynamic ray shooting and shortest path queries as well.


PEDIATRICS ◽  
1973 ◽  
Vol 51 (4) ◽  
pp. 753-753
Author(s):  
Emperor Watcher ◽  
C. A. S.

Was the layout editor making a sly comment on the present state of American pediatrics by juxtaposing Mrs. Seymour's letter with the articles concerning Child Health Associates in the January issue (Pediatrics 51:1-16, 1973)? If the word "pediatrician" is substituted for "surgeon " in the 1754 letter, it has a surprisingly modern ring. One gets the impression from reading the four articles that CHAs have demonstrated that they are capable of doing good when compared with practicing pediatricians, but it is not clear whether evidence has been collected to deal with the question of whether the associates cause less harm (in testing hypotheses one is liable to two kinds of error, and the relationships between type I and type II errors is the basis for the Neyman-Pearson theory).


Author(s):  
Robert Shearer ◽  
Truman Clark

Linear models are the most commonly used analytical tools in the nonprofit literature. Academics and practitioners utilize these models to test different hypotheses in support of their research efforts, seeking to find significant results that substantiate their theories. And yet the authors of this article have discovered a surprisingly large number of insignificant results in articles from established nonprofit journals. Insignificant hypotheses and Type II errors surely account for a number of these results, but the authors believe the majority of these results are due to a different cause, one that is detectable and preventable: multicollinearity.Dans les articles sur les organismes sans but lucratif, les modèles linéaires sont les outils analytiques les plus communément utilisés. En effet, académiques et praticiens utilisent tous les deux ces modèles pour évaluer diverses hypothèses relatives à leurs recherches, espérant trouver des résultats significatifs pouvant confirmer leurs théories. Pourtant, les auteurs de cet article ont découvert un nombre surprenant de résultats non significatifs dans des articles de revues établies sur les organismes sans but lucratif. Des hypothèses non significatives et des erreurs du type II expliquent sûrement certains de ces résultats, mais les auteurs croient que la majorité des résultats ont une cause différente qui est détectable et évitable : la multicolinéarité.


1989 ◽  
Vol 25 (3) ◽  
pp. 451-454 ◽  
Author(s):  
Joel Berger ◽  
Michael D. Kock
Keyword(s):  
Type I ◽  
Type Ii ◽  
The Real ◽  

2019 ◽  
Vol 8 (4) ◽  
pp. 1849-1853

Nowadays people are interested to avail loans in banks for their needs, but providing loans to all people is not possible to banks, so they are using some measures to identify eligible customers. To measure the performance of categorical variables sensitivity and specificity are widely used in Medical and tangentially in econometrics, after using some measures also if banks provide the loans to the wrong customers whom might not able to repay the loans, and not providing to customers who can repay will lead to the type I errors and type II errors, to minimize these errors, this study explains one, how to know sensitivity is large or small and second to study the bench marks on forecasting the model by Fuzzy analysis based on fuzzy based weights and it is compared with the sensitivity analysis.


Author(s):  
Gary McClelland ◽  
John G. Lynch ◽  
Julie R. Irwin ◽  
Stephen A. Spiller ◽  
Gavan J. Fitzsimons

Sign in / Sign up

Export Citation Format

Share Document