Taxonomy of Edge Computing: Challenges, Opportunities, and Data Reduction Methods

Author(s):  
Kusumlata Jain ◽  
Smaranika Mohapatra
Author(s):  
T. Chen ◽  
C. M. Harvey ◽  
S. Wang ◽  
V. V. Silberschmidt

AbstractDouble-cantilever beams (DCBs) are widely used to study mode-I fracture behavior and to measure mode-I fracture toughness under quasi-static loads. Recently, the authors have developed analytical solutions for DCBs under dynamic loads with consideration of structural vibration and wave propagation. There are two methods of beam-theory-based data reduction to determine the energy release rate: (i) using an effective built-in boundary condition at the crack tip, and (ii) employing an elastic foundation to model the uncracked interface of the DCB. In this letter, analytical corrections for a crack-tip rotation of DCBs under quasi-static and dynamic loads are presented, afforded by combining both these data-reduction methods and the authors’ recent analytical solutions for each. Convenient and easy-to-use analytical corrections for DCB tests are obtained, which avoid the complexity and difficulty of the elastic foundation approach, and the need for multiple experimental measurements of DCB compliance and crack length. The corrections are, to the best of the authors’ knowledge, completely new. Verification cases based on numerical simulation are presented to demonstrate the utility of the corrections.


1987 ◽  
Vol 26 ◽  
pp. 1-11
Author(s):  
Walman ◽  
James Jay Anderson

Currents and mixing properties on the sill of Ambon Bay were measured with drogues. Current speed of 0.62 m/sec was observed. A mixing model suggests material released on the sill would decrease by a factor of 2 x 104 in one hour. Drogue construction and data reduction methods are described.


2003 ◽  
Vol 125 (3) ◽  
pp. 274-276 ◽  
Author(s):  
R. R. de Swardt

During a recent study the residual strain/stress states through the walls of autofrettaged thick-walled high-strength steel cylinders were measured with neutron diffraction, Sachs boring and the compliance methods (Venter et al., 2000, J. Strain Anal. Eng. Des., 35, pp. 459–469). The Sachs boring method was developed prior to the advent of high speed computers. A new method for the data reduction was proposed. In order to verify the proposed procedure, the Sachs boring experimental method was simulated using finite element modeling. A residual stress field was introduced in the finite element method by elasto-plastic finite element analysis. The physical process of material removal by means of boring was simulated by step-by-step removal of elements from the finite element mesh. Both the traditional and newly proposed data reduction methods were used to calculate the residual stresses. The new data reduction method compares favorably with the traditional method.


2018 ◽  
pp. 956-979
Author(s):  
Morteza Shafiee Sardasht ◽  
Saeed Saheb

Predicting corporate bankruptcy has been an important challenging problem in research topic in accounting and finance. In bankruptcy prediction, researchers often confront a range of observations and variables which are often vast amount of financial ratios. By reducing variables and select relevant data from a given dataset, data reduction process can optimize bankruptcy prediction. This study addresses four well-known data reduction methods including t-test, correlation analysis, principal component analysis (PCA) and factor analysis (FA) and evaluated them in bankruptcy prediction in the Tehran Stock Exchange (TSE). To this end, considering 35 financial ratios, the results of data reduction methods were separately used to train Support Vector Machine (SVM) as the powerful prediction model. Regarding the empirical results, among the aforementioned methods, the t-test lead to the most prediction rate with 97.1% of predictability and PCA by 95.1% provides the next position.


Author(s):  
Morteza Shafiee Sardasht ◽  
Saeed Saheb

Predicting corporate bankruptcy has been an important challenging problem in research topic in accounting and finance. In bankruptcy prediction, researchers often confront a range of observations and variables which are often vast amount of financial ratios. By reducing variables and select relevant data from a given dataset, data reduction process can optimize bankruptcy prediction. This study addresses four well-known data reduction methods including t-test, correlation analysis, principal component analysis (PCA) and factor analysis (FA) and evaluated them in bankruptcy prediction in the Tehran Stock Exchange (TSE). To this end, considering 35 financial ratios, the results of data reduction methods were separately used to train Support Vector Machine (SVM) as the powerful prediction model. Regarding the empirical results, among the aforementioned methods, the t-test lead to the most prediction rate with 97.1% of predictability and PCA by 95.1% provides the next position.


2019 ◽  
Vol 11 (13) ◽  
pp. 1610 ◽  
Author(s):  
Marta Wlodarczyk-Sielicka ◽  
Andrzej Stateczny ◽  
Jacek Lubczonek

Water areas occupy over 70 percent of the Earth’s surface and are constantly subject to research and analysis. Often, hydrographic remote sensors are used for such research, which allow for the collection of information on the shape of the water area bottom and the objects located on it. Information about the quality and reliability of the depth data is important, especially during coastal modelling. In-shore areas are liable to continuous transformations and they must be monitored and analyzed. Presently, bathymetric geodata are usually collected via modern hydrographic systems and comprise very large data point sequences that must then be connected using long and laborious processing sequences including reduction. As existing bathymetric data reduction methods utilize interpolated values, there is a clear requirement to search for new solutions. Considering the accuracy of bathymetric maps, a new method is presented here that allows real geodata to be maintained, specifically position and depth. This study presents a description of a developed method for reducing geodata while maintaining true survey values.


1987 ◽  
Vol 33 (7) ◽  
pp. 1207-1210 ◽  
Author(s):  
M C Haven ◽  
P J Orsulak ◽  
L L Arnold ◽  
G Crowley

Abstract In an attempt to optimize curve fitting for immunoradiometric assays, we investigated eight data-reduction methods with two commercially available assays of thyrotropin. In four of these methods linear data-reduction models are used: logit-log programs of Iso-Data, Micromedic, and Hewlitt-Packard, and probit-log of Hewlitt-Packard. The other four were nonlinear data-reduction models: Iso-Data's "French curve" (modified spline), four-parameter logistic function, and point-to-point methods, as well as a nonlinear least squares method. In using the eight data-reduction methods on data from analyses of 78 patients' samples, we found clinically relevant differences between models. In fact, differences found by changing data-reduction models were greater than the difference between the two commercial kits.


Sign in / Sign up

Export Citation Format

Share Document