scholarly journals A Summary of F-Transform Techniques in Data Analysis

Electronics ◽  
2021 ◽  
Vol 10 (15) ◽  
pp. 1771
Author(s):  
Ferdinando Di Martino ◽  
Irina Perfilieva ◽  
Salvatore Sessa

Fuzzy transform is a technique applied to approximate a function of one or more variables applied by researchers in various image and data analysis. In this work we present a summary of a fuzzy transform method proposed in recent years in different data mining disciplines, such as the detection of relationships between features and the extraction of association rules, time series analysis, data classification. After having given the definition of the concept of Fuzzy Transform in one or more dimensions in which the constraint of sufficient data density with respect to fuzzy partitions is also explored, the data analysis approaches recently proposed in the literature based on the use of the Fuzzy Transform are analyzed. In particular, the strategies adopted in these approaches for managing the constraint of sufficient data density and the performance results obtained, compared with those measured by adopting other methods in the literature, are explored. The last section is dedicated to final considerations and future scenarios for using the Fuzzy Transform for the analysis of massive and high-dimensional data.

2017 ◽  
Vol 3 (2) ◽  
pp. 89
Author(s):  
Ilham Tri Maulana ◽  
Fadil Firdian

<p>Penelitian ini bertujuan untuk merancang sebuah media pembelajaran berupa Multimedia Interaktif pada mata kuliah aplikasi software. Multimedia Interaktif ini dirancang sebagai salah satu media pembelajaran bagi dosen di perkuliahan. Penelitian ini menggunakan metode SDLC (System Development Life Cycle), adapun prosedur metode SDLC yaitu: Definisi Kebutuhan, Analisis Kebutuhan, Pengumpulan Data, Desain Aplikasi, Coding, Pengujian Sistem. Jenis data yaitu data primer dimana data yang diberikan oleh dosen pengampu mata kuliah. Teknik analisis data yang digunakan adalah teknik analisis data deskriptif yaitu dengan mendeskripsikan rancangan Multimedia Interaktif. Perancangan multimedia interaktif diharapkan bisa memfasilitasi kebutuhan mahasiswa dan dosen untuk mempelajari materi tersebut setiap saat tanpa ada batasan waktu dan tempat. Software yang digunakan adalah Macromedia Flash CS .3</p><p> </p><p><em>This research aimed to design a learning media in the form of Interactive Multimedia course on software applications. Interactive Multimedia is designed as a media of learning for lecturer in the Class. This study uses SDLC (System Development Life Cycle), while the procedure SDLC methods are: Definition of Requirements, Needs Analysis, Data Collection, Application Design, Coding, Testing System. Types of data are primary data where the data provided by the lecturer of the course. Data analysis technique used is the technique of descriptive data analysis is to describe the design of Interactive Multimedia. The design of interactive multimedia is expected to facilitate the needs of students and faculty to learn the material at any time without any limitation of time and place. Software used is Macromedia Flash CS 3.</em></p><p><em> </em></p>


1998 ◽  
Vol 3 (1) ◽  
pp. 13-36 ◽  
Author(s):  
Ruth Guttman ◽  
Charles W. Greenbaum

This article gives an overview of Facet Theory, a systematic approach to facilitating theory construction, research design, and data analysis for complex studies, that is particularly appropriate to the behavioral and social sciences. Facet Theory is based on (1) a definitional framework for a universe of observations in the area of study; (2) empirical structures of observations within this framework; (3) a search for correspondence between the definitional system and aspects of the empirical structure for the observations. The development of Facet Theory and Facet Design is reviewed from early scale analysis and the Guttman Scale, leading to the concepts of “mapping sentence,” “universe of content,” “common range,” “content facets,” and nonmetric multidimensional methods of data analysis. In Facet Theory, the definition of the behavioral domain provides a rationale for hypothesizing structural relationships among variables employed in a study. Examples are presented from various areas of research (intelligence, infant development, animal behavior, etc.) to illustrate the methods and results of structural analysis with Smallest Space Analysis (SSA), Multidimensional Scalogram Analysis (MSA), and Partial Order Scalogram Analysis (POSA). The “radex” and “cylindrex” of intelligence tests are shown to be outstanding examples of predicted spatial configurations that have demonstrated the ubiquitous emergence of the same empirical structures in different studies. Further examples are given from studies of spatial abilities, infant development, animal behavior, and others. The use of Facet Theory, with careful construction of theory and design, is shown to provide new insights into existing data; it allows for the diagnosis and discrimination of behavioral traits and makes the generalizability and replication of findings possible, which in turn makes possible the discovery of lawfulness. Achievements, issues, and future challenges of Facet Theory are discussed.


2019 ◽  
Vol 20 (1) ◽  
pp. 30-35
Author(s):  
Agus Prasetya

This article is motivated by the fact that the existence of the Street Vendor (PKL) profession is a manifestation of the difficulty of work and the lack of jobs. The scarcity of employment due to the consideration of the number of jobs with unbalanced workforce, economically this has an impact on the number of street vendors (PKL) exploding ... The purpose of being a street vendor is, as a livelihood, making a living, looking for a bite of rice for family, because of the lack of employment, this caused the number of traders to increase. The scarcity of jobs, causes informal sector migration job seekers to create an independent spirit, entrepreneurship, entrepreneurship, with capital, managed by traders who are true populist economic actors. The problems in street vendors are: (1) how to organize, regulate, empower street vendors in the cities (2) how to foster, educate street vendors, and (3) how to help, find capital for street vendors (4) ) how to describe grief as a Five-Foot Trader. This paper aims to find a solution to the problem of street vendors, so that cases of conflict, cases of disputes, clashes of street vendors with Satpol PP can be avoided. For this reason, the following solutions must be sought: (1) understanding the causes of the explosions of street vendors (2) understanding the problems of street vendors. (3) what is the solution to solving street vendors in big cities. (4) describe Street Vendors as actors of the people's economy. This article is qualitative research, the social paradigm is the definition of social, the method of retrieving observational data, in-depth interviews, documentation. Data analysis uses Interactive Miles and Huberman theory, with stages, Collection Data, Display Data, Data Reduction and Vervying or conclusions.


Author(s):  
Ambar Widianingrum ◽  
Joko Sulianto ◽  
Rahmat Rais

The purpose of this study was to describe the feasibility of teaching materials based on an open-ended approach to improve the reasoning abilities of fourth grade students in elementary schools. This type of research is research and development (Research and Development). The subjects of this study were 3 classroom teachers. The data analysis technique used is descriptive qualitative data analysis (data reduction, data presentation and conclusion) and quantitative descriptive data analysis. Based on the results of stage 1 media validation, it was obtained 84.8%, and the results of stage 2 media validation were obtained 94.8%. The result of material validation for stage 1 was obtained 84.6%, and validation for material for stage 2 was obtained 93.3%. The results of initial field trials obtained media 93.7% and material 92.3%. This shows that the teaching material is declared valid and suitable for use. Based on the results of this study, the suggestion that can be conveyed is that teaching materials based on an open-ended approach can be used as a tool for teaching and learning resources for students.


2021 ◽  
pp. 1-27
Author(s):  
D. Sartori ◽  
F. Quagliotti ◽  
M.J. Rutherford ◽  
K.P. Valavanis

Abstract Backstepping represents a promising control law for fixed-wing Unmanned Aerial Vehicles (UAVs). Its non-linearity and its adaptation capabilities guarantee adequate control performance over the whole flight envelope, even when the aircraft model is affected by parametric uncertainties. In the literature, several works apply backstepping controllers to various aspects of fixed-wing UAV flight. Unfortunately, many of them have not been implemented in a real-time controller, and only few attempt simultaneous longitudinal and lateral–directional aircraft control. In this paper, an existing backstepping approach able to control longitudinal and lateral–directional motions is adapted for the definition of a control strategy suitable for small UAV autopilots. Rapidly changing inner-loop variables are controlled with non-adaptive backstepping, while slower outer loop navigation variables are Proportional–Integral–Derivative (PID) controlled. The controller is evaluated through numerical simulations for two very diverse fixed-wing aircraft performing complex manoeuvres. The controller behaviour with model parametric uncertainties or in presence of noise is also tested. The performance results of a real-time implementation on a microcontroller are evaluated through hardware-in-the-loop simulation.


2021 ◽  
pp. 1-12
Author(s):  
Jian Zheng ◽  
Jianfeng Wang ◽  
Yanping Chen ◽  
Shuping Chen ◽  
Jingjin Chen ◽  
...  

Neural networks can approximate data because of owning many compact non-linear layers. In high-dimensional space, due to the curse of dimensionality, data distribution becomes sparse, causing that it is difficulty to provide sufficient information. Hence, the task becomes even harder if neural networks approximate data in high-dimensional space. To address this issue, according to the Lipschitz condition, the two deviations, i.e., the deviation of the neural networks trained using high-dimensional functions, and the deviation of high-dimensional functions approximation data, are derived. This purpose of doing this is to improve the ability of approximation high-dimensional space using neural networks. Experimental results show that the neural networks trained using high-dimensional functions outperforms that of using data in the capability of approximation data in high-dimensional space. We find that the neural networks trained using high-dimensional functions more suitable for high-dimensional space than that of using data, so that there is no need to retain sufficient data for neural networks training. Our findings suggests that in high-dimensional space, by tuning hidden layers of neural networks, this is hard to have substantial positive effects on improving precision of approximation data.


2021 ◽  
Author(s):  
Graziano Patti ◽  
Sabrina Grassi ◽  
Gabriele Morreale ◽  
Mauro Corrao ◽  
Sebastiano Imposa

AbstractThe occurrence of strong and abrupt rainfall, together with a wrong land use planning and an uncontrolled urban development, can constitute a risk for infrastructure and population. The water flow in the subsoil, under certain conditions, may cause underground cavities formation. This phenomena known as soil piping can evolve and generate the surface collapse. It is clear that such phenomena in densely urbanized areas represent an unpredictable and consistent risk factor, which can interfere with social activities. In this study a multidisciplinary approach aimed to obtain useful information for the mitigation of the risks associated with the occurrence of soil piping phenomena in urban areas has been developed. This approach is aimed at defining the causes of sudden soil subsidence events, as well as the definition of the extension and possible evolution of these instability areas. The information obtained from rainfall data analysis, together with a study of the morphological, geological and hydrogeological characteristics, have allowed us to evaluate the causes that have led to the formation of soil pipes. Furthermore, performance of 3D electrical resistivity surveys in the area affected by the instability have allowed us to estimate their extension in the subsoil and identifying the presence of further areas susceptible to instability.


2010 ◽  
Vol 118-120 ◽  
pp. 601-605
Author(s):  
Han Ming

Evaluation method of reliability parameter estimation needs to be improved effectively with the advance of science and technology. This paper develops a new method of parameter estimation, which is named E-Bayesian estimation method. In the case one hyper-parameter, the definition of E-Bayesian estimation of the failure probability is provided, moreover, the formulas of E-Bayesian estimation and hierarchical Bayesian estimation, and the property of E-Bayesian estimation of the failure probability are also provided. Finally, calculation on practical problems shows that the provided method is feasible and easy to perform.


2018 ◽  
Vol 46 (1) ◽  
pp. 32-39 ◽  
Author(s):  
Ilana G. Raskind ◽  
Rachel C. Shelton ◽  
Dawn L. Comeau ◽  
Hannah L. F. Cooper ◽  
Derek M. Griffith ◽  
...  

Data analysis is one of the most important, yet least understood, stages of the qualitative research process. Through rigorous analysis, data can illuminate the complexity of human behavior, inform interventions, and give voice to people’s lived experiences. While significant progress has been made in advancing the rigor of qualitative analysis, the process often remains nebulous. To better understand how our field conducts and reports qualitative analysis, we reviewed qualitative articles published in Health Education & Behavior between 2000 and 2015. Two independent reviewers abstracted information in the following categories: data management software, coding approach, analytic approach, indicators of trustworthiness, and reflexivity. Of the 48 ( n = 48) articles identified, the majority ( n = 31) reported using qualitative software to manage data. Double-coding transcripts was the most common coding method ( n = 23); however, nearly one third of articles did not clearly describe the coding approach. Although the terminology used to describe the analytic process varied widely, we identified four overarching trajectories common to most articles ( n = 37). Trajectories differed in their use of inductive and deductive coding approaches, formal coding templates, and rounds or levels of coding. Trajectories culminated in the iterative review of coded data to identify emergent themes. Few articles explicitly discussed trustworthiness or reflexivity. Member checks ( n = 9), triangulation of methods ( n = 8), and peer debriefing ( n = 7) were the most common procedures. Variation in the type and depth of information provided poses challenges to assessing quality and enabling replication. Greater transparency and more intentional application of diverse analytic methods can advance the rigor and impact of qualitative research in our field.


Sign in / Sign up

Export Citation Format

Share Document