scholarly journals A Novel Approach for Data Analysis using Explainable AI

2021 ◽  
Vol 05 (02) ◽  
pp. 74-80
Author(s):  
Nirupama P
2005 ◽  
Vol 02 (01) ◽  
pp. 63-76
Author(s):  
M. Z. ISKANDARANI ◽  
N. F. SHILBAYEH

An innovative NDT (non-destructive testing) technique for interrogating materials for their defects has been developed successfully. The technique has a novel approach to data analysis by employing intensity, RGB signal re-mix and wavelength variation of a thermally generated IR-beam onto the specimen under test which can be sensed and displayed on a computer screen as an image. Specimen inspection and data analysis are carried out through pixel level re-ordering and shelving techniques within a transformed image file using a sequence grouping and regrouping software system, which is specifically developed for this work. The interaction between an impact damaged RIM composite structure and thermal energy is recorded, analyzed, and modeled using an equivalent Electronic circuit. Effect of impact damage on the integrity of the composite structure is also discussed.


Author(s):  
Naonori Ueda ◽  
Futoshi Naya

Machine learning is a promising technology for analyzing diverse types of big data. The Internet of Things era will feature the collection of real-world information linked to time and space (location) from all sorts of sensors. In this paper, we discuss spatio-temporal multidimensional collective data analysis to create innovative services from such spatio-temporal data and describe the core technologies for the analysis. We describe core technologies about smart data collection and spatio-temporal data analysis and prediction as well as a novel approach for real-time, proactive navigation in crowded environments such as event spaces and urban areas. Our challenge is to develop a real-time navigation system that enables movements of entire groups to be efficiently guided without causing congestion by making near-future predictions of people flow. We show the effectiveness of our navigation approach by computer simulation using artificial people-flow data.


Author(s):  
Ricardo G. Villar ◽  
Jigg L. Pelayo ◽  
Ray Mari N. Mozo ◽  
James B. Salig Jr. ◽  
Jojemar Bantugan

Leaning on the derived results conducted by Central Mindanao University Phil-LiDAR 2.B.11 Image Processing Component, the paper attempts to provides the application of the Light Detection and Ranging (LiDAR) derived products in arriving quality Landcover classification considering the theoretical approach of data analysis principles to minimize the common problems in image classification. These are misclassification of objects and the non-distinguishable interpretation of pixelated features that results to confusion of class objects due to their closely-related spectral resemblance, unbalance saturation of RGB information is a challenged at the same time. Only low density LiDAR point cloud data is exploited in the research denotes as 2 pts/m<sup>2</sup> of accuracy which bring forth essential derived information such as textures and matrices (number of returns, intensity textures, nDSM, etc.) in the intention of pursuing the conditions for selection characteristic. A novel approach that takes gain of the idea of object-based image analysis and the principle of allometric relation of two or more observables which are aggregated for each acquisition of datasets for establishing a proportionality function for data-partioning. In separating two or more data sets in distinct regions in a feature space of distributions, non-trivial computations for fitting distribution were employed to formulate the ideal hyperplane. Achieving the distribution computations, allometric relations were evaluated and match with the necessary rotation, scaling and transformation techniques to find applicable border conditions. Thus, a customized hybrid feature was developed and embedded in every object class feature to be used as classifier with employed hierarchical clustering strategy for cross-examining and filtering features. This features are boost using machine learning algorithms as trainable sets of information for a more competent feature detection. The product classification in this investigation was compared to a classification based on conventional object-oriented approach promoting straight-forward functionalities of the software eCognition. A compelling rise of efficiency in the overall accuracy (74.4% to 93.4%) and kappa index of agreement (70.5% to 91.7%) is noticeable based on the initial process. Nevertheless, having low-dense LiDAR dataset could be enough in generating exponential increase of performance in accuracy.


2018 ◽  
Author(s):  
Yulia Panina ◽  
Arno Germond ◽  
Brit G. David ◽  
Tomonobu M. Watanabe ◽  

ABSTRACTThe real-time quantitative polymerase chain reaction (qPCR) is routinely used for quantification of nucleic acids and is considered the gold standard in the field of relative nucleic acid measurements. The efficiency of the qPCR reaction is one of the most important parameters that needs to be determined, reported, and incorporated into data analysis in any qPCR experiment. The Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) guidelines recognize the calibration curve as the method of choice for estimation of qPCR efficiency. The precision of this method has been reported to be between SD=0.007 (3 replicates) and SD=0.022 (no replicates). In this manuscript we present a novel approach to analysing qPCR data obtained by running a dilution series. Unlike previously developed methods, our method relies on a new formula that describes pairwise relationships between data points on separate amplification curves and thus operates extensive statistics (hundreds of estimations). The comparison of our method with classical calibration curve by Monte Carlo simulation shows that our approach can almost double the precision of efficiency and gene expression ratio estimations on the same dataset.


Sign in / Sign up

Export Citation Format

Share Document