scholarly journals Comparison and Evaluation of Dimensionality Reduction Techniques for Hyperspectral Data Analysis

Proceedings ◽  
2019 ◽  
Vol 24 (1) ◽  
pp. 6 ◽  
Author(s):  
K Nivedita Priyadarshini ◽  
V Sivashankari ◽  
Sulochana Shekhar ◽  
K Balasubramani

Hyperspectral datasets provide explicit ground covers with hundreds of bands. Filtering contiguous hyperspectral datasets potentially discriminates surface features. Therefore, in this study, a number of spectral bands are minimized without losing original information through a process known as dimensionality reduction (DR). Redundant bands portray the fact that neighboring bands are highly correlated, sharing similar information. The benefits of utilizing dimensionality reduction include the ability to slacken the complexity of data during processing and transform original data to remove the correlation among bands. In this paper, two DR methods, principal component analysis (PCA) and minimum noise fraction (MNF), are applied to the Airborne Visible/Infrared Imaging Spectrometer-Next Generation (AVIRIS-NG) dataset of Kalaburagi for discussion.

Author(s):  
H. Ma ◽  
W. Feng ◽  
X. Cao ◽  
L. Wang

Hyperspectral images usually consist of more than one hundred spectral bands, which have potentials to provide rich spatial and spectral information. However, the application of hyperspectral data is still challengeable due to “the curse of dimensionality”. In this context, many techniques, which aim to make full use of both the spatial and spectral information, are investigated. In order to preserve the geometrical information, meanwhile, with less spectral bands, we propose a novel method, which combines principal components analysis (PCA), guided image filtering and the random forest classifier (RF). In detail, PCA is firstly employed to reduce the dimension of spectral bands. Secondly, the guided image filtering technique is introduced to smooth land object, meanwhile preserving the edge of objects. Finally, the features are fed into RF classifier. To illustrate the effectiveness of the method, we carry out experiments over the popular Indian Pines data set, which is collected by Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) sensor. By comparing the proposed method with the method of only using PCA or guided image filter, we find that effect of the proposed method is better.


2020 ◽  
Vol 12 (22) ◽  
pp. 3703
Author(s):  
Adrian Doicu ◽  
Dmitry S. Efremenko ◽  
Thomas Trautmann

A spectral acceleration approach for the spherical harmonics discrete ordinate method (SHDOM) is designed. This approach combines the correlated k-distribution method and some dimensionality reduction techniques applied on the optical parameters of an atmospheric system. The dimensionality reduction techniques used in this study are the linear embedding methods: principal component analysis, locality pursuit embedding, locality preserving projection, and locally embedded analysis. Through a numerical analysis, it is shown that relative to the correlated k-distribution method, PCA in conjunction with a second-order of scattering approximation yields an acceleration factor of 12. This implies that SHDOM equipped with this acceleration approach is efficient enough to perform spectral integration of radiance fields in inhomogeneous multi-dimensional media.


2017 ◽  
Vol 10 (13) ◽  
pp. 355 ◽  
Author(s):  
Reshma Remesh ◽  
Pattabiraman. V

Dimensionality reduction techniques are used to reduce the complexity for analysis of high dimensional data sets. The raw input data set may have large dimensions and it might consume time and lead to wrong predictions if unnecessary data attributes are been considered for analysis. So using dimensionality reduction techniques one can reduce the dimensions of input data towards accurate prediction with less cost. In this paper the different machine learning approaches used for dimensionality reductions such as PCA, SVD, LDA, Kernel Principal Component Analysis and Artificial Neural Network  have been studied.


2021 ◽  
Vol 54 (4) ◽  
pp. 1-34
Author(s):  
Felipe L. Gewers ◽  
Gustavo R. Ferreira ◽  
Henrique F. De Arruda ◽  
Filipi N. Silva ◽  
Cesar H. Comin ◽  
...  

Principal component analysis (PCA) is often applied for analyzing data in the most diverse areas. This work reports, in an accessible and integrated manner, several theoretical and practical aspects of PCA. The basic principles underlying PCA, data standardization, possible visualizations of the PCA results, and outlier detection are subsequently addressed. Next, the potential of using PCA for dimensionality reduction is illustrated on several real-world datasets. Finally, we summarize PCA-related approaches and other dimensionality reduction techniques. All in all, the objective of this work is to assist researchers from the most diverse areas in using and interpreting PCA.


2019 ◽  
pp. 85-98 ◽  
Author(s):  
Ana del Águila ◽  
Dmitry S. Efremenko ◽  
Thomas Trautmann

Hyper-spectral sensors take measurements in the narrow contiguous bands across the electromagnetic spectrum. Usually, the goal is to detect a certain object or a component of the medium with unique spectral signatures. In particular, the hyper-spectral measurements are used in atmospheric remote sensing to detect trace gases. To improve the efficiency of hyper-spectral processing algorithms, data reduction methods are applied. This paper outlines the dimensionality reduction techniques in the context of hyper-spectral remote sensing of the atmosphere. The dimensionality reduction excludes redundant information from the data and currently is the integral part of high-performance radiation transfer models. In this survey, it is shown how the principal component analysis can be applied for spectral radiance modelling and retrieval of atmospheric constituents, thereby speeding up the data processing by orders of magnitude. The discussed techniques are generic and can be readily applied for solving atmospheric as well as material science problems.


2019 ◽  
Vol 8 (S3) ◽  
pp. 66-71
Author(s):  
T. Sudha ◽  
P. Nagendra Kumar

Data mining is one of the major areas of research. Clustering is one of the main functionalities of datamining. High dimensionality is one of the main issues of clustering and Dimensionality reduction can be used as a solution to this problem. The present work makes a comparative study of dimensionality reduction techniques such as t-distributed stochastic neighbour embedding and probabilistic principal component analysis in the context of clustering. High dimensional data have been reduced to low dimensional data using dimensionality reduction techniques such as t-distributed stochastic neighbour embedding and probabilistic principal component analysis. Cluster analysis has been performed on the high dimensional data as well as the low dimensional data sets obtained through t-distributed stochastic neighbour embedding and Probabilistic principal component analysis with varying number of clusters. Mean squared error; time and space have been considered as parameters for comparison. The results obtained show that time taken to convert the high dimensional data into low dimensional data using probabilistic principal component analysis is higher than the time taken to convert the high dimensional data into low dimensional data using t-distributed stochastic neighbour embedding.The space required by the data set reduced through Probabilistic principal component analysis is less than the storage space required by the data set reduced through t-distributed stochastic neighbour embedding.


Metabolites ◽  
2021 ◽  
Vol 11 (8) ◽  
pp. 545
Author(s):  
Rebecca J. Schmidt ◽  
Donghai Liang ◽  
Stefanie A. Busgang ◽  
Paul Curtin ◽  
Cecilia Giulivi

Maternal and cord plasma metabolomics were used to elucidate biological pathways associated with increased diagnosis risk for autism spectrum disorders (ASD). Metabolome-wide associations were assessed in both maternal and umbilical cord plasma in relation to diagnoses of ASD and other non-typical development (Non-TD) compared to typical development (TD) in the Markers of Autism risk in Babies: Learning Early Signs (MARBLES) cohort study of children born to mothers who already have at least one child with ASD. Analyses were stratified by sample matrix type, machine mode, and annotation confidence level. Dimensionality reduction techniques were used [i.e, principal component analysis (PCA) and random subset weighted quantile sum regression (WQSRS)] to minimize the high multiple comparison burden. With WQSRS, a metabolite mixture obtained from the negative mode of maternal plasma decreased the odds of Non-TD compared to TD. These metabolites, all related to the prostaglandin pathway, underscored the relevance of neuroinflammation status. No other significant findings were observed. Dimensionality reduction strategies provided confirming evidence that a set of maternal plasma metabolites are important in distinguishing Non-TD compared to TD diagnosis. A lower risk for Non-TD was linked to anti-inflammatory elements, thereby linking neuroinflammation to detrimental brain function consistent with studies ranging from neurodevelopment to neurodegeneration.


2020 ◽  
Vol 0 (0) ◽  
Author(s):  
Alexandra-Maria Tăuţan ◽  
Alessandro C. Rossi ◽  
Ruben de Francisco ◽  
Bogdan Ionescu

AbstractMethods developed for automatic sleep stage detection make use of large amounts of data in the form of polysomnographic (PSG) recordings to build predictive models. In this study, we investigate the effect of several dimensionality reduction techniques, i.e., principal component analysis (PCA), factor analysis (FA), and autoencoders (AE) on common classifiers, e.g., random forests (RF), multilayer perceptron (MLP), long-short term memory (LSTM) networks, for automated sleep stage detection. Experimental testing is carried out on the MGH Dataset provided in the “You Snooze, You Win: The PhysioNet/Computing in Cardiology Challenge 2018”. The signals used as input are the six available (EEG) electoencephalographic channels and combinations with the other PSG signals provided: ECG – electrocardiogram, EMG – electromyogram, respiration based signals – respiratory efforts and airflow. We observe that a similar or improved accuracy is obtained in most cases when using all dimensionality reduction techniques, which is a promising result as it allows to reduce the computational load while maintaining performance and in some cases also improves the accuracy of automated sleep stage detection. In our study, using autoencoders for dimensionality reduction maintains the performance of the model, while using PCA and FA the accuracy of the models is in most cases improved.


Sign in / Sign up

Export Citation Format

Share Document