scholarly journals Maternal Plasma Metabolic Profile Demarcates a Role for Neuroinflammation in Non-Typical Development of Children

Metabolites ◽  
2021 ◽  
Vol 11 (8) ◽  
pp. 545
Author(s):  
Rebecca J. Schmidt ◽  
Donghai Liang ◽  
Stefanie A. Busgang ◽  
Paul Curtin ◽  
Cecilia Giulivi

Maternal and cord plasma metabolomics were used to elucidate biological pathways associated with increased diagnosis risk for autism spectrum disorders (ASD). Metabolome-wide associations were assessed in both maternal and umbilical cord plasma in relation to diagnoses of ASD and other non-typical development (Non-TD) compared to typical development (TD) in the Markers of Autism risk in Babies: Learning Early Signs (MARBLES) cohort study of children born to mothers who already have at least one child with ASD. Analyses were stratified by sample matrix type, machine mode, and annotation confidence level. Dimensionality reduction techniques were used [i.e, principal component analysis (PCA) and random subset weighted quantile sum regression (WQSRS)] to minimize the high multiple comparison burden. With WQSRS, a metabolite mixture obtained from the negative mode of maternal plasma decreased the odds of Non-TD compared to TD. These metabolites, all related to the prostaglandin pathway, underscored the relevance of neuroinflammation status. No other significant findings were observed. Dimensionality reduction strategies provided confirming evidence that a set of maternal plasma metabolites are important in distinguishing Non-TD compared to TD diagnosis. A lower risk for Non-TD was linked to anti-inflammatory elements, thereby linking neuroinflammation to detrimental brain function consistent with studies ranging from neurodevelopment to neurodegeneration.

2020 ◽  
Vol 12 (22) ◽  
pp. 3703
Author(s):  
Adrian Doicu ◽  
Dmitry S. Efremenko ◽  
Thomas Trautmann

A spectral acceleration approach for the spherical harmonics discrete ordinate method (SHDOM) is designed. This approach combines the correlated k-distribution method and some dimensionality reduction techniques applied on the optical parameters of an atmospheric system. The dimensionality reduction techniques used in this study are the linear embedding methods: principal component analysis, locality pursuit embedding, locality preserving projection, and locally embedded analysis. Through a numerical analysis, it is shown that relative to the correlated k-distribution method, PCA in conjunction with a second-order of scattering approximation yields an acceleration factor of 12. This implies that SHDOM equipped with this acceleration approach is efficient enough to perform spectral integration of radiance fields in inhomogeneous multi-dimensional media.


2017 ◽  
Vol 10 (13) ◽  
pp. 355 ◽  
Author(s):  
Reshma Remesh ◽  
Pattabiraman. V

Dimensionality reduction techniques are used to reduce the complexity for analysis of high dimensional data sets. The raw input data set may have large dimensions and it might consume time and lead to wrong predictions if unnecessary data attributes are been considered for analysis. So using dimensionality reduction techniques one can reduce the dimensions of input data towards accurate prediction with less cost. In this paper the different machine learning approaches used for dimensionality reductions such as PCA, SVD, LDA, Kernel Principal Component Analysis and Artificial Neural Network  have been studied.


2021 ◽  
Vol 54 (4) ◽  
pp. 1-34
Author(s):  
Felipe L. Gewers ◽  
Gustavo R. Ferreira ◽  
Henrique F. De Arruda ◽  
Filipi N. Silva ◽  
Cesar H. Comin ◽  
...  

Principal component analysis (PCA) is often applied for analyzing data in the most diverse areas. This work reports, in an accessible and integrated manner, several theoretical and practical aspects of PCA. The basic principles underlying PCA, data standardization, possible visualizations of the PCA results, and outlier detection are subsequently addressed. Next, the potential of using PCA for dimensionality reduction is illustrated on several real-world datasets. Finally, we summarize PCA-related approaches and other dimensionality reduction techniques. All in all, the objective of this work is to assist researchers from the most diverse areas in using and interpreting PCA.


2019 ◽  
pp. 85-98 ◽  
Author(s):  
Ana del Águila ◽  
Dmitry S. Efremenko ◽  
Thomas Trautmann

Hyper-spectral sensors take measurements in the narrow contiguous bands across the electromagnetic spectrum. Usually, the goal is to detect a certain object or a component of the medium with unique spectral signatures. In particular, the hyper-spectral measurements are used in atmospheric remote sensing to detect trace gases. To improve the efficiency of hyper-spectral processing algorithms, data reduction methods are applied. This paper outlines the dimensionality reduction techniques in the context of hyper-spectral remote sensing of the atmosphere. The dimensionality reduction excludes redundant information from the data and currently is the integral part of high-performance radiation transfer models. In this survey, it is shown how the principal component analysis can be applied for spectral radiance modelling and retrieval of atmospheric constituents, thereby speeding up the data processing by orders of magnitude. The discussed techniques are generic and can be readily applied for solving atmospheric as well as material science problems.


2019 ◽  
Vol 8 (S3) ◽  
pp. 66-71
Author(s):  
T. Sudha ◽  
P. Nagendra Kumar

Data mining is one of the major areas of research. Clustering is one of the main functionalities of datamining. High dimensionality is one of the main issues of clustering and Dimensionality reduction can be used as a solution to this problem. The present work makes a comparative study of dimensionality reduction techniques such as t-distributed stochastic neighbour embedding and probabilistic principal component analysis in the context of clustering. High dimensional data have been reduced to low dimensional data using dimensionality reduction techniques such as t-distributed stochastic neighbour embedding and probabilistic principal component analysis. Cluster analysis has been performed on the high dimensional data as well as the low dimensional data sets obtained through t-distributed stochastic neighbour embedding and Probabilistic principal component analysis with varying number of clusters. Mean squared error; time and space have been considered as parameters for comparison. The results obtained show that time taken to convert the high dimensional data into low dimensional data using probabilistic principal component analysis is higher than the time taken to convert the high dimensional data into low dimensional data using t-distributed stochastic neighbour embedding.The space required by the data set reduced through Probabilistic principal component analysis is less than the storage space required by the data set reduced through t-distributed stochastic neighbour embedding.


2020 ◽  
Vol 0 (0) ◽  
Author(s):  
Alexandra-Maria Tăuţan ◽  
Alessandro C. Rossi ◽  
Ruben de Francisco ◽  
Bogdan Ionescu

AbstractMethods developed for automatic sleep stage detection make use of large amounts of data in the form of polysomnographic (PSG) recordings to build predictive models. In this study, we investigate the effect of several dimensionality reduction techniques, i.e., principal component analysis (PCA), factor analysis (FA), and autoencoders (AE) on common classifiers, e.g., random forests (RF), multilayer perceptron (MLP), long-short term memory (LSTM) networks, for automated sleep stage detection. Experimental testing is carried out on the MGH Dataset provided in the “You Snooze, You Win: The PhysioNet/Computing in Cardiology Challenge 2018”. The signals used as input are the six available (EEG) electoencephalographic channels and combinations with the other PSG signals provided: ECG – electrocardiogram, EMG – electromyogram, respiration based signals – respiratory efforts and airflow. We observe that a similar or improved accuracy is obtained in most cases when using all dimensionality reduction techniques, which is a promising result as it allows to reduce the computational load while maintaining performance and in some cases also improves the accuracy of automated sleep stage detection. In our study, using autoencoders for dimensionality reduction maintains the performance of the model, while using PCA and FA the accuracy of the models is in most cases improved.


2011 ◽  
Vol 08 (02) ◽  
pp. 161-169
Author(s):  
E. SIVASANKAR ◽  
H. SRIDHAR ◽  
V. BALAKRISHNAN ◽  
K. ASHWIN ◽  
R. S. RAJESH

Data mining methods are used to mine voluminous data to find useful information from data. The data that is to be mined may have a large number of dimensions, so the mining process will take a lot of time. In general, the computation time is an exponential function of the number of dimensions. It is in this context that we use dimensionality reduction techniques to speed up the decision-making process. Dimensionality reduction techniques can be categorized as Feature Selection and Feature Extraction Techniques. In this paper we compare the two categories of dimensionality reduction techniques. Feature selection has been implemented using the Information Gain and Goodman–Kruskal measure. Principal Component Analysis has been used for Feature Extraction. In order to compare the accuracy of the methods, we have also implemented a classifier using back-propagation neural network. In general, it is found that feature extraction methods are more accurate than feature selection methods in the framework of credit risk analysis.


Proceedings ◽  
2019 ◽  
Vol 24 (1) ◽  
pp. 6 ◽  
Author(s):  
K Nivedita Priyadarshini ◽  
V Sivashankari ◽  
Sulochana Shekhar ◽  
K Balasubramani

Hyperspectral datasets provide explicit ground covers with hundreds of bands. Filtering contiguous hyperspectral datasets potentially discriminates surface features. Therefore, in this study, a number of spectral bands are minimized without losing original information through a process known as dimensionality reduction (DR). Redundant bands portray the fact that neighboring bands are highly correlated, sharing similar information. The benefits of utilizing dimensionality reduction include the ability to slacken the complexity of data during processing and transform original data to remove the correlation among bands. In this paper, two DR methods, principal component analysis (PCA) and minimum noise fraction (MNF), are applied to the Airborne Visible/Infrared Imaging Spectrometer-Next Generation (AVIRIS-NG) dataset of Kalaburagi for discussion.


2019 ◽  
Vol 8 (3) ◽  
pp. 7153-7160

From the analysis of big data, dimensionality reduction techniques play a significant role in various fields where the data is huge with multiple columns or classes. Data with high dimensions contains thousands of features where many of these features contain useful information. Along with this there contains a lot of redundant or irrelevant features which reduce the quality, performance of data and decrease the efficiency in computation. Procedures which are done mathematically for reducing dimensions are known as dimensionality reduction techniques. The main aim of the Dimensionality Reduction algorithms such as Principal Component Analysis (PCA), Random Projection (RP) and Non Negative Matrix Factorization (NMF) is used to decrease the inappropriate information from the data and moreover the features and attributes taken from these algorithms were not able to characterize data as different divisions. This paper gives a review about the traditional methods used in Machine algorithm for reducing the dimension and proposes a view, how deep learning can be used for dimensionality reduction.


Sign in / Sign up

Export Citation Format

Share Document