Application of multivariable statistical techniques in plant-wide WWTP control strategies analysis

2007 ◽  
Vol 56 (6) ◽  
pp. 75-83 ◽  
Author(s):  
X. Flores ◽  
J. Comas ◽  
I.R. Roda ◽  
L. Jiménez ◽  
K.V. Gernaey

The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation of the complex multicriteria data sets and allows an improved use of information for effective evaluation of control strategies.

Author(s):  
Andrew J. Connolly ◽  
Jacob T. VanderPlas ◽  
Alexander Gray ◽  
Andrew J. Connolly ◽  
Jacob T. VanderPlas ◽  
...  

With the dramatic increase in data available from a new generation of astronomical telescopes and instruments, many analyses must address the question of the complexity as well as size of the data set. This chapter deals with how we can learn which measurements, properties, or combinations thereof carry the most information within a data set. It describes techniques that are related to concepts discussed when describing Gaussian distributions, density estimation, and the concepts of information content. The chapter begins with an exploration of the problems posed by high-dimensional data. It then describes the data sets used in this chapter, and introduces perhaps the most important and widely used dimensionality reduction technique, principal component analysis (PCA). The remainder of the chapter discusses several alternative techniques which address some of the weaknesses of PCA.


2018 ◽  
Vol 17 ◽  
pp. 117693511877108 ◽  
Author(s):  
Min Wang ◽  
Steven M Kornblau ◽  
Kevin R Coombes

Principal component analysis (PCA) is one of the most common techniques in the analysis of biological data sets, but applying PCA raises 2 challenges. First, one must determine the number of significant principal components (PCs). Second, because each PC is a linear combination of genes, it rarely has a biological interpretation. Existing methods to determine the number of PCs are either subjective or computationally extensive. We review several methods and describe a new R package, PCDimension, that implements additional methods, the most important being an algorithm that extends and automates a graphical Bayesian method. Using simulations, we compared the methods. Our newly automated procedure is competitive with the best methods when considering both accuracy and speed and is the most accurate when the number of objects is small compared with the number of attributes. We applied the method to a proteomics data set from patients with acute myeloid leukemia. Proteins in the apoptosis pathway could be explained using 6 PCs. By clustering the proteins in PC space, we were able to replace the PCs by 6 “biological components,” 3 of which could be immediately interpreted from the current literature. We expect this approach combining PCA with clustering to be widely applicable.


2013 ◽  
Vol 13 (6) ◽  
pp. 3133-3147 ◽  
Author(s):  
Y. L. Roberts ◽  
P. Pilewskie ◽  
B. C. Kindel ◽  
D. R. Feldman ◽  
W. D. Collins

Abstract. The Climate Absolute Radiance and Refractivity Observatory (CLARREO) is a climate observation system that has been designed to monitor the Earth's climate with unprecedented absolute radiometric accuracy and SI traceability. Climate Observation System Simulation Experiments (OSSEs) have been generated to simulate CLARREO hyperspectral shortwave imager measurements to help define the measurement characteristics needed for CLARREO to achieve its objectives. To evaluate how well the OSSE-simulated reflectance spectra reproduce the Earth's climate variability at the beginning of the 21st century, we compared the variability of the OSSE reflectance spectra to that of the reflectance spectra measured by the Scanning Imaging Absorption Spectrometer for Atmospheric Cartography (SCIAMACHY). Principal component analysis (PCA) is a multivariate decomposition technique used to represent and study the variability of hyperspectral radiation measurements. Using PCA, between 99.7% and 99.9% of the total variance the OSSE and SCIAMACHY data sets can be explained by subspaces defined by six principal components (PCs). To quantify how much information is shared between the simulated and observed data sets, we spectrally decomposed the intersection of the two data set subspaces. The results from four cases in 2004 showed that the two data sets share eight (January and October) and seven (April and July) dimensions, which correspond to about 99.9% of the total SCIAMACHY variance for each month. The spectral nature of these shared spaces, understood by examining the transformed eigenvectors calculated from the subspace intersections, exhibit similar physical characteristics to the original PCs calculated from each data set, such as water vapor absorption, vegetation reflectance, and cloud reflectance.


2016 ◽  
Vol 2 (4) ◽  
pp. 211
Author(s):  
Girdhari Lal Chaurasia ◽  
Mahesh Kumar Gupta ◽  
Praveen Kumar Tandon

Water is an essential resource for all the organisms, plants and animals including the human beings. It is the backbone for agricultural and industrial sectors and all the small business units. Increase in human population and economic activities have tremendously increased the demand for large-scale suppliers of fresh water for various competing end users.The quality evaluation of water is represented in terms of physical, chemical and Biological parameters. A particular problem in the case of water quality monitoring is the complexity associated with analyzing the large number of measured variables. The data sets contain rich information about the behavior of the water resources. Multivariate statistical approaches allow deriving hidden information from the data sets about the possible influences of the environment on water quality. Classification, modeling and interpretation of monitored data are the most important steps in the assessment of water quality. The application of different multivariate statistical techniques, such as cluster analysis (CA), principal component analysis (PCA) and factor analysis (FA) help to identify important components or factors accounting for most of the variances of a system. In the present study water samples were analyzed for various physicochemical analyses by different methods following the standards of APHA, BIS and WHO and were subjected to further statistical analysis viz. the cluster analysis to understand the similarity and differences among the various sampling stations.  Three clusters were found. Cluster 1 was marked with 3 sampling locations 1, 3 & 5; Cluster-2 was marked with sampling location-2 and cluster-3 was marked with sampling location-4. Principal component analysis/factor analysis is a pattern reorganization technique which is used to assess the correlation between the observations in terms of different factors which are not observable. Observations correlated either positively or negatively, are likely to be affected by the same factors while the observations which are not correlated are influenced by different factors. In our study three factors explained 99.827% of variances. F1 marked  51.619% of total variances, high positive strong loading with TSS, TS, Temp, TDS, phosphate and moderate with electrical conductivity with loading values of 0.986, 0.970, 0.792, 0.744, 0.695,  0.701, respectively. Factor 2 marked 27.236% of the total variance with moderate positive loading with total alkalinity & temp. with loading values 0.723 & 0.606 respectively. It also explained the moderate negative loading with conductivity, TDS, and chloride with loading values -0.698, -0.690, -0.582. Factor F 3 marked 20.972 % of the variances with positive loading with PH, chloride, and phosphate with strong loading of pH 0.872 and moderate positive loading with chloride and phosphate with loading values 0.721, and 0.569 respectively. 


2016 ◽  
Author(s):  
Dasapta Erwin Irawan ◽  
Thomas Triadi Putranto

Abstract. The following paper describes in brief the data set related to our project "Hydrochemical assessment of Semarang Groundwater Quality". All of 58 samples were taken in 1992, 1993, 2003, 2006, and 2007 using well point data from several reports from Ministry of Energy and Min- eral Resources and independent consultants. We provided 20 parameters in each samples (sample id, coord X, coord Y, well depth, water level, water elevation, TDS, pH, EC, K, Ca, Na, Mg, Cl, SO4, HCO3, year, ion balance, screen location, and chemical facies). The chemical composi- tion were tested in the Water Quality Laboratory, Universitas Diponegoro using mas spectrofotometer method. The statistical treatment for the dataset (available on Zenodo doi:10.5281/zenodo.57293) were described as follows: (1) data preparation in to csv file format, load it in to R environment; (2) data treatment, including: correlation matrix, cluster analysis using kmeans and hierarchical cluster analysis, and principal component analysis. For anal- ysis and visualizations, We used the following R packages: ggplot2, dplyr, factomineR, factoExtra, cluster, ggcorrplot, and ape.


2012 ◽  
Vol 12 (10) ◽  
pp. 28305-28341
Author(s):  
Y. L. Roberts ◽  
P. Pilewskie ◽  
B. C. Kindel ◽  
D. R. Feldman ◽  
W. D. Collins

Abstract. The Climate Absolute Radiance and Refractivity Observatory (CLARREO) is a climate observation system that has been designed to monitor the Earth's climate with unprecedented absolute radiometric accuracy and SI traceability. Climate Observation System Simulation Experiments (OSSEs) have been generated to simulate CLARREO hyperspectral shortwave imager measurements to help define the measurement characteristics needed for CLARREO to achieve its objectives. To evaluate how well the OSSE-simulated reflectance spectra reproduce the Earth's climate variability at the beginning of the 21st century, we compared the variability of the OSSE reflectance spectra to that of the reflectance spectra measured by the Scanning Imaging Absorption Spectrometer for Atmospheric Cartography (SCIAMACHY). Principal component analysis (PCA) is a multivariate spectral decomposition technique used to represent and study the variability of hyperspectral radiation measurements. Using PCA, between 99.7% and 99.9% of the total variance the OSSE and SCIAMACHY data sets can be explained by subspaces defined by six principal components (PCs). To quantify how much information is shared between the simulated and observed data sets, we spectrally decomposed the intersection of the two data set subspaces. The results from four cases in 2004 showed that the two data sets share eight (January and October) and seven (April and July) dimensions, which correspond to about 99.9% of the total SCIAMACHY variance for each month. The spectral nature of these shared spaces, understood by examining the transformed eigenvectors calculated from the subspace intersections, exhibit similar physical characteristics to the original PCs calculated from each data set, such as water vapor absorption, vegetation reflectance, and cloud reflectance.


2020 ◽  
Vol 498 (3) ◽  
pp. 3440-3451
Author(s):  
Alan F Heavens ◽  
Elena Sellentin ◽  
Andrew H Jaffe

ABSTRACT Bringing a high-dimensional data set into science-ready shape is a formidable challenge that often necessitates data compression. Compression has accordingly become a key consideration for contemporary cosmology, affecting public data releases, and reanalyses searching for new physics. However, data compression optimized for a particular model can suppress signs of new physics, or even remove them altogether. We therefore provide a solution for exploring new physics during data compression. In particular, we store additional agnostic compressed data points, selected to enable precise constraints of non-standard physics at a later date. Our procedure is based on the maximal compression of the MOPED algorithm, which optimally filters the data with respect to a baseline model. We select additional filters, based on a generalized principal component analysis, which are carefully constructed to scout for new physics at high precision and speed. We refer to the augmented set of filters as MOPED-PC. They enable an analytic computation of Bayesian Evidence that may indicate the presence of new physics, and fast analytic estimates of best-fitting parameters when adopting a specific non-standard theory, without further expensive MCMC analysis. As there may be large numbers of non-standard theories, the speed of the method becomes essential. Should no new physics be found, then our approach preserves the precision of the standard parameters. As a result, we achieve very rapid and maximally precise constraints of standard and non-standard physics, with a technique that scales well to large dimensional data sets.


2020 ◽  
Author(s):  
Anna Morozova ◽  
Rania Rebbah ◽  
M. Alexandra Pais

<p>Geomagnetic field (GMF) variations from external sources are classified as regular diurnal or occurring during periods of disturbances. The most significant regular variations are the quiet solar daily variation (Sq) and the disturbance daily variation (SD). These variations have well recognized daily cycles and need to be accounted for before the analysis of the disturbed field. Preliminary analysis of the GMF variations shows that the principal component analysis (PCA) is a useful tool for extraction of regular variations of GMF; however the requirements to the data set length, geomagnetic activity level etc. need to be established.</p><p>Here we present preliminary results of the PCA-based Sq extraction procedure based on the analysis of the Coimbra Geomagnetic Observatory (COI) measurements of the geomagnetic field components H, X, Y and Z between 2007 and 2015. The PCA-based Sq curves are compared with the standard ones obtained using 5 IQD per month. PCA was applied to data sets of different length: either 1 month-long data set for one of 2007-2015 years or data series for the same month but from different years (2007-2015) combined together. For most of the analyzed years the first PCA mode (PC1) was identified as SD variation and the second mode (PC2) was identified as Sq variation.</p>


2018 ◽  
Vol 74 (5) ◽  
pp. 411-421 ◽  
Author(s):  
Michael C. Thompson ◽  
Duilio Cascio ◽  
Todd O. Yeates

Real macromolecular crystals can be non-ideal in a myriad of ways. This often creates challenges for structure determination, while also offering opportunities for greater insight into the crystalline state and the dynamic behavior of macromolecules. To evaluate whether different parts of a single crystal of a dynamic protein, EutL, might be informative about crystal and protein polymorphism, a microfocus X-ray synchrotron beam was used to collect a series of 18 separate data sets from non-overlapping regions of the same crystal specimen. A principal component analysis (PCA) approach was employed to compare the structure factors and unit cells across the data sets, and it was found that the 18 data sets separated into two distinct groups, with largeRvalues (in the 40% range) and significant unit-cell variations between the members of the two groups. This categorization mapped the different data-set types to distinct regions of the crystal specimen. Atomic models of EutL were then refined against two different data sets obtained by separately merging data from the two distinct groups. A comparison of the two resulting models revealed minor but discernable differences in certain segments of the protein structure, and regions of higher deviation were found to correlate with regions where larger dynamic motions were predicted to occur by normal-mode molecular-dynamics simulations. The findings emphasize that large spatially dependent variations may be present across individual macromolecular crystals. This information can be uncovered by simultaneous analysis of multiple partial data sets and can be exploited to reveal new insights about protein dynamics, while also improving the accuracy of the structure-factor data ultimately obtained in X-ray diffraction experiments.


2017 ◽  
Vol 10 (13) ◽  
pp. 355 ◽  
Author(s):  
Reshma Remesh ◽  
Pattabiraman. V

Dimensionality reduction techniques are used to reduce the complexity for analysis of high dimensional data sets. The raw input data set may have large dimensions and it might consume time and lead to wrong predictions if unnecessary data attributes are been considered for analysis. So using dimensionality reduction techniques one can reduce the dimensions of input data towards accurate prediction with less cost. In this paper the different machine learning approaches used for dimensionality reductions such as PCA, SVD, LDA, Kernel Principal Component Analysis and Artificial Neural Network  have been studied.


Sign in / Sign up

Export Citation Format

Share Document