scholarly journals Gray-Box Identification for High-Dimensional Manifold Constrained Regression

2009 ◽  
Vol 42 (10) ◽  
pp. 1292-1297 ◽  
Author(s):  
Henrik Ohlsson ◽  
Lennart Ljung
2005 ◽  
Vol 4 (1) ◽  
pp. 22-31 ◽  
Author(s):  
Timo Similä

One of the main tasks in exploratory data analysis is to create an appropriate representation for complex data. In this paper, the problem of creating a representation for observations lying on a low-dimensional manifold embedded in high-dimensional coordinates is considered. We propose a modification of the Self-organizing map (SOM) algorithm that is able to learn the manifold structure in the high-dimensional observation coordinates. Any manifold learning algorithm may be incorporated to the proposed training strategy to guide the map onto the manifold surface instead of becoming trapped in local minima. In this paper, the Locally linear embedding algorithm is adopted. We use the proposed method successfully on several data sets with manifold geometry including an illustrative example of a surface as well as image data. We also show with other experiments that the advantage of the method over the basic SOM is restricted to this specific type of data.


2003 ◽  
Vol 15 (6) ◽  
pp. 1373-1396 ◽  
Author(s):  
Mikhail Belkin ◽  
Partha Niyogi

One of the central problems in machine learning and pattern recognition is to develop appropriate representations for complex data. We consider the problem of constructing a representation for data lying on a low-dimensional manifold embedded in a high-dimensional space. Drawing on the correspondence between the graph Laplacian, the Laplace Beltrami operator on the manifold, and the connections to the heat equation, we propose a geometrically motivated algorithm for representing the high-dimensional data. The algorithm provides a computationally efficient approach to nonlinear dimensionality reduction that has locality-preserving properties and a natural connection to clustering. Some potential applications and illustrative examples are discussed.


Author(s):  
Michael Elmegaard ◽  
Jan Ru¨bel ◽  
Mizuho Inagaki ◽  
Atsushi Kawamoto ◽  
Jens Starke

Mechanical systems are typically described with finite element models resulting in high-dimensional dynamical systems. The high-dimensional space excludes the application of certain investigation methods like numerical continuation and bifurcation analysis to investigate the dynamical behaviour and its parameter dependence. Nevertheless, the dynamical behaviour usually lives on a low-dimensional manifold but typically no closed equations are available for the macroscopic quantities of interest. Therefore, an equation-free approach is suggested here to analyse and investigate the vibration behaviour of nonlinear rotating machinery. This allows then in the next step to optimize the rotor design specifications to reduce unbalance vibrations of a rotor-bearing system with nonlinear factors like the oil film dynamics. As an example we provide a simple model of a passenger car turbocharger where we investigate how the maximal vibration amplitude of the rotor depends on the viscosity of the oil used in the bearings.


Author(s):  
MIAO CHENG ◽  
BIN FANG ◽  
YUAN YAN TANG ◽  
HENGXIN CHEN

Many problems in pattern classification and feature extraction involve dimensionality reduction as a necessary processing. Traditional manifold learning algorithms, such as ISOMAP, LLE, and Laplacian Eigenmap, seek the low-dimensional manifold in an unsupervised way, while the local discriminant analysis methods identify the underlying supervised submanifold structures. In addition, it has been well-known that the intraclass null subspace contains the most discriminative information if the original data exist in a high-dimensional space. In this paper, we seek for the local null space in accordance with the null space LDA (NLDA) approach and reveal that its computational expense mainly depends on the quantity of connected edges in graphs, which may be still unacceptable if a great deal of samples are involved. To address this limitation, an improved local null space algorithm is proposed to employ the penalty subspace to approximate the local discriminant subspace. Compared with the traditional approach, the proposed method can achieve more efficiency so that the overload problem is avoided, while slight discriminant power is lost theoretically. A comparative study on classification shows that the performance of the approximative algorithm is quite close to the genuine one.


2012 ◽  
Vol 263-266 ◽  
pp. 2126-2130 ◽  
Author(s):  
Zhi Gang Lou ◽  
Hong Zhao Liu

Manifold learning is a new unsupervised learning method. Its main purpose is to find the inherent law of generated data sets. Be used for high dimensional nonlinear fault samples for learning, in order to identify embedded in high dimensional data space in the low dimensional manifold, can be effective data found the essential characteristics of fault identification. In many types of fault, sometimes often failure and normal operation of the equipment of some operation similar to misjudgment, such as oil pipeline transportation process, pipeline regulating pump, adjustable valve, pump switch, normal operation and pipeline leakage fault condition similar spectral characteristics, thus easy for pipeline leakage cause mistakes. This paper uses the manifold learning algorithm for fault pattern clustering recognition, and through experiments on the algorithm is evaluated.


Author(s):  
Navendu S. Patil ◽  
Joseph P. Cusumano

Detecting bifurcations in noisy and/or high-dimensional physical systems is an important problem in nonlinear dynamics. Near bifurcations, the dynamics of even a high dimensional system is typically dominated by its behavior on a low dimensional manifold. Since the system is sensitive to perturbations near bifurcations, they can be detected by looking at the apparent deterministic structure generated by the interaction between the noise and low-dimensional dynamics. We use minimal hidden Markov models built from the noisy time series to quantify this deterministic structure at the period-doubling bifurcations in the two-well forced Duffing oscillator perturbed by noise. The apparent randomness in the system is characterized using the entropy rate of the discrete stochastic process generated by partitioning time series data. We show that as the bifurcation parameter is varied, sharp changes in the statistical complexity and the entropy rate can be used to locate incipient bifurcations.


Author(s):  
BAO WANG ◽  
STAN J. OSHER

Improving the accuracy and robustness of deep neural nets (DNNs) and adapting them to small training data are primary tasks in deep learning (DL) research. In this paper, we replace the output activation function of DNNs, typically the data-agnostic softmax function, with a graph Laplacian-based high-dimensional interpolating function which, in the continuum limit, converges to the solution of a Laplace–Beltrami equation on a high-dimensional manifold. Furthermore, we propose end-to-end training and testing algorithms for this new architecture. The proposed DNN with graph interpolating activation integrates the advantages of both deep learning and manifold learning. Compared to the conventional DNNs with the softmax function as output activation, the new framework demonstrates the following major advantages: First, it is better applicable to data-efficient learning in which we train high capacity DNNs without using a large number of training data. Second, it remarkably improves both natural accuracy on the clean images and robust accuracy on the adversarial images crafted by both white-box and black-box adversarial attacks. Third, it is a natural choice for semi-supervised learning. This paper is a significant extension of our earlier work published in NeurIPS, 2018. For reproducibility, the code is available at https://github.com/BaoWangMath/DNN-DataDependentActivation.


Sign in / Sign up

Export Citation Format

Share Document