scholarly journals A Regression Based Non-Intrusive Method Using Separated Representation for Uncertainty Quantification

Author(s):  
Prashant Rai ◽  
Mathilde Chevreuil ◽  
Anthony Nouy ◽  
Jayant Sen Gupta

This paper aims at handling high dimensional uncertainty propagation problems by proposing a tensor product approximation method based on regression techniques. The underlying assumption is that the model output functional can be well represented in a separated form, as a sum of elementary tensors in the stochastic tensor product space. The proposed method consists in constructing a tensor basis with a greedy algorithm and then in computing an approximation in the generated approximation space using regression with sparse regularization. Using appropriate regularization techniques, the regression problems are well posed for only few sample evaluations and they provide accurate approximations of model outputs.

2021 ◽  
Vol 5 (2) ◽  
pp. 42
Author(s):  
María A. Navascués ◽  
Ram Mohapatra ◽  
Md. Nasim Akhtar

In this paper, we define fractal bases and fractal frames of L2(I×J), where I and J are real compact intervals, in order to approximate two-dimensional square-integrable maps whose domain is a rectangle, using the identification of L2(I×J) with the tensor product space L2(I)⨂L2(J). First, we recall the procedure of constructing a fractal perturbation of a continuous or integrable function. Then, we define fractal frames and bases of L2(I×J) composed of product of such fractal functions. We also obtain weaker families as Bessel, Riesz and Schauder sequences for the same space. Additionally, we study some properties of the tensor product of the fractal operators associated with the maps corresponding to each variable.


2018 ◽  
Vol 17 (2) ◽  
pp. 235-251 ◽  
Author(s):  
Andreas Meinel ◽  
Sebastián Castaño-Candamil ◽  
Benjamin Blankertz ◽  
Fabien Lotte ◽  
Michael Tangermann

Author(s):  
K. Darshana Abeyrathna ◽  
Ole-Christoffer Granmo ◽  
Xuan Zhang ◽  
Lei Jiao ◽  
Morten Goodwin

Relying simply on bitwise operators, the recently introduced Tsetlin machine (TM) has provided competitive pattern classification accuracy in several benchmarks, including text understanding. In this paper, we introduce the regression Tsetlin machine (RTM), a new class of TMs designed for continuous input and output, targeting nonlinear regression problems. In all brevity, we convert continuous input into a binary representation based on thresholding, and transform the propositional formula formed by the TM into an aggregated continuous output. Our empirical comparison of the RTM with state-of-the-art regression techniques reveals either superior or on par performance on five datasets. This article is part of the theme issue ‘Harmonizing energy-autonomous computing and intelligence’.


2021 ◽  
Vol 19 (2) ◽  
pp. 75-82
Author(s):  
Niranjan Bora

It was mainly due to Atkinson works, who introduced Linear Multiparameter Eigenvalue problems (LMEPs), based on determinantal operators on the Tensor Product Space. Later, in the area of Multiparameter eigenvalue problems has received attention from the Mathematicians in the recent years also, who pointed out that there exist a variety of mixed eigenvalue problems with several parameters in different scientific domains. This article aims to bring into a light variety of scientific problems that appear naturally as LMEPs. Of course, with all certainty, the list of collection of applications presented here are far from complete, and there are bound to be many more applications of which we are currently unaware. The paper may provide a review on applications of Multiparameter eigenvalue problems in different scientific domains and future possible applicatios both in theoretical and applied disciplines.


NeuroImage ◽  
2015 ◽  
Vol 104 ◽  
pp. 163-176 ◽  
Author(s):  
Holger Mohr ◽  
Uta Wolfensteller ◽  
Steffi Frimmel ◽  
Hannes Ruge

2018 ◽  
Vol 38 (1) ◽  
pp. 197 ◽  
Author(s):  
Dipankar Das ◽  
Nilakshi Goswami ◽  
Vishnu Narayan Mishra

For two real Banach algebras $\mathbb{A}_1$ and $\mathbb{A}_2$, let $K_p$ be the projective cone in $\mathbb{A}_1\otimes_\gamma \mathbb{A}_2$. Using this we define a cone norm on the algebraic tensor product of two vector spaces over the Banach algebra $\mathbb{A}_1\otimes_\gamma \mathbb{A}_2$ and discuss some properties. We derive some fixed point theorems in this projective cone normed tensor product space over Banach algebra with a suitable example. For two self mappings $S$ and $T$ on a cone Banach space over Banach algebra, the stability of the iteration scheme $x_{2n+1}=Sx_{2n}$, $x_{2n+2}=Tx_{2n+1},\;n=0,1,2,...$ converging to the common fixed point of $S$ and $T$ is also discussed here.


1999 ◽  
Vol 14 (4) ◽  
pp. 319-340 ◽  
Author(s):  
İLHAN UYSAL ◽  
H. ALTAY GÜVENIR

Predicting or learning numeric features is called regression in the statistical literature, and it is the subject of research in both machine learning and statistics. This paper reviews the important techniques and algorithms for regression developed by both communities. Regression is important for many applications, since lots of real life problems can be modeled as regression problems. The review includes Locally Weighted Regression (LWR), rule-based regression, Projection Pursuit Regression (PPR), instance-based regression, Multivariate Adaptive Regression Splines (MARS) and recursive partitioning regression methods that induce regression trees (CART, RETIS and M5).


Sign in / Sign up

Export Citation Format

Share Document