Low-Rank Tensor Decompositions for Nonlinear System Identification: A Tutorial with Examples

2022 ◽  
Vol 42 (1) ◽  
pp. 54-74
Author(s):  
Kim Batselier
2020 ◽  
Vol 34 (04) ◽  
pp. 4420-4427
Author(s):  
Nikos Kargas ◽  
Nicholas D. Sidiropoulos

Function approximation from input and output data pairs constitutes a fundamental problem in supervised learning. Deep neural networks are currently the most popular method for learning to mimic the input-output relationship of a general nonlinear system, as they have proven to be very effective in approximating complex highly nonlinear functions. In this work, we show that identifying a general nonlinear function y = ƒ(x1,…,xN) from input-output examples can be formulated as a tensor completion problem and under certain conditions provably correct nonlinear system identification is possible. Specifically, we model the interactions between the N input variables and the scalar output of a system by a single N-way tensor, and setup a weighted low-rank tensor completion problem with smoothness regularization which we tackle using a block coordinate descent algorithm. We extend our method to the multi-output setting and the case of partially observed data, which cannot be readily handled by neural networks. Finally, we demonstrate the effectiveness of the approach using several regression tasks including some standard benchmarks and a challenging student grade prediction task.


Sign in / Sign up

Export Citation Format

Share Document