scholarly journals Joint estimation of low-rank components and connectivity graph in high-dimensional graph signals: Application to brain imaging

2021 ◽  
Vol 182 ◽  
pp. 107931
Author(s):  
Rui Liu ◽  
Ngai-Man Cheung
Mathematics ◽  
2021 ◽  
Vol 9 (12) ◽  
pp. 1330
Author(s):  
Raeyong Kim

The conjugacy problem for a group G is one of the important algorithmic problems deciding whether or not two elements in G are conjugate to each other. In this paper, we analyze the graph of group structure for the fundamental group of a high-dimensional graph manifold and study the conjugacy problem. We also provide a new proof for the solvable word problem.


2017 ◽  
Vol 59 (3) ◽  
pp. 289-310 ◽  
Author(s):  
Yong He ◽  
Xinsheng Zhang ◽  
Jiadong Ji ◽  
Bin Liu

2019 ◽  
Vol 19 (1) ◽  
pp. 39-53 ◽  
Author(s):  
Martin Eigel ◽  
Johannes Neumann ◽  
Reinhold Schneider ◽  
Sebastian Wolf

AbstractThis paper examines a completely non-intrusive, sample-based method for the computation of functional low-rank solutions of high-dimensional parametric random PDEs, which have become an area of intensive research in Uncertainty Quantification (UQ). In order to obtain a generalized polynomial chaos representation of the approximate stochastic solution, a novel black-box rank-adapted tensor reconstruction procedure is proposed. The performance of the described approach is illustrated with several numerical examples and compared to (Quasi-)Monte Carlo sampling.


2016 ◽  
Vol 6 (2) ◽  
pp. 109-130 ◽  
Author(s):  
Siu-Long Lei ◽  
Xu Chen ◽  
Xinhe Zhang

AbstractHigh-dimensional two-sided space fractional diffusion equations with variable diffusion coefficients are discussed. The problems can be solved by an implicit finite difference scheme that is proven to be uniquely solvable, unconditionally stable and first-order convergent in the infinity norm. A nonsingular multilevel circulant pre-conditoner is proposed to accelerate the convergence rate of the Krylov subspace linear system solver efficiently. The preconditoned matrix for fast convergence is a sum of the identity matrix, a matrix with small norm, and a matrix with low rank under certain conditions. Moreover, the preconditioner is practical, with an O(NlogN) operation cost and O(N) memory requirement. Illustrative numerical examples are also presented.


2021 ◽  
Author(s):  
Ziwei Zhu ◽  
Xudong Li ◽  
Mengdi Wang ◽  
Anru Zhang

Taming high-dimensional Markov models In “Learning Markov models via low-rank optimization”, Z. Zhu, X. Li, M. Wang, and A. Zhang focus on learning a high-dimensional Markov model with low-dimensional latent structure from a single trajectory of states. To overcome the curse of high dimensions, the authors propose to equip the standard MLE (maximum-likelihood estimation) with either nuclear norm regularization or rank constraint. They show that both approaches can estimate the full transition matrix accurately using a trajectory of length that is merely proportional to the number of states. To solve the rank-constrained MLE, which is a nonconvex problem, the authors develop a new DC (difference) programming algorithm. Finally, they apply the proposed methods to analyze taxi trips on the Manhattan island and partition the island based on the destination preference of customers; this partition can help balance supply and demand of taxi service and optimize the allocation of traffic resources.


Sign in / Sign up

Export Citation Format

Share Document