scholarly journals On decompositions and approximations of conjugate partial-symmetric tensors

CALCOLO ◽  
2021 ◽  
Vol 58 (4) ◽  
Author(s):  
Taoran Fu ◽  
Bo Jiang ◽  
Zhening Li

AbstractHermitian matrices have played an important role in matrix theory and complex quadratic optimization. The high-order generalization of Hermitian matrices, conjugate partial-symmetric (CPS) tensors, have shown growing interest recently in tensor theory and computation, particularly in application-driven complex polynomial optimization problems. In this paper, we study CPS tensors with a focus on ranks, computing rank-one decompositions and approximations, as well as their applications. We prove constructively that any CPS tensor can be decomposed into a sum of rank-one CPS tensors, which provides an explicit method to compute such rank-one decompositions. Three types of ranks for CPS tensors are defined and shown to be different in general. This leads to the invalidity of the conjugate version of Comon’s conjecture. We then study rank-one approximations and matricizations of CPS tensors. By carefully unfolding CPS tensors to Hermitian matrices, rank-one equivalence can be preserved. This enables us to develop new convex optimization models and algorithms to compute best rank-one approximations of CPS tensors. Numerical experiments from data sets in radar wave form design, elasticity tensor, and quantum entanglement are performed to justify the capability of our methods.

Author(s):  
Constanze Liaw ◽  
Sergei Treil ◽  
Alexander Volberg

Abstract The classical Aronszajn–Donoghue theorem states that for a rank-one perturbation of a self-adjoint operator (by a cyclic vector) the singular parts of the spectral measures of the original and perturbed operators are mutually singular. As simple direct sum type examples show, this result does not hold for finite rank perturbations. However, the set of exceptional perturbations is pretty small. Namely, for a family of rank $d$ perturbations $A_{\boldsymbol{\alpha }}:= A + {\textbf{B}} {\boldsymbol{\alpha }} {\textbf{B}}^*$, ${\textbf{B}}:{\mathbb C}^d\to{{\mathcal{H}}}$, with ${\operatorname{Ran}}{\textbf{B}}$ being cyclic for $A$, parametrized by $d\times d$ Hermitian matrices ${\boldsymbol{\alpha }}$, the singular parts of the spectral measures of $A$ and $A_{\boldsymbol{\alpha }}$ are mutually singular for all ${\boldsymbol{\alpha }}$ except for a small exceptional set $E$. It was shown earlier by the 1st two authors, see [4], that $E$ is a subset of measure zero of the space $\textbf{H}(d)$ of $d\times d$ Hermitian matrices. In this paper, we show that the set $E$ has small Hausdorff dimension, $\dim E \le \dim \textbf{H}(d)-1 = d^2-1$.


Author(s):  
Mareike Dressler ◽  
Adam Kurpisz ◽  
Timo de Wolff

AbstractVarious key problems from theoretical computer science can be expressed as polynomial optimization problems over the boolean hypercube. One particularly successful way to prove complexity bounds for these types of problems is based on sums of squares (SOS) as nonnegativity certificates. In this article, we initiate optimization problems over the boolean hypercube via a recent, alternative certificate called sums of nonnegative circuit polynomials (SONC). We show that key results for SOS-based certificates remain valid: First, for polynomials, which are nonnegative over the n-variate boolean hypercube with constraints of degree d there exists a SONC certificate of degree at most $$n+d$$ n + d . Second, if there exists a degree d SONC certificate for nonnegativity of a polynomial over the boolean hypercube, then there also exists a short degree d SONC certificate that includes at most $$n^{O(d)}$$ n O ( d ) nonnegative circuit polynomials. Moreover, we prove that, in opposite to SOS, the SONC cone is not closed under taking affine transformation of variables and that for SONC there does not exist an equivalent to Putinar’s Positivstellensatz for SOS. We discuss these results from both the algebraic and the optimization perspective.


2012 ◽  
Vol 24 (4) ◽  
pp. 1047-1084 ◽  
Author(s):  
Xiao-Tong Yuan ◽  
Shuicheng Yan

We investigate Newton-type optimization methods for solving piecewise linear systems (PLSs) with nondegenerate coefficient matrix. Such systems arise, for example, from the numerical solution of linear complementarity problem, which is useful to model several learning and optimization problems. In this letter, we propose an effective damped Newton method, PLS-DN, to find the exact (up to machine precision) solution of nondegenerate PLSs. PLS-DN exhibits provable semiiterative property, that is, the algorithm converges globally to the exact solution in a finite number of iterations. The rate of convergence is shown to be at least linear before termination. We emphasize the applications of our method in modeling, from a novel perspective of PLSs, some statistical learning problems such as box-constrained least squares, elitist Lasso (Kowalski & Torreesani, 2008 ), and support vector machines (Cortes & Vapnik, 1995 ). Numerical results on synthetic and benchmark data sets are presented to demonstrate the effectiveness and efficiency of PLS-DN on these problems.


1991 ◽  
Vol 113 (3) ◽  
pp. 280-285 ◽  
Author(s):  
T. J. Beltracchi ◽  
G. A. Gabriele

The Recursive Quadratic Programming (RQP) method has become known as one of the most effective and efficient algorithms for solving engineering optimization problems. The RQP method uses variable metric updates to build approximations of the Hessian of the Lagrangian. If the approximation of the Hessian of the Lagrangian converges to the true Hessian of the Lagrangian, then the RQP method converges quadratically. The choice of a variable metric update has a direct effect on the convergence of the Hessian approximation. Most of the research performed with the RQP method uses some modification of the Broyden-Fletcher-Shanno (BFS) variable metric update. This paper describes a hybrid variable metric update that yields good approximations to the Hessian of the Lagrangian. The hybrid update combines the best features of the Symmetric Rank One and BFS updates, but is less sensitive to inexact line searches than the BFS update, and is more stable than the SR1 update. Testing of the method shows that the efficiency of the RQP method is unaffected by the new update but more accurate Hessian approximations are produced. This should increase the accuracy of the solutions obtained with the RQP method, and more importantly, provide more reliable information for post optimality analyses, such as parameter sensitivity studies.


Sign in / Sign up

Export Citation Format

Share Document