A new feasible descent algorithm combining SQP with generalized projection for optimization problems without strict complementarity

2005 ◽  
Vol 162 (3) ◽  
pp. 1065-1081 ◽  
Author(s):  
Jin-Bao Jian
2020 ◽  
Author(s):  
Ren Wang ◽  
Meng Wang ◽  
Jinjun Xiong

Abstract Higher-order tensors can represent scores in a rating system, frames in a video, and images of the same subject. In practice, the measurements are often highly quantized due to the sampling strategies or the quality of devices. Existing works on tensor recovery have focused on data losses and random noises. Only a few works consider tensor recovery from quantized measurements but are restricted to binary measurements. This paper, for the first time, addresses the problem of tensor recovery from multi-level quantized measurements by leveraging the low CANDECOMP/PARAFAC (CP) rank property. We study the recovery of both general low-rank tensors and tensors that have tensor singular value decomposition (TSVD) by solving nonconvex optimization problems. We provide the theoretical upper bounds of the recovery error, which diminish to zero when the sizes of dimensions increase to infinity. We further characterize the fundamental limit of any recovery algorithm and show that our recovery error is nearly order-wise optimal. A tensor-based alternating proximal gradient descent algorithm with a convergence guarantee and a TSVD based projected gradient descent algorithm are proposed to solve the nonconvex problems. Our recovery methods can also handle data losses and do not necessarily need the information of the quantization rule. The methods are validated on synthetic data, image datasets, and music recommender datasets.


Author(s):  
Ren Wang ◽  
Meng Wang ◽  
Jinjun Xiong

Abstract Higher-order tensors can represent scores in a rating system, frames in a video, and images of the same subject. In practice, the measurements are often highly quantized due to the sampling strategies or the quality of devices. Existing works on tensor recovery have focused on data losses and random noises. Only a few works consider tensor recovery from quantized measurements but are restricted to binary measurements. This paper, for the first time, addresses the problem of tensor recovery from multi-level quantized measurements by leveraging the low CANDECOMP/PARAFAC (CP) rank property. We study the recovery of both general low-rank tensors and tensors that have tensor singular value decomposition (TSVD) by solving nonconvex optimization problems. We provide the theoretical upper bounds of the recovery error, which diminish to zero when the sizes of dimensions increase to infinity. We further characterize the fundamental limit of any recovery algorithm and show that our recovery error is nearly order-wise optimal. A tensor-based alternating proximal gradient descent algorithm with a convergence guarantee and a TSVD-based projected gradient descent algorithm are proposed to solve the nonconvex problems. Our recovery methods can also handle data losses and do not necessarily need the information of the quantization rule. The methods are validated on synthetic data, image datasets, and music recommender datasets.


2015 ◽  
Vol 32 (03) ◽  
pp. 1550012 ◽  
Author(s):  
Suxiang He ◽  
Liwei Zhang ◽  
Jie Zhang

It is well-known that the linear rate of convergence can be established for the classical augmented Lagrangian method for constrained optimization problems without strict complementarity. Whether this result is still valid for other nonlinear Lagrangian methods (NLM) is an interesting problem. This paper proposes a nonlinear Lagrangian function based on Fischer–Burmeister (F–B) nonlinear complimentarity problem (NCP) function for constrained optimization problems. The rate of convergence of this NLM is analyzed under the linear independent constraint qualification and the strong second-order sufficient condition without strict complementarity when subproblems are assumed to be solved exactly and inexactly, respectively. Interestingly, it is demonstrated that the Lagrange multipliers associating with inactive inequality constraints at the local minimum point converge to zeros superlinearly. Several illustrative examples are reported to show the behavior of the NLM.


Sign in / Sign up

Export Citation Format

Share Document