First-Order Algorithms for Optimization Problems with a Maximum Eigenvalue/Singular Value Cost and or Constraints

Author(s):  
Elijah Polak
2021 ◽  
Vol Volume 2 (Original research articles>) ◽  
Author(s):  
Lisa C. Hegerhorst-Schultchen ◽  
Christian Kirches ◽  
Marc C. Steinbach

This work continues an ongoing effort to compare non-smooth optimization problems in abs-normal form to Mathematical Programs with Complementarity Constraints (MPCCs). We study general Nonlinear Programs with equality and inequality constraints in abs-normal form, so-called Abs-Normal NLPs, and their relation to equivalent MPCC reformulations. We introduce the concepts of Abadie's and Guignard's kink qualification and prove relations to MPCC-ACQ and MPCC-GCQ for the counterpart MPCC formulations. Due to non-uniqueness of a specific slack reformulation suggested in [10], the relations are non-trivial. It turns out that constraint qualifications of Abadie type are preserved. We also prove the weaker result that equivalence of Guginard's (and Abadie's) constraint qualifications for all branch problems hold, while the question of GCQ preservation remains open. Finally, we introduce M-stationarity and B-stationarity concepts for abs-normal NLPs and prove first order optimality conditions corresponding to MPCC counterpart formulations.


2013 ◽  
Vol 2013 ◽  
pp. 1-10
Author(s):  
Wenlong Xu ◽  
Xiaofang Liu ◽  
Xia Li

Parallel imaging is a rapid magnetic resonance imaging technique. For the ill-conditioned problem, noise and aliasing artifacts are amplified during the reconstruction process and are serious especially for high accelerating imaging. In this paper, a sparse constrained reconstruction problem is proposed for parallel imaging, and an effective solution based on the variable splitting method is contrived. First-order and second-order norm optimization problems are first split, and then they are transferred to unconstrained minimization problem by the augmented Lagrangian method. At last, first-order norm and second-order norm optimization problems are alternatively resolved by different methods. With a discrepancy principle as the stopping criterion, analysis of simulated and actual parallel magnetic resonance image reconstruction is presented and discussed. Compared with the routine parallel imaging reconstruction methods, the results show that the noise and aliasing artifacts in the reconstructed image are evidently reduced at large acceleration factors.


2018 ◽  
Vol 10 (9) ◽  
pp. 168781401879333 ◽  
Author(s):  
Zhiliang Huang ◽  
Tongguang Yang ◽  
Fangyi Li

Conventional decoupling approaches usually employ first-order reliability method to deal with probabilistic constraints in a reliability-based design optimization problem. In first-order reliability method, constraint functions are transformed into a standard normal space. Extra non-linearity introduced by the non-normal-to-normal transformation may increase the error in reliability analysis and then result in the reliability-based design optimization analysis with insufficient accuracy. In this article, a decoupling approach is proposed to provide an alternative tool for the reliability-based design optimization problems. To improve accuracy, the reliability analysis is performed by first-order asymptotic integration method without any extra non-linearity transformation. To achieve high efficiency, an approximate technique of reliability analysis is given to avoid calculating time-consuming performance function. Two numerical examples and an application of practical laptop structural design are presented to validate the effectiveness of the proposed approach.


Author(s):  
Daniel Bartl ◽  
Samuel Drapeau ◽  
Jan Obłój ◽  
Johannes Wiesel

We consider sensitivity of a generic stochastic optimization problem to model uncertainty. We take a non-parametric approach and capture model uncertainty using Wasserstein balls around the postulated model. We provide explicit formulae for the first-order correction to both the value function and the optimizer and further extend our results to optimization under linear constraints. We present applications to statistics, machine learning, mathematical finance and uncertainty quantification. In particular, we provide an explicit first-order approximation for square-root LASSO regression coefficients and deduce coefficient shrinkage compared to the ordinary least-squares regression. We consider robustness of call option pricing and deduce a new Black–Scholes sensitivity, a non-parametric version of the so-called Vega. We also compute sensitivities of optimized certainty equivalents in finance and propose measures to quantify robustness of neural networks to adversarial examples.


Author(s):  
Yi Xu ◽  
Zhuoning Yuan ◽  
Sen Yang ◽  
Rong Jin ◽  
Tianbao Yang

Extrapolation is a well-known technique for solving convex optimization and variational inequalities and recently attracts some attention for non-convex optimization. Several recent works have empirically shown its success in some machine learning tasks. However, it has not been analyzed for non-convex minimization and there still remains a gap between the theory and the practice. In this paper, we analyze gradient descent  and stochastic gradient descent with extrapolation for finding an approximate first-order stationary point in smooth non-convex optimization problems. Our convergence upper bounds show that the algorithms with extrapolation can be accelerated than without extrapolation.


Sign in / Sign up

Export Citation Format

Share Document