A Neurodynamic Optimization Approach for L1 Minimization with Application to Compressed Image Reconstruction

2021 ◽  
Vol 30 (01) ◽  
pp. 2140007
Author(s):  
Chengchen Dai ◽  
Hangjun Che ◽  
Man-Fai Leung

This paper presents a neurodynamic optimization approach for l1 minimization based on an augmented Lagrangian function. By using the threshold function in locally competitive algorithm (LCA), subgradient at a nondifferential point is equivalently replaced with the difference of the neuronal state and its mapping. The efficacy of the proposed approach is substantiated by reconstructing three compressed images.

2010 ◽  
Vol 121-122 ◽  
pp. 123-127
Author(s):  
Wen Ling Zhao ◽  
Jing Zhang ◽  
Jin Chuan Zhou

In connection with Problem (P) with both the equality constraints and inequality constraints, we introduce a new augmented lagrangian function. We establish the existence of local saddle point under the weaker sufficient second order condition, discuss the relationships between local optimal solution of the primal problem and local saddle point of the augmented lagrangian function.


2011 ◽  
Vol 467-469 ◽  
pp. 877-881
Author(s):  
Ai Ping Jiang ◽  
Feng Wen Huang

In this paper, two modifications are proposed for minimizing the nonlinear optimization problem (NLP) based on Fletcher and Leyffer’s filter method which is different from traditional merit function with penalty term. We firstly modify one component of filter pairs with NCP function instead of violation constrained function in order to avoid the difficulty of selecting penalty parameters. We also proved that the modified algorithm is globally and super linearly convergent under certain conditions. We secondly convert objective function to augmented Lagrangian function in case of incompatibility caused by sub-problems.


Sign in / Sign up

Export Citation Format

Share Document