nonsmooth optimization problem
Recently Published Documents


TOTAL DOCUMENTS

12
(FIVE YEARS 4)

H-INDEX

2
(FIVE YEARS 0)

2022 ◽  
Vol 40 ◽  
pp. 1-16
Author(s):  
Fakhrodin Hashemi ◽  
Saeed Ketabchi

Optimal correction of an infeasible equations system as Ax + B|x|= b leads into a non-convex fractional problem. In this paper, a regularization method(ℓp-norm, 0 < p < 1), is presented to solve mentioned fractional problem. In this method, the obtained problem can be formulated as a non-convex and nonsmooth optimization problem which is not Lipschitz. The objective function of this problem can be decomposed as a difference of convex functions (DC). For this reason, we use a special smoothing technique based on DC programming. The numerical results obtained for generated problem show high performance and the effectiveness of the proposed method.


2021 ◽  
Vol 2021 (1) ◽  
Author(s):  
Doaa Filali ◽  
Mohammad Dilshad ◽  
Mohammad Akram ◽  
Feeroz Babu ◽  
Izhar Ahmad

AbstractThis article aims to introduce and analyze the viscosity method for hierarchical variational inequalities involving a ϕ-contraction mapping defined over a common solution set of variational inclusion and fixed points of a nonexpansive mapping on Hadamard manifolds. Several consequences of the composed method and its convergence theorem are presented. The convergence results of this article generalize and extend some existing results from Hilbert/Banach spaces and from Hadamard manifolds. We also present an application to a nonsmooth optimization problem. Finally, we clarify the convergence analysis of the proposed method by some computational numerical experiments in Hadamard manifold.


Complexity ◽  
2021 ◽  
Vol 2021 ◽  
pp. 1-8
Author(s):  
Jia-Tong Li ◽  
Jie Shen ◽  
Na Xu

For CVaR (conditional value-at-risk) portfolio nonsmooth optimization problem, we propose an infeasible incremental bundle method on the basis of the improvement function and the main idea of incremental method for solving convex finite min-max problems. The presented algorithm only employs the information of the objective function and one component function of constraint functions to form the approximate model for improvement function. By introducing the aggregate technique, we keep the information of previous iterate points that may be deleted from bundle to overcome the difficulty of numerical computation and storage. Our algorithm does not enforce the feasibility of iterate points and the monotonicity of objective function, and the global convergence of the algorithm is established under mild conditions. Compared with the available results, our method loosens the requirements of computing the whole constraint function, which makes the algorithm easier to implement.


2018 ◽  
Vol 9 (1) ◽  
pp. 68-86 ◽  
Author(s):  
L. Antonelli ◽  
V. De Simone

Abstract Segmentation is a typical task in image processing having as main goal the partitioning of the image into multiple segments in order to simplify its interpretation and analysis. One of the more popular segmentation model, formulated by Chan-Vese, is the piecewise constant Mumford-Shah model restricted to the case of two-phase segmentation. We consider a convex relaxation formulation of the segmentation model, that can be regarded as a nonsmooth optimization problem, because the presence of the l1-term. Two basic approaches in optimization can be distinguished to deal with its non differentiability: the smoothing methods and the nonsmoothing methods. In this work, a numerical comparison of some first order methods belongs of both approaches are presented. The relationships among the different methods are shown, and accuracy and efficiency tests are also performed on several images.


2018 ◽  
Vol 1 (1) ◽  
pp. 1-10
Author(s):  
S. Dempe ◽  
G. Luo ◽  
S. Franke

In this paper, we investigate the pessimistic bilevel linear optimization problem (PBLOP). Based on the lower level optimal value function and duality, the PBLOP can be transformed to a single-level while nonconvex and nonsmooth optimization problem. By use of linear optimization duality, we obtain a tractable and equivalent transformation and propose algorithms for computing global or local optimal solutions. One small example is presented to illustrate the feasibility of the method.  


2018 ◽  
Vol 2018 ◽  
pp. 1-9
Author(s):  
Miao Chen ◽  
Shou-qiang Du

We study the method for solving a kind of nonsmooth optimization problems with l1-norm, which is widely used in the problem of compressed sensing, image processing, and some related optimization problems with wide application background in engineering technology. Transformated by the absolute value equations, this kind of nonsmooth optimization problem is rewritten as a general unconstrained optimization problem, and the transformed problem is solved by a smoothing FR conjugate gradient method. Finally, the numerical experiments show the effectiveness of the given smoothing FR conjugate gradient method.


2014 ◽  
Vol 2014 ◽  
pp. 1-6 ◽  
Author(s):  
Longquan Yong

The method of least absolute deviation (LAD) finds applications in many areas, due to its robustness compared to the least squares regression (LSR) method. LAD is robust in that it is resistant to outliers in the data. This may be helpful in studies where outliers may be ignored. Since LAD is nonsmooth optimization problem, this paper proposed a metaheuristics algorithm named novel global harmony search (NGHS) for solving. Numerical results show that the NGHS method has good convergence property and effective in solving LAD.


Sign in / Sign up

Export Citation Format

Share Document