difference of convex functions
Recently Published Documents


TOTAL DOCUMENTS

62
(FIVE YEARS 19)

H-INDEX

14
(FIVE YEARS 2)

2022 ◽  
Vol 40 ◽  
pp. 1-16
Author(s):  
Fakhrodin Hashemi ◽  
Saeed Ketabchi

Optimal correction of an infeasible equations system as Ax + B|x|= b leads into a non-convex fractional problem. In this paper, a regularization method(ℓp-norm, 0 < p < 1), is presented to solve mentioned fractional problem. In this method, the obtained problem can be formulated as a non-convex and nonsmooth optimization problem which is not Lipschitz. The objective function of this problem can be decomposed as a difference of convex functions (DC). For this reason, we use a special smoothing technique based on DC programming. The numerical results obtained for generated problem show high performance and the effectiveness of the proposed method.


2021 ◽  
Vol 2021 ◽  
pp. 1-19
Author(s):  
Zhijun Luo ◽  
Zhibin Zhu ◽  
Benxin Zhang

This paper proposes a nonconvex model (called LogTVSCAD) for deblurring images with impulsive noises, using the log-function penalty as the regularizer and adopting the smoothly clipped absolute deviation (SCAD) function as the data-fitting term. The proposed nonconvex model can effectively overcome the poor performance of the classical TVL1 model for high-level impulsive noise. A difference of convex functions algorithm (DCA) is proposed to solve the nonconvex model. For the model subproblem, we consider the alternating direction method of multipliers (ADMM) algorithm to solve it. The global convergence is discussed based on Kurdyka–Lojasiewicz. Experimental results show the advantages of the proposed nonconvex model over existing models.


Author(s):  
Sorin-Mihai Grad ◽  
Felipe Lara

AbstractWe introduce and investigate a new generalized convexity notion for functions called prox-convexity. The proximity operator of such a function is single-valued and firmly nonexpansive. We provide examples of (strongly) quasiconvex, weakly convex, and DC (difference of convex) functions that are prox-convex, however none of these classes fully contains the one of prox-convex functions or is included into it. We show that the classical proximal point algorithm remains convergent when the convexity of the proper lower semicontinuous function to be minimized is relaxed to prox-convexity.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Jie Shen ◽  
Na Xu ◽  
Fang-Fang Guo ◽  
Han-Yang Li ◽  
Pan Hu

Abstract For nonlinear nonsmooth DC programming (difference of convex functions), we introduce a new redistributed proximal bundle method. The subgradient information of both the DC components is gathered from some neighbourhood of the current stability center and it is used to build separately an approximation for each component in the DC representation. Especially we employ the nonlinear redistributed technique to model the second component of DC function by constructing a local convexification cutting plane. The corresponding convexification parameter is adjusted dynamically and is taken sufficiently large to make the ”augmented” linearization errors nonnegative. Based on above techniques we obtain a new convex cutting plane model of the original objective function. Based on this new approximation the redistributed proximal bundle method is designed and the convergence of the proposed algorithm to a Clarke stationary point is proved. A simple numerical experiment is given to show the validity of the presented algorithm.


2021 ◽  
Vol 0 (0) ◽  
pp. 0
Author(s):  
Abdellatif Moudafi ◽  
Paul-Emile Mainge

<p style='text-indent:20px;'>Based on a work by M. Dur and J.-B. Hiriart-Urruty[<xref ref-type="bibr" rid="b3">3</xref>], we consider the problem of whether a symmetric matrix is copositive formulated as a difference of convex functions problem. The convex nondifferentiable function in this d.c. decomposition being proximable, we then apply a proximal-gradient method to approximate the related stationary points. Whereas, in [<xref ref-type="bibr" rid="b3">3</xref>], the DCA algorithm was used.</p>


2020 ◽  
pp. 1-13
Author(s):  
Xufang Li ◽  
Zhong Wu ◽  
Fang Zhang ◽  
Deqiang Qu

BACKGROUND: Many medical image processing problems can be translated into solving the optimization models. In reality, there are lots of nonconvex optimization problems in medical image processing. OBJECTIVE: In this paper, we focus on a special class of robust nonconvex optimization, namely, robust optimization where given the parameters, the objective function can be expressed as the difference of convex functions. METHODS: We present the necessary condition for optimality under general assumptions. To solve this problem, a sequential robust convex optimization algorithm is proposed. RESULTS: We show that the new algorithm is globally convergent to a stationary point of the original problem under the general assumption about the uncertain set. The application of medical image enhancement is conducted and the numerical result shows its efficiency.


Sign in / Sign up

Export Citation Format

Share Document