dc decomposition
Recently Published Documents


TOTAL DOCUMENTS

7
(FIVE YEARS 3)

H-INDEX

3
(FIVE YEARS 1)

2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Feichao Shen ◽  
Ying Zhang ◽  
Xueyong Wang

In this paper, we propose an accelerated proximal point algorithm for the difference of convex (DC) optimization problem by combining the extrapolation technique with the proximal difference of convex algorithm. By making full use of the special structure of DC decomposition and the information of stepsize, we prove that the proposed algorithm converges at rate of O 1 / k 2 under milder conditions. The given numerical experiments show the superiority of the proposed algorithm to some existing algorithms.


Algorithms ◽  
2019 ◽  
Vol 12 (12) ◽  
pp. 249 ◽  
Author(s):  
Annabella Astorino ◽  
Antonio Fuduli ◽  
Giovanni Giallombardo ◽  
Giovanna Miglionico

A multiple instance learning problem consists of categorizing objects, each represented as a set (bag) of points. Unlike the supervised classification paradigm, where each point of the training set is labeled, the labels are only associated with bags, while the labels of the points inside the bags are unknown. We focus on the binary classification case, where the objective is to discriminate between positive and negative bags using a separating surface. Adopting a support vector machine setting at the training level, the problem of minimizing the classification-error function can be formulated as a nonconvex nonsmooth unconstrained program. We propose a difference-of-convex (DC) decomposition of the nonconvex function, which we face using an appropriate nonsmooth DC algorithm. Some of the numerical results on benchmark data sets are reported.


Author(s):  
Duy Nhat Phan ◽  
Hoai Minh Le ◽  
Hoai An Le Thi

In this work, we present a variant of DCA (Difference of Convex function Algorithm) with the aim to improve its convergence speed. The proposed algorithm, named Accelerated DCA (ADCA), consists in incorporating the Nesterov's acceleration technique into DCA. We first investigate ADCA for solving the standard DC program and rigorously study its convergence properties and the convergence rate. Secondly, we develop ADCA for a special case of the standard DC program whose the objective function is the sum of a differentiable with L-Lipschitz gradient function (possibly nonconvex) and a nonsmooth DC function. We exploit the special structure of the problem to propose an efficient DC decomposition for which the corresponding ADCA scheme is inexpensive. As an application, we consider the sparse binary logistic regression problem. Numerical experiments on several benchmark datasets illustrate the efficiency of our algorithm and its superiority over well-known methods.


2017 ◽  
Vol 169 (1) ◽  
pp. 69-94 ◽  
Author(s):  
Amir Ali Ahmadi ◽  
Georgina Hall
Keyword(s):  

2008 ◽  
Vol 45 (2) ◽  
pp. 187-201 ◽  
Author(s):  
Zvi Drezner ◽  
Stefan Nickel

Sign in / Sign up

Export Citation Format

Share Document