convexity assumption
Recently Published Documents


TOTAL DOCUMENTS

31
(FIVE YEARS 10)

H-INDEX

4
(FIVE YEARS 2)

Author(s):  
Guolin Yu ◽  
Siqi Li ◽  
Xiao Pan ◽  
Wenyan Han

This paper is devoted to the investigation of optimality conditions for approximate quasi-weakly efficient solutions to a class of nonsmooth Vector Equilibrium Problem (VEP) via convexificators. First, a necessary optimality condition for approximate quasi-weakly efficient solutions to problem (VEP) is presented by making use of the properties of convexificators. Second, the notion of approximate pseudoconvex function in the form of convexificators is introduced, and its existence is verified by a concrete example. Under the introduced generalized convexity assumption, a sufficient optimality condition for approximate quasi-weakly efficient solutions to problem (VEP) is also established. Finally, a scalar characterization for approximate quasi-weakly efficient solutions to problem (VEP) is obtained by taking advantage of Tammer’s function.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Nazarii Tupitsa ◽  
Pavel Dvurechensky ◽  
Alexander Gasnikov ◽  
Sergey Guminov

Abstract We consider alternating minimization procedures for convex and non-convex optimization problems with the vector of variables divided into several blocks, each block being amenable for minimization with respect to its variables while maintaining other variables blocks constant. In the case of two blocks, we prove a linear convergence rate for an alternating minimization procedure under the Polyak–Łojasiewicz (PL) condition, which can be seen as a relaxation of the strong convexity assumption. Under the strong convexity assumption in the many-blocks setting, we provide an accelerated alternating minimization procedure with linear convergence rate depending on the square root of the condition number as opposed to just the condition number for the non-accelerated method. We also consider the problem of finding an approximate non-negative solution to a linear system of equations A ⁢ x = y {Ax=y} with alternating minimization of Kullback–Leibler (KL) divergence between Ax and y.


2020 ◽  
Vol 2020 (766) ◽  
pp. 61-71 ◽  
Author(s):  
Or Hershkovits

AbstractWe show that the Bowl soliton in {\mathbb{R}^{3}} is the unique translating solution of the mean curvature flow whose tangent flow at {-\infty} is the shrinking cylinder. As an application, we show that for a generic mean curvature flow, all (non-static) translating limit flows are the bowl soliton. The crucial point is that we do not make any global convexity assumption, while as the same time, the asymptotic requirement is very weak.


Author(s):  
Mohsine Jennane ◽  
El Mostafa Kalmoun ◽  
Lahoussine Lafhim

We consider a nonsmooth semi-infinite interval-valued vector programming problem, where the objectives and constraints functions need not to be locally Lipschitz. Using Abadie's constraint qualification and convexificators, we provide  Karush-Kuhn-Tucker necessary optimality conditions by converting the initial problem into a bi-criteria optimization problem. Furthermore, we establish sufficient optimality conditions  under the asymptotic convexity assumption.


2019 ◽  
Vol 36 (04) ◽  
pp. 1950021
Author(s):  
Tijani Amahroq ◽  
Abdessamad Oussarhan

Optimality conditions are established in terms of Lagrange–Fritz–John multipliers as well as Lagrange–Kuhn–Tucker multipliers for set optimization problems (without any convexity assumption) by using new scalarization techniques. Additionally, we indicate how these results may be applied to some particular weak vector equilibrium problems.


Author(s):  
Xiang Geng ◽  
Bin Gu ◽  
Xiang Li ◽  
Wanli Shi ◽  
Guansheng Zheng ◽  
...  

Semi-supervised learning (SSL) plays an increasingly important role in the big data era because a large number of unlabeled samples can be used effectively to improve the performance of the classifier. Semi-supervised support vector machine (S3VM) is one of the most appealing methods for SSL, but scaling up S3VM for kernel learning is still an open problem. Recently, a doubly stochastic gradient (DSG) algorithm has been proposed to achieve efficient and scalable training for kernel methods. However, the algorithm and theoretical analysis of DSG are developed based on the convexity assumption which makes them incompetent for non-convex problems such as S3VM. To address this problem, in this paper, we propose a triply stochastic gradient algorithm for S3VM, called TSGS3VM. Specifically, to handle two types of data instances involved in S3VM, TSGS3VM samples a labeled instance and an unlabeled instance as well with the random features in each iteration to compute a triply stochastic gradient. We use the approximated gradient to update the solution. More importantly, we establish new theoretic analysis for TSGS3VM which guarantees that TSGS3VM can converge to a stationary point. Extensive experimental results on a variety of datasets demonstrate that TSGS3VM is much more efficient and scalable than existing S3VM algorithms.


2019 ◽  
Vol 21 (02) ◽  
pp. 1850007 ◽  
Author(s):  
Dorin Bucur ◽  
Ilaria Fragalà

We prove that the optimal cluster problem for the sum/the max of the first Robin eigenvalue of the Laplacian, in the limit of a large number of convex cells, is asymptotically solved by (the Cheeger sets of) the honeycomb of regular hexagons. The same result is established for the Robin torsional rigidity. In the specific case of the max of the first Robin eigenvalue, we are able to remove the convexity assumption on the cells.


2019 ◽  
Vol 25 ◽  
pp. 55
Author(s):  
Xi Yin Zheng ◽  
Kung-Fu Ng

Under either linearity or convexity assumption, several authors have studied the stability of error bounds for inequality systems when the concerned data undergo small perturbations. In this paper, we consider the corresponding issue for a more general conic inequality (most of the constraint systems in optimization can be described by an inequality of this type). In terms of coderivatives for vector-valued functions, we study perturbation analysis of error bounds for conic inequalities in the subsmooth setting. The main results of this paper are new even in the convex/smooth case.


2019 ◽  
Vol 39 (2) ◽  
pp. 145-157
Author(s):  
Miguel de Benito Delgado ◽  
Jesus Ildefonso Díaz

We study some properties of the coincidence set for the boundary Signorini problem, improving some results from previous works by the second author and collaborators. Among other new results, we show here that the convexity assumption on the domain made previously in the literature on the location of the coincidence set can be avoided under suitable alternative conditions on the data.


Sign in / Sign up

Export Citation Format

Share Document