Sufficient conditions for global minimum in partially convex constrained problems

Author(s):  
V.N. Solov'ev
Author(s):  
Bruce Calvert ◽  
M. K. Vamanamurthy

AbstractLet p: R2 → R be a polynomial with a local minimum at its only critical point. This must give a global minimum if the degree of p is < 5, but not necessarily if the degree is ≥ 5. It is an open question what the result is for cubics and quartics in more variables, except cubics in three variables. Other sufficient conditions for a global minimum of a general function are given.1980 Mathematics subject classification (Amer. Math. Soc.): 26 B 99, 26 C 99.


2014 ◽  
Vol 2014 ◽  
pp. 1-6
Author(s):  
Martin Branda

We deal with the conditions which ensure exact penalization in stochastic programming problems under finite discrete distributions. We give several sufficient conditions for problem calmness including graph calmness, existence of an error bound, and generalized Mangasarian-Fromowitz constraint qualification. We propose a new version of the theorem on asymptotic equivalence of local minimizers of chance constrained problems and problems with exact penalty objective. We apply the theory to a problem with a stochastic vanishing constraint.


Author(s):  
Ion Necoara ◽  
Martin Takáč

Abstract In this paper we consider large-scale smooth optimization problems with multiple linear coupled constraints. Due to the non-separability of the constraints, arbitrary random sketching would not be guaranteed to work. Thus, we first investigate necessary and sufficient conditions for the sketch sampling to have well-defined algorithms. Based on these sampling conditions we develop new sketch descent methods for solving general smooth linearly constrained problems, in particular, random sketch descent (RSD) and accelerated random sketch descent (A-RSD) methods. To our knowledge, this is the first convergence analysis of RSD algorithms for optimization problems with multiple non-separable linear constraints. For the general case, when the objective function is smooth and non-convex, we prove for the non-accelerated variant sublinear rate in expectation for an appropriate optimality measure. In the smooth convex case, we derive for both algorithms, non-accelerated and A-RSD, sublinear convergence rates in the expected values of the objective function. Additionally, if the objective function satisfies a strong convexity type condition, both algorithms converge linearly in expectation. In special cases, where complexity bounds are known for some particular sketching algorithms, such as coordinate descent methods for optimization problems with a single linear coupled constraint, our theory recovers the best known bounds. Finally, we present several numerical examples to illustrate the performances of our new algorithms.


Filomat ◽  
2017 ◽  
Vol 31 (11) ◽  
pp. 3407-3420 ◽  
Author(s):  
P. Cheraghi ◽  
Ali Farajzadeh ◽  
Gradimir Milovanovic

Some necessary conditions for having nonempty weak subdifferential of a function are presented and the positively homogeneous of the weak subdifferential operator is proved. Necessary and sufficient conditions for achieving a global minimum of a weak subdifferentiable function is stated, as well as a link between subdifferential and the Fr?chet differential with a weak subdifferential. A result about the equality of the fuzzy sum rule inclusion is also investigated. Finally, some examples are included.


Entropy ◽  
2019 ◽  
Vol 21 (10) ◽  
pp. 924 ◽  
Author(s):  
Tailin Wu ◽  
Ian Fischer ◽  
Isaac L. Chuang ◽  
Max Tegmark

The Information Bottleneck (IB) method provides an insightful and principled approach for balancing compression and prediction for representation learning. The IB objective I ( X ; Z ) - β I ( Y ; Z ) employs a Lagrange multiplier β to tune this trade-off. However, in practice, not only is β chosen empirically without theoretical guidance, there is also a lack of theoretical understanding between β , learnability, the intrinsic nature of the dataset and model capacity. In this paper, we show that if β is improperly chosen, learning cannot happen—the trivial representation P ( Z | X ) = P ( Z ) becomes the global minimum of the IB objective. We show how this can be avoided, by identifying a sharp phase transition between the unlearnable and the learnable which arises as β is varied. This phase transition defines the concept of IB-Learnability. We prove several sufficient conditions for IB-Learnability, which provides theoretical guidance for choosing a good β . We further show that IB-learnability is determined by the largest confident, typical and imbalanced subset of the examples (the conspicuous subset), and discuss its relation with model capacity. We give practical algorithms to estimate the minimum β for a given dataset. We also empirically demonstrate our theoretical conditions with analyses of synthetic datasets, MNIST and CIFAR10.


Mathematics ◽  
2019 ◽  
Vol 7 (10) ◽  
pp. 900 ◽  
Author(s):  
Hamed H Al-Sulami ◽  
Nawab Hussain ◽  
Jamshaid Ahmad

Best proximity point theorem furnishes sufficient conditions for the existence and computation of an approximate solution ω that is optimal in the sense that the error σ ( ω , J ω ) assumes the global minimum value σ ( θ , ϑ ) . The aim of this paper is to define the notion of Suzuki α - Θ -proximal multivalued contraction and prove the existence of best proximity points ω satisfying σ ( ω , J ω ) = σ ( θ , ϑ ) , where J is assumed to be continuous or the space M is regular. We derive some best proximity results on a metric space with graphs and ordered metric spaces as consequences. We also provide a non trivial example to support our main results. As applications of our main results, we discuss some variational inequality problems and dynamical programming problems.


2004 ◽  
Vol 06 (01) ◽  
pp. 15-20 ◽  
Author(s):  
PIERRE CARTIGNY ◽  
CHRISTOPHE DEISSENBERG

George Leitmann's direct method permits to find the global minimum of a variational problem by use of a change of variables. In this article, we extend this method to a class of scalar, inequality constrained problems. An application example is given.


2021 ◽  
Vol 17 (4) ◽  
pp. 1-29
Author(s):  
Monaldo Mastrolilli

Given an ideal I and a polynomial f the Ideal Membership Problem (IMP) is to test if f ϵ I . This problem is a fundamental algorithmic problem with important applications and notoriously intractable. We study the complexity of the IMP for combinatorial ideals that arise from constrained problems over the Boolean domain. As our main result, we identify the borderline of tractability. By using Gröbner bases techniques, we extend Schaefer’s dichotomy theorem [STOC, 1978] which classifies all Constraint Satisfaction Problems (CSPs) over the Boolean domain to be either in P or NP-hard. Moreover, our result implies necessary and sufficient conditions for the efficient computation of Theta Body Semi-Definite Programming (SDP) relaxations, identifying therefore the borderline of tractability for constraint language problems. This article is motivated by the pursuit of understanding the recently raised issue of bit complexity of Sum-of-Squares (SoS) proofs [O’Donnell, ITCS, 2017]. Raghavendra and Weitz [ICALP, 2017] show how the IMP tractability for combinatorial ideals implies bounded coefficients in SoS proofs.


Sign in / Sign up

Export Citation Format

Share Document