cardinality constraint
Recently Published Documents


TOTAL DOCUMENTS

99
(FIVE YEARS 54)

H-INDEX

11
(FIVE YEARS 1)

Author(s):  
Bruno Ordozgoiti ◽  
Ananth Mahadevan ◽  
Antonis Matakos ◽  
Aristides Gionis

AbstractWhen searching for information in a data collection, we are often interested not only in finding relevant items, but also in assembling a diverse set, so as to explore different concepts that are present in the data. This problem has been researched extensively. However, finding a set of items with minimal pairwise similarities can be computationally challenging, and most existing works striving for quality guarantees assume that item relatedness is measured by a distance function. Given the widespread use of similarity functions in many domains, we believe this to be an important gap in the literature. In this paper we study the problem of finding a diverse set of items, when item relatedness is measured by a similarity function. We formulate the diversification task using a flexible, broadly applicable minimization objective, consisting of the sum of pairwise similarities of the selected items and a relevance penalty term. To find good solutions we adopt a randomized rounding strategy, which is challenging to analyze because of the cardinality constraint present in our formulation. Even though this obstacle can be overcome using dependent rounding, we show that it is possible to obtain provably good solutions using an independent approach, which is faster, simpler to implement and completely parallelizable. Our analysis relies on a novel bound for the ratio of Poisson-Binomial densities, which is of independent interest and has potential implications for other combinatorial-optimization problems. We leverage this result to design an efficient randomized algorithm that provides a lower-order additive approximation guarantee. We validate our method using several benchmark datasets, and show that it consistently outperforms the greedy approaches that are commonly used in the literature.


Author(s):  
Jinhak Kim ◽  
Mohit Tawarmalani ◽  
Jean-Philippe P. Richard

We develop techniques to convexify a set that is invariant under permutation and/or change of sign of variables and discuss applications of these results. First, we convexify the intersection of the unit ball of a permutation and sign-invariant norm with a cardinality constraint. This gives a nonlinear formulation for the feasible set of sparse principal component analysis (PCA) and an alternative proof of the K-support norm. Second, we characterize the convex hull of sets of matrices defined by constraining their singular values. As a consequence, we generalize an earlier result that characterizes the convex hull of rank-constrained matrices whose spectral norm is below a given threshold. Third, we derive convex and concave envelopes of various permutation-invariant nonlinear functions and their level sets over hypercubes, with congruent bounds on all variables. Finally, we develop new relaxations for the exterior product of sparse vectors. Using these relaxations for sparse PCA, we show that our relaxation closes 98% of the gap left by a classical semidefinite programming relaxation for instances where the covariance matrices are of dimension up to 50 × 50.


Author(s):  
Liman Du ◽  
Wenguo Yang ◽  
Suixiang Gao

The number of social individuals who interact with their friends through social networks is increasing, leading to an undeniable fact that word-of-mouth marketing has become one of the useful ways to promote sale of products. The Constrained Profit Maximization in Attribute network (CPMA) problem, as an extension of the classical influence maximization problem, is the main focus of this paper. We propose the profit maximization in attribute network problem under a cardinality constraint which is closer to the actual situation. The profit spread metric of CPMA calculates the total benefit and cost generated by all the active nodes. Different from the classical Influence Maximization problem, the influence strength should be recalculated according to the emotional tendency and classification label of nodes in attribute networks. The profit spread metric is no longer monotone and submodular in general. Given that the profit spread metric can be expressed as the difference between two submodular functions and admits a DS decomposition, a three-phase algorithm named as Marginal increment and Community-based Prune and Search(MCPS) Algorithm frame is proposed which is based on Louvain algorithm and logistic function. Due to the method of marginal increment, MPCS algorithm can compute profit spread more directly and accurately. Experiments demonstrate the effectiveness of MCPS algorithm.


Author(s):  
Zhicheng Liu ◽  
Hong Chang ◽  
Ran Ma ◽  
Donglei Du ◽  
Xiaoyan Zhang

Abstract We consider a two-stage submodular maximization problem subject to a cardinality constraint and k matroid constraints, where the objective function is the expected difference of a nonnegative monotone submodular function and a nonnegative monotone modular function. We give two bi-factor approximation algorithms for this problem. The first is a deterministic $\left( {{1 \over {k + 1}}\left( {1 - {1 \over {{e^{k + 1}}}}} \right),1} \right)$ -approximation algorithm, and the second is a randomized $\left( {{1 \over {k + 1}}\left( {1 - {1 \over {{e^{k + 1}}}}} \right) - \varepsilon ,1} \right)$ -approximation algorithm with improved time efficiency.


2021 ◽  
Author(s):  
Liping Pang ◽  
Menglong Xue ◽  
Na Xu

Abstract In this paper, we consider the cardinality-constrained optimization problem and propose a new sequential optimality condition for the continuous relaxation reformulation which is popular recently. It is stronger than the existing results and is still a first-order necessity condition for the cardinality constraint problem without any additional assumptions. Meanwhile, we provide a problem-tailored weaker constraint qualification, which can guarantee that new sequential conditions are Mordukhovich-type stationary points. On the other hand, we improve the theoretical results of the augmented Lagrangian algorithm. Under the same condition as the existing results, we prove that any feasible accumulation point of the iterative sequence generated by the algorithm satisfies the new sequence optimality condition. Furthermore, the algorithm can converge to the Mordukhovich-type (essentially strong) stationary point if the problem-tailored constraint qualification is satisfied.


2021 ◽  
Vol 12 (1) ◽  
Author(s):  
C. Maria Keet ◽  
Rolf Grütter

Abstract Background The ontology authoring step in ontology development involves having to make choices about what subject domain knowledge to include. This may concern sorting out ontological differences and making choices between conflicting axioms due to limitations in the logic or the subject domain semantics. Examples are dealing with different foundational ontologies in ontology alignment and OWL 2 DL’s transitive object property versus a qualified cardinality constraint. Such conflicts have to be resolved somehow. However, only isolated and fragmented guidance for doing so is available, which therefore results in ad hoc decision-making that may not be the best choice or forgotten about later. Results This work aims to address this by taking steps towards a framework to deal with the various types of modeling conflicts through meaning negotiation and conflict resolution in a systematic way. It proposes an initial library of common conflicts, a conflict set, typical steps toward resolution, and the software availability and requirements needed for it. The approach was evaluated with an actual case of domain knowledge usage in the context of epizootic disease outbreak, being avian influenza, and running examples with COVID-19 ontologies. Conclusions The evaluation demonstrated the potential and feasibility of a conflict resolution framework for ontologies.


Author(s):  
Victoria G. Crawford

In this paper, the monotone submodular maximization problem (SM) is studied. SM is to find a subset of size kappa from a universe of size n that maximizes a monotone submodular objective function f . We show using a novel analysis that the Pareto optimization algorithm achieves a worst-case ratio of (1 − epsilon)(1 − 1/e) in expectation for every cardinality constraint kappa < P , where P ≤ n + 1 is an input, in O(nP ln(1/epsilon)) queries of f . In addition, a novel evolutionary algorithm called the biased Pareto optimization algorithm, is proposed that achieves a worst-case ratio of (1 − epsilon)(1 − 1/e − epsilon) in expectation for every cardinality constraint kappa < P in O(n ln(P ) ln(1/epsilon)) queries of f . Further, the biased Pareto optimization algorithm can be modified in order to achieve a a worst-case ratio of (1 − epsilon)(1 − 1/e − epsilon) in expectation for cardinality constraint kappa in O(n ln(1/epsilon)) queries of f . An empirical evaluation corroborates our theoretical analysis of the algorithms, as the algorithms exceed the stochastic greedy solution value at roughly when one would expect based upon our analysis.


2021 ◽  
Author(s):  
Ming-Yu Zhang ◽  
Yong-Jun Liu

Abstract Due to long lead times, uncertain outcomes and lack of enough historical data, pharmaceutical research and development (R$\&$D) portfolio selection is a often very complex decision issue. The aim of this paper is to investigate pharmaceutical R$\&$D portfolio selection with unavailable and unreliable project information, where the borrowed capital is allowed. Based on fuzzy set theory, we propose two pharmaceutical R$\&$D portfolio optimization models with minimum borrowed capital by taking into account corporate strategy in developing new products, scarcity of resources, lack of investment budget and cardinality constraint. In the two proposed models, the pharmaceutical R$\&$D company is assumed to achieve the objectives of maximizing terminal wealth and minimizing the cumulative borrowed capital over the whole investment horizon. Then, we transform the two proposed bi-objective models into the corresponding single-objective models by using the weighted sum approach and employ the modified artificial bee colony (ABC) algorithm to solve the transformed models. Finally, we provide a numerical example to illustrate the application of the proposed models.


Sign in / Sign up

Export Citation Format

Share Document