scholarly journals Approximate Hypergraph Vertex Cover and generalized Tuza's conjecture

2022 ◽  
pp. 927-944
Author(s):  
Venkatesan Guruswami ◽  
Sai Sandeep
Keyword(s):  
Author(s):  
Eiji MIYANO ◽  
Toshiki SAITOH ◽  
Ryuhei UEHARA ◽  
Tsuyoshi YAGITA ◽  
Tom C. van der ZANDEN

2020 ◽  
Vol 287 ◽  
pp. 77-84
Author(s):  
Pengcheng Liu ◽  
Zhao Zhang ◽  
Xianyue Li ◽  
Weili Wu

Symmetry ◽  
2021 ◽  
Vol 13 (6) ◽  
pp. 1036
Author(s):  
Abel Cabrera Martínez ◽  
Alejandro Estrada-Moreno ◽  
Juan Alberto Rodríguez-Velázquez

This paper is devoted to the study of the quasi-total strong differential of a graph, and it is a contribution to the Special Issue “Theoretical computer science and discrete mathematics” of Symmetry. Given a vertex x∈V(G) of a graph G, the neighbourhood of x is denoted by N(x). The neighbourhood of a set X⊆V(G) is defined to be N(X)=⋃x∈XN(x), while the external neighbourhood of X is defined to be Ne(X)=N(X)∖X. Now, for every set X⊆V(G) and every vertex x∈X, the external private neighbourhood of x with respect to X is defined as the set Pe(x,X)={y∈V(G)∖X:N(y)∩X={x}}. Let Xw={x∈X:Pe(x,X)≠⌀}. The strong differential of X is defined to be ∂s(X)=|Ne(X)|−|Xw|, while the quasi-total strong differential of G is defined to be ∂s*(G)=max{∂s(X):X⊆V(G)andXw⊆N(X)}. We show that the quasi-total strong differential is closely related to several graph parameters, including the domination number, the total domination number, the 2-domination number, the vertex cover number, the semitotal domination number, the strong differential, and the quasi-total Italian domination number. As a consequence of the study, we show that the problem of finding the quasi-total strong differential of a graph is NP-hard.


2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Yaoxin Li ◽  
Jing Liu ◽  
Guozheng Lin ◽  
Yueyuan Hou ◽  
Muyun Mou ◽  
...  

AbstractIn computer science, there exist a large number of optimization problems defined on graphs, that is to find a best node state configuration or a network structure, such that the designed objective function is optimized under some constraints. However, these problems are notorious for their hardness to solve, because most of them are NP-hard or NP-complete. Although traditional general methods such as simulated annealing (SA), genetic algorithms (GA), and so forth have been devised to these hard problems, their accuracy and time consumption are not satisfying in practice. In this work, we proposed a simple, fast, and general algorithm framework based on advanced automatic differentiation technique empowered by deep learning frameworks. By introducing Gumbel-softmax technique, we can optimize the objective function directly by gradient descent algorithm regardless of the discrete nature of variables. We also introduce evolution strategy to parallel version of our algorithm. We test our algorithm on four representative optimization problems on graph including modularity optimization from network science, Sherrington–Kirkpatrick (SK) model from statistical physics, maximum independent set (MIS) and minimum vertex cover (MVC) problem from combinatorial optimization on graph, and Influence Maximization problem from computational social science. High-quality solutions can be obtained with much less time-consuming compared to the traditional approaches.


Sign in / Sign up

Export Citation Format

Share Document