scholarly journals Extending Dual Arc Consistency

Author(s):  
S. Nagarajan ◽  
S. D. Goodwin ◽  
A. Sattar

Many extensions to existing binary constraint satisfaction algorithms have been proposed that directly deal with nonbinary constraints. Another choice is to perform a structural transformation of the representation of the problem, so that the resulting problem is a binary CSP except that now the original constraints which were nonbinary are replaced by binary compatibility constraints between relations. A lot of recent work has focussed on comparing different levels of local consistency enforceable in the nonbinary representation with the dual representation. In this paper we present extensions to the standard dual encoding that can compactly represent the given CSP using an equivalent dual encoding that contains all the original solutions to the CSP, using constraint coverings. We show how enforcing arc consistency in these constraint covering based encodings, strictly dominates enforcement of generalized arc consistency (GAC) on the primal nonbinary encoding.

2001 ◽  
Vol 1 (6) ◽  
pp. 713-750 ◽  
Author(s):  
KRZYSZTOF R. APT ◽  
ERIC MONFROY

We study here a natural situation when constraint programming can be entirely reduced to rule-based programming. To this end we explain first how one can compute on constraint satisfaction problems using rules represented by simple first-order formulas. Then we consider constraint satisfaction problems that are based on predefined, explicitly given constraints. To solve them we first derive rules from these explicitly given constraints and limit the computation process to a repeated application of these rules, combined with labeling. We consider two types of rule here. The first type, that we call equality rules, leads to a new notion of local consistency, called rule consistency that turns out to be weaker than arc consistency for constraints of arbitrary arity (called hyper-arc consistency in Marriott & Stuckey (1998)). For Boolean constraints rule consistency coincides with the closure under the well-known propagation rules for Boolean constraints. The second type of rules, that we call membership rules, yields a rule-based characterization of arc consistency. To show feasibility of this rule-based approach to constraint programming, we show how both types of rules can be automatically generated, as CHR rules of Frühwirth (1995). This yields an implementation of this approach to programming by means of constraint logic programming. We illustrate the usefulness of this approach to constraint programming by discussing various examples, including Boolean constraints, two typical examples of many valued logics, constraints dealing with Waltz's language for describing polyhedral scenes, and Allen's qualitative approach to temporal logic.


2014 ◽  
Vol 23 (04) ◽  
pp. 1460017
Author(s):  
Jinsong Guo ◽  
Hongbo Li ◽  
Zhanshan Li ◽  
Yonggang Zhang ◽  
Xianghua Jia

Maintaining local consistencies can improve the efficiencies of the search algorithms solving constraint satisfaction problems (CSPs). Comparing with arc consistency which is the most widely used local consistency, stronger local consistencies can make the search space smaller while they require higher computational cost. In this paper, we make an attempt on the compromise between the pruning ability and the computational cost. A new local consistency called singleton strong bound consistency (SSBC) and its light version, light SSBC, are proposed. The search algorithm maintaining light SSBC can outperform MAC on a considerable number of problems.


2005 ◽  
Vol 24 ◽  
pp. 641-684 ◽  
Author(s):  
N. Samaras ◽  
K. Stergiou

A non-binary Constraint Satisfaction Problem (CSP) can be solved directly using extended versions of binary techniques. Alternatively, the non-binary problem can be translated into an equivalent binary one. In this case, it is generally accepted that the translated problem can be solved by applying well-established techniques for binary CSPs. In this paper we evaluate the applicability of the latter approach. We demonstrate that the use of standard techniques for binary CSPs in the encodings of non-binary problems is problematic and results in models that are very rarely competitive with the non-binary representation. To overcome this, we propose specialized arc consistency and search algorithms for binary encodings, and we evaluate them theoretically and empirically. We consider three binary representations; the hidden variable encoding, the dual encoding, and the double encoding. Theoretical and empirical results show that, for certain classes of non-binary constraints, binary encodings are a competitive option, and in many cases, a better one than the non-binary representation.


Author(s):  
Marlene Arangú ◽  
Miguel Salido

A fine-grained arc-consistency algorithm for non-normalized constraint satisfaction problems Constraint programming is a powerful software technology for solving numerous real-life problems. Many of these problems can be modeled as Constraint Satisfaction Problems (CSPs) and solved using constraint programming techniques. However, solving a CSP is NP-complete so filtering techniques to reduce the search space are still necessary. Arc-consistency algorithms are widely used to prune the search space. The concept of arc-consistency is bidirectional, i.e., it must be ensured in both directions of the constraint (direct and inverse constraints). Two of the most well-known and frequently used arc-consistency algorithms for filtering CSPs are AC3 and AC4. These algorithms repeatedly carry out revisions and require support checks for identifying and deleting all unsupported values from the domains. Nevertheless, many revisions are ineffective, i.e., they cannot delete any value and consume a lot of checks and time. In this paper, we present AC4-OP, an optimized version of AC4 that manages the binary and non-normalized constraints in only one direction, storing the inverse founded supports for their later evaluation. Thus, it reduces the propagation phase avoiding unnecessary or ineffective checking. The use of AC4-OP reduces the number of constraint checks by 50% while pruning the same search space as AC4. The evaluation section shows the improvement of AC4-OP over AC4, AC6 and AC7 in random and non-normalized instances.


2020 ◽  
Vol 23 (4) ◽  
pp. 17-32
Author(s):  
Konul Khalilova ◽  
Irina Orujova

The current article involves the issues of losses, gains, or survivals contributing to literature in the process of translation. It represents a thorough study based on the novel “The Grapes of Wrath” by John Steinbeck from English and, respectively, its translation into Azerbaijani by Ulfet Kurchayli. It investigates the problematic areas or challenges emerging from the source-text discrepancies. Furthermore, this article also concentrates on the issue of cultural non-equivalence or the losses occurring in translating English literary texts into Azerbaijani. The paper identifies the translation techniques adopted by the translator of John Steinbeck’s The Grapes of Wrath. Adopting certain techniques rather than others has led to many losses on different levels. The translator’s important role as a cultural insider is also emphasized. The wide gap, distance, or the differences between the cultures, languages, and thought patterns of the English and Azerbaijani language speakers are the main factors resulting in various losses in the process of translation. Coping with these extra-linguistic constraints is harder than the linguistic ones as the translator has no choice in the given situations, deleting these elements from the TT or replacing them with elements that do not fit the context. This article aims at determining translation losses and gains, defining ways that the translator employs for compensating losses, through the analysis of John Steinbeck’s style in The Grapes of Wrath. The article concludes that there are some situations where the translation of a certain text from the SL into the TL embraces alteration in the whole informational content of the text, in the form of expressions or words.


Author(s):  
Robert Ganian ◽  
Andre Schidler ◽  
Manuel Sorge ◽  
Stefan Szeider

Treewidth and hypertree width have proven to be highly successful structural parameters in the context of the Constraint Satisfaction Problem (CSP). When either of these parameters is bounded by a constant, then CSP becomes solvable in polynomial time. However, here the order of the polynomial in the running time depends on the width, and this is known to be unavoidable; therefore, the problem is not fixed-parameter tractable parameterized by either of these width measures. Here we introduce an enhancement of tree and hypertree width through a novel notion of thresholds, allowing the associated decompositions to take into account information about the computational costs associated with solving the given CSP instance. Aside from introducing these notions, we obtain efficient theoretical as well as empirical algorithms for computing threshold treewidth and hypertree width and show that these parameters give rise to fixed-parameter algorithms for CSP as well as other, more general problems. We complement our theoretical results with experimental evaluations in terms of heuristics as well as exact methods based on SAT/SMT encodings.


2021 ◽  
Vol 7 (1) ◽  
pp. 15
Author(s):  
Robiatul Munajah ◽  
Asep Supena

The success of students in learning does not only depend on their own abilities. Several factors that can give effect need to be optimized. The teacher's strategy is very meaningful to optimize students' multiple intelligences according to the indicators that each student has. Every child in this world has various intelligences in different levels and indicators. This shows that all children, by nature, are intelligent. The difference lies in the level and indicators of intelligence. These differences are determined by various factors. One of them is the stimulation given when children learn in the learning process carried out by the teacher. The difference in intelligence among students demands a fair and existential way of thinking of educators. This research is a literature review to see more specifically the teacher's strategy in optimizing multiple intelligences in elementary schools based on research reference sources and books. Good educators are able to detect children's intelligence by observing the behavior, tendencies, interests, ways and qualities of children when reacting to the given stimulus. All indicators of intelligence can be recognized by educators to then make a profile of intelligence. Therefore, every teacher should know how to develop the intelligence of their students, by identifying each indicator of children's intelligence and realizing the importance of developing all the intelligences of their students. 


2001 ◽  
Vol 14 ◽  
pp. 53-81 ◽  
Author(s):  
X. Chen ◽  
P. Van Beek

In recent years, many improvements to backtracking algorithms for solving constraint satisfaction problems have been proposed. The techniques for improving backtracking algorithms can be conveniently classified as look-ahead schemes and look-back schemes. Unfortunately, look-ahead and look-back schemes are not entirely orthogonal as it has been observed empirically that the enhancement of look-ahead techniques is sometimes counterproductive to the effects of look-back techniques. In this paper, we focus on the relationship between the two most important look-ahead techniques---using a variable ordering heuristic and maintaining a level of local consistency during the backtracking search---and the look-back technique of conflict-directed backjumping (CBJ). We show that there exists a ``perfect'' dynamic variable ordering such that CBJ becomes redundant. We also show theoretically that as the level of local consistency that is maintained in the backtracking search is increased, the less that backjumping will be an improvement. Our theoretical results partially explain why a backtracking algorithm doing more in the look-ahead phase cannot benefit more from the backjumping look-back scheme. Finally, we show empirically that adding CBJ to a backtracking algorithm that maintains generalized arc consistency (GAC), an algorithm that we refer to as GAC-CBJ, can still provide orders of magnitude speedups. Our empirical results contrast with Bessiere and Regin's conclusion (1996) that CBJ is useless to an algorithm that maintains arc consistency.


Sign in / Sign up

Export Citation Format

Share Document