rule consistency
Recently Published Documents


TOTAL DOCUMENTS

15
(FIVE YEARS 3)

H-INDEX

5
(FIVE YEARS 0)

Author(s):  
Lucia Quaglia

The elemental regime on margins for derivatives not cleared through CCPs was added later on to the international regulatory agenda. The US was a pace-setter at the international level and a first-mover at the domestic level in promoting relatively precise, stringent and consistent margin requirements. The EU supported the US international standard-setting efforts, but adopted domestic regulation after international rules were set. There were no foot-draggers, even though several jurisdictions on the fringe were reluctant followers. Domestic regulators gathered in international standard-setting bodies facilitated the ironing out of differences amongst and within jurisdictions. Transgovernmental networks also fostered rule consistency, helping to manage the regime complexity resulting from several interlinked elemental regimes on derivatives. Margins were heavily contested by the financial industry, which mobilized to make them less precise and stringent. Private actors also urged regulators to consider this reform in conjunction with other post-crisis standards, notably, capital requirements.


2020 ◽  
Vol 51 (1) ◽  
pp. 28-39
Author(s):  
Wesley Kaufmann ◽  
Alex Ingrams ◽  
Daan Jacobs

A growing stream of research in public administration is concerned with how red tape and administrative burden affects citizens. Drawing on the procedural fairness literature, we argue that the consistent application of rules reduces perceived red tape. We also hypothesize that red tape perceptions are affected by outcome favorability and that an interaction effect exists between consistency and outcome favorability. Our reasoning is tested with a survey experiment in the context of a federal jury duty summons procedure, and administered to a sample of U.S. citizens through TurkPrime. The statistical results support our hypotheses; perceived red tape is lower if rules are applied consistently and if citizens receive a favorable outcome. We also find that consistently applying a procedure reduces perceived red tape further when citizens receive a favorable outcome. The implications of these findings for research and practice are discussed.


2020 ◽  
Author(s):  
Ilker Kose

Abstract Background: Electronic claim processing (ECP) systems in healthcare insurance require comprehensive and secure management of medical information. Even though state of the art ECP systems can read payment rules written in plain-text, there are hundreds of rules (each including dozens of conditions) in a conventional ECP system. The conditions of the rules, in turn, refer to thousands of medical entities and concepts. Although domain experts can manage plain-text payment rules, the length and complexity of the rules yield low comprehensibility and in-rule and inter-rule consistencies. Hence, a more efficient and straightforward system is required. This study aims to make a claim management system medical data bank more efficient using ontology. Method: We developed an ontology-based medical information management system (ONTMIMS) in healthcare insurance to simplify payment rules. 1,312 sets of diagnosis and health services were included in the ONTMIMS. The development of the ontology was compromised of four stages: i) specification and conceptualization; ii) formalization; iii) implementation; and iv) evaluation. Protégé and Apache Jena library tools were used to execute queries on the ontologies and the ONTMIMS was tested on an active ECP system. Results: The experiments indicated that ONTMIMS increased comprehensibility rates for domain experts from 35.1% to 64.9%. Distinguishing in-rule inconsistencies increased from 65% to 82.5% and distinguishing inter-rule inconsistencies increased from 78.8% to 85%. Conclusions: Ontology, as in many other studies, is very useful in representing and processing information. This is the first study applying ontology to ECP systems for health insurance institutions. The results demonstrate that applying ontology increased in-rule and inter-rule consistency and made rule sentences more comprehensible to domain experts.


2011 ◽  
Vol 05 (03) ◽  
pp. 271-280 ◽  
Author(s):  
PHILIPPE BESNARD

Representing knowledge in a rule-based system takes place by means of "if…then…" statements. These are called production rules for the reason that new information is produced when the rule fires. The logic attached to rule-based systems is taken to be classical inasmuch as "if…then…" is encoded by material implication. However, it appears that the notion of triggering "if…then…" amounts to different logical definitions. The paper investigates the matter, with an emphasis upon consistency because reading "if… then…" statements as rules calls for a notion of rule consistency that does not conform with consistency in the classical sense. Natural deduction is used to explore entailment and equivalence among various formulations and properties.


2004 ◽  
Vol 28 (5) ◽  
pp. 428-434 ◽  
Author(s):  
Drew Nesdale ◽  
Michael Scarlett

This study examined the effect on pre-adolescent children’s attitudes to bullying of one group-based variable (group status) and two situational variables (rule legitimacy and rule consistency). Pre-adolescent boys ( n 1/4 229) read a story about a group of boys who had high or low (handball) status. The legitimacy (high versus low) of the rules governing the use of a handball court, and the extent to which the group’s claim to the court was consistent with the rules (high versus low), were also manipulated. The participants’ liking, causal attribution, deservingness, and punishment responses to an intergroup bullying episode instigated by the group of boys against children from another class indicated that the participants recognised the import of the situational variables and, at least to some extent, took them into account. Nevertheless, the results indicated that the children favoured the bully group, and that their responses systematically reflected this bias. Possible bases for understanding these effects are discussed.


Author(s):  
JOSEF TKADLEC ◽  
IVAN BRUHA

This paper deals with the multiple-rule problem which arises when several decision rules (of different classes) match ("fire" for) an input to-be-classified (unseen) object. The paper focuses on formal aspects and theoretical methodology for the above problem. The general definitions of the notions of a Designer, Learner and Classifier are presented in a formal matter, including parameters that are usually attached to the above concepts such as rule consistency, completeness, quality, matching rate, etc. We thus provide the minimum-requirement definitions as necessary conditions for these concepts. Any designer (decision-system builder) of a new multiple-rule system may start with these minimum requirements. We only expect that the Classifier makes its decisions according to its decision scheme induced as a knowledge base (theory, model, concept description). Also, two case studies are discussed. We conclude with a general flow chart for a decision-system builder. He/she can just pursue it and select parameters of a Learner and Classifier, following the minimum requirements provided.


2001 ◽  
Vol 1 (6) ◽  
pp. 713-750 ◽  
Author(s):  
KRZYSZTOF R. APT ◽  
ERIC MONFROY

We study here a natural situation when constraint programming can be entirely reduced to rule-based programming. To this end we explain first how one can compute on constraint satisfaction problems using rules represented by simple first-order formulas. Then we consider constraint satisfaction problems that are based on predefined, explicitly given constraints. To solve them we first derive rules from these explicitly given constraints and limit the computation process to a repeated application of these rules, combined with labeling. We consider two types of rule here. The first type, that we call equality rules, leads to a new notion of local consistency, called rule consistency that turns out to be weaker than arc consistency for constraints of arbitrary arity (called hyper-arc consistency in Marriott & Stuckey (1998)). For Boolean constraints rule consistency coincides with the closure under the well-known propagation rules for Boolean constraints. The second type of rules, that we call membership rules, yields a rule-based characterization of arc consistency. To show feasibility of this rule-based approach to constraint programming, we show how both types of rules can be automatically generated, as CHR rules of Frühwirth (1995). This yields an implementation of this approach to programming by means of constraint logic programming. We illustrate the usefulness of this approach to constraint programming by discussing various examples, including Boolean constraints, two typical examples of many valued logics, constraints dealing with Waltz's language for describing polyhedral scenes, and Allen's qualitative approach to temporal logic.


Sign in / Sign up

Export Citation Format

Share Document