scholarly journals Minimal Transitive Factorizations of Permutations into Cycles

2009 ◽  
Vol 61 (5) ◽  
pp. 1092-1117 ◽  
Author(s):  
John Irving

Abstract. We introduce a new approach to an enumerative problem closely linked with the geometry of branched coverings, that is, we study the number ${{H}_{\alpha }}({{i}_{2}},{{i}_{3}},...)$) of ways a given permutation (with cycles described by the partition $\alpha $) can be decomposed into a product of exactly ${{i}_{2}}$ 2-cycles, ${{i}_{3}}$ 3-cycles, etc., with certain minimality and transitivity conditions imposed on the factors. The method is to encode such factorizations as planar maps with certain descent structure and apply a new combinatorial decomposition to make their enumeration more manageable. We apply our technique to determine ${{H}_{\alpha }}({{i}_{2}},{{i}_{3}},...)$ when $\alpha $ has one or two parts, extending earlier work of Goulden and Jackson. We also show how these methods are readily modified to count inequivalent factorizations, where equivalence is defined by permitting commutations of adjacent disjoint factors. Our technique permits us to generalize recent work of Goulden, Jackson, and Latour, while allowing for a considerable simplification of their analysis.

1991 ◽  
Vol 110 (3) ◽  
pp. 545-558 ◽  
Author(s):  
J. D. Biggins ◽  
N. H. Bingham

The occurrence of certain ‘near-constancy phenomena’ in some aspects of the theory of (simple) branching processes forms the background for the work below. The problem arises out of work by Karlin and McGregor [8, 9]. A detailed study of the theoretical and numerical aspects of the Karlin–McGregor near-constancy phenomenon was given by Dubuc[7], and considered further by Bingham[4]. We give a new approach which simplifies and generalizes the results of these authors. The primary motivation for doing this was the recent work of Barlow and Perkins [3], who observed near-constancy in a framework not immediately covered by the results then known.


10.37236/3386 ◽  
2015 ◽  
Vol 22 (2) ◽  
Author(s):  
Marie Albenque ◽  
Dominique Poulalhon

This article presents a unified bijective scheme between planar maps and blossoming trees, where a blossoming tree is defined as a spanning tree of the map decorated with some dangling half-edges that enable to reconstruct its faces. Our method generalizes a previous construction of Bernardi by loosening its conditions of application so as to include annular maps, that is maps embedded in the plane with a root face different from the outer face.The bijective construction presented here relies deeply on the theory of $\alpha$-orientations introduced by Felsner, and in particular on the existence of minimal and accessible orientations. Since most of the families of maps can be characterized by such orientations, our generic bijective method is proved to capture as special cases many previously known bijections involving blossoming trees: for example Eulerian maps, $m$-Eulerian maps, non-separable maps and simple triangulations and quadrangulations of a $k$-gon. Moreover, it also permits to obtain new bijective constructions for bipolar orientations and $d$-angulations of girth $d$ of a $k$-gon.As for applications, each specialization of the construction translates into enumerative by-products, either via a closed formula or via a recursive computational scheme. Besides, for every family of maps described in the paper, the construction can be implemented in linear time. It yields thus an effective way to encode or sample planar maps.In a recent work, Bernardi and Fusy introduced another unified bijective scheme; we adopt here a different strategy which allows us to capture different bijections. These two approaches should be seen as two complementary ways of unifying bijections between planar maps and decorated trees.


2018 ◽  
Author(s):  
Jasmijn A. Baaijens ◽  
Alexander Schönhuth

AbstractHaplotype aware genome assembly plays an important role in genetics, medicine, and various other disciplines, yet generation of haplotype-resolved de novo assemblies remains a major challenge. Beyond distinguishing between errors and true sequential variants, one needs to assign the true variants to the different genome copies. Recent work has pointed out that the enormous quantities of traditional NGS read data have been greatly underexploited in terms of haplotig computation so far, which reflects that methodology for reference independent haplotig computation has not yet reached maturity. We present POLYTE (POLYploid genome fitTEr) as a new approach to de novo generation of haplotigs for diploid and polyploid genomes. Our method follows an iterative scheme where in each iteration reads or contigs are joined, based on their interplay in terms of an underlying haplotype-aware overlap graph. Along the iterations, contigs grow while preserving their haplotype identity. Benchmarking experiments on both real and simulated data demonstrate that POLYTE establishes new standards in terms of error-free reconstruction of haplotype-specific sequence. As a consequence, POLYTE outperforms state-of-the-art approaches in various relevant aspects, where advantages become particularly distinct in polyploid settings. POLYTE is freely available as part of the HaploConduct package at https://github.com/HaploConduct/HaploConduct, implemented in Python and C++.


Author(s):  
Olivier Bonami ◽  
Berthold Crysmann

In most recent work, Crysmann and Bonami (2012) suggest to reconcile the insights of inferential-realisational morphology (Anderson, 1992; Stump, 2001; Brown and Hippisley, 2012) with the full typology of variable morphotactics: situations where the expression of analogous feature sets can appear in various positions in the string. The authors proposed to account for these facts by importing, into HPSG, a variant of Paradigm Function Morphology (Stump, 2001) where realisation rules are doubly indexed for linear position and paradigmatic opposition. In this paper we first introduce more empirical challenges for theories of morphotactics that neither PFM nor the reformist approach of Crysmann and Bonami (2012) can accommodate. We then argue for a reappraisal of methods for morph introduction, and propose a new approach that replaces stipulation of classes of paradigmatic opposition with a general distinction between expression and conditioning (Carstairs, 1987; Noyer, 1992) which greatly expands the scope of Pāṇini’s Principle.


2020 ◽  
pp. 173-186
Author(s):  
Bob Hale

In recent work, Kit Fine proposes a new approach to the philosophy of mathematics, which he calls procedural postulationism: the postulates from which a mathematical theory is derived are imperatival, rather than indicative, in character. According to procedural postulationism, what is postulated in mathematics are not propositions true in a given mathematical domain, but rather procedures for the construction of that domain. Fine claims some very significant advantages for procedural postulationism over other approaches. This chapter raises some questions for the view and its promised advantages. One crucial set of questions concerns how exactly the commands of procedural postulationism are to be understood. And in particular, how literally are we to take talk of construction?


1988 ◽  
Vol 12 (2) ◽  
pp. 337-361
Author(s):  
Carl Vetters

The paper proposes a new approach to temporal clauses and temporal adverbs, which are mostly said to "localize" the main verb. In recent work of Vicenzo Lo Cascio and others, temporal clauses and adverbs are treated completely differently and independently. We want to show that they should be handled together. Therefore, we start from a different conception of temporal localization, based on non-linguistic localization, as used in f.e. geography. So, we are able to show similarities between temporal clauses and adverbs. Our conclusion is that the localizer is always the time interval which is situated at the background, although it is not necessarily expressed by the temporal clause or adverb. On the contrary, a temporal clause or adverb can also be localized by the verb of the main clause.


2020 ◽  
Vol 34 (06) ◽  
pp. 9900-9907 ◽  
Author(s):  
Michael Katz ◽  
Shirin Sohrabi ◽  
Octavian Udrea

The need for finding a set of plans rather than one has been motivated by a variety of planning applications. The problem is studied in the context of both diverse and top-k planning: while diverse planning focuses on the difference between pairs of plans, the focus of top-k planning is on the quality of each individual plan. Recent work in diverse planning introduced additionally restrictions on solution quality. Naturally, there are application domains where diversity plays the major role and domains where quality is the predominant feature. In both cases, however, the amount of produced plans is often an artificial constraint, and therefore the actual number has little meaning. Inspired by the recent work in diverse planning, we propose a new family of computational problems called top-quality planning, where solution validity is defined through plan quality bound rather than an arbitrary number of plans. Switching to bounding plan quality allows us to implicitly represent sets of plans. In particular, it makes it possible to represent sets of plans that correspond to valid plan reorderings with a single plan. We formally define the unordered top-quality planning computational problem and present the first planner for that problem. We empirically demonstrate the superior performance of our approach compared to a top-k planner-based baseline, ranging from 41% increase in coverage for finding all optimal plans to 69% increase in coverage for finding all plans of quality up to 120% of optimal plan cost. Finally, complementing the new approach by a complete procedure for generating all valid reorderings of a given plan, we derive a top-quality planner. We show the planner to be competitive with a top-k planner based baseline.


Author(s):  
J. Xia ◽  
Q. J. Ge

Abstract This paper extends the recent work of Xia and Ge (1999) to develop methods for the exact analysis of the swept surface of a cylindrical surface undergoing two-parameter rational Bézier motions. Instead of the approach of analyzing the point trajectory of an object motion for swept volume analysis, this paper seeks to develop a new approach to swept volume analysis by studying the plane trajectory of a rational motion. It seeks to bring together recent work in swept volume analysis, plane representation of developable surfaces, as well as computer aided synthesis of freeform rational motions. The results have applications in design and approximation of freeform surfaces as well as tool path planning for 5-axis machining of freeform surfaces.


2019 ◽  
Vol 7 (2) ◽  
pp. 196-215 ◽  
Author(s):  
Joel G. Thomas ◽  
Paul B. Sharp

Efforts to understand the causes of psychopathology have remained stifled in part because current practices do not clearly describe how psychological constructs differ from biological phenomena and how to integrate them in unified explanations. The present article extends recent work in philosophy of science by proposing a framework called mechanistic science as a promising way forward. This approach maintains that integrating psychological and biological phenomena involves demonstrating how psychological functions are implemented in biological structures. Successful early attempts to advance mechanistic explanations of psychological phenomena are reviewed, and lessons are derived to show how the framework can be applied to a range of clinical psychological phenomena, including gene by environment findings, computational models of reward processing in schizophrenia, and self-related processes in personality pathology. Pursuing a mechanistic approach can ultimately facilitate more productive and successful collaborations across a range of disciplines.


Author(s):  
Wayne Glausser

This chapter focuses on an old and a new approach to analyzing the seven deadly sins: Thomas Aquinas’s medieval theology, and contemporary cognitive science. As different as the two perspectives might seem, they share a common indebtedness to Aristotle, and they entangle more compellingly than one might have expected. Certain themes occupy and key problems vex both theology and science. The chapter first sets out the Aristotle connection, then engages in comparative analyses of Aquinas and science for each of the deadly sins. Although there is no single, definitive text for contemporary cognitive science that has the authority of the Summa Theologica, a recent issue of Scientific American provides a helpful window onto recent work by psychologists and neuroscientists that relates to the deadly sins.


Sign in / Sign up

Export Citation Format

Share Document