scholarly journals Varieties of Noisy Harmonic Grammar

Author(s):  
Bruce Hayes

Noisy Harmonic Grammar (NHG) is a framework for stochastic grammars that uses the GEN-cum-EVAL system originated in Optimality Theory. As a form of Harmonic Grammar, NHG outputs as winner the candidate with the smallest harmonic penalty (weighted sum of constraint violations). It is stochastic because at each "evaluation time," constraint weights are nudged upward or downward by a random amount, resulting in a particular probability distribution over candidates. This “classical” form of NHG can be modified in various ways, creating alternative theories. I explore these variants in a variety of simple simulations intended to reveal key differences in their behavior; maxent grammars are also included in the comparison. In conclusion I offer hints from the empirical world regarding which of these rival theories might be correct.

Phonology ◽  
2013 ◽  
Vol 30 (1) ◽  
pp. 27-71 ◽  
Author(s):  
Gaja Jarosz

This paper explores the relative merits of constraint rankingvs. weighting in the context of a major outstanding learnability problem in phonology: learning in the face of hidden structure. Specifically, the paper examines a well-known approach to the structural ambiguity problem, Robust Interpretive Parsing (RIP; Tesar & Smolensky 1998), focusing on its stochastic extension first described by Boersma (2003). Two related problems with the stochastic formulation of RIP are revealed, rooted in a failure to take full advantage of probabilistic information available in the learner's grammar. To address these problems, two novel parsing strategies are introduced and applied to learning algorithms for both probabilistic ranking and weighting. The novel parsing strategies yield significant improvements in performance, asymmetrically improving performance of OT learners. Once RIP is replaced with the proposed modifications, the apparent advantage of HG over OT learners reported in previous work disappears (Boersma & Pater 2008).


Phonology ◽  
2016 ◽  
Vol 33 (3) ◽  
pp. 493-532
Author(s):  
Giorgio Magri

OT error-driven learning admits guarantees of efficiency, stochastic tolerance and noise robustness which hold independently of any substantive assumptions on the constraints. This paper shows that the HG learner used in the current literature does not admit such constraint-independent guarantees. The HG theory of error-driven learning thus needs to be substantially restricted to specific constraint sets.


Phonology ◽  
2015 ◽  
Vol 32 (3) ◽  
pp. 353-383 ◽  
Author(s):  
Robert Daland

A phonotactic grammar assigns a well-formedness score to all possible surface forms. This paper considers whether phonotactic grammars should be probabilistic, and gives several arguments that they need to be. Hayes & Wilson (2008) demonstrate the promise of a maximum entropy Harmonic Grammar as a probabilistic phonotactic grammar. This paper points out a theoretical issue with maxent phonotactic grammars: they are not guaranteed to assign a well-defined probability distribution, because sequences that contain arbitrary repetitions of unmarked sequences may be underpenalised. The paper motivates a solution to this issue: include a *Structconstraint. A mathematical proof of necessary and sufficient conditions to avoid the underpenalisation problem are given in online supplementary materials.


2021 ◽  
pp. 1-22
Author(s):  
Jennifer L. Smith

Abstract In a phonological saltation alternation, a segment or class “skips” a relatively similar category to surface as something less similar, as when /ɡ/ alternates with [x], skipping [k]. White (2013) and Hayes and White (2015) argue that saltation is unnatural—difficult to learn in the laboratory and diachronically unstable. They propose that the phonological grammar includes a learning bias against such unnatural patterns. White and Hayes further demonstrate that Harmonic Grammar (HG; Legendre, Miyata, and Smolensky 1990) cannot model typical saltation without nondefault mechanisms that would require extra steps in acquisition, making HG consistent with their proposed learning bias. I identify deletion saltation as a distinct saltation subtype and show that HG, with faithfulness formalized in standard Correspondence Theory (CT; McCarthy and Prince 1995), can model this pattern. HG/CT thus predicts that deletion saltation, unlike typical (here called segment-scale) saltation, is natural. Other frameworks fail to distinguish the two saltation types—they can either model both types, or neither. Consequently, if future empirical work finds deletion saltation to be more natural than other saltation patterns, this would support weighted-constraint models such as HG over ranked-constraint models such as Optimality Theory (OT; Prince and Smolensky 1993, 2004); would support CT over the *MAP model of faithfulness (Zuraw 2013); and would support formalizing CT featural-faithfulness constraints in terms of IDENT constraints, binary features, or both.


2011 ◽  
Vol 90-93 ◽  
pp. 1200-1204
Author(s):  
Xiang Hai Zhou ◽  
Juan Du

In order to investigate the development of vehicle loads on highway in recent years,data collection and statistical analysis of vehicle loads in overload non-controlling area were made.It is shown that all ti1e vehicle loads follow multiple-peaked distribution.The vehicle load probability distribution in overload non-controlling area was simulated using a weighted sum of 4 normal distributions.In order to describe the load limit of heavy duty vehicle for bridge safety accurately,the vehicle load probability distribution in overload non-controlling area was simulated by a piecewise function.The maximum probability distribution of vehicle loads during different periods were simulated by the distribution of I type extremum.


2021 ◽  
Vol 8 (2) ◽  
Author(s):  
Giorgio Magri ◽  
Benjamin Storme

The classical constraints used in phonological theory apply to a single candidate at a time. Yet, some proposals in the phonological literature have enriched the classical constraint toolkit with constraints that instead apply to multiple candidates simultaneously. For instance, Dispersion Theory (Flemming 2002, 2004, 2008) adopts distinctiveness constraints that penalize pairs of surface forms which are not sufficiently dispersed. Also, some approaches to paradigm uniformity effects (Kenstowicz 1997; McCarthy 2005) adopt Optimal Paradigm faithfulness constraints that penalize pairs of stems in a paradigm which are not sufficiently similar. As a consequence, these approaches need to “lift” the classical constraints from a single candidate to multiple candidates by summing constraint violations across multiple candidates.Is this assumption of constraint summation typologically innocuous? Or do the classical constraints make different typological predictions when they are summed, independently of the presence of distinctiveness or optimal paradigm faithfulness constraints? The answer depends on the underlying model of constraint optimization, namely on how the profiles of constraint violations are ordered to determine the smallest one. Extending an independent result by Prince (2015), this paper characterizes those orderings for which the assumption of constraint summation is typologically innocuous. As a corollary, the typological innocuousness of constraint summation is established within both Optimality Theory and Harmonic Grammar.


2013 ◽  
Vol 44 (4) ◽  
pp. 569-609 ◽  
Author(s):  
Giorgio Magri

Various authors have recently endorsed Harmonic Grammar (HG) as a replacement for Optimality Theory (OT). One argument for this move is that OT seems not to have close correspondents within machine learning while HG allows methods and results from machine learning to be imported into computational phonology. Here, I prove that this argument in favor of HG and against OT is wrong. In fact, I show that any algorithm for HG can be turned into an algorithm for OT. Hence, HG has no computational advantages over OT. This result allows tools from machine learning to be systematically adapted to OT. As an illustration of this new toolkit for computational OT, I prove convergence for a slight variant of Boersma’s (1998) (nonstochastic) Gradual Learning Algorithm.


2018 ◽  
Vol 55 (1) ◽  
pp. 123-159
Author(s):  
MIRANDA MCCARVEL ◽  
AARON KAPLAN

In Tamil, coronals are licensed in onsets and initial syllables, exemplifying what Jesney (2011b) calls Licensing in Multiple Contexts (LMC). Jesney shows that while only positional faithfulness produces LMC in Optimality Theory, positional licensing provides a competing analysis of LMC in Harmonic Grammar (HG). This suggests that positional faithfulness may not be necessary in HG. We argue, though, that positional faithfulness remains essential. First, other facts in Tamil are incompatible with the positional licensing approach to LMC, rendering the positional faithfulness alternative the only viable analysis. Second, only with positional faithfulness can certain typological generalizations concerning assimilation between consonants be captured.


Author(s):  
Anna Mai ◽  
Eric Bakovic

We show that, in general, Optimality Theory (OT) grammars containing a restricted family of locally-conjoined constraints (Smolensky 2006) make the same typological predictions as corresponding Harmonic Grammar (HG) grammars. We provide an intuition for the generalization using a simple constrast and neutralization typology, as well as a formal proof. This demonstration adds structure to claims about the (non)equivalence of HG and OT with local conjunction (Legendre et al. 2006, Pater 2016) and provides a tool for understanding how different sets of constraints lead to the same typological predictions in HG and OT.


Sign in / Sign up

Export Citation Format

Share Document