Allegro rules in colloquial Thai: some thoughts on process phonology

1986 ◽  
Vol 22 (2) ◽  
pp. 331-354 ◽  
Author(s):  
Ken Lodge

As part of an investigation into rapid speech and its rule-based processes, I want to present an analysis of colloquial spoken Thai and show how different tempi can be related to one another. I also want to see whether the processes displayed by colloquial Thai fit into the general picture of phonological processes which has emerged over the past 15 years or so (roughly Stampe, 1969, onwards) within different theoretical frameworks. In particular I shall try to relate my findings to the increasingly accepted notions of richer phonological structure now being envisaged (e.g. Clements & Keyser, 1983 – tridimensional; Goldsmith, 1976 a & b – autosegmental; Liberman & Prince, 1977 and Kiparsky, 1979 – metrical; Anderson & Ewen, 1980, and Durand, 1986 a – dependency).

1992 ◽  
Vol 23 (1) ◽  
pp. 52-60 ◽  
Author(s):  
Pamela G. Garn-Nunn ◽  
Vicki Martin

This study explored whether or not standard administration and scoring of conventional articulation tests accurately identified children as phonologically disordered and whether or not information from these tests established severity level and programming needs. Results of standard scoring procedures from the Assessment of Phonological Processes-Revised, the Goldman-Fristoe Test of Articulation, the Photo Articulation Test, and the Weiss Comprehensive Articulation Test were compared for 20 phonologically impaired children. All tests identified the children as phonologically delayed/disordered, but the conventional tests failed to clearly and consistently differentiate varying severity levels. Conventional test results also showed limitations in error sensitivity, ease of computation for scoring procedures, and implications for remediation programming. The use of some type of rule-based analysis for phonologically impaired children is highly recommended.


2021 ◽  
pp. 147737082110006
Author(s):  
José A. Brandariz

In what might be called the ‘austerity-driven hypothesis’, a consistent strand of literature has sought to explain the prison downsizing witnessed in many jurisdictions of the global north over the past decade by referring to the financial crisis of the late 2000s to early 2010s and its effects in terms of public spending cuts. Since this economic phase is essentially over, whereas the (moderate) decarceration turn is still ongoing, there are good reasons to challenge this hypothesis. This article delves into the non-economic forces that are fostering a prison population decline that, 10 years on, is becoming the new ‘penal normal’. The article thereby aims to spark a dialogue not only with the scholarship exploring the prison downsizing but also with certain theoretical frameworks that have played a key role in examining the punitive turn era. Additionally, the article contributes to the conversation on the need to reframe materialist readings on penality in a ‘non-reductionist’ fashion. By revisiting heterodox theses and scrutinizing the impact of recent penal changes on traditional materialist accounts, the article joins the collective endeavour seeking to update political economic perspectives on punishment and the penal field.


2021 ◽  
pp. 026638212110619
Author(s):  
Sharon Richardson

During the past two decades, there have been a number of breakthroughs in the fields of data science and artificial intelligence, made possible by advanced machine learning algorithms trained through access to massive volumes of data. However, their adoption and use in real-world applications remains a challenge. This paper posits that a key limitation in making AI applicable has been a failure to modernise the theoretical frameworks needed to evaluate and adopt outcomes. Such a need was anticipated with the arrival of the digital computer in the 1950s but has remained unrealised. This paper reviews how the field of data science emerged and led to rapid breakthroughs in algorithms underpinning research into artificial intelligence. It then discusses the contextual framework now needed to advance the use of AI in real-world decisions that impact human lives and livelihoods.


2021 ◽  
Author(s):  
Bulat Zagidullin ◽  
Ziyan Wang ◽  
Yuanfang Guan ◽  
Esa Pitkänen ◽  
Jing Tang

Application of machine and deep learning (ML/DL) methods in drug discovery and cancer research has gained a considerable amount of attention in the past years. As the field grows, it becomes crucial to systematically evaluate the performance of novel DL solutions in relation to established techniques. To this end we compare rule-based and data-driven molecular representations in prediction of drug combination sensitivity and drug synergy scores using standardized results of 14 high throughput screening studies, comprising 64,200 unique combinations of 4,153 molecules tested in 112 cancer cell lines. We evaluate the clustering performance of molecular fingerprints and quantify their similarity by adapting Centred Kernel Alignment metric. Our work demonstrates that in order to identify an optimal representation type it is necessary to supplement quantitative benchmark results with qualitative considerations, such as model interpretability and robustness, which may vary between and throughout preclinical drug development projects.


Phonology ◽  
2018 ◽  
Vol 35 (3) ◽  
pp. 441-479
Author(s):  
Külli Prillop

This article introduces basic principles of a generative theory of phonology that unifies aspects of parallel constraint-based theories and serial rule-based theories. In the core of the grammar are phonological processes that consist of a markedness constraint and a repair. Processes are universal, but every language activates a different set, and applies them in different orders. Phonological processes may be in bleeding or feeding relations. These two basic relations are sufficient to define more complicated interactions, such as blocking, derived and non-derived environment effects, chain shifts and allophony.


2020 ◽  
Vol 12 (1) ◽  
pp. 113-121
Author(s):  
Carla Piazzon Ramos Vieira ◽  
Luciano Antonio Digiampietri

The technologies supporting Artificial Intelligence (AI) have advanced rapidly over the past few years and AI is becoming a commonplace in every aspect of life like the future of self-driving cars or earlier health diagnosis. For this to occur shortly, the entire community stands in front of the barrier of explainability, an inherent problem of latest models (e.g. Deep Neural Networks) that were not present in the previous hype of AI (linear and rule-based models). Most of these recent models are used as black boxes without understanding partially or even completely how different features influence the model prediction avoiding algorithmic transparency. In this paper, we focus on how much we can understand the decisions made by an SVM Classifier in a post-hoc model agnostic approach. Furthermore, we train a tree-based model (inherently interpretable) using labels from the SVM, called secondary training data to provide explanations and compare permutation importance method to the more commonly used measures such as accuracy and show that our methods are both more reliable and meaningful techniques to use. We also outline the main challenges for such methods and conclude that model-agnostic interpretability is a key component in making machine learning more trustworthy.


2020 ◽  
Vol 20 (5) ◽  
pp. 590-603
Author(s):  
Curtis Redd ◽  
Emma K Russell

In recent years, we have witnessed a tide of government apologies for historic laws criminalising homosexuality. Complicating a conventional view of state apologies as a progressive effort to come to terms with past mistakes, queer theoretical frameworks help to elucidate the power effects and self-serving nature of the new politics of regret. We argue that through the discourse of gay apology, the state extolls pride in its present identity by expressing shame for its ‘homophobic past’. In doing so, it discounts the possibility that systemic homophobia persists in the present. Through a critical discourse analysis of the ‘world first’ gay apology from the parliament of the Australian state of Victoria in 2016, we identify five key themes: the inexplicability of the past, the individualisation of homophobia, the construction of a ‘post-homophobic’ society, the transformation of shame into state pride and subsuming the ‘unhappy queer’ through the expectation of forgiveness.


Worldview ◽  
1972 ◽  
Vol 15 (5) ◽  
pp. 35-41
Author(s):  
Ashok Kapur

In the past two decades most discussions about Indian foreign policy dealt with the nature and limitations of nonalignment. The discussions usually had an air of remoteness. In American perceptions, India as a policy area was peripheral to America's immediate political and strategic interests. Even when humanitarian motives were invoked as factors for political consideration there was a feeling that India was too bulky, too riddled with immense problems. Thus, Americans were never sure whether intimate political ties with India were probable or even desirable from America's point of view. If this was the general picture accepted by liberals in America, conservatives asserted that India was “wishy-washy” toward the Communist bloc. India's refusal to join the American alliance system confirmed this suspicion.


1959 ◽  
Vol 15 (05) ◽  
pp. 327-344
Author(s):  
L. V. Martin

The past 25 years have brought dramatic improvements in mortality rates in Great Britain. Infant mortality is now about 40% of what it was a quarter of a century ago: death-rates for children have been reduced to less than 1 per 1000. Not only children have benefited; young adults in the 15–44 age-group experience mortality little more than a third of what it was in the early thirties, whilst women aged 45–64 have had an improvement of about 40%.It is therefore regrettable that the experience of men aged 45–64 is a black spot in the general picture. The improvement for men in this age-group has not kept pace with that for women of the same age, being only about 15% in 25 years, little more than ½% a year. Table 1 compares the death-rates for men and women in post-war years and demonstrates very clearly how over a period of 12 years the ratio of male to female mortality has steadily increased.


2015 ◽  
Vol 64 (2) ◽  
Author(s):  
Mark A. Andor ◽  
Manuel Frondel ◽  
Stephan Sommer

AbstractIn Europe’s Emission Trading System (ETS), prices for emission permits have remained low for many years now. This fact gave rise to controversies on whether there is a need for fundamentally reforming the ETS. Potential reform proposals include the introduction of a minimum price for certificates and a market stability reserve (MSR). This is a rule-based mechanism to steering the volume of permits in the market. While preparing the introduction of this instrument, the European Commission hopes to be able to increase and stabilize certificate prices in the medium- and long-term. In this article, we recommend retaining the ETS as it is, rather than supplementing it by introducing a minimum price floor or a market stability reserve. Instead, mistakes from the past should be corrected by a single intervention: the final elimination of those 900 million permits that were taken out of the market in 2014, but would again emerge in the market in 2019 and 2020 (backloading).


Sign in / Sign up

Export Citation Format

Share Document