scholarly journals Implicatures and Naturalness

Author(s):  
Igor Douven

AbstractPragmatics postulates a rich typology of implicatures to explain how true assertions can nevertheless be misleading. This typology has been mainly defended on the basis of a priori considerations. We consider the question of whether the typology corresponds to an independent reality, specifically whether the various types of implicatures constitute natural concepts. To answer this question, we rely on the conceptual spaces framework, which represents concepts geometrically, and which provides a formally precise criterion for naturalness. Using data from a previous study, a space for the representation of implicatures is constructed. Examination of the properties of various types of implicatures as represented in that space then gives some reason to believe that most or even all types of implicatures do correspond to natural concepts.

2018 ◽  
Vol 63 ◽  
pp. 691-742 ◽  
Author(s):  
Hadi Banaee ◽  
Erik Schaffernicht ◽  
Amy Loutfi

There is an increasing need to derive semantics from real-world observations to facilitate natural information sharing between machine and human. Conceptual spaces theory is a possible approach and has been proposed as mid-level representation between symbolic and sub-symbolic representations, whereby concepts are represented in a geometrical space that is characterised by a number of quality dimensions. Currently, much of the work has demonstrated how conceptual spaces are created in a knowledge-driven manner, relying on prior knowledge to form concepts and identify quality dimensions. This paper presents a method to create semantic representations using data-driven conceptual spaces which are then used to derive linguistic descriptions of numerical data. Our contribution is a principled approach to automatically construct a conceptual space from a set of known observations wherein the quality dimensions and domains are not known a priori. This novelty of the approach is the ability to select and group semantic features to discriminate between concepts in a data-driven manner while preserving the semantic interpretation that is needed to infer linguistic descriptions for interaction with humans. Two data sets representing leaf images and time series signals are used to evaluate the method. An empirical evaluation for each case study assesses how well linguistic descriptions generated from the conceptual spaces identify unknown observations. Furthermore, comparisons are made with descriptions derived on alternative approaches for generating semantic models.


Author(s):  
Mustafa S. Abd ◽  
Suhad Faisal Behadili

Psychological research centers help indirectly contact professionals from the fields of human life, job environment, family life, and psychological infrastructure for psychiatric patients. This research aims to detect job apathy patterns from the behavior of employee groups in the University of Baghdad and the Iraqi Ministry of Higher Education and Scientific Research. This investigation presents an approach using data mining techniques to acquire new knowledge and differs from statistical studies in terms of supporting the researchers’ evolving needs. These techniques manipulate redundant or irrelevant attributes to discover interesting patterns. The principal issue identifies several important and affective questions taken from a questionnaire, and the psychiatric researchers recommend these questions. Useless questions are pruned using the attribute selection method. Moreover, pieces of information gained through these questions are measured according to a specific class and ranked accordingly. Association and a priori algorithms are used to detect the most influential and interrelated questions in the questionnaire. Consequently, the decisive parameters that may lead to job apathy are determined.


Author(s):  
Saurabh Basu ◽  
Zhiyu Wang ◽  
Christopher Saldana

Tool chatter is envisaged as a technique to create undulations on fabricated biomedical components. Herein, a-priori designed topographies were fabricated using modulate assisted machining of oxygen free high conductivity copper. Subsequently, underpinnings of microstructure evolution in this machining process were characterized using electron back scattered diffraction based orientation imaging microscopy. These underpinnings were related to the unsteady mechanical states present during modulated assisted machining, this numerically modeled using data obtained from simpler machining configurations. In this manner, relationships between final microstructural states and the underlying mechanics were found. Finally, these results were discussed in the context of unsteady mechanics present during tool chatter, it was shown that statistically predictable microstructural outcomes result during tool chatter.


1997 ◽  
Vol 43 (143) ◽  
pp. 180-191 ◽  
Author(s):  
Ε. M. Morris ◽  
H. -P. Bader ◽  
P. Weilenmann

AbstractA physics-based snow model has been calibrated using data collected at Halley Bay, Antarctica, during the International Geophysical Year. Variations in snow temperature and density are well-simulated using values for the model parameters within the range reported from other polar field experiments. The effect of uncertainty in the parameter values on the accuracy of the predictions is no greater than the effect of instrumental error in the input data. Thus, this model can be used with parameters determined a priori rather than by optimization. The model has been validated using an independent data set from Halley Bay and then used to estimate 10 m temperatures on the Antarctic Peninsula plateau over the last half-century.


2019 ◽  
Vol 20 (2) ◽  
pp. 251-274 ◽  
Author(s):  
Zeinab Takbiri ◽  
Ardeshir Ebtehaj ◽  
Efi Foufoula-Georgiou ◽  
Pierre-Emmanuel Kirstetter ◽  
F. Joseph Turk

Abstract Monitoring changes of precipitation phase from space is important for understanding the mass balance of Earth’s cryosphere in a changing climate. This paper examines a Bayesian nearest neighbor approach for prognostic detection of precipitation and its phase using passive microwave observations from the Global Precipitation Measurement (GPM) satellite. The method uses the weighted Euclidean distance metric to search through an a priori database populated with coincident GPM radiometer and radar observations as well as ancillary snow-cover data. The algorithm performance is evaluated using data from GPM official precipitation products, ground-based radars, and high-fidelity simulations from the Weather Research and Forecasting Model. Using the presented approach, we demonstrate that the hit probability of terrestrial precipitation detection can reach to 0.80, while the probability of false alarm remains below 0.11. The algorithm demonstrates higher skill in detecting snowfall than rainfall, on average by 10%. In particular, the probability of precipitation detection and its solid phase increases by 11% and 8%, over dry snow cover, when compared to other surface types. The main reason is found to be related to the ability of the algorithm in capturing the signal of increased liquid water content in snowy clouds over radiometrically cold snow-covered surfaces.


1992 ◽  
Vol 2 (4) ◽  
pp. 407-435 ◽  
Author(s):  
François Bourdoncle

AbstractThe essential part of abstract interpretation is to build a machine-representable abstract domain expressing interesting properties about the possible states reached by a program at runtime. Many techniques have been developed which assume that one knows in advance the class of properties that are of interest. There are cases however when there are no a priori indications about the 'best' abstract properties to use. We introduce a new framework that enables non-unique representations of abstract program properties to be used, and expose a method, called dynamic partitioning, that allows the dynamic determination of interesting abstract domains using data structures built over simpler domains. Finally, we show how dynamic partitioning can be used to compute non-trivial approximations of functions over infinite domains and give an application to the computation of minimal function graphs.


1997 ◽  
Vol 119 (3) ◽  
pp. 574-578 ◽  
Author(s):  
B. Guerrier ◽  
H. G. Liu ◽  
C. Be´nard

The profile and time evolution of a solid/liquid interface in a phase change process is estimated by solving an inverse heat transfer problem, using data measurements in the solid phase only. One then faces the inverse resolution of a heat equation in a variable and a priori unknown 2D domain. This ill-posed problem is solved by a regularization approach: the unknown function (position of the melting front) is obtained by minimization of a two component criterion, consisting of a distance between the output of a simulation model and the measured data, to which a penalizing function is added in order to restore the continuity of the inverse operator. A numerical study is developed to analyze the validity domain of the identification method. From simulation tests, it is shown that the minimum signal/noise ratio that can be handled depends strongly on the position of the measurement sensors.


2014 ◽  
Vol 45 (6) ◽  
pp. 868-892 ◽  
Author(s):  
Timothy D. Jones ◽  
Nick A. Chappell

With the aim of quantifying the purely hydrological control on fast water quality dynamics, a modelling approach was used to identify the structure (and dynamic response characteristics or DRCs) of the relationship between rainfall and hydrogen ion (H+) load, with reference to rainfall to streamflow response. Unlike most hydrochemistry studies, the method used makes no a priori assumptions about the complexity of the dynamics (e.g., number of flow-paths), but instead uses objective statistical methods to define these (together with uncertainty analysis). The robust models identified are based on continuous-time transfer functions and demonstrate high simulation efficiency with a constrained uncertainty allowing hydrological interpretation of dominant flow-paths and behaviour of H+ load in four upland headwaters. Identified models demonstrated that the short-term dynamics in H+ concentration were closely associated with the streamflow response, suggesting a dominant hydrological control. The second-order structure identified for the rainfall to streamflow response was also seen as the optimal model for rainfall to H+ load, even given the very dynamic concentration response, possibly indicating the same two flow-paths being responsible for both integrated responses.


2019 ◽  
Author(s):  
Wiktor Młynarski ◽  
Michal Hledík ◽  
Thomas R. Sokolowski ◽  
Gašper Tkačik

Normative theories and statistical inference provide complementary approaches for the study of biological systems. A normative theory postulates that organisms have adapted to efficiently solve essential tasks, and proceeds to mathematically work out testable consequences of such optimality; parameters that maximize the hypothesized organismal function can be derived ab initio, without reference to experimental data. In contrast, statistical inference focuses on efficient utilization of data to learn model parameters, without reference to any a priori notion of biological function, utility, or fitness. Traditionally, these two approaches were developed independently and applied separately. Here we unify them in a coherent Bayesian framework that embeds a normative theory into a family of maximum-entropy “optimization priors.” This family defines a smooth interpolation between a data-rich inference regime (characteristic of “bottom-up” statistical models), and a data-limited ab inito prediction regime (characteristic of “top-down” normative theory). We demonstrate the applicability of our framework using data from the visual cortex, the retina, and C. elegans, and argue that the flexibility it affords is essential to address a number of fundamental challenges relating to inference and prediction in complex, high-dimensional biological problems.


2020 ◽  
Author(s):  
Taeeung Kim ◽  
JUNHYE KWON ◽  
Chung Gun Lee ◽  
Chang-Yong Jang

Abstract Background: Childhood obesity is a serious public health threat. Although many researchers conducted research on socioecological determinants of childhood obesity, their longitudinal effects remain inconclusive especially among young children. This study examined socioecological factors and associated transitions of children’s body mass index (BMI) status throughout children’s kindergarten to elementary school years, using data from a national longitudinal sample.Methods: The baseline sample of this study included 1,264 children (weighted N=379,297) extracted from the Early Childhood Longitudinal Study (baseline mean age: 5.24 years). The socioecological framework guided selection of socioecological obesogenic variables (e.g., family activity and parental involvement). Longitudinal ordered logistic regressions were performed to determine the associations between socioecological obesogenic variables and unhealthy/healthy changes in BMI status that captured transitions between healthy and unhealthy weight status (i.e., overweight, obesity, and severe obesity).Results: Children with Hispanic ethnicity and nonwhite race, less socioeconomic and environmental support, and living in households with fewer family members were more likely than their counterparts to have unhealthy BMI status changes over time (all ps<0.05). Over the study period, girls were more likely than boys to experience transitions to unhealthy BMI status (all ps<0.05).Conclusion: As hypothesized a priori, the findings of the current affirmed multiple dimensions of how sociological obesogenic factors may influence children’s BMI status changes in a longitudinal setting. In order to maintain children’s long-term healthy weight, more attention should be paid to socioeconomic obesogenic factors surrounding children as well as individual determinants of obesity (e.g., being physically active and having well-balanced nutrition).


Sign in / Sign up

Export Citation Format

Share Document