The grounding and sharing of symbols

2006 ◽  
Vol 14 (2) ◽  
pp. 275-285 ◽  
Author(s):  
Angelo Cangelosi

The double function of language, as a social/communicative means, and as an individual/cognitive capability, derives from its fundamental property that allows us to internally re-represent the world we live in. This is possible through the mechanism of symbol grounding, i.e., the ability to associate entities and states in the external and internal world with internal categorical representations. The symbol grounding mechanism, as language, has both an individual and a social component. The individual component, called the “Physical Symbol Grounding”, refers to the ability of each individual to create an intrinsic link between world entities and internal categorical representations. The social component, called “Social Symbol Grounding”, refers to the collective negotiation for the selection of shared symbols (words) and their grounded meanings. The paper discusses these two aspects of symbol grounding in relation to distributed cognition, using examples from cognitive modeling research on grounded agents and robots.

Author(s):  
Dongho Kim

The demand for qualified teachers with sufficient pedagogical knowledge and skills is high. However, existing teacher education programs do not provide adequate experiences through which to develop pre-service teachers’ professional foundations. This study recognized Open Educational Resources (OER) as a means by which to address the issue of enhancing teacher education. The purpose of this study was to propose a framework to be used to integrate OER into lesson design activities for pre-service teachers. In this study, a focused literature review investigated the frameworks of distributed cognition and example-based learning. This review process resulted in a unified framework that provides a description of how pre-service teachers learn with OER at both the individual and cognitive system levels. Four principles and 10 guidelines are provided to guide the implementation of OER-based lesson design activities in real settings. The new framework has the potential to enhance pre-service teachers’ Web resource-based professional development.


2021 ◽  
Vol 28 (4) ◽  
Author(s):  
Selwin Hageraats ◽  
Mathieu Thoury ◽  
Stefan Stanescu ◽  
Katrien Keune

X-ray linear dichroism (XLD) is a fundamental property of many ordered materials that can for instance provide information on the origin of magnetic properties and the existence of differently ordered domains. Conventionally, measurements of XLD are performed on single crystals, crystalline thin films, or highly ordered nanostructure arrays. Here, it is demonstrated how quantitative measurements of XLD can be performed on powders, relying on the random orientation of many particles instead of the controlled orientation of a single ordered structure. The technique is based on a scanning X-ray transmission microscope operated in the soft X-ray regime. The use of a Fresnel zone plate allows X-ray absorption features to be probed at ∼40 nm lateral resolution – a scale small enough to probe the individual crystallites in most powders. Quantitative XLD parameters were then retrieved by determining the intensity distributions of certain diagnostic dichroic absorption features, estimating the angle between their transition dipole moments, and fitting the distributions with four-parameter dichroic models. Analysis of several differently produced ZnO powders shows that the experimentally obtained distributions indeed follow the theoretical model for XLD. Making use of Monte Carlo simulations to estimate uncertainties in the calculated dichroic model parameters, it was established that longer X-ray exposure times lead to a decrease in the amplitude of the XLD effect of ZnO.


2018 ◽  
Vol 22 (1) ◽  
pp. 187-197
Author(s):  
Åsa Harvard Maare

Abstract This paper looks at how players of a card game create spatial arrangements of playing cards, and the cognitive and communicative effects of such arrangements. The data is an episode of two 8-year old children and a teacher playing the combinatorial card game Set, in the setting of the leisure-time center. The paper explores and explains how the visual resources of the game are used for externalizing information in terms of distributed cognition and epistemic actions. The paper also examines how other participants attend to the visual arrangements and self-directed talk of the active player. The argument is that externalizing information may be a strategy for reducing cognitive load for the individual problem-solver, but it is also a communicative behaviour affecting other participants and causing them to engage with the problem and the problem-solver. Seeing and hearing players who have succeeded in finding a set provide observers with rich learning opportunities, and increases their motivation to play the game. From the point of view of learning design, the consequence of this is that bystanders merit to be considered as the potential learners of a pedagogical game as much as the players themselves


Author(s):  
Fernando Vidal ◽  
Francisco Ortega

The chapter explores the cerebralization of psychological distress. The psychopharmacological revolution took place in the 1950s. Later on, the nosological biologization of mental disorders received a crucial impetus when DSM III opened the way to redescribing in neurological terms disorders such as schizophrenia, autism and depression. Behaviors previously considered merely awkward, such as shyness, or seen as having a major social component, like alcoholism or obesity, have become predominantly neurological conditions. The chapter provides an overview of such a situation, as well as a more detailed examination of the cerebralization of depression, which is a particularly complex cultural and biopolitical phenomenon. It shall also explore the consequences of the cerebralizing trend for the constitution of “forms of living.” While biological psychiatry has been criticized as dehumanizing, it has also contributed to free patients and families from blame and stigma. Insofar as a problem resides within the brain, the individual bears no guilt; though organic, the disorder is externalized relatively to the person’s identity. This sustained a “neurodiversity” movement, led by high-functioning autistics who believe that their condition is not a disease to be treated and, if possible, cured, but rather a human specificity to be respected like other forms of difference.


2021 ◽  
Vol 3 (3) ◽  
pp. 302-321
Author(s):  
Hegumen Anthony Kamenchuk ◽  

This article outlines the key features of the Christian understanding of divine providence in comparison with the philosophical trends of Antiquity from the 1st to the 3rd centuries (before Neoplatonism). The author identifies three paradigms of understanding divine providence in the ancient pagan philosophy of this period (atheistic, pantheistic and deistic) and in this context defines the Christian paradigm as “dialogical panentheism”. According to the author, Christianity at its core offers a worldview, which is uncharacteristic for paganism: the cosmos is focused on the implementation of a dialogue between man and God and the achievement of existential intimacy between the Creator and creation. It is also noted that Christianity, in contrast to ancient thought, placed an emphasis on the fact that the fundamental property of the higher Deity is His openness in relation to the Other, and not just self-contemplating or self-contained calmness. This, in turn, determines two other aspects in the Christian doctrine of providence: the all-pervading participation of God in the life of the world and His concern for the individual and those who are flawed. The author also says that the Orthodox understanding of providence is a harmonious middle between the extremes of pantheism and deism.


2017 ◽  
Vol 3 (3) ◽  
pp. 141 ◽  
Author(s):  
V. V. Malakhov

The principles of stoichiography and novel reference-free methods of molecular and phase analysis for complex unknown mixtures are considered. The stoichiography can be inferred from stoichiometry of mass transfer of unsteady homo- and heterophase processes and joins both operations: separation of mixture by means of chromatography, electromigration, dissolution or others and determination of stoichiometry of a substance flow with time. The stoichiography allows a chemical compound to be determined by its primary property, namely, by stoichiometry of elemental composition. Stoichiograms provided a basis for such type of information. They are time variances of molar ratio for mass transfer rates of chemical elements from multielement substances. Invariancy to concentration and temperature of solvents, hydrodynamic regime is a fundamental property of the stoichiograms in the case of individual compounds. Therefore the stoichiograms are kept constant and are equal to formula stoichiometric coefficients of the individual compound. Theory and methodology of new stoichographic methods, differential dissolution and ion-chromato-stoichiography are presented. New equipment, stoichiograph, and a new procedure of differential dissolution, stoichiographic titration, are discussed here in details. Applications of differential dissolution to analyze multielement and<br />polyphase crystalline and amorphous samples are given.


Author(s):  
Luís Moniz Pereira ◽  
Ari Saptawijaya

We address problems in machine ethics dealt with using computational techniques. Our research has focused on Computational Logic, particularly Logic Programming, and its appropriateness to model morality, namely moral permissibility, its justification, and the dual-process of moral judgments regarding the realm of the individual. In the collective realm, we, using Evolutionary Game Theory in populations of individuals, have studied norms and morality emergence computationally. These, to start with, are not equipped with much cognitive capability, and simply act from a predetermined set of actions. Our research shows that the introduction of cognitive capabilities, such as intention recognition, commitment, and apology, separately and jointly, reinforce the emergence of cooperation in populations, comparatively to their absence. Bridging such capabilities between the two realms helps understand the emergent ethical behavior of agents in groups, and implements them not just in simulations, but in the world of future robots and their swarms. Evolutionary Anthropology provides teachings.


2021 ◽  
Vol 12 ◽  
Author(s):  
Marco Ragni ◽  
Daniel Brand ◽  
Nicolas Riesterer

In the last few decades, cognitive theories for explaining human spatial relational reasoning have increased. Few of these theories have been implemented as computational models, however, even fewer have been compared computationally to each other. A computational model comparison requires, among other things, a still missing quantitative benchmark of core spatial relational reasoning problems. By presenting a new evaluation approach, this paper addresses: (1) developing a benchmark including raw data of participants, (2) reimplementation, adaptation, and extension of existing cognitive models to predict individual responses, and (3) a thorough evaluation of the cognitive models on the benchmark data. The paper shifts the research focus of cognitive modeling from reproducing aggregated response patterns toward assessing the predictive power of models for the individual reasoner. It demonstrate that not all psychological effects can discern theories. We discuss implications for modeling spatial relational reasoning.


2018 ◽  
Vol 5 (2) ◽  
pp. 240-246 ◽  
Author(s):  
Ron Sun

Cognitive social simulation is at the intersection of cognitive modeling and social simulation, two forms of computer-based, quantitative modeling and understanding. Cognitive modeling centers on producing precise computational or mathematical models of mental processes (such as human reasoning or decision making), while social simulation focuses on precise models of social processes (such as group discussion or collective decision making). By combining cognitive and social models, cognitive social simulation is poised to address issues concerning both individual and social processes. To better anticipate the implications of policies, detailed simulation enables precise analysis of possible scenarios and outcomes. Thus, cognitive social simulation will have practical applications in relation to policy making in many areas that require understanding at both the individual and the aggregate level.


Author(s):  
Eldad Yechiam ◽  
Tim Rakow

We examined the relative weight given to obtained and foregone outcomes (i.e., outcomes from the non-chosen options) in repeated choices using cognitive modeling. Previous modeling studies have yielded mixed results. When participants’ choices are analyzed by models that predict the next choice ahead in a sequence of decisions, the results imply that people give less weight to foregone than to obtained outcomes. In contrast, in simulation models of n trials ahead, the results imply that, on average, people give equal weight to foregone and obtained outcomes. Using datasets of experience-based binary choices with fixed (stationary) payoff distributions (Erev & Haruvy, in press) and dynamic (nonstationary) payoff distributions (Rakow & Miler, 2009), we employed generalization tests at the individual level to examine whether the findings derived from the one-step-ahead method are due to overfitting. The results of trial-ahead model fitting implied that for the nonstationary tasks only, foregone outcomes received lower weight. However, when this dataset was assessed via generalization criteria at the individual level, equal weighting of foregone and obtained outcomes was the best assumption. This implies that overfitting is implicated in the superior fit of models that assume discounting of foregone outcomes.


Sign in / Sign up

Export Citation Format

Share Document