Change and variation in responses to perceptual information.

Author(s):  
Nancy de Villiers Rader
2018 ◽  
Author(s):  
Andrew D Wilson

Ever since Gibson proposed the concept of ‘affordances’ in 1979, we have been arguing about the best way to formalize the idea in a way that can allow us to successfully explain behavior. The first approach was to consider them as dispositional properties of task environment which can support skillful perception and action. A more recent approach considers them more broadly as relations between properties of organisms and their environments. This expands the spatial and temporal scope of affordances; we stand in many kinds of relations to our physical but also social and cultural environments. Relational affordances are therefore offered as an ecological way to explain behaviors in these domains. However, these relational affordances do not, as a rule, interact with perceptual media and therefore do not create perceptual information about themselves. This means they cannot be perceived, which in turn means they cannot play any role in an ecological explanation of a behavior. This paper briefly reviews the dispositional vs relational accounts of affordances, explains the problem, and proposes an information-based alternative (building on Golonka, 2015). Affordances are not relational, but fortunately information is, and this is where we should focus our attention.


2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Martin Giesel ◽  
Anna Nowakowska ◽  
Julie M. Harris ◽  
Constanze Hesse

AbstractWhen we use virtual and augmented reality (VR/AR) environments to investigate behaviour or train motor skills, we expect that the insights or skills acquired in VR/AR transfer to real-world settings. Motor behaviour is strongly influenced by perceptual uncertainty and the expected consequences of actions. VR/AR differ in both of these aspects from natural environments. Perceptual information in VR/AR is less reliable than in natural environments, and the knowledge of acting in a virtual environment might modulate our expectations of action consequences. Using mirror reflections to create a virtual environment free of perceptual artefacts, we show that hand movements in an obstacle avoidance task systematically differed between real and virtual obstacles and that these behavioural differences occurred independent of the quality of the available perceptual information. This suggests that even when perceptual correspondence between natural and virtual environments is achieved, action correspondence does not necessarily follow due to the disparity in the expected consequences of actions in the two environments.


2019 ◽  
Vol 29 (5) ◽  
pp. 676-696 ◽  
Author(s):  
Sabrina Golonka ◽  
Andrew D. Wilson

In 2010, Bechtel and Abrahamsen defined and described what it means to be a dynamic causal mechanistic explanatory model. They discussed the development of a mechanistic explanation of circadian rhythms as an exemplar of the process and challenged cognitive science to follow this example. This article takes on that challenge. A mechanistic model is one that accurately represents the real parts and operations of the mechanism being studied. These real components must be identified by an empirical programme that decomposes the system at the correct scale and localises the components in space and time. Psychological behaviour emerges from the nature of our real-time interaction with our environments—here we show that the correct scale to guide decomposition is picked out by the ecological perceptual information that enables that interaction. As proof of concept, we show that a simple model of coordinated rhythmic movement, grounded in information, is a genuine dynamical mechanistic explanation of many key coordination phenomena.


2000 ◽  
Vol 20 (22) ◽  
pp. RC108-RC108 ◽  
Author(s):  
Charan Ranganath ◽  
Marcia K. Johnson ◽  
Mark D'Esposito

2021 ◽  
Author(s):  
Mattia Rosso ◽  
Pieter-Jan Maes ◽  
Marc Leman

Abstract Rhythmic joint coordination is ubiquitous in daily-life human activities. In order to coordinate their actions towards shared goals, individuals need to co-regulate their timing and move together at the collective level of behavior. Remarkably, basic forms of coordinated behavior tend to emerge spontaneously as long as two individuals are exposed to each other’s rhythmic movements. The present study investigated the dynamics of spontaneous dyadic entrainment, and more specifically how they depend on the sensory modalities mediating informational coupling. By means of a novel interactive paradigm, we showed that dyadic entrainment systematically takes place during a minimalistic rhythmic task despite explicit instructions to ignore the partner. Crucially, the interaction was organized by clear dynamics in a modality-dependent fashion. Our results showed highly consistent coordination patterns in visually-mediated entrainment, whereas we observed more chaotic and more variable profiles in the auditorily-mediated counterpart. The proposed experimental paradigm yields empirical evidence for the overwhelming tendency of dyads to behave as coupled rhythmic units. In the context of our experimental design, it showed that coordination dynamics differ according to availability and nature of perceptual information. Interventions aimed at rehabilitating, teaching or training sensorimotor functions can be ultimately informed and optimized by such fundamental knowledge.


2018 ◽  
Vol 47 (1) ◽  
pp. 106-116
Author(s):  
Karlos Luna ◽  
Pedro B. Albuquerque ◽  
Beatriz Martín-Luengo

Fast track article for IS&T International Symposium on Electronic Imaging 2021: Human Vision and Electronic Imaging proceedings.


Sign in / Sign up

Export Citation Format

Share Document