scholarly journals TouchScope: A passive-haptic reading device to investigate tactile perception

2021 ◽  
Author(s):  
Ana Baciero ◽  
Manuel Perea ◽  
Jon Andoni Duñabeitia ◽  
Pablo Gomez

The sense of touch is underrepresented in cognitive psychology research. One of the reasons is that controlling the timing of stimulus presentation, which is a hallmark of cognitive research, is significantly more difficult for tactile stimuli than visual or auditory stimuli. To contribute to the development of tactile research, we present a system to display tactile stimuli and collect response time with the capability for static and dynamic (passive haptic) stimuli presentation. While the system requires some construction, it can be put together with commercially available materials. We present here the hardware and software implementation and some examples of experiments.

1997 ◽  
Vol 272 (2) ◽  
pp. R648-R655 ◽  
Author(s):  
M. R. Opp ◽  
L. A. Toth ◽  
E. A. Tolley

Slow-wave activity in the electroencephalogram is thought to reflect the depth or intensity of sleep. This hypothesis is primarily derived from studies of rats or humans. However, some characteristics of sleep of rabbits differ from those of rats or humans. To determine whether slow-wave activity (power density in the delta frequency band of 0.5-5.0 Hz) correlates with arousability in rabbits, we presented auditory stimuli (72-90 dB) to control or sleep-deprived animals during slow-wave sleep. The resulting behavioral responses, defined by changes in eye state and body posture, and the latency to return to sleep were used as measures of arousability. Behavioral responsiveness to auditory stimuli increased with increasing stimulus intensity in both control and sleep-deprived animals. Overall, however, sleep-deprived animals exhibited fewer postural changes and eye openings than did control rabbits. Sleep-deprived rabbits also more rapidly returned to sleep after the stimulus presentation than did control animals. Latency to return to sleep was correlated with delta power before stimulus presentation, but behavioral responsiveness was not. These data suggest that, in this rabbit model, delta power may not be predictive of behavioral arousability but may reflect sleep propensity.


2007 ◽  
Vol 29 (4) ◽  
pp. 457-478 ◽  
Author(s):  
Derek T.Y. Mann ◽  
A. Mark Williams ◽  
Paul Ward ◽  
Christopher M. Janelle

Research focusing on perceptual-cognitive skill in sport is abundant. However, the existing qualitative syntheses of this research lack the quantitative detail necessary to determine the magnitude of differences between groups of varying levels of skills, thereby limiting the theoretical and practical contribution of this body of literature. We present a meta-analytic review focusing on perceptual-cognitive skill in sport (N = 42 studies, 388 effect sizes) with the primary aim of quantifying expertise differences. Effects were calculated for a variety of dependent measures (i.e., response accuracy, response time, number of visual fixations, visual fixation duration, and quiet eye period) using point-biserial correlation. Results indicated that experts are better than nonexperts in picking up perceptual cues, as revealed by measures of response accuracy and response time. Systematic differences in visual search behaviors were also observed, with experts using fewer fixations of longer duration, including prolonged quiet eye periods, compared with nonexperts. Several factors (e.g., sport type, research paradigm employed, and stimulus presentation modality) significantly moderated the relationship between level of expertise and perceptual-cognitive skill. Practical and theoretical implications are presented and suggestions for empirical work are provided.


2006 ◽  
Vol 95 (2) ◽  
pp. 995-1007 ◽  
Author(s):  
Rory Sayres ◽  
Kalanit Grill-Spector

Object-selective cortical regions exhibit a decreased response when an object stimulus is repeated [repetition suppression (RS)]. RS is often associated with priming: reduced response times and increased accuracy for repeated stimuli. It is unknown whether RS reflects stimulus-specific repetition, the associated changes in response time, or the combination of the two. To address this question, we performed a rapid event-related functional MRI (fMRI) study in which we measured BOLD signal in object-selective cortex, as well as object recognition performance, while we manipulated stimulus repetition. Our design allowed us to examine separately the roles of response time and repetition in explaining RS. We found that repetition played a robust role in explaining RS: repeated trials produced weaker BOLD responses than nonrepeated trials, even when comparing trials with matched response times. In contrast, response time played a weak role in explaining RS when repetition was controlled for: it explained BOLD responses only for one region of interest (ROI) and one experimental condition. Thus repetition suppression seems to be mostly driven by repetition rather than performance changes. We further examined whether RS reflects processes occurring at the same time as recognition or after recognition by manipulating stimulus presentation duration. In one experiment, durations were longer than required for recognition (2 s), whereas in a second experiment, durations were close to the minimum time required for recognition (85–101 ms). We found significant RS for brief presentations (albeit with a reduced magnitude), which again persisted when controlling for performance. This suggests a substantial amount of RS occurs during recognition.


2012 ◽  
Vol 2012 ◽  
pp. 1-13 ◽  
Author(s):  
Katri Salminen ◽  
Veikko Surakka ◽  
Jani Lylykangas ◽  
Jussi Rantala ◽  
Teemu Ahmaniemi ◽  
...  

Traditionally only speech communicates emotions via mobile phone. However, in daily communication the sense of touch mediates emotional information during conversation. The present aim was to study if tactile stimulation affects emotional ratings of speech when measured with scales of pleasantness, arousal, approachability, and dominance. In the Experiment 1 participants rated speech-only and speech-tactile stimuli. The tactile signal mimicked the amplitude changes of the speech. In the Experiment 2 the aim was to study whether the way the tactile signal was produced affected the ratings. The tactile signal either mimicked the amplitude changes of the speech sample in question, or the amplitude changes of another speech sample. Also, concurrent static vibration was included. The results showed that the speech-tactile stimuli were rated as more arousing and dominant than the speech-only stimuli. The speech-only stimuli were rated as more approachable than the speech-tactile stimuli, but only in the Experiment 1. Variations in tactile stimulation also affected the ratings. When the tactile stimulation was static vibration the speech-tactile stimuli were rated as more arousing than when the concurrent tactile stimulation was mimicking speech samples. The results suggest that tactile stimulation offers new ways of modulating and enriching the interpretation of speech.


2019 ◽  
Vol 13 (6) ◽  
pp. 901-916
Author(s):  
Tim Fawns

This article offers a framework for understanding how different kinds of memory work together in interaction with people, photographs and other resources. Drawing on evidence from two qualitative studies of photography and memory, as well as literature from cognitive psychology, distributed cognition and media studies, I highlight complexities that have seldom been taken into account in cognitive psychology research. I then develop a ‘blended memory’ framework in which memory and photography can be interdependent, blending together as part of a wider activity of distributed remembering that is structured by interaction and phenomenology. In contrast to studies of cued recall, which commonly feature isolated categories or single instances of recall, this framework takes account of people’s histories of photographic practices and beliefs to explain the long-term convergence of episodic, semantic and inferential memory. Finally, I discuss implications for understanding and designing future memory research.


Author(s):  
Patrick Bruns ◽  
Brigitte Röder

It is well known that spatial discrepancies between synchronized auditory and visual events can lead to mislocalizations of the auditory stimulus toward the visual stimulus, the so-called ventriloquism effect. Recently, a similar effect of touch on audition has been reported. This study investigated whether this audio-tactile ventriloquism effect depends on hand posture. Participants reported the perceived location of brief auditory stimuli that were presented from left, right, and center locations, either alone or with concurrent tactile stimuli to the fingertips situated at the left and right sides of the speaker array. Compared to unimodal presentations, auditory localization was biased toward the side of the concurrent tactile stimulus in the bimodal trials. This effect was reduced but still significant when participants adopted a crossed-hands posture. In this condition a partial (incomplete) localization bias was observed only for large audio-tactile spatial discrepancies. However, localization was still shifted toward the external location of the tactile stimulus, and not toward the side of the anatomical hand that was stimulated. These results substantiate recent evidence for the existence of an audio-tactile ventriloquism effect and extend these findings by demonstrating that this illusion operates predominantly in an external coordinate system.


2014 ◽  
Vol 2 (2) ◽  
pp. 129-144 ◽  
Author(s):  
Charles Viau-Quesnel ◽  
Rémi Gaudreault ◽  
Andrée-Anne Ouellet ◽  
Claudette Fortin

Tones are perceived longer than visual stimuli of same durations. One interpretation of this modality effect is that auditory stimuli capture attention more easily than visual stimuli, resulting in more efficient temporal processing. During a time interval production, expecting a break signal lengthens the produced interval, an effect explained by attention sharing between timing and monitoring for the signal occurrence. In the present study, participants produced a brief time interval defined by a visual or an auditory stimulus and in most trials, there was a break in stimulus presentation. The effect of break expectancy was significantly stronger when the timing stimulus was presented in the visual than in the auditory modality, an interaction supporting attentional interpretations of the modality and expectancy effects. We conclude that auditory stimuli orient attention to time more readily than visual stimuli in a context of attention sharing, which reduces the distracting effect of break expectancy.


Sign in / Sign up

Export Citation Format

Share Document