tactile navigation
Recently Published Documents


TOTAL DOCUMENTS

25
(FIVE YEARS 6)

H-INDEX

6
(FIVE YEARS 1)

2021 ◽  
Vol 13 (21) ◽  
pp. 11613
Author(s):  
Nora Weinberger ◽  
Silvia Woll ◽  
Christopher Conrad Maximillian Kyba ◽  
Nona Schulte-Römer

The participation of citizens in scientific research has a long tradition, and in some disciplines, especially medical research, it is even common practice. In Technology Assessment (TA), Responsible Research and Innovation (RRI), and Sustainable Development (SD), the participation of citizens can be of considerable value. In this paper, we explore this value for three concepts, based on the researcher’s insights from three participatory research projects. The first project is the citizen science project TeQfor1, which was conducted with, for, and on the type 1 diabetes community, who do not feel adequately supported by the conventional health care system. In the second project, citizens with vision impairments participated in the technological development of an audio-tactile navigation tool in the TERRAIN project. The third project (Nachtlichter) dealt with light pollution. Based on the three projects presented, we show that citizen participation makes specific contributions to TA, RRI, and SD. We also investigate the specificity of citizen engagement and motivation by differentiating between existing and emerging involvement. In conclusion, we discuss the benefits that may be added by participatory approaches for the three concepts of TA, RRI, and SD.


2021 ◽  
Vol 28 (4) ◽  
pp. 1-35
Author(s):  
Oliver Beren Kaul ◽  
Michael Rohs ◽  
Marc Mogalle ◽  
Benjamin Simon

Tactile patterns are a means to convey navigation instructions to pedestrians and are especially helpful for people with visual impairments. This article presents a concept to provide precise micro-navigation instructions through a tactile around-the-head display. Our system presents four tactile patterns for fundamental navigation instructions in conjunction with continuous directional guidance. We followed an iterative, user-centric approach to design the patterns for the fundamental navigation instructions, combined them with a continuous directional guidance stimulus, and tested our system with 13 sighted (blindfolded) and 2 blind participants in an obstacle course, including stairs. We optimized the patterns and validated the final prototype with another five blind participants in a follow-up study. The system steered our participants successfully with a 5.7 cm average absolute deviation from the optimal path. Our guidance is only a little less precise than the usual shoulder wobbling during normal walking and an order of magnitude more precise than previous tactile navigation systems. Our system allows various new use cases of micro-navigation for people with visual impairments, e.g., preventing collisions on a sidewalk or as an anti-veering tool. It also has applications in other areas, such as personnel working in low-vision environments (e.g., firefighters).


2020 ◽  
Vol 27 (5) ◽  
pp. 128-143
Author(s):  
A. A. Kurmangulov ◽  
E. E. Korchagin ◽  
Yu. S. Reshetnikova ◽  
N. I. Golovina ◽  
N. S. Brynza

Background. Provided stronger demands of people towards healthcare systems, changes in the legal framework and increasing competition, medical institutions are seeking new approaches and mechanisms for improving comfort of medical care services. Navigation support as part of the visualisation system in a medical unit should comply with certain standards of colour design and its comfortable presentation to patients.Objectives. To formulate basic principles for effective colour navigation support in medical facilities using best national and foreign practices of colour design for navigation and visualisation solutions in healthcare.Мethods. Relevant publications were mined in Scopus, Web of Science, MedLine, the Cochrane Library, Elibrary and PubMed. Search depth by time was limited to 10 years. Main keyword queries were «lean production» [«бережливое производство»], «lean healthcare» [«бережливое здравоохранение»], «lean medicine», «navigation» [«навигация»], «visualization» [«визуализация»], «бережливая поликлиника», «квалиметрия». The obtained data were interpreted using legal, historical, descriptive analytical methods, content analysis.Results. Colour solutions in the Russian healthcare system are implemented in a variety of ways to support data visualisation. As part of a visual and tactile navigation system, the colour solutions serve to highlight individual navigation objects, pattern space, order and structure navigation elements, emphasise and manage relevant textual information. Incorrect colour navigation support can lead to wastes of lean principles in the main, auxiliary and maintenance processes in a medical facility. The Russian federal legislation does not currently regulate colour solutions for visualisation support in medical institutions.Conclusion. Colour solutions being part of navigation support systems should be regulated by federal and regional legal acts to allow qualimetric assessment and improvement of navigation and visual systems in medical facilities. A high-quality design of navigation support requires detailed information about the managed institution.


Author(s):  
Oliver Korn ◽  
James Gay ◽  
Rúben Gouveia ◽  
Lea Buchweitz ◽  
Annika Sabrina Schulz ◽  
...  
Keyword(s):  

2019 ◽  
Author(s):  
Alireza Azarfar ◽  
Tansu Celikel

Navigation is a result of complex sensorimotor computation which requires integration of sensory information in allocentric and egocentric coordinates as the brain computes a motor plan to drive navigation. In this active sensing process, motor commands are adaptively regulated based on prior sensory information. In the darkness, rodents commonly rely on their tactile senses, in particular to their whiskers, to gather the necessary sensory information and instruct navigation. Previous research has shown that rodents can process whisker input to guide mobility even prior to whisking onset by the end of the second postnatal week, however, when and how adaptive sensorimotor control of whisker position matures is still not known. Here, we addressed this question in rats longitudinally as animals searched for a stationary target in darkness. The results showed that juvenile rats perform object localization by controlling their body, but not whisker position, based on the expected location of the target. Adaptive, closed-loop, control of whisker position matures only after the third postnatal week. Computational modeling of the active whisking showed the emergence of the closed-loop control of whisker position and reactive retraction, i.e. whisker retraction that ensures the constancy of duration of tactile sampling, facilitate the maturation of sensorimotor exploration strategies during active sensing. These results argue that adaptive motor control of body and whiskers develop sequentially, and sensorimotor control of whisker position emerges later in postnatal development upon the maturation of intracortical sensorimotor circuits.


2017 ◽  
Author(s):  
Suzanne Amador Kane ◽  
Daniel Van Beveren ◽  
Roslyn Dakin

AbstractFeathers act as vibrotactile sensors that can detect mechanical stimuli during avian flight and tactile navigation, suggesting that they may also detect stimuli during social displays. In this study, we present the first measurements of the biomechanical properties of the feather crests found on the heads of birds, with an emphasis on those from the Indian peafowl (Pavo cristatus). We show that in peafowl these crest feathers are coupled to filoplumes, small feathers known to function as mechanosensors. We also determined that airborne stimuli with the frequencies used during peafowl courtship and social displays couple efficiently via resonance to the vibrational response of their feather crests. Specifically, vibrational measurements showed that although different types of feathers have a wide range of fundamental resonant frequencies, peafowl crests are driven near-optimally by the shaking frequencies used by peacocks performing train-rattling displays. Peafowl crests were also driven to vibrate near resonance in a playback experiment that mimicked the effect of these mechanical sounds in the acoustic very near-field, reproducing the way peafowl displays are experienced at distances ≤ 1.5m in vivo. When peacock wing-shaking courtship behaviour was simulated in the laboratory, the resulting airflow excited measurable vibrations of crest feathers. These results demonstrate that peafowl crests have mechanical properties that allow them to respond to airborne stimuli at the frequencies typical of this species’ social displays. This suggests a new hypothesis that mechanosensory stimuli could complement acoustic and visual perception and/or proprioception of social displays in peafowl and other bird species. We suggest behavioral studies to explore these ideas and their functional implications.


2017 ◽  
Vol 139 (03) ◽  
pp. 36-41
Author(s):  
John Kosowatz

This article provides an overview of high-tech sensors, visual detection software, and mobile computing power applications, which are being developed to enable visually impaired people to navigate. By adapting technology developed for robots, automobiles, and other products, researchers and developers are creating wearable devices that can aid the visually impaired as they navigate through their daily routines—even identifying people and places. The Eyeronman system, developed by NYU’s Visuomotor Integration Laboratory and Tactile Navigation Tools, combines a sensor-laden outer garment or belt with a vest studded with vibrating actuators. The sensors detect objects in the immediate environment and relay their locations via buzzes on the wearer's torso. OrCam’s, a computer vision company in Jerusalem, team of programmers, computer engineers, and hardware designers have developed MyEye device, which attaches to the temple of a pair of eyeglasses. The device instructs the user on how to store items in memory, including things such as credit cards and faces of friends and family.


Sign in / Sign up

Export Citation Format

Share Document