Mobile Text Input with Soft Keyboards: Optimization by Means of Visual Clues

Author(s):  
Laurent Magnien ◽  
Jean Léon Bouraoui ◽  
Nadine Vigouroux
Keyword(s):  
2020 ◽  
Vol 6 (3) ◽  
pp. 571-574
Author(s):  
Anna Schaufler ◽  
Alfredo Illanes ◽  
Ivan Maldonado ◽  
Axel Boese ◽  
Roland Croner ◽  
...  

AbstractIn robot-assisted procedures, the surgeon controls the surgical instruments from a remote console, while visually monitoring the procedure through the endoscope. There is no haptic feedback available to the surgeon, which impedes the assessment of diseased tissue and the detection of hidden structures beneath the tissue, such as vessels. Only visual clues are available to the surgeon to control the force applied to the tissue by the instruments, which poses a risk for iatrogenic injuries. Additional information on haptic interactions of the employed instruments and the treated tissue that is provided to the surgeon during robotic surgery could compensate for this deficit. Acoustic emissions (AE) from the instrument/tissue interactions, transmitted by the instrument are a potential source of this information. AE can be recorded by audio sensors that do not have to be integrated into the instruments, but that can be modularly attached to the outside of the instruments shaft or enclosure. The location of the sensor on a robotic system is essential for the applicability of the concept in real situations. While the signal strength of the acoustic emissions decreases with distance from the point of interaction, an installation close to the patient would require sterilization measures. The aim of this work is to investigate whether it is feasible to install the audio sensor in non-sterile areas far away from the patient and still be able to receive useful AE signals. To determine whether signals can be recorded at different potential mounting locations, instrument/tissue interactions with different textures were simulated in an experimental setup. The results showed that meaningful and valuable AE can be recorded in the non-sterile area of a robotic surgical system despite the expected signal losses.


Sensors ◽  
2021 ◽  
Vol 21 (15) ◽  
pp. 5178
Author(s):  
Sangbong Yoo ◽  
Seongmin Jeong ◽  
Seokyeon Kim ◽  
Yun Jang

Gaze movement and visual stimuli have been utilized to analyze human visual attention intuitively. Gaze behavior studies mainly show statistical analyses of eye movements and human visual attention. During these analyses, eye movement data and the saliency map are presented to the analysts as separate views or merged views. However, the analysts become frustrated when they need to memorize all of the separate views or when the eye movements obscure the saliency map in the merged views. Therefore, it is not easy to analyze how visual stimuli affect gaze movements since existing techniques focus excessively on the eye movement data. In this paper, we propose a novel visualization technique for analyzing gaze behavior using saliency features as visual clues to express the visual attention of an observer. The visual clues that represent visual attention are analyzed to reveal which saliency features are prominent for the visual stimulus analysis. We visualize the gaze data with the saliency features to interpret the visual attention. We analyze the gaze behavior with the proposed visualization to evaluate that our approach to embedding saliency features within the visualization supports us to understand the visual attention of an observer.


2012 ◽  
Vol 55 ◽  
pp. e131-e132
Author(s):  
S. Pouplin ◽  
J. Robertson ◽  
J.-Y. Antoine ◽  
A. Blanchet ◽  
J.-L. Kahloun ◽  
...  

2011 ◽  
Vol 135-136 ◽  
pp. 944-949
Author(s):  
Ji Quan Yu ◽  
Wan Tao Qian ◽  
Xin Gang He

PAC(programmable automation controller) is a new trend of the industrial controller, but for now, most IDEs(integrated development environment) are still providing the PLC mode for users, which can not take full advantage of the PAC. Further more, in China, there is still not such IDE with complete intellectual properties for PACs designed by Chinese companies. For above purposes, the CHD-PACIDE was implemented which supported the ARM cortex-Mx series microcontrollers. This IDE consists of three layers, interface layer, data management layer and kernel layer, which managed functional modules respectively. Based on a C-like language Engineer C defined by our research team, the interface layer provided the structural graphical input mode and the text input mode for users to edit their code. The data management layer used the XML with specified format manage the flow of data. The kernel layer had two parts which were implemented in the IDE and the debug microcontroller stm8s, this layer can be used to debug user’s code through the Jtag port under the Coresight debugging structure of ARM. This IDE could be updated easily by adding the specific XML file for the new microcontroller used by the specific PAC.


Sign in / Sign up

Export Citation Format

Share Document