Optimal User Interface for a RIS by using Modern Input Devices

Author(s):  
J. Weber ◽  
Ch. Gärtner ◽  
S. Nissen-Meyer ◽  
U. Fink ◽  
Th. Hilbertz
Keyword(s):  
2021 ◽  
Vol 7 (2) ◽  
pp. 211-214
Author(s):  
Max B. Schäfer ◽  
Bha A. Al-Abboodi ◽  
Peter P. Pott

Abstract In robotic telemanipulation for minimally-invasive surgery, lack of haptic sensation and non-congruent movement of input device and manipulator are major drawbacks. Input devices based on cable-driven parallel mechanisms have the potential to be a stiff alternative to input devices based on rigid parallel or serial kinematics by offering low inertia and a scalable workspace. In this paper, the haptic user interface of a cable-driven input device and its technical specifications are presented and assessed. The haptic user interface allows to intuitively control the gripping movement of the manipulator’s end effector by providing a two-finger precision grasp. By design, the interface allows to command input angles between 0° and 45°. Furthermore, interaction forces from the manipulator’s end effector can be displayed to the user’s twofinger grasp in a range from 0 N to 6 N with a frequency bandwidth of 17 Hz.


Author(s):  
Mario Covarrubias ◽  
Michele Antolini ◽  
Monica Bordegoni ◽  
Umberto Cugini

This paper describes a multimodal system whose aim is to replicate in a virtual reality environment some typical operations performed by professional designers with real splines laid over the surface of a physical prototype of an aesthetic product, in order to better evaluate the characteristics of the shape they are creating. The system described is able not only to haptically render a continuous contact along a curve, by means of a servo controlled haptic strip, but also to allow the user to modify the shape applying force directly on the haptic device. The haptic strip is able to bend and twist in order to better approximate the portion of the surface of the virtual object over which the strip is laying. This device is 600mm long and is controlled by 11 digital servos for the control of the shape (6 for bending and 5 for twisting) and by two MOOG-FCS HapticMaster devices and two additional digital servos for 6-DOF positioning. We have developed additional input devices, which have been integrated with the haptic strip, which consist of two force sensitive handles positioned at the extremities of the strip, and a capacitive linear touch sensor placed along the surface of the strip, and four buttons. These devices are used to interact with the system, to select menu options, and to apply deformations to the virtual object. The paper describes the interaction modalities and the developed user interface, the applied methodologies, the achieved results and the conclusions elicited from the user tests.


Author(s):  
Anu Sathyan ◽  
L C Manikandan

The use of touch screen in our day-to-day life is always there. People without touch screen are only a few. Touch screen were built for ease of work and for saving time. It is an assistive technology. This interface can be beneficial to those that have difficulty in using other input devices such as a mouse or keyboard. When used in conjunction with software such as on-screen keyboards, or other assistive technology, they can make computing resources more available to people that have difficulty in using computers. Touch Screen is widely used and emerging technology that is sensitive to human touch, allowing a user to interact with the computer by touching pictures or words on the screen. It provides a very good user interface with applications that normally require a mouse. The touch screen interface is going to revolutionize the electronic interactive devices in a big way. The purpose of this study is to analyse the various technologies used to build the touch screen and it's also helpful for the future gen those who make a study on it.


Author(s):  
Mikael Wiberg

No matter if we think about interaction design as a design tradition aimed at giving form to the interaction with computational objects, or if we think about interaction design as being simply about user interface design it is hard to escape the fact that the user interface to a large extent defines the scene and the form of the interaction. Without adopting a fully deterministic perspective here it is still a fact that if the user interface is screen-based and graphical and the input modality is mouse-based, then it is likely that the form of that interaction, that is what the turn-taking looks like and what is demanded by the user, is very similar to other screen-based interfaces with similar input devices. However, the design space for the form of interaction is growing fast. While command-based interfaces and text-based interfaces sort of defined the whole design space in the 1970s, the development since then, including novel ways of bringing sensors, actuators, and smart materials to the user interface has certainly opened up for a broader design space for interaction design. But it is not only the range of materials that has been extended over the last few decades, but we have also moved through a number of form paradigms for interaction design. With this as a point of departure I will in this chapter reflect on how we have moved from early days of command-based user interfaces, via the use of metaphors in the design of graphical user interfaces (GUIs), towards ways of interacting with the computer via tangible user interfaces (TUIs). Further on, I will describe how this movement towards TUIs was a first step away from building user interfaces based on representations and metaphors and a first step towards material interactions.


1982 ◽  
Vol 26 (4) ◽  
pp. 300-300
Author(s):  
J. R. Kornfeld

Using video disc storage technology and alternative input devices, system designers can give end-users of interactive systems more flexible access to information. But improving accessibility does not automatically improve usability of the information provided by such systems. To insure that users not only understand but make efficient use of information, human factors engineers need to develop new styles of structuring the information that must eventually be presented to end-users. New types of user-interface functions must be designed for giving end-users better control over the means by which they can access and use the information presented by these systems. This paper summarizes the experience gained in improving the user-interface to an interactive video disc system, installed as a network of touch-sensitive terminals in a large public area. Questions are proposed to aid the human performance engineer in defining functional requirements, and methods are outlined for structuring the information content. Finally, step-by-step guidelines are offered for conducting structured walkthroughs of the user-interface design, and matrix formats are presented for documenting the results of these procedures.


Author(s):  
Markos Mentzelopoulos ◽  
Jeffrey Ferguson ◽  
Aristidis Protopsaltis

<p class="Abstract">The use of perceptual inputs is an emerging area within HCI that suggests a developing Perceptual User Interface (PUI) that may prove advantageous for those involved in mobile serious games and immersive social network environments. Since there are a large variety of input devices, software platforms, possible interactions, and myriad ways to combine all of the above elements in pursuit of a PUI, we propose in this paper a basic experimental framework that will be able to standardize study of the wide range of interactive applications for testing efficacy in learning or information retrieval and also suggest improvements to emerging PUIs by enabling quick iteration. This rapid iteration will start to define a targeted range of interactions that will be intuitive and comfortable as perceptual inputs, and enhance learning and information retention in comparison to traditional GUI systems. The work focuses on the planning of the technical development of two scenarios, and the first steps in developing a framework to evaluate these and other PUIs for efficacy and pedagogy.</p>


2004 ◽  
Vol 4 (3) ◽  
pp. 178-185 ◽  
Author(s):  
Gerold Wesche

This paper presents a virtual environment based user interface for the conceptual design of free form surface models from scratch. The user performs sketching and elaboration directly within a projection-based, table-like environment. He uses head-tracked stereo glasses and simple input devices. We describe user interface components for creation, manipulation, and application control, which were specifically designed for use in a 3D environment. These components are part of a two-handed interaction scheme. In our modeling approach, the user draws curves and constructs a curve network that forms the skeleton of the surface. Automatic surfacing methods generate shapes that correspond to the outlined boundary, thus freeing the designer from specifying all surface parameters by hand. We demonstrate how the use of a virtual environment benefits such creation and manipulation tasks.


Author(s):  
M.A. O’Keefe ◽  
J. Taylor ◽  
D. Owen ◽  
B. Crowley ◽  
K.H. Westmacott ◽  
...  

Remote on-line electron microscopy is rapidly becoming more available as improvements continue to be developed in the software and hardware of interfaces and networks. Scanning electron microscopes have been driven remotely across both wide and local area networks. Initial implementations with transmission electron microscopes have targeted unique facilities like an advanced analytical electron microscope, a biological 3-D IVEM and a HVEM capable of in situ materials science applications. As implementations of on-line transmission electron microscopy become more widespread, it is essential that suitable standards be developed and followed. Two such standards have been proposed for a high-level protocol language for on-line access, and we have proposed a rational graphical user interface. The user interface we present here is based on experience gained with a full-function materials science application providing users of the National Center for Electron Microscopy with remote on-line access to a 1.5MeV Kratos EM-1500 in situ high-voltage transmission electron microscope via existing wide area networks. We have developed and implemented, and are continuing to refine, a set of tools, protocols, and interfaces to run the Kratos EM-1500 on-line for collaborative research. Computer tools for capturing and manipulating real-time video signals are integrated into a standardized user interface that may be used for remote access to any transmission electron microscope equipped with a suitable control computer.


Sign in / Sign up

Export Citation Format

Share Document