Improving Pointing in Graphical User Interfaces for People With Motor Impairments Through Ability-Based Design

2019 ◽  
pp. 1193-1243 ◽  
Author(s):  
Jacob O. Wobbrock

Pointing to targets in graphical user interfaces remains a frequent and fundamental necessity in modern computing systems. Yet for millions of people with motor impairments, children, and older users, pointing—whether with a mouse cursor, a stylus, or a finger on a touch screen—remains a major access barrier because of the fine-motor skills required. In a series of projects inspired by and contributing to ability-based design, we have reconsidered the nature and assumptions behind pointing, resulting in changes to how mouse cursors work, the types of targets used, the way interfaces are designed and laid out, and even how input devices are used. The results from these explorations show that people with motor difficulties can acquire targets in graphical user interfaces when interfaces are designed to better match the abilities of their users. Ability-based design, as both a design philosophy and a design approach, provides a route to realizing a future in which people can utilize whatever abilities they have to express themselves not only to machines, but to the world.

Author(s):  
Jacob O. Wobbrock

Pointing to targets in graphical user interfaces remains a frequent and fundamental necessity in modern computing systems. Yet for millions of people with motor impairments, children, and older users, pointing—whether with a mouse cursor, a stylus, or a finger on a touch screen—remains a major access barrier because of the fine-motor skills required. In a series of projects inspired by and contributing to ability-based design, we have reconsidered the nature and assumptions behind pointing, resulting in changes to how mouse cursors work, the types of targets used, the way interfaces are designed and laid out, and even how input devices are used. The results from these explorations show that people with motor difficulties can acquire targets in graphical user interfaces when interfaces are designed to better match the abilities of their users. Ability-based design, as both a design philosophy and a design approach, provides a route to realizing a future in which people can utilize whatever abilities they have to express themselves not only to machines, but to the world.


Author(s):  
Sha Xin Wei

Since 1984, Graphical User Interfaces have typically relied on visual icons that mimic physical objects like the folder, button, and trash can, or canonical geometric elements like menus, and spreadsheet cells. GUI’s leverage our intuition about the physical environment. But the world can be thought of as being made of stuff as well as things. Making interfaces from this point of view requires a way to simulate the physics of stuff in realtime response to continuous gesture, driven by behavior logic that can be understood by the user and the designer. The author argues for leveraging the corporeal intuition that people learn from birth about heat flow, water, smoke, to develop interfaces at the density of matter that leverage in turn the state of the art in computational physics.


Author(s):  
Richard Pekelney ◽  
Robin Chu

The rapid growth of graphical user interfaces on personal computers has led to the mouse input device playing a prominent and central role in the control of computer applications. As their use increases, mouse design and comfort issues are becoming more and more critical. This report describes the ergonomic design criteria and resulting product attributes of a commercially successful mouse computer input device. Although well-founded ergonomic principles were incorporated into the design criteria, very little ergonomic research has been published on the design of mice. There is a need for additional research on the ergonomics computer mouse input devices.


1995 ◽  
Vol 8 (2) ◽  
pp. 99-108 ◽  
Author(s):  
Alison Black ◽  
Jacob Buur

Solid user interfaces are the key to many of the electronic products which people use in their everyday personal and working lives: domestic and consumer equipment, telephones, ticket machines, measuring and metering devices and so on. Despite their prevalence they have received relatively little research attention compared to the graphical user interfaces of computing systems, and are rarely considered as a class. This paper identifies the characteristics of SU Is that can help or constrain peoples' interactions with products. It highlights techniques for analysis, design and testing required to ensure that solid user interfaces are informative and easy to use.


Author(s):  
Sha Xin Wei

Since 1984, Graphical User Interfaces have typically relied on visual icons that mimic physical objects like the folder, button, and trash can, or canonical geometric elements like menus, and spreadsheet cells. GUI’s leverage our intuition about the physical environment. But the world can be thought of as being made of stuff as well as things. Making interfaces from this point of view requires a way to simulate the physics of stuff in realtime response to continuous gesture, driven by behavior logic that can be understood by the user and the designer. The author argues for leveraging the corporeal intuition that people learn from birth about heat flow, water, smoke, to develop interfaces at the density of matter that leverage in turn the state of the art in computational physics.


Author(s):  
Noam Shemtov

This chapter examines the scope of protection to which graphical user interfaces may be eligible under various intellectual property rights: namely, trade marks, unfair-competition laws, design rights, copyright, and patents. It first considers the extent of copyright protection over a software product’s ‘look-and-feel’ elements, with particular emphasis on graphical user interfaces protection under US and EU laws. It then discusses trade-mark, trade-dress, and unfair-competition protection for graphical user interfaces, along with intellectual property rights protection for design patents and registered designs. Finally, it describes the patent protection for graphical user interfaces in the United States and at the European Patent Office.


Author(s):  
Roman Bruch ◽  
Paul M. Scheikl ◽  
Ralf Mikut ◽  
Felix Loosli ◽  
Markus Reischl

Behavioral analysis of moving animals relies on a faithful recording and track analysis to extract relevant parameters of movement. To study group behavior and social interactions, often simultaneous analyses of individuals are required. To detect social interactions, for example to identify the leader of a group as opposed to followers, one needs an error-free segmentation of individual tracks throughout time. While automated tracking algorithms exist that are quick and easy to use, inevitable errors will occur during tracking. To solve this problem, we introduce a robust algorithm called epiTracker for segmentation and tracking of multiple animals in two-dimensional (2D) videos along with an easy-to-use correction method that allows one to obtain error-free segmentation. We have implemented two graphical user interfaces to allow user-friendly control of the functions. Using six labeled 2D datasets, the effort to obtain accurate labels is quantified and compared to alternative available software solutions. Both the labeled datasets and the software are publicly available.


Sign in / Sign up

Export Citation Format

Share Document