HUMAN–COMPUTER INTERACTION: LESSONS FROM HUMAN-HUMAN COMMUNICATION

1990 ◽  
pp. 51-66 ◽  
Author(s):  
Pierre FALZON
Author(s):  
John Neumann ◽  
Jennifer M. Ross ◽  
Peter Terrence ◽  
Mustapha Mouloua

This report looks at the research trends over the years 1989 — 2004 as published in the International Journal of Human Computer Interaction (HCI). Over this time period, there has been a concerned focus by scholars and practitioners to bring issues such as interface design, usability engineering, human information processing, and user-centric system development into the mainstream consciousness of engineers and developers. Our research aims to provide information to both scholars and developers on the past and current trends in the growing field of HCI. Using the PsycINFO journal database, we compiled an extensive Excel workbook containing relevant information on all the articles appearing in the journal since its inception. We were then able to classify each document using the ACM SIGCHI taxonomy, developed by Hewett, et al. This taxonomy permits classification of articles based on six factors, within one of 17 possible categories. Several other dimensions were examined including year & period of publication (1989–1993; 1994–1999; 2000–2004), author affiliation, geographic location, number of empirical studies per paper, and average sample size per study. We also reported the classifications of each article as reported by PsycINFO. Besides noting the clear growth in the total number of articles published each period, our results indicate that the field of Human Computer Interaction has seen changes in research focus. Current trends point to an increase in research focusing on developmental processes, usability evaluation methods, human communication and interaction, and applications. Another trend shows a notable decrease in empirical studies using human participants over the 15-year period.


Author(s):  
Robert J. K. Jacob

The problem of human-computer interaction can be viewed as two powerful information processors (human and computer) attempting to communicate with each other via a narrow-bandwidth, highly constrained interface (Tufte, 1989). To address it, we seek faster, more natural, and more convenient means for users and computers to exchange information. The user’s side is constrained by the nature of human communication organs and abilities; the computer’s is constrained only by input/output devices and interaction techniques that we can invent. Current technology has been stronger in the computer-to-user direction than the user-to-computer, hence today’s user-computer dialogues are rather one-sided, with the bandwidth from the computer to the user far greater than that from user to computer. Using eye movements as a user-to-computer communication medium can help redress this imbalance. This chapter describes the relevant characteristics of the human eye, eye-tracking technology, how to design interaction techniques that incorporate eye movements into the user-computer dialogue in a convenient and natural way, and the relationship between eye-movement interfaces and virtual environments. As with other areas of research and design in human-computer interaction, it is helpful to build on the equipment and skills humans have acquired through evolution and experience and search for ways to apply them to communicating with a computer. Direct manipulation interfaces have enjoyed great success largely because they draw on analogies to existing human skills (pointing, grabbing, moving objects in space), rather than trained behaviors. Similarly, we try to make use of natural eye movements in designing interaction techniques for the eye. Because eye movements are so different from conventional computer inputs, our overall approach in designing interaction techniques is, wherever possible, to obtain information from a user’s natural eye movements while viewing the screen, rather than requiring the user to make specific trained eye movements to actuate the system. This requires careful attention to issues of human design, as will any successful work in virtual environments. The goal is for human-computer interaction to start with studies of the characteristics of human communication channels and skills and then develop devices, interaction techniques, and interfaces that communicate effectively to and from those channels.


Author(s):  
Patrik T. Schuler ◽  
Katherina A. Jurewicz ◽  
David M. Neyens

Gestures are a natural input method for human communication and may be effective for drivers to interact with in-vehicle infotainment systems (IVIS). Most of the existing work on gesture-based human-computer interaction (HCI) in and outside of the vehicle focus on the distinguishability of computer systems. The purpose of this study was to identify gesture sets that are used for IVIS tasks and to compare task times across the different functions for gesturing and touchscreens. Task times for user-defined gestures were quicker than for a novel touchscreen. There were several functions that resulted in relatively intuitive gesture mappings (e.g., zooming in and zooming out on a map) and others that did not have strong mappings across participants (e.g., decreasing volume and playing the next song). The findings of this study suggest that user-centric gestures can be utilized to interact with IVIS systems instead of touchscreens, and future work should evaluate how to account for variability in intuitive gestures. Understanding the gesture variability among the end users can support the development of an in-vehicle gestural input system that is intuitive for all users.


2021 ◽  
Vol 11 (1) ◽  
pp. 1-9
Author(s):  
Riya Jain ◽  
Muskan Jain ◽  
Roopal Jain ◽  
Suman Madan

The creation of intelligent and natural interfaces between users and computer systems has received a lot of attention. Several modes of knowledge like visual, audio, and pen can be used individually or in combination have been proposed in support of this endeavour. Human communication relies heavily on the use of gestures to communicate information. Gesture recognition is a subject of science and language innovation that focuses on numerically quantifying human gestures. It is possible for people to communicate properly with machines using gesture recognition without the use of any mechanical devices. Hand gestures are a form of nonverbal communication that can be applied to several fields, including deaf-mute communication, robot control, human–computer interaction (HCI), home automation, and medical applications. Many different methods have been used in hand gesture research papers, including those focused on instrumented sensor technology and computer vision. To put it another way, the hand sign may be categorized under a variety of headings, including stance and motion, dynamic and static, or a combination of the two. This paper provides an extensive study on hand gesture methods and explores their applications.


Author(s):  
Aditya Thakur ◽  
Rahul Rai

Gestures are an important medium of human communication and have been studied throughout the centuries from different viewpoints and in different domains ranging from arts, linguistics, philosophy and engineering. Recent developments in gestures based human-computer interaction (HCI) studies are noteworthy. However, commercial application of hand gestures in computer aided sketching and modeling is rarely found. The present study focuses on identifying various aspects of hand gestures that can be used as an input to a gesture based CAD software for sketching and 3D modeling tasks. First, we experimentally observed and studied hand gestures performed by users to convey CAD sketch and 3D modeling commands. Next, we performed literature study to compile a repository of gestures used by researchers in conveying various commands for gesture based human computer interaction. With the knowledge gleaned from these two steps a simplified yet representative taxonomy was created to classify hand gestures which can be used for drawing tasks in CAD. During the course of these studies we identified various attributes/requirements of a gesture based CAD software, consideration of which will help in enhancing the CAD designer experience.


Sign in / Sign up

Export Citation Format

Share Document