Encyclopedia of Human Computer Interaction
Latest Publications


TOTAL DOCUMENTS

109
(FIVE YEARS 0)

H-INDEX

6
(FIVE YEARS 0)

Published By IGI Global

9781591405627, 9781591407980

Author(s):  
David R. Danielson

Credibility evaluation processes on the World Wide Web are subject to a number of unique selective pressures. The Web’s potential for supplying timely, accurate, and comprehensive information contrasts with its lack of centralized quality control mechanisms, resulting in its simultaneous potential for doing more harm than good to information seekers. Web users must balance the problems and potentials of accepting Web content and do so in an environment for which traditional, familiar ways of evaluating credibility do not always apply. Web credibility research aims to better understand this delicate balance and the resulting evaluation processes employed by Web users. This article reviews credibility conceptualizations utilized in the field, unique characteristics of the Web relevant to credibility, theoretical perspectives on Web credibility evaluation processes, factors influencing Web credibility assessments, and future trends.


Author(s):  
Julie Thomas ◽  
Claudia Roda

As Kress and Van Leeuwen (2001) state, there is no communication without interaction. Broadly, levels of “interactivity” can be recognized as depending on quality of feedback and control and exchange of discourse according to the mode or modes (“multimodal discourse”) involved. Important constraints that operate to modify interactivity of any kind can be identified as the amount of “common ground” (Clark, 1996), constraints of space and time, relative embodiment, and choice of or control over the means, manner, and/or medium of feedback. Ha and James (1998) emphasize the element of response as characterized by playfulness, choice, connectedness, information collection, and reciprocal communication.


Author(s):  
Alan Woolrych ◽  
Mark Hindmarch

Usability inspection method (UIM) is the term used for a variety of analytical methods designed to “find” usability problems in an interface design. The basic principle involves analysts inspecting the interface against a set of pre-determined rules, standards or requirements. Analysts inspect the interface and predict potential usability problems based on breaches of these rules. None of the UIMs currently in use are capable of detecting all of the problems associated with an interface. After describing some of the UIMs in use, this article will look at the authors’ work on improving these methods by focusing on the resources analysts bring to an inspection.


Author(s):  
Anxo Cereijo Roibás

Let’s remember the first films that started to show the broad public futuristic communication scenarios, where users were able to exchange almost any kind of information to communicate with anyone at any place and at any time, like Marc Daniels’ “Star Trek” in the 1960s and James Cameron’s “Terminator” in the 1970s, for example. The consequence of this was that impersonalized spaces (e.g., airports) (Auge, 1992) could easily become a personalized environment for working or leisure, according to the specific needs of each user. These kinds of scenarios recently have been defined as ubiquitous communication environments. These environments are characterized by a system of interfaces that can be or fixed in allocated positions or portable (and/or wearable) devices. According to our experience with 2G technologies, we can foresee that the incoming 3G communication technologies will make sure, however, that the second typology of interfaces will become more and more protagonist in our daily lives. The reason is that portable and wearable devices represent a sort of prosthesis, and therefore, they reflect more than ever the definition of interface as an extension of the human body. When in 1973 Martin Cooper from Motorola patented an interface called Radio Telephone System (which can be defined as the first mobile phone), he probably didn’t suspect the substantial repercussion of his invention in the human microenvironment and in its social sphere. The mobile phone, enabling an interpersonal communication that is time- and place-independent, has changed humans’ habits and their way of making relationships (Rheingold, 1993). This system made possible a permanent and ubiquitous connection among users. At the same time, it has made users free to decide whether to be available or not in any moment and in any place they might be (Hunter, 2002). This article is based on empirical work in the field with network operators (Vodafone) and handset manufacturers (Nokia) and research at the Politecnico di Milano University, the University of Lapland, and the University of Brighton. The intention is to give a practical approach to the design of interfaces in ubiquitous communication scenarios.


Author(s):  
Anthony Faiola

With the increasing demand for global communication between countries, it is imperative that we understand the importance of national culture in human communication on the World Wide Web (WWW). As we consider the vast array of differences in the way we think, behave, assign value, and interact with others, culture becomes a focal point in research of online communication. More than ever, culture has become an important human-computer interaction (HCI) issue, because it impacts both the substance and the vehicle of communication via communication technologies. Global economics and information delivery is leading to even greater diversification among individuals and groups of users who employ the WWW as a key resource for accessing information and purchasing products. Companies will depend more on the Internet as an integral component of their communication infrastructure. With a shift toward online services for information, business professionals have identified international Web usability as an increasingly relevant area of HCI research. What must be addressed are the cultural factors surrounding Web site design. Specifically argued is that culture is a discernible variable in international Web site design, and as such, should better accommodate global users who seek to access online information or products. There are still many unresolved questions regarding cross-cultural HCI and communication and the delivery of information via the Web. To date, there has been no significant connection made between culture context and cognition, cross-cultural Web design, and related issues of HCI. This correlation is relevant for identifying new knowledge in cross-cultural Web design theory and practice.


Author(s):  
Kazuhisa Seta

In ontological engineering research field, the concept of “task ontology” is well-known as a useful technology to systemize and accumulate the knowledge to perform problem-solving tasks (e.g., diagnosis, design, scheduling, and so on). A task ontology refers to a system of a vocabulary/concepts used as building blocks to perform a problem-solving task in a machine readable manner, so that the system and humans can collaboratively solve a problem based on it. The concept of task ontology was proposed by Mizoguchi (Mizoguchi, Tijerino, & Ikeda, 1992, 1995) and its validity is substantiated by development of many practical knowledge-based systems (Hori & Yoshida, 1998; Ikeda, Seta, & Mizoguchi, 1997; Izumi &Yamaguchi, 2002; Schreiber et al., 2000; Seta, Ikeda, Kakusho, & Mizoguchi, 1997). He stated: …task ontology characterizes the computational architecture of a knowledge-based system which performs a task. The idea of task ontology which serves as a system of the vocabulary/concepts used as building blocks for knowledge-based systems might provide an effective methodology and vocabulary for both analyzing and synthesizing knowledge-based systems. It is useful for describing inherent problem-solving structure of the existing tasks domain-independently. It is obtained by analyzing task structures of real world problem. ... The ultimate goal of task ontology research is to provide a theory of all the vocabulary/concepts necessary for building a model of human problem solving processes. (Mizoguchi, 2003) We can also recognize task ontology as a static user model (Seta et al., 1997), which captures the meaning of problem-solving processes, that is, the input/output relation of each activity in a problem-solving task and its effects on the real world as well as on the humans’ mind.


Author(s):  
Patricia M. Boechler

Computers have become commonplace tools in educational environments and are used to provide both basic and supplemental instruction to students on a variety of topics. Searching for information in hypermedia documents, whether on the Web or through individual educational sites, is a common task in learning activities. Previous research has identified a number of variables that impact how students use electronic documents. Individual differences such as learning style or cognitive style (Andris, 1996; Fitzgerald & Semrau, 1998), prior topic knowledge (Ford & Chen, 2000), level of interest (Lawless & Kulikowich, 1998), and gender (Beasley & Vila, 1992) all influence performance. Additionally, characteristics of the document such as the inherent structure of the material, the linking structure (Korthauer & Koubek, 1994), and the types of navigation tools that accompany the document can affect student performance and behaviour (Boechler & Dawson, 2002; McDonald & Stevenson, 1998, 1999). In short, the effective use of hypermedia documents in educational settings depends on complex interactions between individual skills (e.g., spatial and reading skills) and the features of the document itself.


Author(s):  
William A. Janvier ◽  
Claude Ghaoui

HCI-related subjects need to be considered to make e-learning more effective; examples of such subjects are: psychology, sociology, cognitive science, ergonomics, computer science, software engineering, users, design, usability evaluation, learning styles, teaching styles, communication preference, personality types, and neuro-linguistic programming language patterns. This article discusses the way some components of HI can be introduced to increase the effectiveness of e-learning by using an intuitive interactive e-learning tool that incorporates communication preference (CP), specific learning styles (LS), neurolinguistic programming (NLP) language patterns, and subliminal text messaging. The article starts by looking at the current state of distance learning tools (DLTs), intelligent tutoring systems (ITS) and “the way we learn”. It then discusses HI and shows how this was implemented to enhance the learning experience.


Author(s):  
Lorenzo Magnani ◽  
Emanuele Bardone ◽  
Michele Bocchiola

Our contention is that interactions between humans and computers have a moral dimension. That is to say, a computer cannot be taken as a neutral tool or a kind of neutral technology (Norman, 1993).1 This conclusion seems a bit puzzling and surely paradoxical. How can a computer be moral? All computational apparatuses can be generally considered as moral mediators, but for our considerations, computers are the best representative tools. First of all, they are the most widespread technological devices, they are relatively cheap in comparison to other technological utilities, and, very importantly, they can be easily interconnected all over the word through the Internet. This last feature allows people to keep in contact with each other and, consequently, to improve their relations. Computers require interactions with humans, but also allow interactions between humans. Since morality relates to how to treat other people within interactive behaviors, computers can help us to act morally in several ways. For instance, as the concept of moral mediators suggests, computers can help us to acquire new information useful to treat in a more satisfactory moral way other human beings.


Author(s):  
Leah P. Macfadyen ◽  
Sabine Doff

Amid the many published pages of excited hyperbole regarding the potential of the Internet for human communications, one salient feature of current Internet communication technologies is frequently overlooked: the reality that Internet- and computer-mediated communications, to date, are communicative environments constructed through language (mostly text). In cyberspace, written language therefore mediates the human-computer interface as well as the human-human interface. What are the implications of the domination of Internet and computer-mediated communications by text? Researchers from diverse disciplines—from distance educators to linguists to social scientists to postmodern philosophers—have begun to investigate this question. They ask: Who speaks online, and how? Is online language really text, or is it “speech”? How does culture affect the language of cyberspace? Approaching these questions from their own disciplinary perspectives, they variously position cyberlanguage as “text,” as “semiotic system,” as “socio-cultural discourse” or even as the medium of cultural hegemony (domination of one culture over another). These different perspectives necessarily shape their analytical and methodological approaches to investigating cyberlanguage, underlying decisions to examine, for example, the details of online text, the social contexts of cyberlanguage, and/or the social and cultural implications of English as Internet lingua franca. Not surprisingly, investigations of Internet communications cut across a number of pre-existing scholarly debates: on the nature and study of “discourse,” on the relationships between language, technology and culture, on the meaning and significance of literacy, and on the liter


Sign in / Sign up

Export Citation Format

Share Document