Encyclopedia of Information Science and Technology, Second Edition
Latest Publications


TOTAL DOCUMENTS

663
(FIVE YEARS 0)

H-INDEX

5
(FIVE YEARS 0)

Published By IGI Global

9781605660264, 9781605660271

Author(s):  
Jan Achterbergh

This overview approaches information and communication technology (ICT) for competitive intelligence from the perspective of strategy formulation. It provides an ICT architecture for supporting the knowledge processes producing relevant knowledge for strategy formulation. To determine what this architecture looks like, we first examine the process of strategy formulation and determine the knowledge required in the process of strategy formulation. To this purpose, we use Beer’s viable system model (VSM). Second, we model the knowledge processes in which the intelligence relevant for the process of strategy formulation is produced and processed. Given these two elements, we describe an ICT architecture supporting the knowledge processes producing the knowledge needed for the strategic process.


Author(s):  
Stu Westin

Studies that rely on Web usage mining can be experimental or observational in nature. The focus of such studies is quite varied and may involve such topics as predicting online purchase intentions (Hooker & Finkelman, 2004; Moe, 2003; Montgomery, Li, Srinivsan, & Liechty, 2004), designing recommender systems for e-commerce products and sites (Cho & Kim, 2004; Kim & Cho, 2003), understanding navigation and search behavior (Chiang, Dholakia, & Westin, 2004; Gery & Haddad, 2003; Johnson, Moe, Fader, Bellman, & Lohse, 2004; Li & Zaiane, 2004), or a myriad of other subjects. Regardless of the issue being studied, data collection for Web usage mining studies often proves to be a vexing problem, and ideal research designs are frequently sacrificed in the interest of finding a reasonable data capture or collection mechanism. Despite the difficulties involved, the research community has recognized the value of Web-based experimental research (Saeed, Hwang, & Yi, 2003; Zinkhan, 2005), and has, in fact, called on investigators to exploit “non-intrusive means of collecting usage and exploration data” (Gao, 2003, p. 31) in future Web studies. In this article we discuss some of the methodological complexities that arise when conducting studies that involve Web usage mining. We then describe an innovative, software-based methodology that addresses many of these problems. The methods described here are most applicable to experimental studies, but they can be applied in ex-post observational research settings, as well.


Author(s):  
Arthur Tatnall

In general terms, a portal can be seen as “a door, gate or entrance” (Macquarie Library, 1981), and in its simplest form the word just means a gateway; however, it is often a gateway to somewhere other than just to the next room or street. The Oxford Reference Dictionary defines a portal as “a doorway or gate etc, especially a large and elaborate one” (Pearsall & Trumble, 1996). In the context of this article, a Web portal is considered to be a special Internet (or intranet) site designed to act as a gateway to give access to other specific sites. A Web portal can be said to aggregate information from multiple sources and make this information available to various users (Tatnall, 2005c). It consists of a Web site that can be used to find and gain access to other sites, but also to provide the services of a guide that can help to protect the user from the chaos of the Internet and direct him or her toward a specific goal. More generally, however, a portal should be seen as providing a gateway not just to sites on the Web, but to all network-accessible resources, whether involving intranets, extranets, or the Internet. In other words, a portal offers centralised access to all relevant content and applications.


Author(s):  
Jerzy A. Kisielnicki

A new management trend of the global information technology (IT) application—virtualization—has appeared in the contemporary management. Virtualization is a process of enterprise transformation (using IT) that allows breaking through various limitations of organizational constraints. Virtualization changes dramatically the image of business, especially of small and medium enterprises (SMEs); by adopting the concept of virtualization, they can become fully competitive and may effectively operate in the global market. Barriers of the scale between SMEs and large organizations disappear. This new type of organizations is often called in literature modern organization or virtual organization. Organizations of this type have an effective decision-making process, and function based on economic criteria. Consequently, their opportunities to grow and to compete in the global market are greater than for traditional SMEs. Hence the thesis that virtualization allows individual organizations to enter strategic co-operative alliances with other similar businesses. Such of virtual organizations have a competitive position in the global market. In the literature, there are many terms used to define virtual organization: “network organizations” (Drucker, 1988, p. 9), “organizations after re-engineering” (Hammer & Champy, 1993, pp. 77-79), “crazy organization,” “crazy time for crazy organization” (Peters, 1994, pp. 5-7), and “intelligent enterprise” (Quinn, 1992, p. 3).


Author(s):  
Evangelia Nidelkou ◽  
Vasileios Papastathis ◽  
Maria Papadogiorgaki

A major theme of Information Science and Technology research is the study of personalization. The key issue of personalization is the problem of understanding human behaviour and its simulation by machines, in a sense that machines can treat users as individuals with respect to their distinct personalities, preferences, goals and so forth. The general fields of research in personalization are user modeling and adaptive systems, which can be traced back to the late 70s, with the use of models of agents by Perrault, Allen, and Cohen (1978) and the introduction of stereotypes by Rich (1979). With the wide progress in hardware and telecom­munications technologies that has led to a vast increase in the services, volume and multimodality (text and multimedia) of content, in the last decade, the need for personalization systems is critical, in order to enable both consumers to manage the volume and complexity of available information and vendors to be competitive in the market.


Author(s):  
Leire San Jose Ruiz de Aguirre

The use of new information and communication technologies (ICT) as a business tool has increased rapidly for the past 10 years (Bonsón, Coffin, & Watson, 2000; Claessens, Glaessner, & Klingebiel, 2000; Vasarhelyi & Greenstein, 2003). More specifically, financial software, e-banking, and the Internet, as core aspects of the various technologies used, have become driving forces behind the expansion of firms and the development of cash management. New technologies are considered as one of the most attractive ways for businesses to increase revenue and achieve economies of scale that can reduce unit costs (Ballantine & Stray, 1998; Barajas & Villanueva, 2001; Daniel, 1999; Daniel & Storey, 1997; Deyoung, 2001; Downes & Muy, 1998; Faulder, 2001; Jayawardhena & Foley, 2000). There are different studies about the use of ICT in the management of the enterprise that explain the obtaining of enterprise performance. Brynjolfsson and Hitt (2000) and Nájera (2005) have done a review of these works and a classification of these types of researches. Unfortunately, there are not specific works or empirical researches about the use of e-banking in cash management; consequently, this work is focused in this. The rest of the chapter is structured as follows. The theoretical foundation on which the study is based is explained in Section 2. Section 3 presents the data and the analysis procedure used to conduct the empirical study. The main results of the investigation are shown in Section 4, and Section 5 presents conclusions. The chapter ends with a list of bibliographical references.


Author(s):  
Fred Kitchens

For hundreds of years, actuaries used pencil and paper to perform their statistical analysis It was a long time before they had the help of a mechanical adding machine. Only recently have they had the benefit of computers. As recently as 1981, computers were not considered important to the process of insurance underwriting. Leading experts in insurance underwriting believed that the judgment factor involved in the underwriting process was too complex for any computer to handle as effectively as a human underwriter (Holtom, 1981). Recent research in the application of technology to the underwriting process has shown that Holtom’s statement may no longer hold true (Gaunt, 1972; Kitchens, 2000; Rose, 1986). The time for computers to take on an important role in the insurance underwriting process may be upon us. The author intends to illustrate the applicability of artificial neural networks to the insurance underwriting process.


Author(s):  
Micheal Chapple ◽  
Charles R. Crowell

The American legal system, along with many of its counterparts around the globe, is only beginning to grapple with the legal challenges of the knowledge age. The past decade has witnessed a multitude of new laws and regulations seeking to address these challenges and provide a common framework for the legal and technical professions. Those charged with information security responsibilities face a myriad of complex and often vague requirements. In this article, we establish a four-level taxonomy for information security laws and explore the major components of each level.


Author(s):  
K. Lee ◽  
X.W. Xu

The three main methods of digitization can be broadly defined as contact digitization, image-based digitization (photogrammetry), and geometry-based digitization (laser scanning). With the development of the latter two digitization methods, and advanced rendering technologies, virtual displays and museums can now be used widely. (Hung, 2007) Furthermore, recent developments in interactive 3-D computer graphics technology have seen an increased interest in, and use of, 3-D digitization for cultural heritage objects. (Muller-Wittig, Zhu, & Voss, 2007) Technologies for reconstructing or remodeling physical components in 3- D formats are not new in the engineering field, in particular within manufacturing engineering. However, 3-D digitization used for the preservation and archiving of cultural artifacts is relatively recent.


Author(s):  
Jaymie Strecker ◽  
Atif M. Memon

This chapter describes the state of the art in testing GUI-based software. Traditionally, GUI testing has been performed manually or semimanually, with the aid of capture- replay tools. Since this process may be too slow and ineffective to meet the demands of today’s developers and users, recent research in GUI testing has pushed toward automation. Model-based approaches are being used to generate and execute test cases, implement test oracles, and perform regression testing of GUIs automatically. This chapter shows how research to date has addressed the difficulties of testing GUIs in today’s rapidly evolving technological world, and it points to the many challenges that lie ahead.


Sign in / Sign up

Export Citation Format

Share Document