Intelligent, Adaptive and Reasoning Technologies
Latest Publications


TOTAL DOCUMENTS

17
(FIVE YEARS 0)

H-INDEX

1
(FIVE YEARS 0)

Published By IGI Global

9781609605957, 9781609605964

Author(s):  
Ray R. Hashemi ◽  
Louis A. Le Blanc ◽  
Azita A. Bahrami ◽  
Mahmood Bahar ◽  
Bryan Traywick

A large sample (initially 33,000 cases representing a ten percent trial) of university alumni giving records for a large public university in the southwestern United States is analyzed by Formal Concept Analysis. This likely represents the initial attempt to perform analysis of such data by means of a machine learning technique. The variables employed include the gift amount to the university foundation as well as traditional demographic variables such as year of graduation, gender, ethnicity, marital status, etc. The foundation serves as one of the institution’s non-profit, fund-raising organizations. It pursues substantial gifts that are designated for the educational or leadership programs of the giver’s choice. Although they process gifts of all sizes, the foundation’s focus is on major gifts and endowments. Association Analysis of the given dataset is a two-step process. In the first step, FCA is applied to identify concepts and their relationships and in the second step, the association rules are defined for each concept. The hypothesis examined in this paper is that the generosity of alumni toward his/her alma mater can be predicted using association rules obtained by applying the Formal Concept Analysis approach.


Author(s):  
Azzam-ul-Asar ◽  
M. Sadeeq Ullah ◽  
Mudasser F. Wyne ◽  
Jamal Ahmed ◽  
Riaz-ul-Hasnain

This paper proposes a neural network based traffic signal controller, which eliminates most of the problems associated with the Traffic Responsive Plan Selection (TRPS) mode of the closed loop system. Instead of storing timing plans for different traffic scenarios, which requires clustering and threshold calculations, the proposed approach uses an Artificial Neural Network (ANN) model that produces optimal plans based on optimized weights obtained through its learning phase. Clustering in a closed loop system is root of the problems and therefore has been eliminated in the proposed approach. The Particle Swarm Optimization (PSO) technique has been used both in the learning rule of ANN as well as generating training cases for ANN in terms of optimized timing plans, based on Highway Capacity Manual (HCM) delay for all traffic demands found in historical data. The ANN generates optimal plans online to address real time traffic demands and thus is more responsive to varying traffic conditions.


Author(s):  
Joerg Leukel ◽  
Vijayan Sugumaran

Product-related information can be integrated with the help of a product ontology, which can provide consensual definitions of concepts and inter-relationships relevant in a product domain of interest. A product ontology is either given by a third party or results from ontology engineering. In both cases, the problem is how to assess its quality, and then select the “right” ontology. This chapter: (1) proposes a metrics suite for product ontology evaluation based on semiotic theory, and (2) demonstrates the feasibility and usefulness of the metrics suite using a supply chain model. The contribution of this research is the comprehensive metrics suite that takes into account the various quality dimensions of product ontology.


Author(s):  
Cecil Eng Huang Chua ◽  
Roger H. Chiang ◽  
Veda C. Storey

Search engines are ubiquitous tools for seeking information from the Internet and, as such, have become an integral part of our information society. New search engines that combine ideas from separate search engines generally outperform the search engines from which they took ideas. Designers, however, may not be aware of the work of other search engine developers or such work may not be available in modules that can be incorporated into another search engine. This research presents an interoperability architecture for building customized search engines. Existing search engines are analyzed and decomposed into self-contained components that are classified into six categories. A prototype, called the Automated Software Development Environment for Information Retrieval, was developed to implement the interoperability architecture, and an assessment of its feasibility was carried out. The prototype resolves conflicts between components of separate search engines and demonstrates how design features across search engines can be integrated.


Author(s):  
Simon Polovina ◽  
Simon Andrews

As 80-85% of all corporate information remains unstructured, outside of the processing scope of enterprise systems, many enterprises rely on Information Systems that cause them to risk transactions that are based on lack of information (errors of omission) or misleading information (errors of commission). To address this concern, the fundamental business concept of monetary transactions is extended to include qualitative business concepts. A Transaction Concept (TC) is accordingly identified that provides a structure for these unstructured but vital aspects of business transactions. Based on REA (Resources, Events, Agents) and modelled using Conceptual Graphs (CGs) and Formal Concept Analysis (FCA), the TC provides businesses with a more balanced view of the transactions they engage in and a means of discovering new transactions that they might have otherwise missed. A simple example is provided that illustrates this integration and reveals a key missing element. This example is supported by reference to a wide range of case studies and application areas that demonstrate the added value of the TC. The TC is then advanced into a Transaction-Oriented Architecture (TOA). The TOA provides the framework by which an enterprise’s business processes are orchestrated according to the TC. TOA thus brings Service-Oriented Architecture (SOA) and the productivity of enterprise applications to the height of the real, transactional world that enterprises actually operate in.


Author(s):  
Fergle D’Aubeterre ◽  
Lakshmi S. Iyer ◽  
Richard Ehrhardt ◽  
Rahul Singh

In the context of a customer-oriented value chain, companies must effectively address customers changing information needs during the process of acquiring a product or service to remain competitive. The ultimate goal of semantic matchmaking is to identify the best resources (supply) that fully meet the requirements (demand); however, such a goal is very difficult to achieve due to information distributed over disparate systems. To alleviate this problem in the context of eMarketplaces, the authors suggest an agent-enabled infomediary-based eMarketplace that enables semantic matchmaking. They extend and apply the exact, partial, and potential match algorithms developed in Di Noia et al. (2004) to show how partial and potential matches can become full matches. Specifically, the authors show how multi-criteria decision making techniques can be utilized to rank matches. They describe mechanisms for knowledge representation and exchange to allow partner organizations to seamlessly share information and knowledge to facilitate the discovery process in an eMarketplace context.


Author(s):  
Alexander Serenko

This study investigates user perceptions and employment of interface agents for email notification to answer three research questions pertaining to user demographics, typical usage, and perceptions of this technology. A survey instrument was administered to 75 email interface agent users. Current email interface agent users are predominantly male, well-educated and well-off innovative individuals who are occupied in the IS/IT sector, utilize email heavily and reside in an English-speaking country. They use agents to announce incoming messages and calendar reminders. The key factors why they like to use agents are perceived usefulness, enjoyment, ease of use, attractiveness, social image, an agent’s reliability and personalization. The major factors why they dislike doing so are perceived intrusiveness of an agent, agent-system interference and incompatibility. Users envision ‘ideal email notification agents’ as highly intelligent applications delivering messages in a non-intrusive yet persistent manner. A model of agent acceptance and use is suggested. [Article copies are available for purchase from InfoSci-on-Demand.com]


Author(s):  
Manish Agrawal ◽  
Kaushal Chari

Prior research on negotiation support systems (NSS) has paid limited attention to the information content in the observed bid sequences of negotiators as well as on the cognitive limitations of individual negotiators and their impacts on negotiation performance. In this paper, we assess the performance of human subjects in the context of agent-based NSS, and the accuracy of an exponential functional form in representing observed human bid sequences. We then predict the reservation values of negotiators based on their observed bids. Finally, we study the impact of negotiation support systems in helping users realize superior negotiation outcomes. Results indicate that an exponential function is a good model for observed bids. Based on the negotiation behaviors of human subjects, we find that accurate estimates of opponent reservation values for the negotiation issue that is most important to subjects, improves negotiation outcomes when an agreement is reached. Results also show that bids are correlated with reservation values, and that automated negotiations using NSS can lead to superior negotiation outcomes compared to human subjects.


Author(s):  
Dhavalkumar Thakker ◽  
Taha Osman ◽  
David Al-Dabass

Web service development is encouraging scenarios where individual or integrated application services can be seamlessly and securely published on the Web without the need to expose their implementation details. However, as Web services proliferate, it becomes difficult to matchmake and integrate them in response to users requests. The goal of our research is to investigate the utilization of the Semantic Web in building a developer-transparent framework facilitating the automatic discovery and composition of Web services. In this chapter, we present a Semantic Case Based Reasoner (SCBR) framework that utilizes the case based reasoning methodology for modelling dynamic Web service discovery and composition. Our approach is original as it considers the runtime behaviour of a service resulting from its execution. Moreover, we demonstrate that the accuracy of automatic matchmaking of Web services can be further improved by taking into account the adequacy of past matchmaking experiences for the requested task. To facilitate Web services composition, we extend our fundamental discovery and matchmaking algorithm using a light-weight knowledge-based substitution approach to adapt the candidate service experiences to the requested solution before suggesting more complex and computationally taxing AI-based planning-based transformations. The inconsistency problem that occurs while adapting existing service composition solutions is addressed with a novel methodology based on the Constraint Satisfaction Problem (CSP).


Sign in / Sign up

Export Citation Format

Share Document