Rough Computing
Latest Publications


TOTAL DOCUMENTS

12
(FIVE YEARS 0)

H-INDEX

3
(FIVE YEARS 0)

Published By IGI Global

9781599045528, 9781599045542

2011 ◽  
pp. 186-203 ◽  
Author(s):  
James F Peters

This paper introduces a monocular vision system that learns with approximation spaces to control the pan and tilt operations of a digital camera that is tracking a moving target. This monocular vision system has been designed to facilitate inspection by a line-crawling robot that moves along an electric power transmission line. The principal problem considered in this chapter is how to use various forms of reinforcement learning to control movements of a digital camera. Prior work on the solution to this problem was done by Chris Gaskett using neural Q-learning starting in 1998 and more recently by Gaskett in 2002. However, recent experiments have revealed that both classical targets tracking as well as other forms of reinforcement learning control outperform Q-learning. This chapter considers various forms of the Actor Critic (AC) method to solve the camera movement control problem. Both the conventional AC method as well as a modified AC method that has a built-in run-and-twiddle (RT) control strategy mechanism is considered in this article. The RT mechanism introduced by Oliver Selfridge in 1981 is an action control strategy, where an organism continues what it has been doing while things are improving (increasing action reward) and twiddles (changes its action strategy) when past actions yield diminishing rewards. In this work, RT is governed by measurements (by a critic) of the degree of overlap between past behaviour patterns and a behavior pattern template representing a standard are carried out within the framework provided by approximation spaces introduced by Zdzislaw Pawlak during the early 1980s. This paper considers how to guide reinforcement learning based on knowledge of acceptable behavior patterns. The contribution of this article is an introduction to actor critic learning methods that benefit from approximation spaces in controlling camera movements during target tracking.


2011 ◽  
pp. 152-161 ◽  
Author(s):  
Cory J. Butz

In this chapter, we review a graphical framework for reasoning from data, called rough set flow graphs (RSFGs), and point out issues of current interest involving RSFG inference. Our discussion begins by examining two methods for conducting inference in a RSFG. We highlight the fact that the order of variable elimination, called an elimination ordering, affects the amount of computation needed for inference. The culminating result is the incorporation of an algorithm for obtaining a good elimination ordering into our RSFG inference algorithm.


2011 ◽  
pp. 70-107 ◽  
Author(s):  
Richard Jensen

Feature selection aims to determine a minimal feature subset from a problem domain while retaining a suitably high accuracy in representing the original features. Rough set theory (RST) has been used as such a tool with much success. RST enables the discovery of data dependencies and the reduction of the number of attributes contained in a dataset using the data alone, requiring no additional information. This chapter describes the fundamental ideas behind RST-based approaches and reviews related feature selection methods that build on these ideas. Extensions to the traditional rough set approach are discussed, including recent selection methods based on tolerance rough sets, variable precision rough sets and fuzzy-rough sets. Alternative search mechanisms are also highly important in rough set feature selection. The chapter includes the latest developments in this area, including RST strategies based on hill-climbing, genetic algorithms and ant colony optimization.


2011 ◽  
pp. 162-174 ◽  
Author(s):  
Annibal Parracho Sant’Anna

A new index of quality of approximation, called the index of mutual information, is proposed in this chapter. It measures the mutual information between the relations respectively determined by condition and decision attributes. Its computation is based on the comparison of two graphs, each one representing a set of attributes. Applications in the context of indiscernibility as well as in the context of dominance relations are considered. The combination of the new measurement approach with the transformation into probabilities of being the preferred option is also explored. A procedure to select the most important attributes is outlined. Illustrative examples are provided.


2011 ◽  
pp. 108-127
Author(s):  
Yiyu Yao

Rough set analysis (RSA) and formal concept analysis (FCA) are two theories of intelligent data analysis. They can be compared, combined and applied to each other. In this chapter, we review the existing studies on the comparisons and combinations of rough set analysis and formal concept analysis and report some new results. A comparative study of two theories in a unified framework provides a better understanding of data analysis.


2011 ◽  
pp. 204-227 ◽  
Author(s):  
Tomasz G. Smolinski ◽  
Astrid A. Prinz

Classification of sampled continuous signals into one of a finite number of predefined classes is possible when some distance measure between the signals in the dataset is introduced. However, it is often difficult to come up with a “temporal” distance measure that is both accurate and efficient computationally. Thus in the problem of signal classification, extracting particular features that distinguish one process from another is crucial. Extraction of such features can take the form of a decomposition technique, such as Principal Component Analysis (PCA) or Independent Component Analysis (ICA). Both these algorithms have proven to be useful in signal classification. However, their main flaw lies in the fact that nowhere during the process of decomposition is the classificatory aptitude of the components taken into consideration. Thus the ability to differentiate between classes, based on the decomposition, is not assured. Classificatory decomposition (CD) is a general term that describes attempts to improve the effectiveness of signal decomposition techniques by providing them with “classification-awareness.” We propose a hybridization of multi-objective evolutionary algorithms (MOEA) and rough sets (RS) to perform the task of decomposition in the light of the underlying classification problem itself.


2011 ◽  
pp. 38-69 ◽  
Author(s):  
Hung Son Nguyen

This chapter presents the Boolean reasoning approach to problem solving and its applications in Rough sets. The Boolean reasoning approach has become a powerful tool for designing effective and accurate solutions for many problems in decision-making, approximate reasoning and optimization. In recent years, Boolean reasoning has become a recognized technique for developing many interesting concept approximation methods in rough set theory. This chapter presents a general framework for concept approximation by combining the classical Boolean reasoning method with many modern techniques in machine learning and data mining. This modified approach - called “the approximate Boolean reasoning” methodology - has been proposed as an even more powerful tool for problem solving in rough set theory and its applications in data mining. Through some most representative applications in many KDD problems including feature selection, feature extraction, data preprocessing, classification of decision rules and decision trees, association analysis, the author hopes to convince that the proposed approach not only maintains all the merits of its antecedent but also owns the possibility of balancing between quality of the designed solution and its computational time.


2011 ◽  
pp. 227-238
Author(s):  
Jerzy W. Grzymala-Busse ◽  
Zdzislaw S. Hippe ◽  
Teresa Mroczek

Results of our research on using two approaches, both based on rough sets, to mining three data sets describing bed caking during the hop extraction process are presented. For data mining we used two methods: direct rule induction by the MLEM2 algorithm and generation of belief networks associated with conversion belief networks into rule sets by the BeliefSEEKER system. Statistics for rule sets are presented, including an error rate. Finally, six rule sets were ranked by an expert. Our results show that both our approaches to data mining are of approximately the same quality.


2011 ◽  
pp. 175-184
Author(s):  
Zbigniew W. Ras ◽  
Elzbieta M. Wyrzykowska

Action rules can be seen as logical terms describing knowledge about possible actions associated with objects which is hidden in a decision system. Classical strategy for discovering them from a database requires prior extraction of classification rules which next are evaluated pair by pair with a goal to build a strategy of action based on condition features in order to get a desired effect on a decision feature. An actionable strategy is represented as a term r = ?????????????????, where ?, ?, ?, ?, and ? are descriptions of events. The term r states that when the fixed condition ? is satisfied and the changeable behavior (???) occurs in objects represented as tuples from a database so does the expectation (???). With each object a number of actionable strategies can be associated and each one of them may lead to different expectations and the same to different reclassifications of objects. This chapter will focus on a new strategy of construction of action rules directly from single classification rules instead of pairs of classification rules. This way we do not only gain on the simplicity of the method of action rules construction but also on its time complexity. The paper will present a modified tree-based strategy for constructing action rules followed by a new simplified strategy of constructing them. Finally, these two strategies will be compared.


2011 ◽  
pp. 1-37 ◽  
Author(s):  
Piotr Wasilewski ◽  
Dominik Slezak

We present three types of knowledge, which can be specified according to the Rough Set theory. Then, we present three corresponding types of algebraic structures appearing in the Rough Set theory. This leads to three following types of vagueness: crispness, classical vagueness, and a new concept of “intermediate” vagueness. We also propose two classifications of information systems and approximation spaces. Based on them, we differentiate between information and knowledge.


Sign in / Sign up

Export Citation Format

Share Document