scholarly journals Acceleration of association‐rule based markov decision processes

Author(s):  
Ma. de G. Garcí­a-Hernández ◽  
J. Ruiz-Pinales ◽  
A. Reyes-Ballesteros ◽  
E. Onaindí­a ◽  
J. Gabriel Aviña-Cervantes ◽  
...  

In this paper, we present a new approach for the estimation of Markov decision processes based on efficient association rule mining techniques such as Apriori. For the fastest solution of the resulting association‐rule based Markov decision process, several accelerating procedures such as asynchronous updates and prioritization using a static ordering have been applied. A new criterion for state reordering in decreasing order of maximum reward is also compared with a modified topological reordering algorithm. Experimental results obtained on a finite state and action‐space stochastic shortest path problem demonstrate the feasibility of the new approach.

Author(s):  
M. de G. García-Hernández ◽  
J. Ruiz-Pinales ◽  
E. Onaindía ◽  
S. Ledesma-Orozco ◽  
J. G. Aviña-Cervantes ◽  
...  

In this paper we propose the combination of accelerated variants of value iteration mixed with improved prioritizedsweeping for the fast solution of stochastic shortest-path Markov decision processes. Value iteration is a classicalalgorithm for solving Markov decision processes, but this algorithm and its variants are quite slow for solvingconsiderably large problems. In order to improve the solution time, acceleration techniques such as asynchronousupdates, prioritization and prioritized sweeping have been explored in this paper. A topological reordering algorithmwas also compared with static reordering. Experimental results obtained on finite state and action-space stochasticshortest-path problems show that our approach achieves a considerable reduction in the solution time with respect tothe tested variants of value iteration. For instance, the experiments showed in one test a reduction of 5.7 times withrespect to value iteration with asynchronous updates.


The domain of construction is a very knowledge-intensive domain with so many factors involved. This implies undertaking any action requires an understanding of the different factors and how best to combine them to achieve a favourable and optimal outcome. Thus decision-making has been extensively used in the domain of construction. The aim of this chapter is to undertake a review of various decision support systems and to provide insights into their applications in the domain of construction. Specifically, the principle of cost index, sub-work chaining diagram method, linear regression and cost over-runs in time-overrun context (CCOTOV) model and Markov decision processes (MDP), ontology and rule-based systems have been reviewed. Based on the review the Markov decision processes (MDP), ontology and rule-based systems were chosen as the more suitable for the cost control case considered in this study.


2006 ◽  
Vol 43 (3) ◽  
pp. 603-621 ◽  
Author(s):  
Huw W. James ◽  
E. J. Collins

This paper is concerned with the analysis of Markov decision processes in which a natural form of termination ensures that the expected future costs are bounded, at least under some policies. Whereas most previous analyses have restricted attention to the case where the set of states is finite, this paper analyses the case where the set of states is not necessarily finite or even countable. It is shown that all the existence, uniqueness, and convergence results of the finite-state case hold when the set of states is a general Borel space, provided we make the additional assumption that the optimal value function is bounded below. We give a sufficient condition for the optimal value function to be bounded below which holds, in particular, if the set of states is countable.


Sign in / Sign up

Export Citation Format

Share Document