contributor author | William T. Scherer | |
contributor author | Douglas M. Glagola | |
date accessioned | 2017-05-08T21:03:01Z | |
date available | 2017-05-08T21:03:01Z | |
date copyright | January 1994 | |
date issued | 1994 | |
identifier other | %28asce%290733-947x%281994%29120%3A1%2837%29.pdf | |
identifier uri | http://yetl.yabesh.ir/yetl/handle/yetl/36757 | |
description abstract | The typical infrastructure maintenance decision‐making environment involves multiple objectives and uncertainty, and is dynamic. One of the most commonly used infrastructure models is a Markov decision process (MDP). MDP models have been applied to numerous sequential decision‐making situations involving uncertainty and multiple objectives, including applications related to infrastructure problems. In this paper we explore the use of Markov models for bridge management systems. In particular, we explore two critical issues associated with the use of MDP models. The first involves state‐space explosion, one of the most common problems with MDP models. We address the issues of state‐space cardinality and present approaches for dealing with the complexity. The second issue with MDP models is the compliance with the Markovian property. With both issues we use the Virginia bridge system and data to illustrate the concepts. Our research indicates that MDPs are a powerful and useful technique for bridge management systems; however, the data collection for repair and maintenance history can be improved in order to build more accurate and complete MDP‐based models. | |
publisher | American Society of Civil Engineers | |
title | Markovian Models for Bridge Maintenance Management | |
type | Journal Paper | |
journal volume | 120 | |
journal issue | 1 | |
journal title | Journal of Transportation Engineering, Part A: Systems | |
identifier doi | 10.1061/(ASCE)0733-947X(1994)120:1(37) | |
tree | Journal of Transportation Engineering, Part A: Systems:;1994:;Volume ( 120 ):;issue: 001 | |
contenttype | Fulltext | |