Show simple item record

contributor authorWilliam T. Scherer
contributor authorDouglas M. Glagola
date accessioned2017-05-08T21:03:01Z
date available2017-05-08T21:03:01Z
date copyrightJanuary 1994
date issued1994
identifier other%28asce%290733-947x%281994%29120%3A1%2837%29.pdf
identifier urihttp://yetl.yabesh.ir/yetl/handle/yetl/36757
description abstractThe typical infrastructure maintenance decision‐making environment involves multiple objectives and uncertainty, and is dynamic. One of the most commonly used infrastructure models is a Markov decision process (MDP). MDP models have been applied to numerous sequential decision‐making situations involving uncertainty and multiple objectives, including applications related to infrastructure problems. In this paper we explore the use of Markov models for bridge management systems. In particular, we explore two critical issues associated with the use of MDP models. The first involves state‐space explosion, one of the most common problems with MDP models. We address the issues of state‐space cardinality and present approaches for dealing with the complexity. The second issue with MDP models is the compliance with the Markovian property. With both issues we use the Virginia bridge system and data to illustrate the concepts. Our research indicates that MDPs are a powerful and useful technique for bridge management systems; however, the data collection for repair and maintenance history can be improved in order to build more accurate and complete MDP‐based models.
publisherAmerican Society of Civil Engineers
titleMarkovian Models for Bridge Maintenance Management
typeJournal Paper
journal volume120
journal issue1
journal titleJournal of Transportation Engineering, Part A: Systems
identifier doi10.1061/(ASCE)0733-947X(1994)120:1(37)
treeJournal of Transportation Engineering, Part A: Systems:;1994:;Volume ( 120 ):;issue: 001
contenttypeFulltext


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record