Show simple item record

contributor authorWaleed A. Farahat
contributor authorH. Harry Asada
date accessioned2017-05-09T00:48:59Z
date available2017-05-09T00:48:59Z
date copyrightNovember, 2012
date issued2012
identifier issn0022-0434
identifier otherJDSMAA-926036#061003_1.pdf
identifier urihttp://yetl.yabesh.ir/yetl/handle/yetl/148428
description abstractVector Markov processes (also known as population Markov processes) are an important class of stochastic processes that have been used to model a wide range of technological, biological, and socioeconomic systems. The dynamics of vector Markov processes are fully characterized, in a stochastic sense, by the state transition probability matrix P . In most applications, P has to be estimated based on either incomplete or aggregated process observations. Here, in contrast to established methods for estimation given aggregate data, we develop Bayesian formulations for estimating P from asynchronous aggregate (longitudinal) observations of the population dynamics. Such observations are common, for example, in the study of aggregate biological cell population dynamics via flow cytometry. We derive the Bayesian formulation, and show that computing estimates via exact marginalization are, generally, computationally expensive. Consequently, we rely on Monte Carlo Markov chain sampling approaches to estimate the posterior distributions efficiently. By explicitly integrating problem constraints in these sampling schemes, significant efficiencies are attained. We illustrate the algorithm via simulation examples and show that the Bayesian estimation schemes can attain significant advantages over point estimates schemes such as maximum likelihood.
publisherThe American Society of Mechanical Engineers (ASME)
titleEstimation of State Transition Probabilities in Asynchronous Vector Markov Processes
typeJournal Paper
journal volume134
journal issue6
journal titleJournal of Dynamic Systems, Measurement, and Control
identifier doi10.1115/1.4006087
journal fristpage61003
identifier eissn1528-9028
keywordsFlow (Dynamics)
keywordsAlgorithms
keywordsMarkov processes AND Probability
treeJournal of Dynamic Systems, Measurement, and Control:;2012:;volume( 134 ):;issue: 006
contenttypeFulltext


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record