Erik Jørgensen

Embedding a State Space Model Into a Markov Decision Process

Research output: Contribution to journal/Conference contribution in journal/Contribution to newspaperJournal articleResearchpeer-review

In agriculture Markov decision processes (MDPs) with finite state and action space are often used to model sequential decision making over time. For instance, states in the process represent possible levels of traits of the animal and transition probabilities are based on biological models estimated from data collected from the animal or herd.
State space models (SSMs) are a general tool for modeling repeated measurements over time where the model parameters can evolve dynamically.
In this paper we consider methods for embedding an SSM into an MDP with finite state and action space. Different ways of discretizing an SSM are discussed and methods for reducing the state space of the MDP are presented. An example from dairy production is given

Original languageEnglish
JournalAnnals of Operations Research
Volume190
Issue1
Pages (from-to)289-309
ISSN0254-5330
DOIs
Publication statusPublished - 2011

    Research areas

  • State space model, Markov decision process, Sequential decision making, Stochastic dynamic programming

See relations at Aarhus University Citationformats

ID: 997739