My home page
My group
Professional activities

Daphne Koller Publications

Learning Continuous Time Bayesian Networks (2003)

by U. Nodelman, C.R. Shelton, and D. Koller

Abstract: Continuous time Bayesian networks (CTBN) describe structured stochastic processes with finitely many states that evolve over continuous time. A CTBN is a directed (possibly cyclic) dependency graph over a set of variables, each of which represents a finite state continuous time Markov process whose transition model is a function of its parents. We address the problem of learning parameters and structure of a CTBN from fully observed data. We define a conjugate prior for CTBNs and show how it can be used both for Bayesian parameter estimation and as the basis of a Bayesian score for structure learning. Because acyclicity is not a constraint in CTBNs, we can show that the structure learning problem is significantly easier, both in theory and in practice, than structure learning for dynamic Bayesian networks (DBNs). Furthermore, as CTBNs can tailor the parameters and dependency structure to the different time granularities of the evolution of different variables, they can provide a better fit to continuous-time processes than DBNs with a fixed time granularity.

Download Information

U. Nodelman, C.R. Shelton, and D. Koller (2003). "Learning Continuous Time Bayesian Networks." Proc. Nineteenth Conference on Uncertainty in Artificial Intelligence (UAI) (pp. 451-458). Winner of the Best Paper Award. pdf ps.gz

Bibtex citation

  title = {Learning Continuous Time Bayesian Networks},
  author = {U. Nodelman and C.R. Shelton and D. Koller},
  booktitle = {Proc. Nineteenth Conference on Uncertainty in Artificial Intelligence (UAI)}, 
  pages = {451--458},
  year = 2003,
  note = {Winner of the Best Paper Award},

full list
Click to go to robotics Click to go to theory Click to go to CS Stanford Click to go to Stanford's Webpage
home | biography | research | papers | my group
courses | professional activities | FAQ | personal