Learning Bayesian Networks with Local Structure

N. Friedman and M. Goldszmidt

To appear in M. I. Jordan ed. Learning and Inference in Graphical Models. 1998.

Postscript version (411K)

PDF version.


We examine a novel addition to the known methods for learning Bayesian networks from data that improves the quality of the learned networks.  Our approach explicitly represents and learns the local structure in the conditional probability distributions (CPDs) that quantify these networks. This increases the space of possible models, enabling the representation of CPDs with a variable number of parameters. The resulting learning procedure induces models that better emulate the interactions present in the data. We describe the theoretical foundations and practical aspects of learning local structures and provide an empirical evaluation of the proposed learning procedure. This evaluation indicates that learning curves characterizing
this procedure converge faster, in the number of training instances, than those of the standard procedure, which ignores the local structure of the CPDs. Our results also show that networks learned with local structures tend to be more complex (in terms of arcs), yet require fewer parameters.

Back to Nir's publications page