Learning minimal latent directed information polytrees

Jalal Etesami, Negar Kiyavash, Todd Coleman

Research output: Contribution to journalArticlepeer-review

24 Scopus citations


We propose an approach for learning latent directed polytrees as long as there exists an appropriately defined discrepancy measure between the observed nodes. Specifically, we use our approach for learning directed information polytrees where samples are available from only a subset of processes. Directed information trees are a new type of probabilistic graphical models that represent the causal dynamics among a set of random processes in a stochastic system. We prove that the approach is consistent for learning minimal latent directed trees. We analyze the sample complexity of the learning task when the empirical estimator of mutual information is used as the discrepancy measure.

Original languageEnglish
Pages (from-to)1723-1768
Number of pages46
JournalNeural Computation
Issue number9
StatePublished - 1 Sep 2016
Externally publishedYes


Dive into the research topics of 'Learning minimal latent directed information polytrees'. Together they form a unique fingerprint.

Cite this