Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Workshop on neuro Causal and Symbolic AI (nCSI)

Discrete Learning Of DAGs Via Backpropagation

Andrew Wren · Pasquale Minervini · Luca Franceschi · Valentina Zantedeschi


Abstract:

Recently continuous relaxations have been proposed in order to learn directed acyclic graphs (DAGs) by backpropagation, instead of combinatorial optimization. However, a number of techniques for fully discrete backpropagation could instead be applied. In this paper, we explore this direction and propose DAG-DB, a framework for learning DAGs by Discrete Backpropagation, based on the architecture of Implicit Maximum Likelihood Estimation (I-MLE). DAG-DB performs competitively using either of two fully discrete backpropagation techniques, I-MLE itself, or straight-through estimation.

Chat is not available.