Poster
Efficient Relational Learning with Hidden Variable Detection
Ni Lao · Jun Zhu · Liu Xinwang · Yandong Liu · William Cohen

Mon Dec 6th 12:00 -- 12:00 AM @ None #None

Markov networks (MNs) can incorporate arbitrarily complex features in modeling relational data. However, this flexibility comes at a sharp price of training an exponentially complex model. To address this challenge, we propose a novel relational learning approach, which consists of a restricted class of relational MNs (RMNs) called relation tree-based RMN (treeRMN), and an efficient Hidden Variable Detection algorithm called Contrastive Variable Induction (CVI). On one hand, the restricted treeRMN only considers simple (e.g., unary and pairwise) features in relational data and thus achieves computational efficiency; and on the other hand, the CVI algorithm efficiently detects hidden variables which can capture long range dependencies. Therefore, the resultant approach is highly efficient yet does not sacrifice its expressive power. Empirical results on four real datasets show that the proposed relational learning method can achieve similar prediction quality as the state-of-the-art approaches, but is significantly more efficient in training; and the induced hidden variables are semantically meaningful and crucial to improve the training speed and prediction qualities of treeRMNs.

Author Information

Ni Lao (Carnegie Mellon University)
Jun Zhu (Tsinghua University)
Liu Xinwang (Carnegie Mellon University)
Yandong Liu (Carnegie Mellon University)
William Cohen (Google AI)

More from the Same Authors