Timezone: »

Structured Energy Network As a Loss
Jay Yoon Lee · Dhruvesh Patel · Purujit Goyal · Wenlong Zhao · Zhiyang Xu · Andrew McCallum

Tue Nov 29 09:00 AM -- 11:00 AM (PST) @ Hall J #223

Belanger & McCallum (2016) and Gygli et al. (2017) have shown that an energy network can capture arbitrary dependencies amongst the output variables in structured prediction; however, their reliance on gradient-based inference (GBI) makes the inference slow and unstable. In this work, we propose Structured Energy As Loss (SEAL) to take advantage of the expressivity of energy networks without incurring the high inference cost. This is a novel learning framework that uses an energy network as a trainable loss function (loss-net) to train a separate neural network (task-net), which is then used to perform the inference through a forward pass. We establish SEAL as a general framework wherein various learning strategies like margin-based, regression, and noise-contrastive, could be employed to learn the parameters of loss-net. Through extensive evaluation on multi-label classification, semantic role labeling, and imagesegmentation, we demonstrate that SEAL provides various useful design choices, is faster at inference than GBI, and leads to significant performance gains over the baselines.

Author Information

Jay Yoon Lee (Graduate School of Data Science, Seoul National University)
Dhruvesh Patel (College of Information and Computer Science, University of Massachusetts, Amherst)
Purujit Goyal (College of Information and Computer Science, University of Massachusetts, Amherst)
Wenlong Zhao (University of Massachusetts Amherst)
Zhiyang Xu (Virginia Tech)
Andrew McCallum (UMass Amherst)

More from the Same Authors